self-hosted icon indicating copy to clipboard operation
self-hosted copied to clipboard

High Memory and CPU Usage in Redis Container for Self Hosted Sentry Application.

Open Swaroop-1613 opened this issue 1 year ago • 6 comments

Body

We are using a self-hosted Sentry application, version 24.5.1, with Docker version 20.10.25.

Today, our Redis container in the Sentry application is experiencing high memory and CPU utilization. How can we identify which process is causing this within the container?

The below two services are utilizing high memory:

USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND libstor+ 7125 21.6 41.0 54768416 26520600 ? Ssl 10:56 27:08 redis-server *:6379 libstor+ 29693 87.3 40.9 54768588 26460832 ? R 13:00 0:09 redis-rdb-bgsave *:6379

Swaroop-1613 avatar Jun 21 '24 12:06 Swaroop-1613

@hubertdeng123 please need your suggestion.

balaG4046 avatar Jun 21 '24 16:06 balaG4046

Wonder if the memory limit discussed here helps.

azaslavsky avatar Jun 21 '24 23:06 azaslavsky

Hi Could you please help us with the following questions regarding our Sentry application using Redis

  1. What type of messages are coming into the queue, and how frequently are these messages arriving?
  2. Is there a way to limit these messages to prevent overutilization of the Redis container?

We have observed that when any project sends a high volume of error messages, our Redis container utilization increases significantly. In our Sentry application, we currently have the option to set a rate limit for sending error messages only at the project level. Is there a way to set a rate limit at the organization level? Thank you for your assistance.

Swaroop-1613 avatar Jun 24 '24 06:06 Swaroop-1613

Rate limiting can only be done at the project level, not at the organization level for self-hosted instances. By high volume, what is the approximate amount of error messages that are coming in? Self-hosted Sentry does not work at larger scale.

  1. Unprocessed event data
  2. You may want to create your own standalone rate limiter before events start hitting redis

hubertdeng123 avatar Jun 25 '24 23:06 hubertdeng123

Hey @hubertdeng123 Upon searching, we found the following link URL We are considering setting the Redis memory limit to 10GB. Questions:

  1. Is setting the Redis memory limit to 10GB an efficient solution to our problem?
  2. Is this the correct way to handle the issue?

Please let us know your thoughts.

Swaroop-1613 avatar Jun 27 '24 06:06 Swaroop-1613

This will cap the amount of memory that redis can use, but that won't be a good solution to making sure your Sentry instance is running smoothly under high load. This is going beyond the realms of what self-hosted Sentry is designed to do. What is your event load looking like? I think you should consider creating your own standalone rate limiter which may help with this as mentioned above.

hubertdeng123 avatar Jun 28 '24 21:06 hubertdeng123

This issue has gone three weeks without activity. In another week, I will close it.

But! If you comment or otherwise update it, I will reset the clock, and if you remove the label Waiting for: Community, I will leave it alone ... forever!


"A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀

getsantry[bot] avatar Jul 20 '24 07:07 getsantry[bot]