Many concurrent users
Hi,
I am sorry to hijack github issues for a question, but I really did not find a "final answer" to my questions.
We're about to setup a very simple livechat with JWT authorization (just to make sure unsubscribed users can't listen or chat).
We're going to have about 300k concurrent users, but maximum for 15 minutes (is kinda webinar with live-streaming and live chatting). There will be a big chaos like in Twitch-Stream, we know that, but it has to get done.
I've already setup a working chat enviroment but I am not sure how to test everthing.
We have a Intel i7 6400 with Quardcore (8 cores with Hyper Threading) and 64GB of RAM. There is no other service running on this server, only node and socketcluster. We've highly adjusted the limits.conf and sysctl.conf to the most maximum possible values.
My questions are: how many concurrent users will this server be able to handle? I've read that there is a linux limit of 65k users for each port. How to get rid of this?
Do I have to setup a nginx proxy to loadbalance? (using like 5-10 ports?).
Do I have to care about nodejs garbage collection? https://blog.jayway.com/2015/04/13/600k-concurrent-websocket-connections-on-aws-using-node-js/
Is there any advice you could give me on this? Do we need more servers to achieve what we need?
Would love to donate some bitcoin fractions in advance
My questions are: how many concurrent users will this server be able to handle? It depends. This sounds like a single program with multiple users who are allowed to send and receive messages. With that in mind you need to ensure you have enough workers, 1 per CPU seems good.
Message size (chat message really), number of messages, and number of subscribers / users will limit the performance.
I'm just doing a wild guess here, but I would assume that server would be able to support 100.000 users. You can easily test this by spinning up som AWS EC2 instances that spawns 10.000 clients that randomly sends and receive messages. Once one instance is running, add another one to see how your server would perform.
Do I have to setup a nginx proxy to loadbalance? (using like 5-10 ports?).
Not if this is a single server.
Do I have to care about nodejs garbage collection? https://blog.jayway.com/2015/04/13/600k-concurrent-websocket-connections-on-aws-using-node-js/
I'm not sure.
Is there any advice you could give me on this? Do we need more servers to achieve what we need?
If possible, use cloud and kubernetes. Since each "live chat event" is running ~15min, its much better to spin up to much resources for a short amount of time then to overload the server. I've tested to break SCC but overloading it with to many users, and it will break hard. Only way to recover in our test was full restart of the environment while also adding more resources.
@toredash thank you so much for your quick and friendly answer. We will consider implementing cloud instances. Do you have any understandable example (code) how to setup horizontally scaling? I just found a paper describing some MQTT techniques but I did not really understand how to achieve this.
Last question: 8 CPUS = 8 Workers, but how may brokers? -w 8 -b 8 ?
Thank you!
1-2 brokers should be enough IMHO.
I would consider using Kubernetes. Google Cloud have Kubernetes as a Service (https://cloud.google.com/kubernetes-engine/) that works well. They also provide load balancing services that can distribute load based on number of connections to the backend, so you get efficient distribution.
This is kinda political decision based on privacy policy. Do you have any self hosted server examples?