I'm building a website where users can participate, like and dislike any given challenge.
Problem : If 10k or 1 million users join the given challenge at a time it can cause a race condition in my database MySQL and in also Redis.
What I want : Aggregating joined participated users, likes and dislikes.
Solution : I'm thinking about using Kafka as a Queue message broker then users event one by one saving into Redis, database and aggregate them.
One problem is also here saving and doing aggregate takes time now; how can I show users they have successfully joined the challenge?
One solution is that when a user joins the challenge I send a request to the Kafka queue then update the current user UI and show a success message (not updating the other users' joined messages to current user because I am not using Websockets)
Other App example Take the same example of https://stackshare.io posts. On posts users can like, dislike and comments.
Estimated users : 1 million Stack : Django, Mysql, Redis and Kafka
Questions
- How I can manage these kinds of things?
- How do big tech companies handle this?
- Where am I right or wrong?
- Are there other tools that can help me in this situation?
- I am using locks in Redis when total like, dislike and joined users increment or decrement. Should I be doing this? Is it the same for transactions in MySQL?
I need the best approach to handle this situation that can also be scalable.
Thanks in advance for reading my post and giving me suggestions on this. ☺️
Consider using SQL support on Apache Pinot, which is an online analytic processing datastore which can write complex SQL queries and also join different tables in Pinot with those in other datastores. Pinot enables you to build dashboards for quick analysis and reporting on aggregated data.