Also, for database I'm using mongo for my this huge personal project but I checked Aerospike (more like redis + mongo) you may use this instead of mongo for better performance. Or if you plan on going on with mongo try Tokumx by http://percona.com/. It literally says MongoDB on steriods (I tried but didn't go with my setup). Percona is really known for pumping databases like MySQL for better performance.
Ajay M
Hello, We have been at similar situation like yours. We deal with serving around 1TB of images for one of our client, and it skyrockets the price way too quickly if we are using any cloud provider services. If and only if you have time or devops people to spare into research I'd recommend setting up your own CDN service in one/multi cloud provider. In my experience going with custom built solution is cheaper. We are using this https://github.com/ilhaan/kubeCDN, it's a CDN cluster setup on k8s cluster.
For image compressing, we have our inhouse library to compress images without losing much resolution making it almost impossible for regular human to detect changes. (Github link)[https://github.com/Comet-App/CImage].
Testing and benchmarking are always messy, cause in our case we can't scale it with existing tools so we simulate acutual 10k req on client's platform. For individual scores, we prefer (GTMetrix)[https://gtmetrix.com/] and (loader.io)[https://loader.io/]. We are also working on custom multiregion supported webpage/api benchmarking platform, will share with if you're interested.
Go with RabbitMQ over redis. Not that I had any problem with redis but I used (without any problem) RabbitMQ with over 10 million records on a 1GB system and 3 cpu with 25+ (50 workers each node) nodes to handle scraping for our project. I'm sure redis can also handle that but it may need more ram.