HAProxy vs Varnish: What are the differences?
Developers describe HAProxy as "The Reliable, High Performance TCP/HTTP Load Balancer". HAProxy (High Availability Proxy) is a free, very fast and reliable solution offering high availability, load balancing, and proxying for TCP and HTTP-based applications. On the other hand, Varnish is detailed as "High-performance HTTP accelerator". Varnish Cache is a web application accelerator also known as a caching HTTP reverse proxy. You install it in front of any server that speaks HTTP and configure it to cache the contents. Varnish Cache is really, really fast. It typically speeds up delivery with a factor of 300 - 1000x, depending on your architecture.
HAProxy belongs to "Load Balancer / Reverse Proxy" category of the tech stack, while Varnish can be primarily classified under "Web Cache".
"Load balancer", "High performance" and "Very fast" are the key factors why developers consider HAProxy; whereas "High-performance", "Very Fast" and "Very Stable" are the primary reasons why Varnish is favored.
Varnish is an open source tool with 908 GitHub stars and 216 GitHub forks. Here's a link to Varnish's open source repository on GitHub.
Pinterest, 9GAG, and Twitch are some of the popular companies that use Varnish, whereas HAProxy is used by Instagram, Dropbox, and Medium. Varnish has a broader approval, being mentioned in 1007 company stacks & 140 developers stacks; compared to HAProxy, which is listed in 457 company stacks and 211 developer stacks.
What is HAProxy?
What is Varnish?
Need advice about which tool to choose?Ask the StackShare community!
Sign up to add, upvote and see more prosMake informed product decisions
What are the cons of using Varnish?
Sign up to get full access to all the companiesMake informed product decisions
Sign up to get full access to all the tool integrationsMake informed product decisions
Around the time of their Series A, Pinterest’s stack included Python and Django, with Tornado and Node.js as web servers. Memcached / Membase and Redis handled caching, with RabbitMQ handling queueing. Nginx, HAproxy and Varnish managed static-delivery and load-balancing, with persistent data storage handled by MySQL.
We're using Git through GitHub for public repositories and GitLab for our private repositories due to its easy to use features. Docker and Kubernetes are a must have for our highly scalable infrastructure complimented by HAProxy with Varnish in front of it. We are using a lot of npm and Visual Studio Code in our development sessions.
We use HAProxy to load balance between our webservers. It balances TCP between the machines round robin and leaves everything else to Node.js, leaving the connections open with a reasonably long time to live to support WebSockets and re-use of a TCP connection for AJAX polling.
When you visit the site, you talk to a load balancer which chooses a varnish front-end which in turn talks to our web front-ends which used to run nine python processes. Each of these processes are serving the exact same version on any given web front-end.
Varnish sits as a secondary cache layer behind Akamai. Two servers operate in a Primary/Secondary configuration with failover managed by HAProxy. Requests peak at around 10k/s.
HAProxy manages internal and origin load balancing using KeepaliveD. Two small servers host the entire site, never moving about 15% load even during the largest load spikes.
We use HAProxy to balance traffic at various points in our stack, includgin nginx nodes on different physical machines, and api nodes on the backend.
I use HAproxy primarily for application routing and SSL termination. I also use its logs and statistics to visualize incoming traffic in Kibana.
We use HAProxy to load balance web requests for our web application, but also for some internal load balancing of microservices.
Varnish is the http cache proxy we use to achieve maximum performance on high traffic e-commerce.