Gunicorn vs OpenResty: What are the differences?
Developers describe Gunicorn as "A Python WSGI HTTP Server for UNIX". Gunicorn is a pre-fork worker model ported from Ruby's Unicorn project. The Gunicorn server is broadly compatible with various web frameworks, simply implemented, light on server resources, and fairly speedy. On the other hand, OpenResty is detailed as "Turning Nginx into a Full-fledged Web App Server". OpenResty (aka. ngx_openresty) is a full-fledged web application server by bundling the standard Nginx core, lots of 3rd-party Nginx modules, as well as most of their external dependencies.
Gunicorn and OpenResty can be primarily classified as "Web Servers" tools.
Gunicorn and OpenResty are both open source tools. OpenResty with 7.12K GitHub stars and 984 forks on GitHub appears to be more popular than Gunicorn with 5.96K GitHub stars and 1.12K GitHub forks.
Instagram, reddit, and hike are some of the popular companies that use Gunicorn, whereas OpenResty is used by Shopify, Kong, and Pagar.me. Gunicorn has a broader approval, being mentioned in 184 company stacks & 51 developers stacks; compared to OpenResty, which is listed in 37 company stacks and 5 developer stacks.
What is Gunicorn?
What is OpenResty?
Need advice about which tool to choose?Ask the StackShare community!
Why do developers choose OpenResty?
What are the cons of using Gunicorn?
What are the cons of using OpenResty?
Sign up to get full access to all the companiesMake informed product decisions
At Kong while building an internal tool, we struggled to route metrics to Prometheus and logs to Logstash without incurring too much latency in our metrics collection.
We replaced nginx with OpenResty on the edge of our tool which allowed us to use the lua-nginx-module to run Lua code that captures metrics and records telemetry data during every request’s log phase. Our code then pushes the metrics to a local aggregator process (written in Go) which in turn exposes them in Prometheus Exposition Format for consumption by Prometheus. This solution reduced the number of components we needed to maintain and is fast thanks to NGINX and LuaJIT.
We use nginx and OpenResty as our API proxy running on EC2 for auth, caching, and some rate limiting for our dozens of microservices. Since OpenResty support embedded Lua we were able to write a custom access module that calls out to our authentication service with the resource, method, and access token. If that succeeds then critical account info is passed down to the underlying microservice. This proxy approach keeps all authentication and authorization in one place and provides a unified CX for our API users. Nginx is fast and cheap to run though we are always exploring alternatives that are also economical. What do you use?
Gunicorn is WSGI container that we used to run our Tornado code as it supports Asynchronous operations on tornado.
Gunicorn runs as the HTTP application server. Serves the django application in WSGI mode.