We used Redis to cache server requests, which cut down response times on most requests by at least 1/10.
PyCharm is our preferred IDE for python apps, for all its simple awesomeness in writing code, as well as the ease with which you can run a Django shell, a web server, or run tests.
Each component of the app was launched in a separate container, so that they wouldn't have to share resources: the front end in one, the back end in another, a third for celery, a fourth for celery-beat, and a fifth for RabbitMQ. Actually, we ended up running four front-end containers and eight back-end, due to load constraints.
The front end was built on an Angular template supplied by the client. We leveraged Angular's flexibility and speed to delivered complex matrices of data quickly and with great finesse.
Django REST delivered all the content to the BI, making calls to the Postgres DB, aggregating numeric data, and automatically associating data models at the time of row creation.
Our backend was written in Django. We took advantage of the ready-to-go admin interface as a go-to solution for the client to be able to authorize his users, as well as other functionality, while most of the work was done through the Django Rest Framework.
We hooked New Relic up to monitor the health or our servers and containers, as well as track the actual app performance.
We used celery, in combination with RabbitMQ and celery-beat, to run periodic tasks, as well as some user-initiated long-running tasks on the server.
We used RabbitMQ in conjunction with Celery (Celery-Beat package) to run periodic tasks on the server.