Multiple queues with Celery and RabbitMQ and Flask

I've got a couple of flask applications running on my VPS that both run lengthy tasks upon receiving an HTTP request. As such they both hand the tasks over to Celery to run asynchronously so they can respond to the HTTP request in a timely manner. It took me a little while to figure out how to get them using completely separate queues on the same RabbitMQ instance so here is the (actually quite simple) solution.

The flask apps, and celery workers both run via systemd services and RabbitMQ is just using the default configuration (i.e. I didn't change anything after installing it).

In the flask app you'll have some function that you decorate with the Celery stuff:

from flask import Flask
from celery import Celery

brokerurl = 'amqp://'

app = Flask(__name__)
celery = Celery(, broker=brokerurl)

@app.route(APP_ROUTE,methods=['GET', 'POST'])
def flask_app():
    #do some stuff

@celery.task(bind=True, queue="my_queue1")
def celery_stuff(self):
        #do stuff

The important bit here is in the Celery decorator specifying the queue you'd like to use queue=my_queue1.

Similarly in your other app you would have the same but the decorator would be:

@celery.task(bind=True, queue="my_queue2")

Your systemd conf files for the celery workers should then look something like:

Description=My cool app

ExecStart=/path/to/app/venv/bin/celery worker -A flask_app.celery -f /path/to/app/log-celery.log --concurrency=1 -n app1_worker  -Q my_queue1


Again the important bit is specifying the queue with the -Q my_queue1 option. The conf file for the second app's worker should look the same but with:

ExecStart=/path/to/app/venv/bin/celery worker -A flask_app2.celery -f /path/to/app/log-celery.log --concurrency=1 -n app1_worker  -Q my_queue2

Now when you run the apps and the celery workers they should both asynchronously run tasks without interfering with each other. I assume this can be expanded beyond two flask apps as well though I haven't tried that yet.