Forums

Celery

I am trying to get celery to run, it seems to have problems... is this impossible to do? I need to have a broker and result backend set and schedule the sending of emails.

Hi there, as I understand it the latest version of celery now requires you to run a broker like rabbitmq, and we don't support that. Older versions used to let you coordinate via the database, but they've dropped support for that now. SO I think your choices are, either revert to an old version of celery, or "roll your own" task scheduling system, eg using a database table to coordinate jobs...

In most cases you should use the Advanced Python Scheduler (APS) for this, and if you are running in Flask you should use the flask plugin for it. Rather than rolling your own.

Its good to run it under Flask because then you can easily create a UI for monitoring the tasks.

One other thing: APS is basically "cron for python" but it does other things as well. I think they maybe should have called it "PyCron" and then it would get better recognition :)

Interesting! It seems a bit odd that you'd run it inside an existing Flask application, though -- a typical website might have a number of separate Flask processes handling its requests, so that might suggest you'd have multiple APS instances running -- which I imagine wouldn't work out well.

Or have I misunderstood how it's meant to be used?

Is there any reason that Celery with Amazon SQS as the broker won't work on PythonAnywhere?

Hmm, none that spring to mind. Just to make sure I understand how that would work -- you'd be using the Celery libraries in your code, but the only server processes that would be handling queuing would be SQS -- is that right?

Yes, a PythonAnywhere web app would use the Celery library to queue tasks using Amazon SQS. Then you run a Celery worker/consumer script as a PythonAnywhere Always-on task. See:

The Django doc above uses RabbitMQ by default. But I think you can change it to SQS with a variable in settings.py i.e., CELERY_BROKER_URL = f"sqs://{aws_access_key}:{aws_secret_key}@"

That certainly should be OK, then.

I have configured celery with sqs on PA but i need to keep running my queue always. Can you tell me what would be the best way to do that?

Take a look at https://help.pythonanywhere.com/pages/AlwaysOnTasks/