Is there any way to run celery on PA?

I am running a Django project, but need to lay off sending some emails as an asynchronous task.

Is there any way to have a worker running something like celery on PythonAnywhere?

I could use scheduled jobs, but that would only send the emails once an hour.

I believe you can have multiple scheduled jobs and stagger them, and/or have them wait for a designed timepoint, so in effect you can have a faster rate.

Or why not just have one job lasting a long time, and polling or whatever you need every 10 seconds / minute / whatever?

hth Jim

Your first solution might actually work, but just seems like a horrible work-around.

I guess I could have a script constantly running, if the scheduler allows this. This won't affect the CPU-cycle allowance, right?

Agreed re. the first - I thought I should mention it as it might help someone.

I actually use the second approach all the time, sleeping for either 10 seconds or 240 seconds. The only problem is that occasionally the process may exit, e.g. certain errors not trapped, or a short PA outage. There have been discussions of how to mitigate this risk in various threads here, e.g. by starting the job as a Scheduled task quite frequently, but having it exit if it's already running. The art lies in how you detect the 'already running' state.

hth Jim

What Jim said. We plan to look into better solutions at some point though, I'll add a +1 to the ticket.

Thank you Jim, for all the help; and Harry for confirming. I will get started on that right away.

Hi do you guys support celery now ? e.g. if I want to insert records into django models using CSV files and I don't want user to wait for the results or avoid breaking the process because user refreshed or reloaded the process while CSV file was being processed. So is it possible to use celery in this scenario on PA yet ?

Unfortunately not -- I'll add an upvote on your behalf. I most people are implementing this kind of behaviour by adding rows to a database in the Django code, then running scheduled tasks to process and delete (or mark as processed) the rows, which isn't ideal but does work.

I have questions about setting up a scheduler as well. I want my app to scrap a site and store the results in the database by adding a row daily. How do you set up Django to do that? Also how do you set up scheduled row deletion? Is this behavior implemented within Django script or with PA?

It's not implemented in either Django or PythonAnywhere. That's code that you'll have to write for yourself. We only provide a way for you to run the task periodically and a place for the task to run.

Just wondering if I can add Celery to my app now?

No, not yet.

Just wondering if we are now allowed to run celery on PA

It was never a matter of "allowed". It's always been because it won't work reliably in our infrastructure and that hasn't changed.

Noted. Will keep checking the blogs.. Thanks.

Just wondering what's the alternative ? We have gotten to a point where we need Celery as well....

@lightup, I've seen some docs here and Its not celery but if your use case is near that then it might work else....

this seems like not a suitable way to send emails to registered users.....I can only see an hourly option to run a script...How would I go about doing this in the background and send email as soon a user registers on my website?

I would LOVE to have celery support!

you would setup a couple hourly tasks (eg: every 15min)

Couple hourly tasks? can you please elaborate? I need to run it every 15 minutes.

Go to Dashboard | Schedule and setup four 'staggered' hourly tasks, e.g. at 0, 15, 30 and 45 minutes.

No celery support yet?

No- but there is some infrastructure work under way that should eventually allow this.

Subscribed to this topic as I like to see support for celery too

Same here. Thumbs up for celery

OK -- thanks for the upvotes! Like Conrad said, we're working on something that -- not immediately, but down the line -- should make running Celery possible. I've made a note to announce something on this forum thread when it's ready, but don't hold your breath -- there are several changes that need to be made first, and none of them are simple.

Another up-vote for Celery integration from me.

I don't need it per say, but there's a really useful Wagtail plugin that scans my website for broken links. Unfortunately it requires a Celery daemon to be running to perform these tasks, so yeah +1 from me.


Another up-vote.

I'll be paying attention!

Noted :-)

This would be a great addition. Following.

Same here. thanks!

Two more up-votes added :-)

[edit: typo]

+1 for celery! Cheers


+1 for celery. Thanks!

thanks- upvoted. it's definitely something we have in mind!

I guess no Celery support yet? Would be great if it worked.


Another vote for Celery support - all Flask apps should use it for async processing.


Another vote for celery support. It becomes crucial for me to have async processing using Django.

Ok. I have added it.

Another vote for Celery, though really the ability to run a server would be :100:. I'm trying to run a scrapy spider and running asynchronously, whether through Celery or Scrapyd, is pretty much necessary when running to run at scale.

Will stay tuned for more info, would love to beta test anything as well.

Noted! Just to clarify the status: this is one of the things we have in mind for always-on tasks. Our plan is to make sure they're bedded in properly, then to extend them so that you can run a server inside an always-on task. Celery should work as soon as we've got that running. There are a couple of things we need to finish before we start work on it, though.

(A third phase would be to provide an option to allow routing to those servers from the public Internet, if they're running HTTP-like servers, which would allow websocket servers, Tornado, Flask-aiohttp, and so on, but that would be further down the line.)

For me, this problem is already solved using always-on tasks.

You can setup an APScheduler script that makes REST requests to your webapp. Secure it using token or other. Then any task that could be kicked off by a user could also be kicked off by that request on a schedule. You can also do things like retries, store tasks in db, etc. You could also do logic like : if webappA returns false, run request Y on webapp b.

This was difficult/impossible to do before by using scheduled tasks, because they would only run at certain times, not more than once an hour.

Celery is fine, but it can be overkill unless you really need distributed tasks.

That's a clever trick, I wouldn't have thought of that! I guess the one case it doesn't handle is when a task will take longer than a few minutes; if a website takes more than a certain amount of time (I think it's currently 180 seconds) to process a request, the system will assume that it's crashed and will restart the process. So anything that takes longer than that would be better done inside the always-on task, or (in the future) via some kind of worker process hanging off something like Celery.