Forums

Is there any way to run celery on PA?

I am running a Django project, but need to lay off sending some emails as an asynchronous task.

Is there any way to have a worker running something like celery on PythonAnywhere?

I could use scheduled jobs, but that would only send the emails once an hour.

I believe you can have multiple scheduled jobs and stagger them, and/or have them wait for a designed timepoint, so in effect you can have a faster rate.

Or why not just have one job lasting a long time, and polling or whatever you need every 10 seconds / minute / whatever?

hth Jim

Your first solution might actually work, but just seems like a horrible work-around.

I guess I could have a script constantly running, if the scheduler allows this. This won't affect the CPU-cycle allowance, right?

Agreed re. the first - I thought I should mention it as it might help someone.

I actually use the second approach all the time, sleeping for either 10 seconds or 240 seconds. The only problem is that occasionally the process may exit, e.g. certain errors not trapped, or a short PA outage. There have been discussions of how to mitigate this risk in various threads here, e.g. by starting the job as a Scheduled task quite frequently, but having it exit if it's already running. The art lies in how you detect the 'already running' state.

hth Jim

What Jim said. We plan to look into better solutions at some point though, I'll add a +1 to the ticket.

Thank you Jim, for all the help; and Harry for confirming. I will get started on that right away.

Hi do you guys support celery now ? e.g. if I want to insert records into django models using CSV files and I don't want user to wait for the results or avoid breaking the process because user refreshed or reloaded the process while CSV file was being processed. So is it possible to use celery in this scenario on PA yet ?

Unfortunately not -- I'll add an upvote on your behalf. I most people are implementing this kind of behaviour by adding rows to a database in the Django code, then running scheduled tasks to process and delete (or mark as processed) the rows, which isn't ideal but does work.

I have questions about setting up a scheduler as well. I want my app to scrap a site and store the results in the database by adding a row daily. How do you set up Django to do that? Also how do you set up scheduled row deletion? Is this behavior implemented within Django script or with PA?

It's not implemented in either Django or PythonAnywhere. That's code that you'll have to write for yourself. We only provide a way for you to run the task periodically and a place for the task to run.

Just wondering if I can add Celery to my app now?

No, not yet.

Just wondering if we are now allowed to run celery on PA

It was never a matter of "allowed". It's always been because it won't work reliably in our infrastructure and that hasn't changed.

Noted. Will keep checking the blogs.. Thanks.

Just wondering what's the alternative ? We have gotten to a point where we need Celery as well....

@lightup, I've seen some docs here https://makecodecode.blogspot.co.ke/2016/03/scheduling-tasks-on-pythonanywhere-to.html and https://help.pythonanywhere.com/pages/LongRunningTasks. Its not celery but if your use case is near that then it might work else....

this seems like not a suitable way to send emails to registered users.....I can only see an hourly option to run a script...How would I go about doing this in the background and send email as soon a user registers on my website?

I would LOVE to have celery support!

you would setup a couple hourly tasks (eg: every 15min)

Couple hourly tasks? can you please elaborate? I need to run it every 15 minutes.

Go to Dashboard | Schedule and setup four 'staggered' hourly tasks, e.g. at 0, 15, 30 and 45 minutes.

No celery support yet?

No- but there is some infrastructure work under way that should eventually allow this.

Subscribed to this topic as I like to see support for celery too

Same here. Thumbs up for celery

OK -- thanks for the upvotes! Like Conrad said, we're working on something that -- not immediately, but down the line -- should make running Celery possible. I've made a note to announce something on this forum thread when it's ready, but don't hold your breath -- there are several changes that need to be made first, and none of them are simple.

Another up-vote for Celery integration from me.

I don't need it per say, but there's a really useful Wagtail plugin that scans my website for broken links. Unfortunately it requires a Celery daemon to be running to perform these tasks, so yeah +1 from me.

noted!

Another up-vote.

I'll be paying attention!

Noted :-)

This would be a great addition. Following.

Same here. thanks!

Two more up-votes added :-)

[edit: typo]

+1 for celery! Cheers

ok

+1 for celery. Thanks!

thanks- upvoted. it's definitely something we have in mind!

I guess no Celery support yet? Would be great if it worked.

no

Another vote for Celery support - all Flask apps should use it for async processing.

ok

Another vote for celery support. It becomes crucial for me to have async processing using Django.

Ok. I have added it.

Another vote for Celery, though really the ability to run a server would be :100:. I'm trying to run a scrapy spider and running asynchronously, whether through Celery or Scrapyd, is pretty much necessary when running to run at scale.

Will stay tuned for more info, would love to beta test anything as well.

Noted! Just to clarify the status: this is one of the things we have in mind for always-on tasks. Our plan is to make sure they're bedded in properly, then to extend them so that you can run a server inside an always-on task. Celery should work as soon as we've got that running. There are a couple of things we need to finish before we start work on it, though.

(A third phase would be to provide an option to allow routing to those servers from the public Internet, if they're running HTTP-like servers, which would allow websocket servers, Tornado, Flask-aiohttp, and so on, but that would be further down the line.)

For me, this problem is already solved using always-on tasks.

You can setup an APScheduler script that makes REST requests to your webapp. Secure it using token or other. Then any task that could be kicked off by a user could also be kicked off by that request on a schedule. You can also do things like retries, store tasks in db, etc. You could also do logic like : if webappA returns false, run request Y on webapp b.

This was difficult/impossible to do before by using scheduled tasks, because they would only run at certain times, not more than once an hour.

Celery is fine, but it can be overkill unless you really need distributed tasks.

That's a clever trick, I wouldn't have thought of that! I guess the one case it doesn't handle is when a task will take longer than a few minutes; if a website takes more than a certain amount of time (I think it's currently 180 seconds) to process a request, the system will assume that it's crashed and will restart the process. So anything that takes longer than that would be better done inside the always-on task, or (in the future) via some kind of worker process hanging off something like Celery.

Anything new about Celery?

Not yet, but we're working towards a solution.

Thank you for a quick response! Can you please give some info about how close are you to get it working? I don't ask for a specific date, just a time frame...few days, weeks, months...?

We have something experimental that works, so we at least know how to do it. However, exactly when we make it production-ready depends on how the (relatively few) things on the development queue take to do, so we can't give a good estimate right now.

Hi PA, is there any update on Celery implementation for it to be available for us?

Thank you!

Best, Marc

hi Marc, no updates as of yet

Ok, thank you!

+1 for Celery for me as well please. Thanks.

Thanks! +1'd in our issue tracker

Another vote for celery support on PA, thanks.

Noted

+1 for Celery

Noted

you can more learn it about here https://docs.celeryproject.org/en/stable/getting-started/first-steps-with-celery.html https://19216881.uno/

Thanks.

celery support on PA yet?

Have you guys started giving celery support yet? if yes then can you send a documentation? Thanks.

no unfortunately not, no updates yet

Guess what I am going to ask? Any support for Celery yet? How about Redis or any other async queue system?

Sending synchronous emails makes the site pause for way too long.

No, we have no updates on that yet.

Does anyone have any sample code for making email sending async?

Is it a matter of adding an entry to the database and using an "always running" script to do the processing every 30 seconds or so?

Sure that would work - just mark the entry as done when the email has been sent.

+1 for Celery

Noted!

To "work around" this shortcoming, I used AWS SQS and a Lambda function. It was a good excuse to learn some more AWS and it is only a few more dollars a month but it would be lovely if PA came up with a solution to this.

Lots of people have been asking forever and Celery seems to be the accepted way to handle this type of work with Flask. The thread is 7 years old.

I just want the user to be able to do something else for the 10 seconds it takes to send a small confirmation email in the background. I wanted to use Celery (+1 to celery from me too!). What else can I do?

@fecobiome -- thanks for the feedback, adding your vote to the ticket. For the time being -- maybe ideas from this help page would be inspirational?

Any update on celery yet?? This is 2022

Not yet. Happy New Year!

+1

Noted!

I guess this could be mitigated if you guys could offer some kind of "guardian" for the always running tasks. Something that checks every x time and if down, run it. That way we could have an always running pooling database or something and executing the code.

@luisdaniel Always-on tasks work like that. When task crashes it is re-run. See also https://blog.pythonanywhere.com/198/

Now in 2023, is celerey now available?

Nope.

What's the best Celery alternative that's been listed here so far? I just need a secondary process to run if an error occurs on my site.

We have a help page on how to implement that here: https://help.pythonanywhere.com/pages/AsyncInWebApps/

+1 for celery

Thanks, noted!

Hello, we are in 2024)

Is there Celery support in PA?

if not +1 for celery)

We don't support it, no. But one option (with a paid account) would be to get a message queue as a service from a third party -- eg. RabbitMQ from CloudAMQP or Redis from Redis Labs -- then you could use Celery with that as the backend.

+1 for celery. Ten year later this should be taken in consideration. ;)

We are considering it, but with nice external services around it is always prioritized lower then our core offering.

Hello, I have set up my Redis lab configuration perfectly and celery is running, but I keep getting : raise ConnectionError(str(exc)) from exc kombu.exceptions.OperationalError: Authentication required

Redis configuration provided by Redis Labs

REDIS_HOST = 'redis-18792.c14.us-east-1-2.ec2.cloud.redislabs.com' REDIS_PORT = 18792 REDIS_PASSWORD = '**'

Redis connection

r = redis.Redis( host=REDIS_HOST, port=REDIS_PORT, password=REDIS_PASSWORD )

Celery configuration

CELERY_BROKER_URL = f'redis://{REDIS_HOST}:{REDIS_PORT}/0' CELERY_RESULT_BACKEND = f'redis://{REDIS_HOST}:{REDIS_PORT}/0' CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_RESULT_SERIALIZER = 'json' CELERY_TASK_SERIALIZER = 'json' CELERY_TASK_ALWAYS_EAGER = False CELERYD_PREFETCH_MULTIPLIER = 50

This is really annoying, I have beenon this for a whole week, what exactly is wrong with PA??

Your server is very slow normally , and I decided to use CELERY to offload task, still you guys did not allow it???

What do you mean by we do not allow it? It looks like you have auth problems with the service you try to connect to.

I did not have any AUTH problem

I have configured everything properly, the page just remains loading and I will later get an error message about load balancer, so is this from my side??

Did you check the web app's error log?

There was not error, I noticed that the always On Tasks even failed to run and I was unable to use channels. Don't worry,I have unsubscribed and I am currently checking heroku, because I need a real time communication in my App.

You still use a paid account, though.