Adding urls in django and processing them in always-on-task


I would like some suggestions with the following. I have a Django Web App. A request can be sent to the app that will contain a parameter that is an external url. I want to add this url to a queue to be scraped. I read that there is no thread support in the web app so i want to add it to an always-on-task that picks them up. How can i do this? What is the best way of doing this? The always-on-task, after scraping, has to add the information scraped to the database used by django.

I read some similar topics, but i couldn't find exactly the answer.

Why not put the URLs in your database and then have the always on task mark the ones that it's done. Then, when the task is done, it can put the result in the database again where Django can see it.

Because i've read multiple times that this is not a good idea. ( But i still can't find what the best practice would be. Yeah. I thought about having in the database a status of "pending" or "ready". But i think it's not a good idea to read and write all the time to the database.

Reading and writing is what databases are for. They are not the best solution for queuing, but they can work for queuing.