Implement caching for Flask + Pythonanywhere

So currently I use these for my webapp:

Flask-Cache - to cache pages with custom timeout and the ability to create a key for each cache. (So I can cache by flask's request.args)

cachetools - basically the same functionality as Flask-Cache (Used a key to get flask's request.args) and it can store the function's parameters and create cache based off that.

and then there's flask_limiter where I can store user's attempts to access the resource and prevent them from overusing the api endpoint.

You see, that I have basically three things in place (though I am thinking of getting rid of Flask-Cache and solely use cachetools as my primary caching). It went smoothly through development on my personal server, I am able to quickly get the resource I want as it is "saved" within it's memory.

Upon deploying on Pythonanywhere, I found out that each of the workers have their own cache/ratelimiter and they are not shared between each workers. I have read the documentation of each and every cache modules I use and they all support things like redis, memcache, etc. All of which that pythonanywhere does not support.

Is there an option where I could "trick" these tools to think that mysql would be the new Redis (if you get what i mean)? Or what other options could I use to implement caching (where I can create keys [cache prefixes/suffixes], TTL, and/or cache based on the parameters of a def) that are PyAW friendly? What about rate limiting?

Thank you so much for your help!

(PS: Here is my github repository if you would like to take a look at how I currently implement things

You can try redislite.

For flask rate limiting, you could follow this?

Hm, that is a possibility, thanks! But do you think having redislite on Pythonanywhere's disk fast enough?

Local disk access is probably going to be faster/around the same time as hitting a database (which will have to go across network).

I believe by default/you can configure redislite to use /dev/shm (ie. store the db in memory)

If I stored the redislite in memory, would all the workers have access to the DB? I'm not too familiar with redislite atm.

/dev/shm is a shared in-memory file system. All processes that are on the same server (i.e. all the workers for a particular web app) share the same filesystem there. It's fairly limited in size (1M), though.

How would I set up redislite with /dev/shm?

i think you could just open Redis('/dev/shm/redis.db'). But you will have to double check that the db is shared between diff processes (I think it might not be, in which case you should just use the local file system).

Also- unless you think your app is going to be super busy, even if your redis was in memory, it probably would have been swapped out- which means that in terms of performance it's not going to be different from just local disk access.

Hm, I've opened two python consoles within the same virtualenv. They happen to not share the /dev/shm i do not think...

First one:

>>> from redislite import Redis
>>> redis_connection = Redis('/dev/shm/redis.db')
>>> redis_connection.keys()
>>> redis_connection.set('key', 'value')
>>> redis_connection.get('key')

Second console:

>>> from redislite import Redis
>>> redis_connection = Redis('/dev/shm/redis.db')
>>> redis_connection.get('key')
>>> redis_connection.get('key')
>>> redis_connection.keys()

Hmm- ya I think perhaps just put it on local disk.

If you're running them in different consoles, they are probably on different machines. That's why I said it works for processes on the same machine.

Would they be the same machine if it is the same webapp?

They would be yes.

I see from the server logs PA uses uWSGI as app server, has someone tried implementing uWSGI Web cache inside an App hosted in PA? If not, do you think it could be easy (and users have the permissions) given that is part of uWSGI?

I also saw Werkzeug lib has a lib to connect to it easily in any Python app (last adapter):


Unfortunately I don't think so -- it would require custom uWSGI config for your website, which we don't support. I think the Flask-Cache option above using Redislite with the storage in /dev/shm would work better.

Hi anyone was able to implement redislite on flask-limiter?

I want to use redislite storage for my flask-limiter, what should I put in storage_uri=""?

maybe use /dev/shm, but I'm not sure

I was able to play with redislite. But upon checking the contents of redis.db. The values for pidfile and unixsocket, are changing every time I reload my web-app. How do I prevent this so that I can still use those keys even after updating my codes?

I think that you'll always wind up with a new redislite instance after restarting your website; if you're only using it for caching, that should be OK if your site is live and not being constantly changed.