Forums

best way to cache for Flask with sqlalchamy

I am using cache to reduce the resources to query from MySQL database. Currently, I set acche type as 'simple' but I am not sure is it the best way with Pythonanywhere MySQL. Or should use redis or memcached? Not familiar with database though.

Both abc and def will be call at the same time, will it be an issues?

from flask import Flask,jsonify,abort,make_response,request
from flask.ext.sqlalchemy import SQLAlchemy
from flask.ext.cache import Cache
from sqlalchemy import text
from collections import OrderedDict
import time

app = Flask(__name__)

SQLALCHEMY_DATABASE_URI = "mysql+mysqlconnector://{username}:{password}@{hostname}/{databasename}".format(
    username="vinus",
    password="liverpoolfc",
    hostname="vinus.mysql.pythonanywhere-services.com",
    databasename="vinus$default",
    )
app.config["SQLALCHEMY_DATABASE_URI"] = SQLALCHEMY_DATABASE_URI
app.config["SQLALCHEMY_POOL_RECYCLE"] = 299

db = SQLAlchemy(app)
# define the cache config keys, remember that it can be done in a settings file
app.config['CACHE_TYPE'] = 'simple'

# register the cache instance and binds it on to your app
app.cache = Cache(app)

@app.route("/")
@app.cache.cached(timeout=300)  # cache this view for 5 minutes
def cached_view():
    return time.ctime()

@app.route('/abc', methods=['GET'])
@app.cache.cached(timeout=300)
def abc():
    try:
        #curs.execute("SELECT Trade AS trade,COUNT(*) AS count FROM ABC GROUP BY Trade")
        call= db.session.execute("SELECT Market_sentiment,BUY_total,HOLD_total,SELL_total FROM KLSE")
        col = ['Market_sentiment','BUY_total','HOLD_total','SELL_total']
        d = call.fetchone()
        #e = [OrderedDict(zip(col,t)) for t in d]
        d1 = [OrderedDict(zip(col,d))]
    except Exception:
        return 'Error: unable to fetch items'
       return jsonify({'Trade': d1})

@app.route('/def', methods=['GET'])
@app.cache.cached(timeout=300)
def abc():
    try:
        #curs.execute("SELECT Trade AS trade,COUNT(*) AS count FROM DEF GROUP BY Trade")
        call= db.session.execute("SELECT Market_sentiment,BUY_total,HOLD_total,SELL_total FROM KLSE")
        col = ['Market_sentiment','BUY_total','HOLD_total','SELL_total']
        d = call.fetchone()
        #e = [OrderedDict(zip(col,t)) for t in d]
        d1 = [OrderedDict(zip(col,d))]
    except Exception:
        return 'Error: unable to fetch items'
       return jsonify({'Trade': d1})

I'm not sure how SimpleCache is implemented, but it is probably fine. If you want to use redis then try redislite and configure it to use /dev/shm (ie. store the db in memory).

Hi conrad,

I refer to this link for the simple cache http://brunorocha.org/python/flask/using-flask-cache.html. I am not sure pickle or redis will use resources more efficiently with PA

  1. How much cache is allocate for redis (use /dev/shm) or simple cache?

  2. is it able to support 100MB data from MySQL?

  1. We provide 1M in /dev/shm
  2. No

I used Flask cache

RedisCache  redis
CACHE_DEFAULT_TIMEOUT
CACHE_KEY_PREFIX
CACHE_REDIS_HOST
CACHE_REDIS_PORT
CACHE_REDIS_PASSWORD
CACHE_REDIS_DB
CACHE_ARGS
CACHE_OPTIONS
CACHE_REDIS_URL

what should I fill up for CACHE_REDIS_HOST CACHE_REDIS_PORT CACHE_REDIS_PASSWORD CACHE_REDIS_DB CACHE_REDIS_URL CACHE_KEY_PREFIX

Thanks

You'll be best off working it out from the documentation. Here's the redislite docs, and you probably already know that the Flask-Cache docs are here.