Forums

"exceeded "max_user_connections" resource: where am i leaking connections?

My sqlalchemy db object is defined as follows:

db = SQLAlchemy()
app = Flask(__name__)
db.init_app(app)

and i use pandas' read_sql_query in my view functions to retrieve queries:

df = pd.read_sql_query(sql, db.engine)

an ajax call uses this repeatedly, and with a few users i get the dreaded error 'exceeded max_user_connections resource'. can someone help point out where the problem is?

Is it possible that pandas' read_sql_query is not very well optimised? Can you find out whether it's using connection pooling properly, or whether there's a way of manually telling it to disconnect after each query?

show global variables where variable_name='max_user_connections'

returns 0. is that problematic?

typing 'show processlist' shows the following:

| Id | User | Host | db | Command | Time | State | Info | +----------+----------+------------------------------------+-------------------+---------+------+-------+------------------+ | 11441409 | lightson | ip-10-183-39-79.ec2.internal:40344 | NULL | Sleep | 488 | | NULL | | 11441435 | lightson | ip-10-183-39-79.ec2.internal:40379 | lightson$lightson | Query | 0 | NULL | show processlist | | 11442341 | lightson | ip-10-9-180-81.ec2.internal:54603 | lightson$lightson | Sleep | 2 | | NULL | | 11442343 | lightson | ip-10-9-180-81.ec2.internal:54605 | lightson$lightson | Sleep | 3 | | NULL | | 11442440 | lightson | ip-10-9-180-81.ec2.internal:54672 | lightson$lightson | Sleep | 13 | | NULL | | 11442501 | lightson | ip-10-9-180-81.ec2.internal:54726 | lightson$lightson | Sleep | 3 | | NULL | | 11442567 | lightson | ip-10-9-180-81.ec2.internal:54775 | lightson$lightson | Sleep | 52 | | NULL | | 11442584 | lightson | ip-10-9-180-81.ec2.internal:54791 | lightson$lightson | Sleep | 14 | | NULL |

after invoking many ajax calls (so all commands are 'Sleep'). i did encounter a crash once in the middle but it came back very quickly.

re: max_user_connections, try doing it without the "global":

show variables where variable_name='max_user_connections' ;

re: the processlist, that does seem like quite a lot of connections, so i suspect pandas is being badly behaved... can you figure out how to make it use a collection pool, or failing that, how to force it to disconnect after each query maybe?

You should be able to control connection pooling explicit: You can use

 app.config['SQLALCHEMY_POOL_SIZE'] = 20

(default 5) or other relevant parameter using this doc: https://pythonhosted.org/Flask-SQLAlchemy/config.html

Or you could control the engine directly with

 db.create_engine('postgresql://me@localhost/mydb',pool_size=20, max_overflow=0)

as documented here: http://docs.sqlalchemy.org/en/rel_0_9/core/pooling.html?highlight=pooling

hih

Yes, what jmfrank63 said, but don't use a pool size of 20, or you'll definitely hit the max_user_connections limit. two or three should be enough...