MySQL or Postgres


I'm loading a page with a lot of database processing(I guess) and it's taking about 3 minutes to load it.

I am trying to understand how to improve this. Unfortunatelly my knowledge of databases is still at a young stage.

A brief explanation of the query would be:

A table journal has an entry for eachday.

Each one of those entries have multiple manytomany fields from other table(analytic). (around 10 per journal entry)

This table analytic has around 100 columns.

I'm currently loading a full year of journal entries into a webpage, which is taking me about 3 minutes to load, which is not good obviously.

I'm trying to understand if there is a paramater in the pythonanywhere plan that I can improve to get a better performance, like web workers or something?

Or would postgres make a difference? Since Django recommends its use, and I'm currently using mysql.

Or anyother advice someone can give me?


Workers won't help, unfortunately -- that only controls how many concurrent requests you can use, not the speed of any individual request.

That does sound very slow for what should be a simple query, though. You're using MySQL, which is good (SQLite is quite slow on our platform, and has concurrency issues). Postgres probably wouldn't be much faster.

My best guess as to the easiest way to speed things up would be to check whether you have indexes on the columns that you're querying. If you do, then perhaps you could share the code that you're using to make the query?

I'm not sure about your question about having a column of indexes but I assume yes, through an epoch column.

I'm ordering the values by time.

And initially I created a column epoch of BigIntegirField to avoid using DateTimeField as I though it would have a better performance, even tho I'm clueless about any performance improvement at this point.

Seriously considering to drop the epoch approach as it's adding complexity with conversions to regular human dates.

This is the single query I am doing to list all the journal entries in the webpage.

journalEntries = Journal.objects.filter(user = request.user).order_by('-epoch')

Indexes are not created automatically on non-primary key columns so, if you don't know that there's an index on a particular column, then you probably do not have an index on the column.

Thanks glenn. I'm not aware of how it works then. I'm gonna do some research and do some tests.

Quick update on this one. I found out that the loading time is not related to the database but to all the associated html/javascript being generated on the webpage.

Thanks for all the previous insights.