Forums

Django search bar very slow on paid account MySQL DB - How to speed it up?

My view has a search bar, which searches extremely slowly with MYSQL BD - $12 account. The weirdest thing is that it was really fast on SQLite. I'm thinking it could be the connection to the MYSQL DB. There are four tables that I'm trying to pull information from Restaurant, Food, Menu, Quality.

When I run it in the shell, it's actually normal speed. No one line takes super long, not sure why it's so slow on the server. I've tried to comment out individual lines and run on the server and it does not really change the speed.

Here is the code that runs very slowly:

 in_radius = [# list short]
 list_ids = ['# list short]

 # Restaurants table is small
 find_objects = Restaurants.objects.filter(Q(zipcode__in=in_radius) | Q(id__in=list_ids)).distinct()

 # Menu table 1 million observations
 objects_selected = Food.objects.filter(menu__menuid=pk)
 table = objects_selected.filter(restaurants__in=find_objects).order_by('price').prefetch_related('restaurants')

 #Food table is small
  food_info = Food.objects.filter(foodid=pk)

 # quality has about 600K objects
 quality_info = Quality.objects.filter(food__foodid=pk,locality = str(5))[:1]

Remember that Django evaluates queries lazily, so the statement

find_objects = Restaurants.objects.filter(Q(zipcode__in=in_radius) | Q(id__in=list_ids)).distinct()

...doesn't actually do the query. It's only when you look at the results that it will run slowly.

So for your timing tests in the shell, I suggest you do something like this to force Django to actually hit the database:

list(Restaurants.objects.filter(Q(zipcode__in=in_radius) | Q(id__in=list_ids)).distinct())

That will at least help pin down the query that's causing the problem.

As to why it's slow in MySQL and fast in SQLite -- do you have indexes on all of the same columns in both databases?

Thank you Giles! Your answer was great. So, the largest table query that's very slow. I do need to add indexes. What's the best way to do it now that I have already built the database?

It's taken me two weeks to build the database, since I had to load all the tables via CSV. I kept getting version/fixture issues when I tried to load my local database on pythonanywhere.

Should I take a copy of the database and dump it, build the indexes and then reload? Is there a way to build the indexes without rebuilding the database?

I see you asked about adding the indexes in a separate forum thread, so I'll reply there.

Hello. Are the necessary features and libraries for searching bars available in the free plan?

Do you have any particular features and libraries in mind? Regarding pre-installed Python packages, you can check them here. You can use SQLite or MySQL for free as well, see this.