Forums

Recent Slowdowns? And can robots.txt help with web speeds?

First question - Since around Christmas, I've had on and off periods where my website will not load. 12/26 was particularly rough. It is usually followed by one of these errors below. Have there been any issues? I didn't see anything on the blog.

2023-12-31 04:53:45,881: OSError: write error

Second question - I use Wagtail and occasionally change the urls on my site. I'm seeing two entries (not necessarily always together) in the error logs. One looking for a URL that has been changed and is no longer there and one looking for a robots.txt.

2023-12-31 11:29:25,158: Not Found: /learn/life-skills/making-calls/
2023-12-31 12:18:23,634: Not Found: /robots.txt

Could the bad URL reference be caused by crawlers? Is that why I'm seeing robots.txt after some of them? Or is there something in the Web App I need to update to ignore old links? I didn't think it would help, but I did reboot the site. I've just added a robots.txt to see if that changes anything.

Also, could a lack of robots.txt be causing the site slowdowns? And if so, why does that prompt the OS write error? Or is that OS error completely unrelated?

Thanks for any help!

I took a look at your sites, and one of them (I won't put the full hostname here for your privacy, but it's the one starting with "b") seems to have had significantly more traffic since Christmas -- you can see that on the hit charts on the "Web" page inside PythonAnywhere.

It could all be related to crawlers, yes, though it could just be increased organic traffic for some reason. The "write error" messages are happening when your site is trying to send a response to a browser, but the browser has already closed the connection. If your site is running slowly, then you'll see a lot of these if people try to access it and give up on waiting for it to load, or if there's a timeout on the browser side. The bots are likely accessing old URLs because they scanned your site previously, and have added the URLs that they found at that time to an index, which they are now re-scanning, looking for changes.

I would say, though, that the amount of traffic you're getting, while higher than it was, is not high enough that I'd expect to see slowdowns. Are there any particular views that take a long time to process, or something like that?

You might also find this help page on diagnosing site performance issues useful -- it has a lot of useful background and debugging tips.

Thanks for taking a look. I'm working through the troubleshooting page now to see if anything applies. I briefly considered problems with views and processing, but dismissed it because I hadn't made any big changes recently. But I didn't think about the fact that if there was an increase in traffic, that could just be "irritating" something that I didn't notice could cause problems before. So, thank you for mentioning that.

I also added some different images to the site. I need to look into how to optimize those with Wagtail.

Appreciate the help!

We do not use Wagtail do it's hard to be specific, but general good start is to measure what is slow. Could be database query or could be some computation or maybe file access or external api call...