Problem with crawling

Hey! I'm crawling a site to get stock prices of a particular company. When I run it as a Python script from the Bash console, it works. But when I run it as a view, it gives me AttributeError: NoneType has no attribute findAll. My account is paid so that's not where the problem is. Can anyone please tell me why this is happening?

It's hard to say without seeing more of the code. Could you post the line that's triggering the exception here, plus a few of the lines around it? Alternatively, I can take a look at your code from our side if you give your permission -- just let me know the file, directory, and line.

Hi! I fixed it. The URL that was being passed was wrong. There's actually one more thing I want to ask. Recently my account was put in the tarpit for the first time. Now, I have two scripts that I have scheduled to run thrice a day each. It is extremely essential for my project that these scripts run. So even if the account is in the tarpit the processes will run (albeit with lower priority) won't they?

That's right, they will run, just a little slower.

Although be aware that we do occasionally have downtime, usually scheduled, but sometimes there are unscheduled bits of downtime, which might mean a period of a few minutes during which tasks will not be started. It's rare, but it happens. If it's absolutely essential that your tasks run 3 times a day, without fail, then I would recommend scheduling the task to run 6 times a day, alternating between running the task 'for real', and then another task which checks if the previous run completed, and re-runs it if it detects it was skipped or aborted.

During the downtime does the site stop functioning? As in during the downtime if someone access the URL does the site still function as normal? I have to present this as my final year project and it will be really sad if the downtime and presentation dates coincide.

Well, look, I'm not going to promise 100% uptime here. We do occasionally have unplanned outages, and our current stats are about 99.7% uptime. Of the outages we do have, 90% is during scheduled maintenance windows. We notify our users at least 24 hours in advance of these, and we schedule them during quiet periods (usually around 6AM GMT).

Yeah I know. I get all the downtime notifications. And by the way I didn't mean any offence. I think my wording was a bit off. What I'm just wondering is will the users still be able to use my site during the downtime.

During planned outages, all sites are down. During unplanned outages, it depends on the incident -- things that only affect console servers don't affect web servers, for example.

OK thanks. I'll keep my fingers crossed that my presentation date and downtime date doesn't coincide.

If you let us know the presentation date/time when you know it, we can see if we can plan around it -- no guarantees, because obviously we have to take the situation of quite a lot of people into consideration, but there's sometimes wriggle room :-)

Thanks for the consideration, but I don't know them yet. They're not finalized. However, I know that the mock presentation is in mid April and the final at the end of May.

That's fine -- just drop us a line at when you know.