urllib 403 forbidden with paid account (works on ipython, but not in bash virtualenv)

I'm using urllib.request.urlopen to try and hit " ". I can get this to work in Ipython3.6, by importing my file, and running the function that does this, however when I run this script from bash, I get a forbidden 403. Why is this the case?

Update: It is now working, I just needed to wait some time.

If it was a bash console that you'd started before you upgraded your account, then it would have been using the old settings -- perhaps that was the problem?

(If it then suddenly started working after our system upgrade, which happened about an hour after you originally posted your message here, that would have been because the system upgrade force-restarted all consoles on the system.)

I started a new bash console after upgrading my account. Yes, that is when it didn't work (during your force-restart). Actually this is happening at this very moment as well. Does this force-restart occur every day at this time?

Hi there, consoles do need to be restarted every so often for maintenance reasons. If you need a script to be "always-on", you should head over to the files tab:

alternatively, we have a new "always-on tasks" service that's in preview/beta, which you could trial if you like?

Hello! my file doesn't run for too long, however i need to call scripts at certain times throughout the day which scrape some sites. However maybe 30 minutes ago, again I'm also seeing a 403 from my scheduled script. Is the only solution for this to incorporate a retry until your servers get restarted, or is there something else going on here as well

Hmmm, this is strange. The console restart thing only matters when you've just upgraded your account; consoles keep the internet access settings that applied to your account at the time you create them, so if you create one when you have a free account, then upgrade, you need to kill and restart the console to get the paid account settings.

But this behaviour you're seeing, where intermittently you get a 403, looks like something different. Could you try printing out the content of the response? It's possible that the NASDAQ API is sometimes sending back a 403 code -- it's just a generic "access denied" response, not something specific to PythonAnywhere.

This seems like a PA problem (i think) If it means anything, i seem to consistently get blocked while running my script through bash, but ipython still consistently works. If it means anything, I'm running this in a py3 virtual env

File "/home/joepeer00000/", line 20, in get_most_advance
    soup = bs(urlopen(url), "html.parser")
 File "/usr/lib/python3.6/urllib/", line 223, in urlopen
    return, data, timeout)
  File "/usr/lib/python3.6/urllib/", line 532, in open
    response = meth(req, response)
  File "/usr/lib/python3.6/urllib/", line 642, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python3.6/urllib/", line 570, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.6/urllib/", line 504, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.6/urllib/", line 650, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden

b'<HTML><HEAD>\n<TITLE>Access Denied</TITLE>\n</HEAD><BODY>\n<H1>Access Denied</H1>\n \nYou don\'t have permission to access "http&#58;&#47;&#47;www&#46;nasdaq&#46;c
om&#47;extended&#45;trading&#47;premarket&#45;mostactive&#46;aspx" on this server.<P>\nReference&#32;&#35;18&#46;96111cb8&#46;1512174140&#46;504052b\n</BODY>\n</HTML

The thing is, though, that error is definitely not one we're producing. It's coming from the server you're trying to access.

It's possible that your bash console and your IPython ones are running on different servers inside our cluster. (If by IPython you mean an IPython/Jupyter notebook, then that's definitely the case. If you mean an IPython console, then it's still possible.)

If they're on different servers, then perhaps the server's owners have blocked one IP address but not the other.