Urlopen issues

I've got this code to get HTML pages:

from urllib import urlopen
def getPage(page):
    return urlopen(page).read()
newPage = getPage("")
print newPage

I'm trying to get the list of plusones from my Google plus account (and then will be turning it into a RSS feed so that I can incorporate into my blog), but I'm getting an error:

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01//EN" ""> <html><head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> <title>ERROR: The requested URL could not be retrieved</title> <style type="text/css"><!--   %l  body :lang(fa) { direction: rtl; font-size: 100%; font-family: Tahoma, Roya, sans-serif; float: right; } :lang(he) { direction: rtl; float: right; }  --></style> </head><body> <div id="titles"> <h1>ERROR</h1> <h2>The requested URL could not be retrieved</h2> </div> <hr>  <div id="content"> <p>The following error was encountered while trying to retrieve the URL: <a href=""></a></p>  <blockquote id="error"> <p><b>Unsupported Request Method and Protocol</b></p> </blockquote>  <p>Squid does not support all request methods for all access protocols. For example, you can not POST a Gopher request.</p>  <p>Your cache administrator is <a href="mailto:webmaster%W">webmaster</a>.</p> <br> </div>  <hr> <div id="footer"> <p>Generated Wed, 18 Jul 2012 12:48:46 GMT by giles-liveproxy (squid/2.7.STABLE9)</p> <!-- ERR_UNSUP_REQ --> </div> </body></html>

Is there any way around this on this platform?

This works fine for me, so I'm guessing the domain just needs to be added to the white list for free accounts.

Yes: just drop a mail to the developers and they'll open it up for you.

Hey there, actually * is already on our whitelist, so that's not the problem. I think it's actually to do with urlopen (and requests too for that matter) and the way they implement the https protocol, and that it's causing problems with squid -- if you have a look, you'll see that the http request is being redirected with a 301 to https.

But you'll find that using curl or wget from a Bash console works fine. I'll dig into it a little further.

What email address should I send to?

OK, looking into it I think it's probably our proxy's fault. We're running quite an old version of squid, so we'll try upgrading it. Will keep you posted, but it may be a week or so before we can get the fix in...

If you ever need to talk to us by email, use

Thank you Harry!

I'm still confused as to why it would be intermittent then. I say intermittent, because when I typed the sample code above I did not get an error.

New question. Would it be okay with PA if I created a second free account if I only planned to use it for testing of issues like this?

For clarification when I say second free account I mean second to my first being this paid account. Not suggesting that I already have a free account in addition to this account. Wow, I really have a way of making simple things complex. All I really want to know if whether PA objects to a single user having multiple accounts. And if the answer is yes we object, then is there an exception for testing of issues.

Hi a2j -- there's absolutely no problem with you creating as many free accounts as you like. Hey, at least one person has several paying accounts. Not that we're trying to push you in that direction or anything ;-)

Re: you not getting the error and rishimaharaj getting it -- it's because your account is a paid-for one. The way our whitelisting system works is that free users are blocked via iptables from accessing any external sites at all, apart from the proxy server. Paid accounts don't have any iptables blocks, so all of your internet access is direct.

Yes, so if I had tested with both my a2j and a free account I would have seen the difference.

As for multiple paid accounts. I'm for it...who wants to tell my wife?