No longer able to connect over HTTP

I have an app that scrapes info from a website and emails it. People can go to my site to sign up. Everything was working fine, but now I get this when it attempts to reach the website to scrape info from:

requests.exceptions.SSLError: bad handshake: Error([('SSL routines', 'SSL3_GET_SERVER_CERTIFICATE', 'certificate verify failed')],)

I'm using python 2.7.6 It all worked when I was using the pythonanywhere default URL for the sign-up portion. I believe I started using my own domain (via godaddy - and the "pythonanywhere HTTPS automatically renewing certificate" around the time this problem showed up. I tried verify=false, but that did not help. I also tried changing out http for https in the request. That didn't change anything either.

requests.get("https://<website-to-scrape>", verify=False)

Any help would be greatly appreciated. Thank you.

I don't understand what you mean by the pythonanywhere default-url for the sign-up portion. Do you mean that you used to use a site, and now you started using

In particular, is this a site you are hosting, or a site that you are scraping?

Hi Conrad. The site I am hosting (which allows people to sign up to receive emails) used to be but now is

Part of my app scrapes a completely different website. That part is no longer working. I don't know if it's related to the domain change.

that should not be related to the domain change. what happens if you try to do the requests.get("https://<website-to-scrape>", verify=False) from your local machine?

I don't what happened, but now it works fine. I even removed the verify=False and that works perfectly, too. So I am back to my original code - which hasn't worked for the last few days, but now works. Any idea what could cause this? Thanks for your help Conrad!

possibly the site you were trying to scrape had something wrong with their ssl config for awhile? not sure.