virtualenv not working with scrapy for django while using python script


I was trying to use virtualenv with scrapy for django but somehow virtualenv does not get activated. In my settings file, I added the following line to activate my virtualenv:

{'__file__': '/home/astitwlathe/.virtualenvs/dscrapy27/bin/'})

This command normally should activate the virtualenv, but this did not happen. I later imported django and checked its version. The version shown was not the virtualenv version django.

Hi there,

We have a new way of supporting virtualenvs. If you look on the web tab you'll find a new field in which you can enter a virtualenv path. There's also a link to the docs

(the old version should still work, but the new way is better in terms of avoiding shadowing issues...)


Thanks for your reply. Actually, I was trying to access virtualenv through scrapy. I have tried two ways, one of them was through execfile. The other one was the one you pointed out. However, even executing scrapy in virualenv, it does not work. The scrapy I use is not installed in virtualenv. It is installed in normal directory. I am accessing django that is installed in virtualenv. Do you think, this is the problem ?

I just checked by adding scrapy into virtualenv, it then also accesses pythonanywhere's django version

When you tried the web tab virtualenv version, did you remove the execfile/activate_this from your wsgi file?

Hi, it has started working. I had to restart the shell for it to work.

Yes, I did remove that part. Thanks for your help. Just wanted to ask, should virualenv have all the copies of libraries that I have to use. Can I also use standard libraries (pythonanywhere's libraries)? If I install all libraries (some are dupicates), extra space is unnecessarily wasted. Just wanted to know your opinion.


the best thing is probably to install duplicate copies of the libraries - then you can be sure you've got the versions you want.

k, sure I will do that from now on ...