Forums

export.pkl file missing essential classes when uploaded to pythonanywhere

Hello, I am trying to utilize a trained machine learning model in my website, which comes in the form of a .pkl file located in a directory named 'model'.

In a bash console, I am able to run: learn = load_learner('/home/AndrewGoldmann/mysite/static/model') This works.

The problem is when I run the same line in my website's app.py. I get the following error:

2020-07-07 14:03:45,204: An error occurred during a request.
Traceback (most recent call last):
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/AndrewGoldmann/mysite/app.py", line 182, in wrap
    return f(*args, **kwargs)
  File "/home/AndrewGoldmann/mysite/app.py", line 342, in api_receiveMarkup
    c.learn = load_learner('/home/AndrewGoldmann/mysite/static/model')
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/fastai/basic_train.py", line 622, in load_learner
    state = torch.load(source, map_location='cpu') if defaults.device == torch.device('cpu') else torch.load(source)
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/torch/serialization.py", line 593, in load
    return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
  File "/home/AndrewGoldmann/.virtualenvs/flaskk/lib/python3.8/site-packages/torch/serialization.py", line 773, in _legacy_load
    result = unpickler.load()
AttributeError: Can't get attribute 'LabelSmoothingCrossEntropy' on <module '__main__' (built-in)>

Is there any explanation for why code run in bash console might produce a different result from what I run in app.py given that I am using the same virtual environment for both?

[edit by admin: formatting]

It looks like when the pickle file was created, there was a class called LabelSmoothingCrossEntropy in the module __main__, which is the module name assigned to things that are directly in the script that is being run (eg. myscript.py if you're running python3.8 myscript.py). When the code is running as a website, the __main__ module is set to something else, so of course it doesn't have the same classes inside it.

Is LabelSmoothingCrossEntropy one of your own classes? Or does it belong to a library that you're using?