简体   繁体   中英

fastapi could not find model defintion when run with uvicorn

I want to host a pytorch model in a fastapi backend. When I run the code with python it is working fine. the depickled model can use the defined class. When the same file is started with uvicorn it cannot find the class definition.

Sourcecode looks like this:

import uvicorn
import json
from typing import List
from fastapi import Body, FastAPI
from fastapi.encoders import jsonable_encoder
import requests
from pydantic import BaseModel

#from model_ii import Model_II_b

import dill as pickle
import torch as T
import sys

app = FastAPI()
current_model = 'model_v2b_c2_small_ep15.pkl'
verbose_model = False  # for model v2

class Model_II_b(T.nn.Module):
[...]
@app.post('/function')
def API_call(req_json: dict = Body(...)):
    try:
        # load model...
        model = pickle.load(open('models/' + current_model, 'rb'))
        result = model.dosomething_with(req_json)

        return result

    except Exception as e:
        raise e
        return {"error" : str(e)}

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=8000)

When I run this with python main.py it is working fine and I am gettings results. When I run it with uvicorn main:app and send a request I get the following error:

AttributeError: Can't get attribute 'Model_II_b' on <module '__mp_main__' from '/opt/webapp/env/bin/uvicorn'>

both should be using the same python env as I use the uvicorn from within the env.

I hope someone has an idea what is wrong with my setup or code.

Update Stacktrace:

(model_2) root@machinelearning-01:/opt/apps# uvicorn main:app --env-file /opt/apps/env/pyvenv.cfg --reload
INFO:     Loading environment from '/opt/apps/env/pyvenv.cfg'
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [164777] using statreload
INFO:     Started server process [164779]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:33872 - "POST /ml/v2/predict HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/opt/apps/env/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py", line 385, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/opt/apps/env/lib/python3.6/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/fastapi/applications.py", line 183, in __call__
    await super().__call__(scope, receive, send)  # pragma: no cover
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/applications.py", line 102, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/middleware/errors.py", line 181, in __call__
    raise exc from None
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/middleware/errors.py", line 159, in __call__
    await self.app(scope, receive, _send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/exceptions.py", line 82, in __call__
    raise exc from None
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/exceptions.py", line 71, in __call__
    await self.app(scope, receive, sender)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/routing.py", line 550, in __call__
    await route.handle(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/routing.py", line 227, in handle
    await self.app(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/routing.py", line 41, in app
    response = await func(request)
  File "/opt/apps/env/lib/python3.6/site-packages/fastapi/routing.py", line 197, in app
    dependant=dependant, values=values, is_coroutine=is_coroutine
  File "/opt/apps/env/lib/python3.6/site-packages/fastapi/routing.py", line 149, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/concurrency.py", line 34, in run_in_threadpool
    return await loop.run_in_executor(None, func, *args)
  File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
    result = self.fn(*self.args, **self.kwargs)
  File "./main.py", line 155, in API_call
    raise e
  File "./main.py", line 129, in API_call
    model = pickle.load(open('models/' + current_model, 'rb'))
  File "/opt/apps/env/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/opt/apps/env/lib/python3.6/site-packages/dill/_dill.py", line 473, in load
    obj = StockUnpickler.load(self)
  File "/opt/apps/env/lib/python3.6/site-packages/dill/_dill.py", line 463, in find_class
    return StockUnpickler.find_class(self, module, name)
AttributeError: Can't get attribute 'Model_II_b' on <module '__mp_main__' from '/opt/apps/env/bin/uvicorn'>
enter code here

With the help from @lsabi I found the solution here https://stackoverflow.com/a/51397373/13947506

With the custom unpickler my problem was solved:

class CustomUnpickler(pickle.Unpickler):

    def find_class(self, module, name):
        if name == 'Model_II_b':
            from model_ii_b import Model_II_b
            return Model_II_b
        return super().find_class(module, name)

current_model = 'model_v2b_c2_small_ep24.pkl'

model = CustomUnpickler(open('models/' + current_model, 'rb')).load()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM