TorchServe: MODEL_LOG – Fatal Python error: Py_Initialize: Unable to get the locale encoding

Published

I am serving my Pytorch model using torch serve on a Docker container.

For Docker i am using the image pytorch/torchserve:0.5.0-cpu from the hub. While serving the model on Docker, i get the Fatal Python error: Py_Initialize when trying to load the model.

Strangely it cant also not find torch although it is already installed in Docker.

2021-11-26 06:50:07,113 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: ponzi, count: 12
2021-11-26 06:50:07,302 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel.
2021-11-26 06:50:07,328 [INFO ] W-9007-ponzi_0.1-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9007-ponzi_0.1-stdout
2021-11-26 06:50:07,303 [WARN ] W-9011-ponzi_0.1-stderr MODEL_LOG - Fatal Python error: Py_Initialize: Unable to get the locale encoding
2021-11-26 06:50:07,334 [INFO ] W-9000-ponzi_0.1-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-ponzi_0.1-stdout
2021-11-26 06:50:07,337 [WARN ] W-9007-ponzi_0.1 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9007-ponzi_0.1-stderr

2021-11-26 06:50:07,337 [WARN ] W-9002-ponzi_0.1-stderr MODEL_LOG - Traceback (most recent call last):
2021-11-26 06:50:07,338 [WARN ] W-9002-ponzi_0.1-stderr MODEL_LOG -   File "/home/model-server/tmp/models/0b8b97efbc714b16a9df197319201e15/encodings.py", line 1, in <module>
2021-11-26 06:50:07,338 [WARN ] W-9002-ponzi_0.1-stderr MODEL_LOG - ModuleNotFoundError: No module named 'torch'

The Docker setup is as below.

FROM pytorch/torchserve:0.5.0-cpu

USER root
RUN printf "nservice_envelope=json" >> /home/model-server/config.properties 
    && pip install pytorch-lightning torchmetrics google-cloud-storage scikit-learn
USER model-server


COPY src /home/model-server/src
COPY load_model.py /home/model-server/src
COPY configs /home/model-server/configs
COPY /models/state_dict.pt /home/model-server/src/
RUN mkdir "model_store"

RUN torch-model-archiver 
    --model-name ponzi 
    --version 0.1 
    --model-file /home/model-server/src/model.py 
    --serialized-file /home/model-server/src/state_dict.pt 
    --handler /home/model-server/src/handler.py 
    --export-path /home/model-server/model_store 
    --extra-files /home/model-server/configs/features.yml,/home/model-server/configs/model_configs.yml,/home/model-server/src/utils.py,/home/model-server/src/encodings.py,/home/model-server/src/dataset.py,/home/model-server/src/__init__.py

CMD ["torchserve", 
     "--start", 
     "--ncs", 
     "--model-store model_store", 
     "--ts-config=/home/model-server/config.properties", 
     "--models ponzi=ponzi.mar"]

Any idea what is wrong here ? Appreciate your help.

Source: Docker Questions

Published
Categorised as docker, pytorch, torchserve Tagged , ,

Answers

Leave a Reply

Still Have Questions?


Our dedicated development team is here for you!

We can help you find answers to your question for as low as 5$.

Contact Us
faq