How I communicate with a process and get files out of a container (like docker) on a cloud server?

  cloud, containers, deep-learning, docker, pytorch

I have a python machine learning script for wich I need special hardware (a 8-GPU machine or Tensor Processing units). Therefore I run the code on a cloud server.

Using Containers as known in docker looks like a very convenient way to execute the deep learning pytorch code. But there will likely be errors. How do I communicate (console output) with the docker process and get the resulting file out of the container. Do I need to upload (by network) it to a file server ?

Another question is how the access from a docker process to the Hardware of the machine works.
I need a CUDA Interface for docker and a cloud service machine with the appropriate hardware for deep learning applications (tensorflow, pytorch, …)

Are there best practices to run deep learning code with containers (at cloud services) ?

Thanks for your help and advice.

Source: Docker Questions