dockerizing a legacy python application

I have an application which I’m trying to expose via docker-compose. I’ve figured out all of the services/volumes necessary for the application except one component which is an awful bit of legacy code, that I don’t currently have time/resources to update.

Basically, it is a python app which isn’t pip installable but rather gets installed via a git clone, and its setup involves running some ad hoc shell and Python code. To make matters worse, the logs/config files are stored in a subdir of the src tree – don’t ask…

Anyway, I’m wondering if the solution (below) I’ve come up with sounds reasonable or if anyone has any better suggestions.

my (ad-hoc) solution: install the app into a docker data volume, and create a service of which multiple instances would simply be running the Python executable on the data volume, along with parameters specifying which config files (also on the volume) to use.

EDIT: this is what I’m thinking currently regarding a docker-compose.yml

version: '3'

services:
  db:
    image: mysql # or probably a more specific mysql image
    volumes:
       # add stuff regarding db_data volume
    # probably some config as well

  msg_queue:
    image: redis
    # probably more config necessary  

   myapp: 
    image:  # some custom image I create elsewhere which exposes my django app via uswgi
    volumes:  # provide config the volumes it requires

  web: # for serving static/media files, reverse-proxy, load balancing 
    image: nginx
    # again more config necessary

  tutor: # the legacy chatbot code which communicate with myapp via msg_queue, also makes use of db
     image: # custom image I create elsewhere
     volumes: #config necessary for tutor_data

volumes:
  logs:
  web_content:
  tutor_content:
  db_data:

I figure I need to add some depends_on‘s, and a bit more stuff, but hopefully this clarifies what I’m attempting to do. Does this clarify my question enough?

Any thoughts would be appreciated.

Source: StackOverflow