I’m developing an app which live-streams video and/or audio from different entities. Those entities’ IDs and configurations are stored as records in my DB. My app’s current architecture is something such as the following:
- a CRUD API endpoint for system-wide functionalities, such as logging in or editing an entity’s configuration.
- N-amount of other endpoints (where N is the number of entities and every endpoint’s route is defined by the specific entity’s ID, like so:
"/:id/api/") for each entity’s specific functionalities. Each entity is loaded by the app on initialization. Those endpoints are both a REST API handler and a WebSocket server for live-streaming media received from the backend which was configured for that entity.
On top of that, there’s an NGINX instance which acts as a proxy and hosts our client files.
Obviously, this isn’t very scalable at the moment (a single server instance handles an ever-growing amount of entities) and requires restarting my server’s instance when adding/deleting an entity – which isn’t ideal. I was thinking of splitting my app’s server into micro-services: one for system-wide CRUD, and N others for each entity defined in my DB. Ultimately, I’d like those micro-services to be run as Docker containers. The problems (or questions to which I don’t know the answers) I’m facing at the moment are:
- How does one run Docker containers dynamically, according to a DB (or programmatically)? Is it even possible?
- How does one update the running Docker container to be able to reconfigure that entity during run-time?
- How would one even configure NGINX to proxy those dynamic micro-services? I’m guessing I’ll have to use something like Consul?
I’m not very knowledgeable, so pardon me if I’m too naive to think I can achieve such architecture. Also, if you can think of a better architecture, I’d love to hear your suggestions.