I am creating a FastAPI server with simple CRUD functionalities with Postgresql as database. Everything works well in my local environment. However, when I tried to make it run in containers using docker-compose up, it failed. I was getting this error: rest_api_1 | File "/usr/local/lib/python3.8/site-packages/psycopg2/__init__.py", line 122, in connect rest_api_1 | conn = _connect(dsn, connection_factory=connection_factory, ..
Some background: I have a relatively simple docker-compose setup with a backend python-based service that has FastAPI and SQLAlchemy running, which connects to a postgres container that houses a decent (but not crazy) amount of data. All of the queries I run from psql directly inside the postgres container are as fast as expected (<1s ..
I have a flask application with sqlalchemy i want to connect mysql with sqlalchemy. I am running a flask container with sqlalchemy that I want to connect with mysql container. I am using docker-compose for this. Below is the error message. Here is my docker-compose file and database connection file. version: ‘3’ services : flask: ..
When ever I deploy application using docker it reinitialises the database with empty tables as per my model packages. I want to reinitialise only if tables are not present in database. Source: Docker..
everyone. I’m using python Jupyter-Lab inside a Docker Conteiner, which is embedded in an AWS EC-2. This Docker Container has an Instant Oracle Cliente installed inside it, so everything is set. The problem is that I’m still having trouble to connect this Docker to my AWS RDS with an Oracle Database, but only using SQLAlchemy. ..
The following code is running into a Docker container. I have a connection specified as follow which is working as I could make a simple query based on it. from sqlalchemy import create_engine engine = create_engine("mssql+pyodbc://username:[email protected]?driver_name") con_xpf = engine.connect() con_xpf.execute("use db_name;") After that I create a sqlite3 DB and connect to it: DBNAME = "data/NEWDB.db" ..
I am trying to create_engine() with a schema that I know for sure works with SQLite and MySQL. I have created a docker image of PostgreSQL with the following docker-compose.yml services: db: image: postgres restart: always environment: POSTGRES_USER: test POSTGRES_PASSWORD: test POSTGRES_DB: test ports: – "5325:5432" expose: – 5325 And I am trying to connect ..
I am trying to dockersize powerdns-admin and it needs to connect to my mariadb via sqlalchemy. The docker-compose: docker-compose.yml version: "3" services: app: image: ngoduykhanh/powerdns-admin:latest container_name: powerdns_admin ports: – "9191:80" logging: driver: json-file options: max-size: 50m environment: – SQLALCHEMY_DATABASE_URI=mysql://root:[email protected]/powerdns – GUNICORN_TIMEOUT=60 – GUNICORN_WORKERS=2 – GUNICORN_LOGLEVEL=DEBUG – OFFLINE_MODE=False # True for offline, False for external resources ..
I am trying to connect to an external MySQL database with public IP from a Docker container with no success. The script I am trying to execute is Python with sqlalchemy and pymysql. The script works correctly if I execute it directly, but it fails when I try to execute it from the docker container. ..
I find the workflow for working with database migrations in a containerized environment confusing. I have a web API with an attached database. The API runs in one container and the database in another. The project file structure is as follows . ├── docker-compose.yml ├── Dockerfile └── app | ├── __init__.py | ├── database | ..