Safely getting env vars into a Docker image to run testing stage

  continuous-integration, devspace, docker, dockerfile

My Dockerfile pretty closely resembles this one:

# Dockerfile
# Uses multi-stage builds requiring Docker 17.05 or higher
# See https://docs.docker.com/develop/develop-images/multistage-build/

# Creating a python base with shared environment variables
FROM python:3.8.1-slim as python-base
ENV PYTHONUNBUFFERED=1 
    PYTHONDONTWRITEBYTECODE=1 
    PIP_NO_CACHE_DIR=off 
    PIP_DISABLE_PIP_VERSION_CHECK=on 
    PIP_DEFAULT_TIMEOUT=100 
    POETRY_HOME="/opt/poetry" 
    POETRY_VIRTUALENVS_IN_PROJECT=true 
    POETRY_NO_INTERACTION=1 
    PYSETUP_PATH="/opt/pysetup" 
    VENV_PATH="/opt/pysetup/.venv"

ENV PATH="$POETRY_HOME/bin:$VENV_PATH/bin:$PATH"


# builder-base is used to build dependencies
FROM python-base as builder-base
RUN apt-get update 
    && apt-get install --no-install-recommends -y 
        curl 
        build-essential

# Install Poetry - respects $POETRY_VERSION & $POETRY_HOME
ENV POETRY_VERSION=1.0.5
RUN curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python

# We copy our Python requirements here to cache them
# and install only runtime deps using poetry
WORKDIR $PYSETUP_PATH
COPY ./poetry.lock ./pyproject.toml ./
RUN poetry install --no-dev  # respects 


# 'development' stage installs all dev deps and can be used to develop code.
# For example using docker-compose to mount local volume under /app
FROM python-base as development
ENV FASTAPI_ENV=development

# Copying poetry and venv into image
COPY --from=builder-base $POETRY_HOME $POETRY_HOME
COPY --from=builder-base $PYSETUP_PATH $PYSETUP_PATH

# Copying in our entrypoint
COPY ./docker/docker-entrypoint.sh /docker-entrypoint.sh
RUN chmod +x /docker-entrypoint.sh

# venv already has runtime deps installed we get a quicker install
WORKDIR $PYSETUP_PATH
RUN poetry install

WORKDIR /app
COPY . .

EXPOSE 8000
ENTRYPOINT /docker-entrypoint.sh $0 [email protected]
CMD ["uvicorn", "--reload", "--host=0.0.0.0", "--port=8000", "main:app"]



# 'test' stage runs our unit tests with pytest and
# coverage.  Build will fail if test coverage is under 95%
FROM development AS test
RUN coverage run --rcfile ./pyproject.toml -m pytest ./tests
RUN coverage report --fail-under 95


# 'production' stage uses the clean 'python-base' stage and copyies
# in only our runtime deps that were installed in the 'builder-base'
FROM python-base as production
ENV FASTAPI_ENV=production

COPY --from=builder-base $VENV_PATH $VENV_PATH
COPY ./docker/gunicorn_conf.py /gunicorn_conf.py

COPY ./docker/docker-entrypoint.sh /docker-entrypoint.sh
RUN chmod +x /docker-entrypoint.sh

COPY ./app /app
WORKDIR /app

ENTRYPOINT /docker-entrypoint.sh $0 [email protected]
CMD [ "gunicorn", "--worker-class uvicorn.workers.UvicornWorker", "--config /gunicorn_conf.py", "main:app"]

I’m using devspace to deploy to local development containers where and it prompts me for the env vars (database connection strings, etc.) when I run devspace dev.

My development stage runs just fine and apparently the env vars are getting into the container the image is running in.

The problem is when it gets to the test stage. It fails to build with the app itself citing missing env vars that are required for the application to run.

For starters I do need to understand better why the development stage runs fine and test doesn’t.

  • If I’m not mistaken CMD [...] is running when a container using the image is started. If so, devspace is handling adding the env vars to the container the image is being deployed to.
  • RUN is probably saying, "Run this command right now," and, if so, you can see there are no env vars being provided.

So my question is: How should I safely get the env vars into the test stage?

Although I’m just playing with devspace right now, this will be an issue also in my CI pipeline.

Source: Docker Questions

LEAVE A COMMENT