i am currently working on deploying an R application via a CI/CD pipeline. Right now i have an DockerfileBase, which builds to an image containing all the R libraries needed for the project. Now i want to somehow integrate the functionality that somehow it is detected when a library is added to the DockerfileBase and ..
My Dockerfile pretty closely resembles this one: # Dockerfile # Uses multi-stage builds requiring Docker 17.05 or higher # See https://docs.docker.com/develop/develop-images/multistage-build/ # Creating a python base with shared environment variables FROM python:3.8.1-slim as python-base ENV PYTHONUNBUFFERED=1 PYTHONDONTWRITEBYTECODE=1 PIP_NO_CACHE_DIR=off PIP_DISABLE_PIP_VERSION_CHECK=on PIP_DEFAULT_TIMEOUT=100 POETRY_HOME="/opt/poetry" POETRY_VIRTUALENVS_IN_PROJECT=true POETRY_NO_INTERACTION=1 PYSETUP_PATH="/opt/pysetup" VENV_PATH="/opt/pysetup/.venv" ENV PATH="$POETRY_HOME/bin:$VENV_PATH/bin:$PATH" # builder-base is used to build dependencies FROM ..
In the scope of CI/CD (gitlab CI with runner on Kubernetes), I would like to test a script that runs a one shot command wrapped in a docker image. Because the runner is unpriviledged and not controlled by me, I tried to avoid dind and went for a low level solution to run the one ..
I want my image creation process to be done on my own system instead of on the server. To do this, I set up a self-hosted Agent on my system. also installed docker in my system. build, restore, and tests is done as well. But when creating images, it gives an error "not permission". my ..
I’m investigating the ability to replace default production Jenkins server with docker Jenkins server. The main reason for this is to get the opportunity for fast deploying of other Jenkins servers in future saving all configuration etc. Do someone have experience in this activity? Does it even make sense to use a containerized Jenkins for ..
Is it ever possible get access to nexus started as docker container from gitlab runner started as docker container? I have Sonatype nexus for maven artifacts, which runs as docker container: docker run -d -p 8081:8081 –name nexus –network cigitlab –hostname nexus –mount source=nexus,target=/nexus-data klo2k/nexus3 And also gitlab-runner as docker container: docker run -d –name ..
I want to write a simple GitHub Action that runs my Django app’s tests when I push to GitHub. GitHub runs the workflow on push, but for some reason, it doesn’t pick up any of the tests, even though running python ./api/manage.py test locally works. The Run tests section of the Job summary shows this: ..
I’m having trouble while running a job on a gitlab pipeline. See the error below: error Could not write file "/usr/src/app/yarn-error.log": "ENOSPC: no space left on device, open ‘/usr/src/app/yarn-error.log’" error An unexpected error occurred: "ENOSPC: no space left on device, copyfile ‘/usr/local/share/.cache/yarn/v6/npm-caniuse-lite-1.0.30001249-90a330057f8ff75bfe97a94d047d5e14fabb2ee8-integrity/node_modules/caniuse-lite/data/features/svg-filters.js’-> ‘/usr/src/app/node_modules/caniuse-lite/data/features/svg-filters.js’". The command ‘/bin/sh -c yarn install’ returned a non-zero code: 1 Can ..
I’m quite new to CI and Docker, and I’m trying to set up a GitHub Actions CI workflow for my project. However, when I run this workflow: name: CI on: push: branches: [ main ] pull_request: branches: [ main ] jobs: build: runs-on: ubuntu-latest steps: – uses: actions/[email protected] – name: Build the Docker development environment ..
I need a simple solution to build a docker image, push it to ECR, and deploy it to ECS. The final part, which deploys the ECR image to ECS is working. (I’m using a deploy.py short script that uses Python’s AWS boto3 SDK, found it easier than making the ECS Orb work..) However, I’m struggling ..