I have developed a Django dockerized web app using docker-compose. It runs in my local fine. The point is that when I define a CI pipeline, specifically CircleCI (I don’t know how it works with any other alternative), to upload it to GCloud App Engine the workflow works fine but when visiting the url it ..
I’m running custom training jobs in google’s Vertex AI. A simple gcloud command to execute a custom job would use something like the following syntax (complete documentation for the command can be seen here): gcloud beta ai custom-jobs create –region=us-central1 –display-name=test –config=config.yaml In the config.yaml file, it is possible to specify the machine and accelerator ..
I am very lost on the steps with gcloud verse docker. I have some gradle code that built a docker image and I see it in images like so (base) Deans-MacBook-Pro:stockstuff-all dean$ docker images REPOSITORY TAG IMAGE ID CREATED SIZE gcr.io/prod-stock-bot/stockstuff latest b041e2925ee5 27 minutes ago 254MB I am unclear if I need to run ..
I am trying to read .env file using "dotenv" package but it returns undefined from process.env.DB_HOST after published to gcloud run. I see all files except for the .env file in root directory when I output all files to log. I do have .env file in my project on a root directory. Not sure why ..
I want to deploy my flask project using docker to firebase and run gcloud build submit but it shown error below. anyone known how to fix this. Source: Docker..
In documentation they say we can build and deploy container from cloud container registry to cloud run using cloudbuild.yaml file: steps: # Build the container image – name: ‘gcr.io/cloud-builders/docker’ args: [‘build’, ‘-t’, ‘gcr.io/PROJECT_ID/IMAGE’, ‘.’] # Push the container image to Container Registry – name: ‘gcr.io/cloud-builders/docker’ args: [‘push’, ‘gcr.io/PROJECT_ID/IMAGE’] # Deploy container image to Cloud Run ..
My pod can’t be created because of the following problem: Failed to pull image "europe-west3-docker.pkg.dev/<PROJECT_ID>/<REPO_NAME>/my-app:1.0.0": rpc error: code = Unknown desc = Error response from daemon: Get https://europe-west3-docker.pkg.dev/v2/<PROJECT_ID>/<REPO_NAME>/my-app/manifests/1.0.0: denied: Permission "artifactregistry.repositories.downloadArtifacts" denied on resource "projects/<PROJECT_ID>/locations/europe-west3/repositories/<REPO_NAME>" (or it may not exist) I’ve never experienced anything like it. Maybe someone can help me out. Here is what ..
I am very new to GCP and I would greatly appreciate some help here … I have a docker containerized application that runs in AWS/Azure but needs to access gcloud SDK as well as through "Google cloud client libraries". what is the best way to setup gcloud authentication from an application that runs outside of ..
I have a project with the following strucure : [email protected]:~/personal-projects/bertQA_server$ ls bert-server-env cloudbuild.yaml Dockerfile main.py mymodel Procfile __pycache__ README.md requirements.txt target My .dockerignore: Dockerfile README.md *.pyc *.pyo *.pyd __pycache__ .pytest_cache bert-server-env This is my Dockerfile FROM tensorflow/tensorflow # Allow statements and log messages to immediately appear in the Knative logs ENV PYTHONUNBUFFERED True # Copy ..
I have 2 containers: one with gcloud/gsutil and clickhouse (based on debian/buster-slim, no additional user or permissions set in Dockerfile) and git-sync container. I am running them both side-by-side in one Pod with shared volume. I pull repository with sh scripts inside git-sync container in shared volume. Here is manifest: apiVersion: v1 kind: Pod metadata: ..