Questions tagged kubeflow-pipelines

Explore the latest questions and answers asked by our top developers.

Kubeflow: Can’t import from modules in docker image

I’ve set up a kubeflow pipeline where I have an image and a function importing a module within the image: import kfp from kfp.components import func_to_container_op, InputPath, OutputPath def process_data(experiment_name): import os print("Listing dir: ", os.system("ls")) import driver # decorator with an arg process_data = func_to_container_op(process_data, base_image=image) @kfp.dsl.pipeline(name="", description="") def pipeline(experiment_name): process_data(experiment_name) if __name__ == […]

Use run parameter as arg. kubeflow

I am trying to use a kubeflow run parameter as an argument for my pipeline step. Every time I compile the yaml file however it gets changed from an Integer to a LocalPath. @dsl.pipeline(name=’First Pipeline’, description=’generates a random set of numbers then performs operations on them returning a json object’) def first_pipeline(generate_n_arg: int = 10): […]

Kubeflow Kale specify container image in pipeline step

Kale allows user to specify only step and dependency from the UI. However, I would like to also specify the docker image to use for the step. I can’t figure how to specify a custom docker image to use in the pipeline step from Kale UI. Any suggestions on how to implement this? Source: Docker […]

Kubeflow dynamically provisioned PVC gets out-of-memory (OOM) error

Dear Stackoverflow community and K8s/Kubeflow experts, I am fairly new to using K8s for MLops, and running into issues configuring a Kubeflow pipeline to mount to an NFS, transfer HDF5 data from that volume to a PVC. The first two tasks are working as expected. Then when I run the preprocessing tasks on those files, […]

Kubeflow returns a no such file or directory on container start

I’m currently trying to deploy a pipeline on Kubeflow, but everytime I start it, it returns: This step is in Failed state with this message: OCI runtime create failed: container_linux.go:345: starting container process caused "exec: "python /usr/src/app/FeatureExtractor.py": stat python /usr/src/app/FeatureExtractor.py: no such file or directory": unknown This is my pipeline: it currently fails on all […]

is there a way to run a python script in cloudbuild steps?

I have a series of cloudbuild steps where i am uploading a pipeline to gcp kubeflow. now i want to run that pipeline in the next step. so for that i have written a python script, what i want it to run this python script in my next cloudbuild steps. here is my python script […]

How to persist variables in google cloudbuild steps?

I have a cloudbuild.json which is used to upload a pipeline to gcp kubeflow. now i want to add another step in which i want to fetch the latest pipeline id and then run the pipeline as an experiment. so my main issue is how should i get the pipeline id in the subsequent steps. […]

I have created a cloudbuild.json for kubeflow pipeline deployment. but it is giving error saying file is not present

This is my cloudbuild.json { "steps": [ { "name": "gcr.io/cloud-builders/docker", "args": [ "build", "-t", "trainer_image", "." ], "dir": "./trainer_image/" }, { "name": "gcr.io/cloud-builders/docker", "args": [ "build", "-t", "base_image", "." ], "dir": "./base_image/" }, { "name": "gcr.io/dmgcp-pkg-internal-poc-oct-04/kfp-cli", "args": [ "dsl-compile –py covertype_training_pipeline.py –output covertype_training_pipeline.yaml" ], "env": [ "BASE_IMAGE=gcr.io/dmgcp-pkg-internal-poc-oct-04/base_image:test", "TRAINER_IMAGE=gcr.io/dmgcp-pkg-internal-poc-oct-04/trainer_image:test", "RUNTIME_VERSION=1.15", "PYTHON_VERSION=3.7", "COMPONENT_URL_SEARCH_PREFIX=https://raw.githubusercontent.com/kubeflow/pipelines/0.2.5/components/gcp/", "USE_KFP_SA=False" ], "dir": "./pipeline/" […]

Is there a way to automate the build of kubeflow pipeline in gcp

here is my cloudbuild.yaml file – name: ‘gcr.io/cloud-builders/docker’ args: [‘build’, ‘-t’, ‘gcr.io/$PROJECT_ID/$_TRAINER_IMAGE_NAME:$TAG_NAME’, ‘.’] dir: $_PIPELINE_FOLDER/trainer_image # Build the base image for lightweight components – name: ‘gcr.io/cloud-builders/docker’ args: [‘build’, ‘-t’, ‘gcr.io/$PROJECT_ID/$_BASE_IMAGE_NAME:$TAG_NAME’, ‘.’] dir: $_PIPELINE_FOLDER/base_image # Compile the pipeline – name: ‘gcr.io/$PROJECT_ID/kfp-cli’ args: – ‘-c’ – | dsl-compile –py $_PIPELINE_DSL –output $_PIPELINE_PACKAGE env: – ‘BASE_IMAGE=gcr.io/$PROJECT_ID/$_BASE_IMAGE_NAME:$TAG_NAME’ – ‘TRAINER_IMAGE=gcr.io/$PROJECT_ID/$_TRAINER_IMAGE_NAME:$TAG_NAME’ […]

Still Have Questions?


Our dedicated development team is here for you!

We can help you find answers to your question for as low as 5$.

Contact Us
faq