Building Your First Kubeflow Pipeline: A Simple Example

Kubeflow Pipelines is a powerful platform for building, deploying, and managing end-to-end machine learning workflows. It simplifies the process of creating and executing ML pipelines, making it easier for data scientists and engineers to collaborate on model development and deployment. In this tutorial, we will guide you through building and running a simple Kubeflow Pipeline using Python.
Prerequisites
- Kubeflow Pipelines installed and set up (follow my previous tutorial, “Kubeflow Pipelines: A Step-by-Step Guide”)
- Familiarity with Python programming
Step 1: Install Kubeflow Pipelines SDK
First, you need to install the Kubeflow Pipelines SDK on your local machine. Run the following command in your terminal or command prompt:
pip install kfp
Step 2: Create a Simple Pipeline in Python
Create a new Python script (e.g., my_first_pipeline.py
) and add the following code:
import kfp
from kfp import dsl
def load_data_op():
return dsl.ContainerOp(
name="Load Data",
image="python:3.7",
command=["sh", "-c"],
arguments=["echo 'Loading data' && sleep 5"],
)
def preprocess_data_op():
return dsl.ContainerOp(
name="Preprocess Data",
image="python:3.7",
command=["sh", "-c"],
arguments=["echo 'Preprocessing data' && sleep 5"],
)
def train_model_op():
return dsl.ContainerOp(
name="Train Model",
image="python:3.7",
command=["sh", "-c"],
arguments=["echo 'Training model' && sleep 5"],
)
@dsl.pipeline(
name="My First Pipeline",
description="A simple pipeline that demonstrates loading, preprocessing, and training steps."
)
def my_first_pipeline():
load_data = load_data_op()
preprocess_data = preprocess_data_op().after(load_data)
train_model = train_model_op().after(preprocess_data)
if __name__ == "__main__":
kfp.compiler.Compiler().compile(my_first_pipeline, "my_first_pipeline.yaml")
This Python script defines a simple pipeline with three steps: loading data, preprocessing data, and training a model. Each step is defined as a function that returns a ContainerOp
object, which represents a containerized operation in the pipeline. The @dsl.pipeline
decorator is used to define the pipeline, and the kfp.compiler.Compiler().compile()
function is used to compile the pipeline into a YAML file.
Step 3: Upload and Run the Pipeline
- Access the Kubeflow Pipelines dashboard by navigating to the URL provided during the setup process.
- Click on the “Pipelines” tab in the left-hand sidebar.
- Click the “Upload pipeline” button in the upper right corner.
- In the “Upload pipeline” dialog, click “Browse” and select the
my_first_pipeline.yaml
file generated in the previous step. - Click “Upload” to upload the pipeline to the Kubeflow platform.
- Once the pipeline is uploaded, click on its name to open the pipeline details page.
- Click the “Create run” button to start a new run of the pipeline.
- On the “Create run” page, you can give your run a name and choose a pipeline version. Click “Start” to begin the pipeline run.
Step 4: Monitor the Pipeline Run
After starting the pipeline run, you will be redirected to the “Run details” page. Here, you can monitor the progress of your pipeline, view the logs for each step, and inspect the output artifacts.
- The pipeline graph will show the status of each step in the pipeline, with different colors indicating success, failure, or in-progress status.
- To view the logs for a specific step, click on the step in the pipeline graph and then click the “Logs” tab in the right-hand pane.
- To view the output artifacts, click on the step in the pipeline graph and then click the “Artifacts” tab in the right-hand pane.
Congratulations! You have successfully built and executed your first Kubeflow Pipeline using Python. You can now experiment with more complex pipelines, integrate different components, and optimize your machine learning workflows.
With Kubeflow Pipelines, you can automate your machine learning workflows, making it easier to build, deploy, and manage complex ML models. Now that you have a basic understanding of how to create and run pipelines in Kubeflow, you can explore more advanced features and build more sophisticated pipelines for your own projects.
Lyron Foster is a Hawai’i based African American Author, Musician, Actor, Blogger, Philanthropist and Multinational Serial Tech Entrepreneur.