Deploying Kubeflow Pipelines on Alibaba Cloud


What Is Kubeflow Pipelines

  • A console for running and tracing experiments.
  • The workflow engine Argo for scheduling multi-step machine learning workflows.
  • An SDK for defining workflows. Currently, the SDK only supports Python.
  • End-to-end orchestration: enables and simplifies the orchestration of machine learning pipelines. Pipelines can be triggered directly, at a scheduled time, by event, or even by data changes.
  • Easy experiment management: makes it easy for you to try numerous ideas and techniques and manage your experiments. Kubeflow Pipelines also makes the transition from experiments to production much easier.
  • Easy re-use: enables you to re-use components and pipelines to quickly create end-to-end solutions without the need to rebuild experiments each time.

Deploy Kubeflow Pipelines on Alibaba Cloud

  1. Pipelines are deployed by using Kubeflow. However, Kubeflow has many built-in components and it is complex to use Ksonnet to deploy Kubeflow. 2. Pipelines depend on the Google cloud platform. They cannot run on other cloud platforms or bare metal instances.


opsys=linux  # or darwin, or windows
curl -s |\
grep browser_download |\
grep $opsys |\
cut -d '"' -f 4 |\
xargs curl -O -L
mv kustomize_*_${opsys}_amd64 /usr/bin/kustomize
chmod u+x /usr/bin/kustomize
  • For more information about creating a Kubernetes cluster in Alibaba Cloud Container Service, click here.


yum install -y git
git clone --recursive
yum install -y openssl
openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout kubeflow-aliyun/overlays/ack-auto-clouddisk/tls.key -out kubeflow-aliyun/overlays/ack-auto-clouddisk/tls.crt -subj "/CN=$domain/O=$domain"
yum install -y httpd-tools
htpasswd -c kubeflow-aliyun/overlays/ack-auto-clouddisk/auth admin
New password:
Re-type new password:
Adding password for user admin
cd kubeflow-aliyun/
kustomize build overlays/ack-auto-clouddisk > /tmp/ack-auto-clouddisk.yaml
sed -i.bak 's/regionid: cn-beijing/regionid: cn-hangzhou/g' \
sed -i.bak 's/zoneid: cn-beijing-e/zoneid: cn-hangzhou-g/g' \
sed -i.bak 's/' \
sed -i.bak 's/storage: 100Gi/storage: 200Gi/g' \
kubectl create --validate=true --dry-run=true -f /tmp/ack-auto-clouddisk.yaml
kubectl create -f /tmp/ack-auto-clouddisk.yaml
kubectl get ing -n kubeflow
ml-pipeline-ui * 80, 443 11m


  • Delete the Kubeflow Pipelines components.
kubectl delete -f /tmp/ack-auto-clouddisk.yaml


Original Source




Follow me to keep abreast with the latest technology news, industry insights, and developer trends. Alibaba Cloud website:

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Kubernetes CronJobs — Part 1: Basics

Dotfiles for Developers — Part 2

Sample terminal output showing a dry-run of Mackup.

CS371p Spring 2022: David Klingler

New Feature: Enhanced Management of ECS Instances with System Event

The 5 Most Important Metrics To Measure The Performance of Video Streaming

Javascript Best Practice Rules to Write

3D Animate Hub: Convert your Photo to 3D Avatar

Announcing a Scala DSL for AWS Data Pipeline

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Alibaba Cloud

Alibaba Cloud

Follow me to keep abreast with the latest technology news, industry insights, and developer trends. Alibaba Cloud website:

More from Medium

Deep dive into Dask Distributed: Part 2

Typical Dask distributed configuration

New Workshop: From Kubernetes to Kubeflow

Kubeflow Cloud Deployment (AWS)

Kubeflow: A Complete Solution to MLOps.

Kubeflow Introductory Image