This repository has the idea to build a first cut of a MLOps pipeline end to end in VertexAI.
The overview architecture of the project is this:
As toy a example, we are trying to solve the problem of default payment risk prediction. The dataset for the problem can be found here.
To do that I will follow the these steps:
- Data analysis to understand the problem and the dataset;
- Baseline machine learning model for classification;
- More complex machine learning model for classification;
- Development of Vertex Pipeline for training and deployment;
- BigQuery Table creation for model metada storage;
- Devolopment of custom prediction image to deal with preprocessing in prediction time;
- Setup of Vertex Model monitoring for data and concept drifts.
To run the pipeline first you need to have a GCP service account with all the required permissions.
Once the service account is ready you can set the bucket with the data using the parameter_values.json in the field input_path. Notice, the path must be a GCS FUSE path to the .csv file contaning the training the data.
After that, you can it by:
python run_pipeline.py --service_account yoursa@project_id.iam.gserviceaccount.com