Hi, I’m developing a vertex ai pipeline using kfp sdk and currently there’s no official google_cloud_pipeline_component for batch prediction supporting the new feature of Model Monitoring afaik.
In order to achieve so, I’m writing my custom component using the google-cloud-aiplatform python lib, specifically using these classes:
google.cloud.aiplatform_v1beta1.services.job_service.JobServiceClient
and
google.cloud.aiplatform_v1beta1.types.BatchPredictionJob
Problem is, when polling the model monitoring job itself the job status doesnt have a predefined set of possible status codes/error messages, nor guarantee they will be present in the google.cloud.aiplatform_v1beta1.types.BatchPredictionJob.
model_monitoring_status object. Some of my jobs have a blank object, some have both “code”: 3 and “message”: “(detailed message)” (when batch predicition fails) some have a single “message”: “FINISHED” (when the job succeeds) in the status object.
How should i properly poll the status of the model monitoring job in order to report success/failure to my kfp pipeline?
thanks