Vertex AI: Google Vertex AI is an integrated suite of machine learning tools and services for building and using ML models with AutoML or custom code. It offers both novices and experts the best workbench for the entire machine learning development lifecycle.
Install this library in a virtualenv using pip. virtualenv is a tool to
create isolated Python environments. The basic problem it addresses is one of
dependencies and versions, and indirectly permissions.
With virtualenv, it's possible to install this library without needing system
install permissions, and without clashing with the installed system
dependencies.
The last version of this library compatible with Python 3.6 is google-cloud-aiplatform==1.12.1.
Overview
This section provides a brief overview of the Vertex AI SDK for Python. You can also reference the notebooks in vertex-ai-samples for examples.
All publicly available SDK features can be found in the google/cloud/aiplatform directory.
Under the hood, Vertex SDK builds on top of GAPIC, which stands for Google API CodeGen.
The GAPIC library code sits in google/cloud/aiplatform_v1 and google/cloud/aiplatform_v1beta1,
and it is auto-generated from Google's service proto files.
For most developers' programmatic needs, they can follow these steps to figure out which libraries to import:
Look through google/cloud/aiplatform first -- Vertex SDK's APIs will almost always be easier to use and more concise comparing with GAPIC
If the feature that you are looking for cannot be found there, look through aiplatform_v1 to see if it's available in GAPIC
If it is still in beta phase, it will be available in aiplatform_v1beta1
If none of the above scenarios could help you find the right tools for your task, please feel free to open a github issue and send us a feature request.
Importing
Vertex AI SDK resource based functionality can be used by importing the following namespace:
fromgoogle.cloudimportaiplatform
Initialization
Initialize the SDK to store common configurations that you use with the SDK.
aiplatform.init(
# your Google Cloud Project ID or number# environment default used is not setproject='my-project',
# the Vertex AI region you will use# defaults to us-central1location='us-central1',
# Google Cloud Storage bucket in same region as location# used to stage artifactsstaging_bucket='gs://my_staging_bucket',
# custom google.auth.credentials.Credentials# environment default credentials used if not setcredentials=my_credentials,
# customer managed encryption key resource name# will be applied to all Vertex AI resources if setencryption_spec_key_name=my_encryption_key_name,
# the name of the experiment to use to track# logged metrics and parametersexperiment='my-experiment',
# description of the experiment aboveexperiment_description='my experiment description'
)
Datasets
Vertex AI provides managed tabular, text, image, and video datasets. In the SDK, datasets can be used downstream to
train models.
Vertex AI supports a variety of dataset schemas. References to these schemas are available under the
aiplatform.schema.dataset namespace. For more information on the supported dataset schemas please refer to the
Preparing data docs.
Training
The Vertex AI SDK for Python allows you train Custom and AutoML Models.
You can train custom models using a custom Python script, custom Python package, or container.
Preparing Your Custom Code
Vertex AI custom training enables you to train on Vertex AI datasets and produce Vertex AI models. To do so your
script must adhere to the following contract:
It must read datasets from the environment variables populated by the training service:
os.environ['AIP_DATA_FORMAT'] # provides format of dataos.environ['AIP_TRAINING_DATA_URI'] # uri to training splitos.environ['AIP_VALIDATION_DATA_URI'] # uri to validation splitos.environ['AIP_TEST_DATA_URI'] # uri to test split
In the code block above my_dataset is managed dataset created in the Dataset section above. The model variable is a managed Vertex AI model that can be deployed or exported.
AutoMLs
The Vertex AI SDK for Python supports AutoML tabular, image, text, video, and forecasting.
To get the model evaluation resource for a given model:
model=aiplatform.Model('projects/my-project/locations/us-central1/models/{MODEL_ID}')
# returns the first evaluation with no arguments, you can also pass the evaluation IDevaluation=model.get_model_evaluation()
eval_metrics=evaluation.metrics
You can also create a reference to your model evaluation directly by passing in the resource name of the model evaluation:
You can also create a batch prediction job asynchronously by including the sync=False argument:
batch_prediction_job=model.batch_predict(..., sync=False)
# wait for resource to be createdbatch_prediction_job.wait_for_resource_creation()
# get the statebatch_prediction_job.state# block until job is completebatch_prediction_job.wait()
To create a Vertex AI Pipeline run and monitor until completion:
# Instantiate PipelineJob objectpl=PipelineJob(
display_name="My first pipeline",
# Whether or not to enable caching# True = always cache pipeline step result# False = never cache pipeline step result# None = defer to cache option for each pipeline component in the pipeline definitionenable_caching=False,
# Local or GCS path to a compiled pipeline definitiontemplate_path="pipeline.json",
# Dictionary containing input parameters for your pipelineparameter_values=parameter_values,
# GCS path to act as the pipeline rootpipeline_root=pipeline_root,
)
# Execute pipeline in Vertex AI and monitor until completionpl.run(
# Email address of service account to use for the pipeline run# You must have iam.serviceAccounts.actAs permission on the service account to use itservice_account=service_account,
# Whether this function call should be synchronous (wait for pipeline run to finish before terminating)# or asynchronous (return immediately)sync=True
)
To create a Vertex AI Pipeline without monitoring until completion, use submit instead of run:
# Instantiate PipelineJob objectpl=PipelineJob(
display_name="My first pipeline",
# Whether or not to enable caching# True = always cache pipeline step result# False = never cache pipeline step result# None = defer to cache option for each pipeline component in the pipeline definitionenable_caching=False,
# Local or GCS path to a compiled pipeline definitiontemplate_path="pipeline.json",
# Dictionary containing input parameters for your pipelineparameter_values=parameter_values,
# GCS path to act as the pipeline rootpipeline_root=pipeline_root,
)
# Submit the Pipeline to Vertex AIpl.submit(
# Email address of service account to use for the pipeline run# You must have iam.serviceAccounts.actAs permission on the service account to use itservice_account=service_account,
)
Explainable AI: Get Metadata
To get metadata in dictionary format from TensorFlow 1 models:
To use Explanation Metadata in endpoint deployment and model upload:
explanation_metadata=builder.get_metadata_protobuf()
# To deploy a model to an endpoint with explanationmodel.deploy(..., explanation_metadata=explanation_metadata)
# To deploy a model to a created endpoint with explanationendpoint.deploy(..., explanation_metadata=explanation_metadata)
# To upload a model with explanationaiplatform.Model.upload(..., explanation_metadata=explanation_metadata)
Cloud Profiler
Cloud Profiler allows you to profile your remote Vertex AI Training jobs on demand and visualize the results in Vertex AI Tensorboard.
To start using the profiler with TensorFlow, update your training script to include the following:
Finally, visit your TensorBoard in your Google Cloud Console, navigate to the "Profile" tab, and click the Capture Profile button. This will allow users to capture profiling statistics for the running jobs.
Kurtis brings up two ideas - one is adding comments, the other is adding a run_sample() wrapper function. Both are valid ideas - and I want the team to make the choice here. My personal opinion is that comments work better in the python ecosystem. But again, happy to let the team decide.
I want to make sure the solution is one that we can scalably roll out across all ~100 of our samples. So I want the change to be made in the generator rather than as one-offs in changes to already-generated samples.
lassification.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_extraction.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_sentiment.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_action_recognition.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_classification.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_object_tracking.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/export_evaluated_data_items_config.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 29 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/export_evaluated_data_items_config.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_forecasting.proto:20:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_image_classification.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_image_object_detection.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_image_segmentation.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_tables.proto:20:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_classification.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_extraction.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_sentiment.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_action_recognition.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_classification.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_object_tracking.proto:19:1: warning: Import google/api/annotations.proto is unused.
Traceback (most recent call last):
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
from gapic.cli import generate
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
from gapic import generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
from .generator import Generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
from gapic.samplegen import manifest, samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
from gapic.samplegen import samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
from gapic.schema import wrappers
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
from gapic.schema.api import API
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
from google.api_core import exceptions # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
[9 / 15] checking cached actions
INFO: Elapsed time: 2.311s, Critical Path: 1.72s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-aiplatform/synth.py", line 34, in <module>
bazel_target="//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py",
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 193, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py']' returned non-zero exit status 1.
2021-01-21 05:20:57,780 autosynth [ERROR] > Synthesis failed
2021-01-21 05:20:57,780 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6d3027b chore(deps): update dependency google-cloud-aiplatform to v0.4.0 (#172)
2021-01-21 05:20:57,788 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-21 05:20:57,795 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
import_data_*_sample tests are not done in the same way (some using mocks, some calling the actual service). They are also flaky and sometimes getting TimeoutError.
oud/aiplatform/v1beta1/accelerator_type.proto is unused.
google/cloud/aiplatform/v1beta1/data_labeling_job.proto:24:1: warning: Import google/cloud/aiplatform/v1beta1/specialist_pool.proto is unused.
google/cloud/aiplatform/v1beta1/dataset.proto:25:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/operation.proto:22:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/deployed_model_ref.proto:21:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/model.proto:28:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/model.proto:21:1: warning: Import google/cloud/aiplatform/v1beta1/dataset.proto is unused.
google/cloud/aiplatform/v1beta1/pipeline_state.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/training_pipeline.proto:30:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/training_pipeline.proto:23:1: warning: Import google/cloud/aiplatform/v1beta1/machine_resources.proto is unused.
google/cloud/aiplatform/v1beta1/training_pipeline.proto:24:1: warning: Import google/cloud/aiplatform/v1beta1/manual_batch_tuning_parameters.proto is unused.
google/cloud/aiplatform/v1beta1/dataset_service.proto:28:1: warning: Import google/cloud/aiplatform/v1beta1/training_pipeline.proto is unused.
google/cloud/aiplatform/v1beta1/endpoint.proto:25:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/study.proto:25:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/study.proto:21:1: warning: Import google/protobuf/duration.proto is unused.
google/cloud/aiplatform/v1beta1/study.proto:24:1: warning: Import google/protobuf/wrappers.proto is unused.
google/cloud/aiplatform/v1beta1/hyperparameter_tuning_job.proto:27:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/job_service.proto:31:1: warning: Import google/protobuf/timestamp.proto is unused.
google/cloud/aiplatform/v1beta1/job_service.proto:27:1: warning: Import google/cloud/aiplatform/v1beta1/operation.proto is unused.
google/cloud/aiplatform/v1beta1/migratable_resource.proto:22:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/migration_service.proto:19:1: warning: Import google/cloud/aiplatform/v1beta1/dataset.proto is unused.
google/cloud/aiplatform/v1beta1/migration_service.proto:20:1: warning: Import google/cloud/aiplatform/v1beta1/model.proto is unused.
google/cloud/aiplatform/v1beta1/model_evaluation.proto:24:1: warning: Import google/api/annotations.proto is unused.
google/cloud/aiplatform/v1beta1/model_evaluation_slice.proto:23:1: warning: Import google/api/annotations.proto is unused.
Traceback (most recent call last):
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
from gapic.cli import generate
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
from gapic import generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
from .generator import Generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
from gapic.samplegen import manifest, samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
from gapic.samplegen import samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
from gapic.schema import wrappers
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
from gapic.schema.api import API
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
from google.api_core import exceptions # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 2.493s, Critical Path: 1.87s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-aiplatform/synth.py", line 34, in <module>
bazel_target="//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py",
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 197, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py']' returned non-zero exit status 1.
2021-01-28 05:21:10,320 autosynth [ERROR] > Synthesis failed
2021-01-28 05:21:10,320 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6589383 fix(deps): remove optional dependencies (#187)
2021-01-28 05:21:10,327 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-28 05:21:10,333 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
Essentially what we want here is that I as a developer, should be able to run the tests on these samples without modifying them to confirm they still work. However unless I'm on the team maintaining the samples, I likely don't have access to the resources it's tested with. By using env variables means all I need to do when running locally is change the env var, and not worry about changing the model ids when there is a problem.
So something like this is perfectly acceptable:
resource_id=os.getenv("YOUR-RESOURCE-NAME")
For bonus points you can include a link to instructions for creating the resource, if you really want to make it accessible for other developers.
state = <grpc._channel._RPCState object at 0x7f07efd52850>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efbc9550>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907542.929950362","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
get_model_evaluation_video_object_tracking_sample.py:33: in get_model_evaluation_video_object_tracking_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
state = <grpc._channel._RPCState object at 0x7f07efbc3950>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07ef1dea00>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606908373.506968741","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
list_model_evaluation_slices_sample.py:33: in list_model_evaluation_slices_sample
response = client.list_model_evaluation_slices(parent=parent)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1264: in list_model_evaluation_slices
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
state = <grpc._channel._RPCState object at 0x7f07efd8c990>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efd8b4b0>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907541.908400298","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
get_model_evaluation_sample.py:33: in get_model_evaluation_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
state = <grpc._channel._RPCState object at 0x7f07efbfa350>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efc12f00>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907541.391839330","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
get_model_evaluation_tabular_classification_sample.py:33: in get_model_evaluation_tabular_classification_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
state = <grpc._channel._RPCState object at 0x7f07efbf1310>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efdb7370>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907543.324315067","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
get_model_sample.py:30: in get_model_sample
response = client.get_model(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:580: in get_model
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
Eventually, the library should move to using the latest version of the microgenerator to get new features and bug fixes.
I've confirmed that continuing to use the Docker image is fine for the time being, and AC Tools will continue to publish it in addition to supporting the Bazel workflow.
state = <grpc._channel._RPCState object at 0x7f07efc98790>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07fe73f820>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907539.997788449","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
export_model_tabular_classification_sample.py:37: in export_model_tabular_classification_sample
response = client.export_model(name=name, output_config=output_config)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:937: in export_model
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
state = <grpc._channel._RPCState object at 0x7f07efc98e90>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f08039cc870>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907541.010522189","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
get_model_evaluation_slice_sample.py:38: in get_model_evaluation_slice_sample
response = client.get_model_evaluation_slice(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1184: in get_model_evaluation_slice
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
def retry_target(target, predicate, sleep_generator, deadline, on_error=None):
"""Call a function and retry if it fails.
This is the lowest-level retry helper. Generally, you'll use the
higher-level retry helper :class:`Retry`.
Args:
target(Callable): The function to call and retry. This must be a
nullary function - apply arguments with `functools.partial`.
predicate (Callable[Exception]): A callable used to determine if an
exception raised by the target should be considered retryable.
It should return True to retry or False otherwise.
sleep_generator (Iterable[float]): An infinite iterator that determines
how long to sleep between retries.
deadline (float): How long to keep retrying the target. The last sleep
period is shortened as necessary, so that the last retry runs at
``deadline`` (and not considerably beyond it).
on_error (Callable[Exception]): A function to call while processing a
retryable exception. Any error raised by this function will *not*
be caught.
Returns:
Any: the return value of the target function.
Raises:
google.api_core.RetryError: If the deadline is exceeded while retrying.
ValueError: If the sleep generator stops yielding values.
Exception: If the target raises a method that isn't retryable.
"""
if deadline is not None:
deadline_datetime = datetime_helpers.utcnow() + datetime.timedelta(
seconds=deadline
)
else:
deadline_datetime = None
last_exc = None
for sleep in sleep_generator:
try:
self = <google.api_core.operation.Operation object at 0x7fa736b2bcd0>
retry = <google.api_core.retry.Retry object at 0x7fa73dd074d0>
def _done_or_raise(self, retry=DEFAULT_RETRY):
"""Check if the future is done and raise if it's not."""
kwargs = {} if retry is DEFAULT_RETRY else {"retry": retry}
if not self.done(**kwargs):
raise _OperationNotComplete()
E google.api_core.future.polling._OperationNotComplete
The above exception was the direct cause of the following exception:
self = <google.api_core.operation.Operation object at 0x7fa736b2bcd0>
timeout = 1800, retry = <google.api_core.retry.Retry object at 0x7fa73dd074d0>
def _blocking_poll(self, timeout=None, retry=DEFAULT_RETRY):
"""Poll and wait for the Future to be resolved.
Args:
timeout (int):
How long (in seconds) to wait for the operation to complete.
If None, wait indefinitely.
"""
if self._result_set:
return
retry_ = self._retry.with_deadline(timeout)
try:
kwargs = {} if retry is DEFAULT_RETRY else {"retry": retry}
target = functools.partial(<bound method PollingFuture._done_or_raise of <google.api_core.operation.Operation object at 0x7fa736b2bcd0>>)
predicate = <function if_exception_type..if_exception_type_predicate at 0x7fa73dcfaf80>
sleep_generator = <generator object exponential_sleep_generator at 0x7fa736b99b50>
deadline = 1800, on_error = None
def retry_target(target, predicate, sleep_generator, deadline, on_error=None):
"""Call a function and retry if it fails.
This is the lowest-level retry helper. Generally, you'll use the
higher-level retry helper :class:`Retry`.
Args:
target(Callable): The function to call and retry. This must be a
nullary function - apply arguments with `functools.partial`.
predicate (Callable[Exception]): A callable used to determine if an
exception raised by the target should be considered retryable.
It should return True to retry or False otherwise.
sleep_generator (Iterable[float]): An infinite iterator that determines
how long to sleep between retries.
deadline (float): How long to keep retrying the target. The last sleep
period is shortened as necessary, so that the last retry runs at
``deadline`` (and not considerably beyond it).
on_error (Callable[Exception]): A function to call while processing a
retryable exception. Any error raised by this function will *not*
be caught.
Returns:
Any: the return value of the target function.
Raises:
google.api_core.RetryError: If the deadline is exceeded while retrying.
ValueError: If the sleep generator stops yielding values.
Exception: If the target raises a method that isn't retryable.
"""
if deadline is not None:
deadline_datetime = datetime_helpers.utcnow() + datetime.timedelta(
seconds=deadline
)
else:
deadline_datetime = None
last_exc = None
for sleep in sleep_generator:
try:
return target()
# pylint: disable=broad-except
# This function explicitly must deal with broad exceptions.
except Exception as exc:
if not predicate(exc):
raise
last_exc = exc
if on_error is not None:
on_error(exc)
now = datetime_helpers.utcnow()
if deadline_datetime is not None:
if deadline_datetime <= now:
six.raise_from(
exceptions.RetryError(
"Deadline of {:.1f}s exceeded while calling {}".format(
deadline, target
),
last_exc,
),
value = None, from_value = _OperationNotComplete()
???
E google.api_core.exceptions.RetryError: Deadline of 1800.0s exceeded while calling functools.partial(<bound method PollingFuture._done_or_raise of <google.api_core.operation.Operation object at 0x7fa736b2bcd0>>), last exception:
:3: RetryError
During handling of the above exception, another exception occurred:
capsys = <_pytest.capture.CaptureFixture object at 0x7fa736b23610>
dataset_name = 'projects/580378083368/locations/us-central1/datasets/1868888292242489344'
import_data_text_entity_extraction_sample.py:40: in import_data_text_entity_extraction_sample
import_data_response = response.result(timeout=timeout)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/future/polling.py:129: in result
self._blocking_poll(timeout=timeout, **kwargs)
self = <google.api_core.operation.Operation object at 0x7fa736b2bcd0>
timeout = 1800, retry = <google.api_core.retry.Retry object at 0x7fa73dd074d0>
def _blocking_poll(self, timeout=None, retry=DEFAULT_RETRY):
"""Poll and wait for the Future to be resolved.
Args:
timeout (int):
How long (in seconds) to wait for the operation to complete.
If None, wait indefinitely.
"""
if self._result_set:
return
retry_ = self._retry.with_deadline(timeout)
try:
kwargs = {} if retry is DEFAULT_RETRY else {"retry": retry}
retry_(self._done_or_raise)(**kwargs)
except exceptions.RetryError:
raise concurrent.futures.TimeoutError(
"Operation did not complete within the designated " "timeout."
)
E concurrent.futures._base.TimeoutError: Operation did not complete within the designated timeout.
I'm using the latest pip version where the default dependency resolver changed, and there are some issues installing google-cloud-aiplatform alongside apache-beam due to incompatible versions of mock.
(env) โ python --version
Python 3.8.5
(env) โ pip --version
pip 20.3.3 from /usr/local/google/home/dcavazos/src/sandbox/env/lib/python3.8/site-packages/pip (python 3.8)
(env) โ pip install apache-beam==2.27.0 google-cloud-aiplatform==0.4.0
# Installation fails after a very long time
I searched through the python-aiplatform repo and mock is only used for testing, so it could safely be part of tests_require instead of install_requires in the setup.py file.
The mock dependency should also be a part of the tests_require in Apache Beam as well, but due to BEAM-8840 the setup_requires and tests_require sections were removed.
Using an older version of pip like 20.2.* throws an error/warning, but the installation still succeeds.
# This error/warning shows when using pip 20.2.4, both packages can still be installed.
ERROR: google-cloud-aiplatform 0.4.0 has requirement mock>=4.0.2, but you'll have mock 2.0.0 which is incompatible.
Starting with pip 20.3, the new dependency resolver cannot install both libraries together due to the mock versions incompatibility.
Environment details
OS type and version: Linux 5.7.17-1rodete4-amd64 #1 SMP Debian 5.7.17-1rodete4 (2020-10-01) x86_64
Python version: Python 3.8.5
pip version: pip 20.3.3 from /usr/local/google/home/dcavazos/src/sandbox/env/lib/python3.8/site-packages/pip (python 3.8)
google-cloud-aiplatform version: 0.4.0 but cannot be installed alongside apache-beam==2.27.0
Steps to reproduce
Update pip to the latest version.
pip install -U pip
Install apache-beam and google-cloud-aiplatform (it takes a really long time to resolve dependencies as well, but that's out of the scope for this).
In the setup.py file, create a new section called tests_require and move the mock dependency to it.
Workaround
In the meantime, the only workaround is to force downgrading your pip version before installing your requirements, which is not always possible in some managed services.
I have a working model endpoint deployed on the AI Platform (Unified).
However, the response size is exceeding the maximum allowed by the gRPC max message setting.
How can I increase this limit of the tensorflow server running on the endpoint?
Environment details
OS type and version:
Python version: python --version
pip version: pip --version
google-cloud-aiplatform version: pip show google-cloud-aiplatform
Steps to reproduce
Deployed an objection detection model trained from the tensorflow/models repository on the AI Platform (Unified)
Send a single base64 encoded image using the aiplatform prediction client.
InvalidArgument: 400 Failed to handle request. endpoint_id: 3414397020616523776, deployed_model_id: 7183628433748918272 with error: `Response size too large. Received at least 3288694 bytes; max is 2000000.`
truct_54__handle_cancellation_from_core.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132284:72: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_55__schedule_rpc_coro.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132290:65: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_56__handle_rpc.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132296:67: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_57__request_call.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132302:71: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_58__server_main_loop.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132308:59: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_59_start.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132314:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_60__start_shutting_down.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132320:62: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_61_shutdown.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132326:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_62_wait_for_termination.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'PyObject* __Pyx_decode_c_bytes(const char*, Py_ssize_t, Py_ssize_t, Py_ssize_t, const char*, const char*, PyObject* (*)(const char*, Py_ssize_t, const char*))':
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:136866:45: warning: 'PyObject* PyUnicode_FromUnicode(const Py_UNICODE*, Py_ssize_t)' is deprecated [-Wdeprecated-declarations]
return PyUnicode_FromUnicode(NULL, 0);
^
In file included from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/unicodeobject.h:1026:0,
from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/Python.h:97,
from bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:4:
bazel-out/host/bin/external/local_config_python/_python3/_python3_include/cpython/unicodeobject.h:551:42: note: declared here
Py_DEPRECATED(3.3) PyAPI_FUNC(PyObject*) PyUnicode_FromUnicode(
^~~~~~~~~~~~~~~~~~~~~
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'void __pyx_f_7_cython_6cygrpc__unified_socket_write(int)':
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:72692:3: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result]
(void)(write(__pyx_v_fd, ((char *)"1"), 1));
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: At global scope:
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:144607:1: warning: 'void __Pyx_PyAsyncGen_Fini()' defined but not used [-Wunused-function]
__Pyx_PyAsyncGen_Fini(void)
^~~~~~~~~~~~~~~~~~~~~
Target //google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 3.271s, Critical Path: 2.64s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 790, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/root/.cache/synthtool/python-aiplatform/synth.py", line 31, in <module>
library = gapic.py_library(
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 45, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 182, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 27, in run
return subprocess.run(
File "/usr/local/lib/python3.9/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py']' returned non-zero exit status 1.
2020-12-05 00:10:08,888 autosynth [ERROR] > Synthesis failed
2020-12-05 00:10:08,889 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 5362a4d chore: update sample test resouce names (#120)
2020-12-05 00:10:08,895 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-12-05 00:10:08,901 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/usr/local/lib/python3.9/subprocess.py", line 456, in check_returncode
raise CalledProcessError(self.returncode, self.args, self.stdout,
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f66280e58d0>
request = endpoint: "projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976"
instances {
struct_value... string_value: "The Chicago Bears is a great football team!"
}
}
}
}
parameters {
struct_value {
}
}
state = <grpc._channel._RPCState object at 0x7f66280e5490>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f66280e6be0>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.FAILED_PRECONDITION
E details = "Endpoint projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split."
E debug_error_string = "{"created":"@1605785199.361124669","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"Endpoint projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split.","grpc_status":9}"
E >
predict_text_sentiment_analysis_sample.py:41: in predict_text_sentiment_analysis_sample
endpoint=endpoint, instances=instances, parameters=parameters
../../google/cloud/aiplatform_v1beta1/services/prediction_service/client.py:438: in predict
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.FAILED_PRECONDITION
details = "Endpoint projects...ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split.","grpc_status":9}"
???
E google.api_core.exceptions.FailedPrecondition: 400 Endpoint projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split.
I am using the Google AiPlatform (Unified) Python client to export a trained model to a Google Cloud bucket. I am following the sample code from: export_model_sample.
The application has "owner" credentials at the moment because I want to make sure it is not a permissions issue. However, when I try to execute the sample code I am getting the following error:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 923, in call
return _end_unary_response_blocking(state, call, False, None)
File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 826, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.FAILED_PRECONDITION
details = "Exporting artifact for model projects/101010101010/locations/us-central1/models/123123123123123 in format is not supported." debug_error_string = "{"created":"@1611864688.554145696","description":"Error received from peer ipv4:172.217.12.202:443","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Exporting artifact for model `projects/110101010101/locations/us-central1/models/123123123123123` in format is not supported.","grpc_status":9}"
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/app/main.py", line 667, in
response = aiplatform_model_client.export_model(name=name, output_config=output_config) File
"/usr/local/lib/python3.8/site-packages/google/cloud/aiplatform_v1beta1/services/model_service/client.py",
line 937, in export_model
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) File
"/usr/local/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py",
line 145, in call
return wrapped_func(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py",
line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc) File "", line 3, in raise_from
google.api_core.exceptions.FailedPrecondition: 400 Exporting artifact
for model projects/111101010101/locations/us-central1/models/123123123123123123
in format `` is not supported.
(I have omitted the project id and the models id. Using 10101 and 123123)
I have verified my inputs but everything seems ok:
Getting the following error when I am running with python for prediction, even I have added JSON file for the cloud auth key. Some other machines work my script without any problem though.
runfile('C:/bot/googlex.py', wdir='C:/bot')
collecting data
Traceback (most recent call last):
File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 58, in error_remapped_callable
return callable_(*args, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\grpc\_channel.py", line 923, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "C:\ProgramData\Anaconda3\lib\site-packages\grpc\_channel.py", line 826, in _end_unary_response_blocking
raise _InactiveRpcError(state)
_InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "502:Bad Gateway"
debug_error_string = "{"created":"@1606950672.430000000","description":"Error received from peer ipv4:172.217.14.234:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"502:Bad Gateway","grpc_status":14}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\bot\googlex.py", line 278, in <module>
predict_tabular_regression_sample(
File "C:\bot\googlex.py", line 261, in predict_tabular_regression_sample
response = client.predict(
File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\client.py", line 438, in predict
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
File "c:\programdata\anaconda3\lib\site-packages\google\api_core\gapic_v1\method.py", line 145, in __call__
return wrapped_func(*args, **kwargs)
File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 60, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
ServiceUnavailable: 503 502:Bad Gateway
runfile('C:/bot/googlex.py', wdir='C:/bot')
collecting data
Traceback (most recent call last):
File "C:\bot\googlex.py", line 278, in <module>
predict_tabular_regression_sample(
File "C:\bot\googlex.py", line 251, in predict_tabular_regression_sample
client = aiplatform.gapic.PredictionServiceClient(client_options=client_options)
File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\client.py", line 327, in __init__
self._transport = Transport(
File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\transports\grpc.py", line 160, in __init__
self._grpc_channel = type(self).create_channel(
File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\transports\grpc.py", line 217, in create_channel
return grpc_helpers.create_channel(
File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 276, in create_channel
composite_credentials = _create_composite_credentials(
File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 223, in _create_composite_credentials
request = google.auth.transport.requests.Request()
AttributeError: module 'google.auth.transport' has no attribute 'requests'
state = <grpc._channel._RPCState object at 0x7f07efc0f890>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efd6e820>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907542.423832671","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
get_model_evaluation_video_classification_sample.py:33: in get_model_evaluation_video_classification_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.
The snippets section of this Github repository, doe not have a separate example to get the export status. Because of this, I was expecting that the last line: print("export_model_response:", export_model_response) would print the status of the operation like the example using the REST &CMD Line that looks like:
state = <grpc._channel._RPCState object at 0x7f07efb4d410>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efde3f00>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907540.586318659","description":"Error received from peer ipv4:142.250.107.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >
get_model_evaluation_sample.py:33: in get_model_evaluation_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.