sachua / mlflow-docker-compose Goto Github PK
View Code? Open in Web Editor NEWMLflow deployment with 1 command
License: Apache License 2.0
MLflow deployment with 1 command
License: Apache License 2.0
Hello,
FIrst, of all I want to say that this a great repo. Thank you for making it very easy to use.
I wanted to say that themlflow models serve -m S3://mlflow/0/98bdf6ec158145908af39f86156c347f/artifacts/model -p 1234
didn't work for me unless I added --env-manager conda
.
Also, I had to downgrade my python on the new env created by mlflow run https://github.com/mlflow/mlflow-example.git -P alpha=0.42
.
Lastly, I couldn't get the curl request to work. It says that the scoring protocol has changed but I can't figure out what has it changed to. And I couldn't find a swagger or anything so not really sure what to do for the example one.
Thank you again for this. Great work
In order to start mysql, the volume "dbdata" needs to be linked to **./**dbdata and not only dbdata
The dot is important.
If not, we get this error:
mlflow_server | 2023/01/16 12:12:33 WARNING mlflow.store.db.utils: SQLAlchemy engine could not be created. The following exception is caught.
mlflow_server | (pymysql.err.OperationalError) (1045, "Access denied for user 'mlflow_user'@'172.30.0.5' (using password: YES)"
A working setting :
db:
restart: always
image: mysql/mysql-server@sha256:fcbe88694872e88ae406bc69540211505eae922a182690d85be6af1a48e5ca0a
container_name: mlflow_db
ports:
- "3306:3306"
environment:
- MYSQL_DATABASE=${MYSQL_DATABASE}
- MYSQL_USER=${MYSQL_USER}
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
- MYSQL_ROOT_PASSWORD=${MYSQL_ROOT_PASSWORD}
volumes:
- ./dbdata:/var/lib/mysql
There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.
Location: renovate.json
Error type: Invalid JSON (parsing failed)
Message: Syntax error near }, ] }
what happened with the last few commits, exactly? I noticed you updated an image to address "command not found" but I think in doing so all the data in my volume was wiped out. i lost over an hour trying to find it and eventually couldn't (and noticed the disk shrank), had to get it from a daily snapshot (thank goodness for that).
is this an issue with my filesystem / docker somehow? or are there commands in the docker compose file that somehow could have removed the data?
I'm noticing the files were in docker/overlay2/l/
instead of docker/volumes/
... that makes me suspect that my previous deployment was never writing to the minio_data
volume at all but rather to the overlay filesystem inside the container, which is why creating a new container made the files disappear. that would explain why upgrading made it look like all my files disappeared...the container changed.
This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
docker-compose.yml
minio/minio sha256:7b53a7bc5ca0b19350c5b852ffc399f7ad3be5020abd0303617a69423f2d926b
minio/mc sha256:92b495576197e48e9c7f864aac6aaa64b7eddf17db4d2e7c305bd69ca526848e
mysql/mysql-server sha256:5b40d96b11333570143d98d3a74100fefadb9abb17b27a95dbc9ad33544ec142
mlflow/Dockerfile
python 3.10-slim-buster
mlflow/requirements.txt
cryptography ==42.0.7
boto3 ==1.34.117
mlflow ==2.13.1
pymysql ==1.1.1
Hi Jpyter notebook fails on saving artefacts but script running the terminal works fine. The error says An error occurred (InvalidAccessKeyId) when calling the PutObject operation: The AWS Access Key Id you provided does not exist in our records. Do you know how to fix this please?
Thanks
Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/boto3/s3/transfer.py", line 292, in upload_file
future.result()
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/s3transfer/futures.py", line 103, in result
return self._coordinator.result()
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/s3transfer/futures.py", line 266, in result
raise self._exception
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/s3transfer/tasks.py", line 139, in call
return self._execute_main(kwargs)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/s3transfer/tasks.py", line 162, in _execute_main
return_value = self._main(**kwargs)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/s3transfer/upload.py", line 758, in _main
client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/botocore/client.py", line 530, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/botocore/client.py", line 964, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidAccessKeyId) when calling the PutObject operation: The Access Key Id you provided does not exist in our records.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "train.py", line 70, in
mlflow.sklearn.log_model(lr, "model")
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/mlflow/sklearn/init.py", line 436, in log_model
return Model.log(
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/mlflow/models/model.py", line 563, in log
mlflow.tracking.fluent.log_artifacts(local_path, mlflow_model.artifact_path)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/mlflow/tracking/fluent.py", line 903, in log_artifacts
MlflowClient().log_artifacts(run_id, local_dir, artifact_path)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/mlflow/tracking/client.py", line 1137, in log_artifacts
self._tracking_client.log_artifacts(run_id, local_dir, artifact_path)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/mlflow/tracking/_tracking_service/client.py", line 465, in log_artifacts
self._get_artifact_repo(run_id).log_artifacts(local_dir, artifact_path)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/mlflow/store/artifact/s3_artifact_repo.py", line 170, in log_artifacts
self._upload_file(
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/mlflow/store/artifact/s3_artifact_repo.py", line 146, in _upload_file
s3_client.upload_file(Filename=local_file, Bucket=bucket, Key=key, ExtraArgs=extra_args)
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/boto3/s3/inject.py", line 143, in upload_file
return transfer.upload_file(
File "/opt/homebrew/Caskroom/miniforge/base/envs/mlflow/lib/python3.8/site-packages/boto3/s3/transfer.py", line 298, in upload_file
raise S3UploadFailedError(
boto3.exceptions.S3UploadFailedError: Failed to upload /var/folders/c0/z_mclgbd4xv322t8wdpyptz80000gp/T/tmpkg7o2um3/model/python_env.yaml to mlflow/1/6e54103d2d9849669c5eaa88033390ab/artifacts/model/python_env.yaml: An error occurred (InvalidAccessKeyId) when calling the PutObject operation: The Access Key Id you provided does not exist in our records.
Can we please get the latest versions of minio/mysql used in the docker compose file. Thanks.
try creating the credentials file in ~/.aws
cat > ~/.aws/credentials <<EOF
[default]
aws_access_key_id=minio
aws_secret_access_key=minio123
EOF
you can see where boto3 tries to find the credentials here: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
Originally posted by @sachua in #5 (comment)
Hi SongAnn,
Thank you for the wonderful mlflow-docker-compose repo. It is exactly what I am looking for.
Question:
Look forward to your reply soon.
Regards,
James
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.