Code Monkey home page Code Monkey logo

innereye-inference's Introduction

This project is now archived

This project is no longer under active maintenance. It is read-only, but you can still clone or fork the repo. Check here for further info. Please contact [email protected] if you run into trouble with the "Archived" state of the repo.

Introduction

InnerEye-Inference is an App Service webapp in python to run inference on medical imaging models trained with the InnerEye-DeepLearning toolkit.

You can also integrate this with DICOM using the InnerEye-Gateway.

Getting Started

Operating System

If developing or using this tool locally, we highly recommend using Ubuntu 20.04 as your operating system. This is as the Azure App Service base image will be Ubuntu. By developing locally in Ubuntu you can guarantee maximum repeatibility between local and cloud behaviour.

For windows users this is easily done through Windows Subsystem for Linux.

Installing Conda or Miniconda

Download a Conda or Miniconda installer for your platform and run it.

Creating a Conda environment

Note that in order to create the Conda environment you will need to have build tools installed on your machine. If you are running Windows, they should be already installed with Conda distribution.

You can install build tools on Ubuntu (and Debian-based distributions) by running:

sudo apt-get install build-essential

If you are running CentOS/RHEL distributions, you can install the build tools by running:

yum install gcc gcc-c++ kernel-devel make

Start the conda prompt for your platform. In that prompt, navigate to your repository root and run:

conda env create --file environment.yml
conda activate inference

Configuration

Add this script with name set_environment.sh to set your env variables. This can be executed in Linux. The code will read the file if the environment variables are not present.

#!/bin/bash
export CUSTOMCONNSTR_AZUREML_SERVICE_PRINCIPAL_SECRET=
export CUSTOMCONNSTR_API_AUTH_SECRET=
export CLUSTER=
export WORKSPACE_NAME=
export EXPERIMENT_NAME=
export RESOURCE_GROUP=
export SUBSCRIPTION_ID=
export APPLICATION_ID=
export TENANT_ID=
export DATASTORE_NAME=
export IMAGE_DATA_FOLDER=

Run with source set_environment.sh

Running flask app locally

  • flask run to test it locally

Testing flask app locally

The app can be tested locally using curl.

Ping

To check that the server is running, issue this command from a local shell:

curl -i -H "API_AUTH_SECRET: <val of CUSTOMCONNSTR_API_AUTH_SECRET>" http://localhost:5000/v1/ping

This should produce an output similar to:

HTTP/1.0 200 OK
Content-Type: text/html; charset=utf-8
Content-Length: 0
Server: Werkzeug/1.0.1 Python/3.7.3
Date: Wed, 18 Aug 2021 11:50:20 GMT

Start

To test DICOM image segmentation of a file, first create Tests/TestData/HN.zip containing a zipped set of the test DICOM files in Tests/TestData/HN. Then assuming there is a model PassThroughModel:4, issue this command:

curl -i \
    -X POST \
    -H "API_AUTH_SECRET: <val of CUSTOMCONNSTR_API_AUTH_SECRET>" \
    --data-binary @Tests/TestData/HN.zip \
    http://localhost:5000/v1/model/start/PassThroughModel:4

This should produce an output similar to:

HTTP/1.0 201 CREATED
Content-Type: text/plain
Content-Length: 33
Server: Werkzeug/1.0.1 Python/3.7.3
Date: Wed, 18 Aug 2021 13:00:13 GMT

api_inference_1629291609_fb5dfdf9

here api_inference_1629291609_fb5dfdf9 is the run id for the newly submitted inference job.

Results

To monitor the progress of the previously submitted inference job, issue this command:

curl -i \
    -H "API_AUTH_SECRET: <val of CUSTOMCONNSTR_API_AUTH_SECRET>" \
    --head \
    http://localhost:5000/v1/model/results/api_inference_1629291609_fb5dfdf9 \
    --next \
    -H "API_AUTH_SECRET: <val of CUSTOMCONNSTR_API_AUTH_SECRET>" \
    --output "HN_rt.zip" \
    http://localhost:5000/v1/model/results/api_inference_1629291609_fb5dfdf9

If the run is still in progress then this should produce output similar to:

HTTP/1.0 202 ACCEPTED
Content-Type: text/html; charset=utf-8
Content-Length: 0
Server: Werkzeug/1.0.1 Python/3.7.3
Date: Wed, 18 Aug 2021 13:45:20 GMT

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:00:01 --:--:--     0

If the run is complete then this should produce an output similar to:

HTTP/1.0 200 OK
Content-Type: application/zip
Content-Length: 131202
Server: Werkzeug/1.0.1 Python/3.7.3
Date: Wed, 18 Aug 2021 14:01:27 GMT

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  128k  100  128k    0     0   150k      0 --:--:-- --:--:-- --:--:--  150k

and download the inference result as a zipped DICOM-RT file to HN_rt.zip.

Running flask app in Azure

  1. Install Azure CLI: curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
  2. Login: az login --use-device-code
  3. Deploy: az webapp up --sku S1 --name test-python12345 --subscription <your_subscription_name> -g InnerEyeInference --location <your region> --runtime PYTHON:3.7
  4. In the Azure portal go to Monitoring > Log Stream for debugging logs

Deployment build

If you would like to reproduce the automatic deployment of the service for testing purposes:

  • az ad sp create-for-rbac --name "<name>" --role contributor --scope /subscriptions/<subs>/resourceGroups/InnerEyeInference --sdk-auth
  • The previous command will return a json object with the content for the variable secrets.AZURE_CREDENTIALS .github/workflows/deploy.yml

Deploying Behind a WAF

If you would like to deploy your Azure App Service behind a Web Application Firewall (WAF) then please see this documentation.

Images

During inference the image data zip file is copied to the IMAGE_DATA_FOLDER in the AzureML workspace's DATASTORE_NAME datastore. At the end of inference the copied image data zip file is overwritten with a simple line of text. At present we cannot delete these. If you would like these overwritten files removed from your datastore you can add a policy to delete items from the datastore after a period of time. We recommend 7 days.

Changing Dependencies

The Azure App Service will use the packages specified in requirements.txt to create the python virtual environment in which the flask app is run. The environment.yml is used for local environments only. Therefore if you want to change the packages your app service has access to, you must update requirements.txt.

Help and Bug Reporting

  1. Guidelines for how to report bug.

Licensing

MIT License

You are responsible for the performance and any necessary testing or regulatory clearances for any models generated

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit the Microsoft CLA site.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Disclaimer

The InnerEye-DeepLearning toolkit, InnerEye-Gateway and InnerEye-Inference (collectively the “Research Tools”) are provided AS-IS for use by third parties for the purposes of research, experimental design and testing of machine learning models. The Research Tools are not intended or made available for clinical use as a medical device, clinical support, diagnostic tool, or other technology intended to be used in the diagnosis, cure, mitigation, treatment, or prevention of disease or other conditions. The Research Tools are not designed or intended to be a substitute for professional medical advice, diagnosis, treatment, or judgment and should not be used as such. All users are responsible for reviewing the output of the developed model to determine whether the model meets the user’s needs and for validating and evaluating the model before any clinical use. Microsoft does not warrant that the Research Tools or any materials provided in connection therewith will be sufficient for any medical purposes or meet the health or medical requirements of any person.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct.

Resources

innereye-inference's People

Contributors

ant0nsc avatar dependabot[bot] avatar dumbledad avatar javier-alvarez avatar jonathantripp avatar ktakeda1 avatar peterhessey avatar tdowrick avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

innereye-inference's Issues

Env locking breaking app service still

The environment locking allows the web app to be deployed, however it does not run successfully and will return 404 responses to curl requests as specified in the README. The locking script needs to be updated in order to place the pip requirements into the requirements.txt file if the Inference service is to run successfully.

Inference Service Returns 404 for Azure Health Checks

When running the inference service behind an Azure Application Gateway, health checks are needed to ensure that a healthy connection is being maintained. However, the health checks default to the root path, /, which on a healthy inference service returns a 404 as there is nothing configured in app.py to return anything. It would be useful to have this path instead return a 200 response to show that the inference service is alive and running.

Deployment Failure

Unable to deploy code to an app, service principal pair due to an issue with conflicting module version requirements. Tail of error message is shown below:

ERROR: Cannot install -r requirements.txt (line 12), -r requirements.txt (line 9) and azure-core==1.22.1 because these package versions have conflicting dependencies.
[10:22:51+0000] INFO: pip is looking at multiple versions of absl-py to determine which version is compatible with other requirements. This could take a while.
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
[10:22:51+0000]
[10:22:51+0000] The conflict is caused by:
[10:22:51+0000] The user requested azure-core==1.22.1
[10:22:51+0000] azure-identity 1.7.0 depends on azure-core<2.0.0 and >=1.11.0
[10:22:51+0000] azure-mgmt-core 1.3.1 depends on azure-core<2.0.0 and >=1.23.0
[10:22:51+0000]
[10:22:51+0000] To fix this you could try to:
[10:22:51+0000] 1. loosen the range of package versions you've specified
[10:22:51+0000] 2. remove package versions to allow pip attempt to solve the dependency conflict
[10:22:51+0000]
WARNING: You are using pip version 22.0.4; however, version 22.3.1 is available.
You should consider upgrading via the '/tmp/8dadc2ab9883def/antenv/bin/python -m pip install --upgrade pip' command. ", 'Timestamp': '2022-12-12 10:22:52', 'ExitCode': '1', 'Recommendation': 'Please review your requirements.txt', 'UrlToGetMoreInformation': 'https://aka.ms/troubleshoot-python'}], 'warnings': []}}. Please run the command az webapp log deployment show -n inf-app-osairis-net -g rg-osairis-net

Move docs to Readthedocs

Documentation is a little spread out - could do with being hosted on Readthedocs to unify things a little more.

Add quick test documentation

Add a brief explanation of how to run a quick test that the server is working.

e.g.

CURL -i -H "API_AUTH_SECRET: <val of CUSTOMCONNSTR_API_AUTH_SECRET>" http://localhost:5000/v1/ping

Documentation improvements

The following improvements should be added to the inference documentation:

  • Add instructions on how to run pytest tests
  • Clarify where logs can be found in a deployed App Service (Monitoring -> Log Stream)
  • Add explanation and instructions for the WAF setup

Problems with running the app service

I updated the InnerEye-Inference project to the newest version and now it doesn't run at all - I'm using the same commands that worked before, which are mentioned in README.md

az login --use-device-code
az webapp up --sku S1 --name inferenceapp --subscription <subscription_name> -g InnerEyeInference --location <location>

Which gives me the output:
Could not auto-detect the runtime stack of your app.
HINT: Are you in the right folder?
For more information, see 'https://go.microsoft.com/fwlink/?linkid=2109470'

I created set_environment.sh with all variables set with correct data (they have worked the whole time for inference).

Do you have any idea what could be the problem here and how we could solve it? (for example, should we define any additional path in the configs?)

Add Conda Environment Locking

To improve reproducibility and stability, repository needs environment locking in the same way as the InnerEye-DeepLearning repo.

Add daily build trigger

Make sure that the action is rerun daily to pick up any possible problems with 3rd party packages changing.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.