The repository contains the code associated with our capstone project. The project is a web application which integrates OpenAI's CLIP to generate labels for images.
Checkout the Tip Sub-Module for access to TIP Python classes:
git submodule update --recursive --remote
Boot your virtual environment and install the required packages:
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
cp .env.template .env
source .env
This sets the REDIS_URL
environment variable to redis://localhost:6379
by default. If you are using a different Redis instance, update the .env
file accordingly.
Start the server:
uvicorn app.main:app --reload
Interact with the application at http://localhost:8000/{path}
-
/clip/cache
- This endpoint will generate the TIP Cache for used in inference. This is required before using the
/clip
endpoint.
- This endpoint will generate the TIP Cache for used in inference. This is required before using the
-
/clip
- This endpoint will generate the label for the image provided in the request body. This endpoint requires the
clip/cache
endpoint to be called first and the receivedcache_id
to be passed in the body.
- This endpoint will generate the label for the image provided in the request body. This endpoint requires the
-
/world
- This endpoint exists as a simple health check for the application. It will return a simple JSON response.
For detailed endpoint documentation check out the documentation section.
FastAPI provides an interactive documentation page at http://localhost:8000/docs
Pre-build linux/amd64
images for this repository are published to our public docker hub repository.
There is a test suite setup using Pytest. To run the tests, run the following command:
pytest
Ruff is setup to lint the code. To run the linter and fix issues, run the following command:
ruff check app --fix