In this tutorial, we'll see how to use LlamaIndex Instrumentation module to send intermediate steps in a RAG pipeline to the frontend for an intuitive user experience.
We use Server-Sent Events which will be recieved by Vercel AI SDK on the frontend.
First clone the repo:
$ git clone https://github.com/rsrohan99/rag-stream-intermediate-events-tutorial.git
$ cd rag-stream-intermediate-events-tutorial
cd
into the backend
directory
$ cd backend
OPENAI_API_KEY=****
$ poetry install
$ poetry run python app/engine/generate.py
$ poetry run python main.py
cd
into the frontend
directory
$ cd frontend
$ bun i
$ bun run dev