This project demonstrates how to minimally achieve live streaming with Langchain, ChatGpt, and Next.js to get real-time data from the backend to the frontend.
- Copy files from repository into your project (do not clone repo, is not stand-alone):
https://github.com/sachio222/langchain-nextjs-streaming.git
- Install dependencies:
cd repo
npm install -S langchain
- Create a
.env.local
file in the root directory and add your OpenAI API key:
OPENAI_API_KEY=YOUR_API_KEY
- Start the development server:
npm run dev
-
Navigate to
http://localhost:3000
in your browser. index.js will automatically run. -
Click the "Start Stream" button to initiate the chatbot stream.
- LangchainJS for LLM chains
- ChatGpt using the 'gpt-3.5-turbo' model.
- Next.js for server-side rendering and real-time updates
- Tailwind css for minimal styling.
Copyright (c) 2023 J. Krajewski and released under the MIT license. This project was inspired by Langchain's example code and being unable to find a solution anywhere.