View Code? Open in Web Editor
NEW
A code sample that shows how to use 🦜️🔗langchain, FAISS and a hosted LLM endpoint to do Q&A about 500+ movies in the Kaggle Rotten Tomatoes Top Movies dataset
Home Page: https://octoai-moviebot.streamlit.app/
moviebot's People
Contributors
Stargazers
Watchers
moviebot's Issues
I'm seeing this response
LLMPredictor is deprecated, please use LLM instead.
thrown during
def create_service_context(llm_predictor, embeddings): """Create and return ServiceContext instance.""" if "service_context" not in session: service_context = ServiceContext.from_defaults( llm_predictor=llm_predictor, chunk_size_limit=400, embed_model=embeddings ) session["service_context"] = service_context return session["service_context"]