Comments (4)
This issue is in RunnableWithMessageHistory get_output_schema, I was able to work around it by defining my chain as follow:
from langchain_core.output_parsers.string import StrOutputParser
json_parser = StrOutputParser()
chain_with_parser = (
chain_with_history |
json_parser
)
then passing chain_with_parser to add_routes:
# Edit this to add the chain you want to add
add_routes(app, chain_with_parser)
not sure exactly how to solve the issue in RunnableWithMessageHistory tho.
from langserve.
I'm having this same error. Code is very similar —
agent_executor = create_agent_executor(database=DATABASE)
def get_message_history(jwt):
return RedisChatMessageHistory(url=REDIS_URL, ttl=600, session_id=jwt)
with_message_history = RunnableWithMessageHistory(
agent_executor,
get_message_history,
input_messages_key="input",
history_messages_key="history",
history_factory_config=[
ConfigurableFieldSpec(
id="jwt",
annotation=str,
name="User JWT",
description="Unique session identifier for the user.",
default="",
is_shared=True,
),
],
)
add_routes(
app,
with_message_history,
per_req_config_modifier=_per_req_config_modifier,
path="/agent",
)
I suspect something with pydantic versions. I'm using pydantic = "^2.8.2"
in my pyproject.toml
.
I'm using an app generated with the langchain-cli last week.
from langserve.
I'm trying to integrate both
RunnableWithMessageHistory
as chat with persistenceplayground_type='chat'
app = FastAPI() llm_model_path = "model/llama-3-8b.gguf" llm = ChatLlamaCpp(model_path=llm_model_path, n_gpu_layers=-1, n_ctx=512, chat_format='chatml') # Declare a chain # prompt = ChatPromptTemplate.from_messages( # [ # ("system", "You are a helpful AI assistant. Please provide a concise list of course with university that suits users question."), # MessagesPlaceholder(variable_name="history"), # ("human", "{human_input}"), # ] # ) prompt = ChatPromptTemplate.from_messages( [ ("system", "You are a helpful AI assistant that recommends courses along with universities to the students. Please provide a concise list of course with university that suits users question."), MessagesPlaceholder(variable_name="messages"), ] ) chain = prompt | llm | StrOutputParser() def get_session_history(): return InMemoryChatMessageHistory() class InputChat(BaseModel): """Input for the chat endpoint.""" messages: List[Union[HumanMessage, AIMessage, SystemMessage]] = Field( ..., description="The chat messages representing the current conversation.", ) chain_with_history = RunnableWithMessageHistory( chain, get_session_history, input_messages_key="input", history_messages_key="chat_history" ).with_types(input_type=InputChat) add_routes( app, chain_with_history, path="/chat", playground_type="chat" ) if __name__ == "__main__": import uvicorn uvicorn.run(app, host="0.0.0.0", port=8000)
TypeError: issubclass() arg 1 must be a class
But
RunnableWithMessageHistory
is a class right?Is there a option to do this?
My Configs are:
pydantic-2.8.2
pydantic-core- 2.20.1
from langserve.
This issue is in RunnableWithMessageHistory get_output_schema, I was able to work around it by defining my chain as follow:
from langchain_core.output_parsers.string import StrOutputParser json_parser = StrOutputParser() chain_with_parser = ( chain_with_history | json_parser )
then passing chain_with_parser to add_routes:
# Edit this to add the chain you want to add add_routes(app, chain_with_parser)
not sure exactly how to solve the issue in RunnableWithMessageHistory tho.
Yes, It's solved!!
Still, The chat playground is only supported for chains that take one of the following as input:
- a dict with a single key containing a list of messages
- a dict with two keys: one a string input, one an list of messages
and which return either an AIMessage or a string.
You can test this chain in the default LangServe playground instead.
To use the default playground, set playground_type="default" when adding the route in your backend.
from langserve.
Related Issues (20)
- Pass authentication token to RemoteRunnable HOT 2
- Grouping Traces into Threads using "Chat" Interface with LangServe
- LangServe JS client - forward Http-Only cookies when calling stream HOT 1
- Scaling to production -> OSError: [Errno 24] Too many open files socket.accept() out of system resource HOT 1
- Memory leak in LangServe HOT 39
- Langserve Playground does not put out any text response when in chat mode (but it does when in default mode) HOT 2
- Langserve :there is a conflict between langchain-cli and fastapi-cli on the version of typer.
- Bug: TypeError: Type is not JSON serializable: Send HOT 4
- No run_id found for the given run. HOT 1
- Support for stream_events v2 HOT 4
- RuntimeError calling add_routes with chains that have input types in Pydantic v2 HOT 2
- I have the same issue,The problem seems due to the ContextualCompressionRetriever,It outputs some info cannot be serialized. HOT 1
- langchain serve using different folder HOT 1
- Questions about LangServe hosting license HOT 1
- RunnableWithMessageHistory and DynamoDBChatMessageHistory
- "correction" field isn't passed from the "token_feedback" endpoint to the Langsmith "/feedback/tokens/" API
- How to use langgserve with django apis? HOT 2
- Failed to run langserve/examples/chat_with_persistence HOT 1
- TypeError when using add_routes() with a RunnableWithMessageHistory wrapper HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from langserve.