dataelement / bisheng Goto Github PK
View Code? Open in Web Editor NEWBisheng is an open LLM devops platform for next generation AI applications.
Home Page: https://bisheng.dataelem.com/
License: Apache License 2.0
Bisheng is an open LLM devops platform for next generation AI applications.
Home Page: https://bisheng.dataelem.com/
License: Apache License 2.0
curl -X POST https://bisheng.dataelem.com/api/v1/process/e3b6523a-c761-4306-8ec7-3f8f4dd519af -H 'Content-Type: application/json' -d '{"inputs": {"input": "message"}, "tweaks": {
"ConversationChain-A1J5d": {},
"ProxyChatLLM-JHRd9": {}
}}'
Internal Server Error
How to write Milvus connection_args ?
I can't save it.
创建知识库时,代码里只能选择openai的:
def decide_embeddings(model: str) -> Embeddings:
model_list = settings.get_knowledge().get('embeddings')
if model == 'text-embedding-ada-002':
return OpenAIEmbeddings(**model_list.get(model))
else:
return HostEmbeddings(**model_list.get(model))
我想换其他的embbeding,是需要自行实现吗?
(bisheng-rt好像没法提issue,借这里发一个)
模型上线功能支持.safetensors
目前有以下几个问题
1、我们目前有一些自定义的embedding模型,但是由于接口定义不同没法直接接入,这一块是否会考虑支持开放自定义接口
2、vector 知否支持其他应用已存在的知识库,非bisheng应用内建立的知识库
3、新建技能保存以后无法直接点击打开,需要刷新页面以后才会出现打开按钮
知识库技能配置完成后,创建不同会话,能否支持动态选择切换知识库?
As the following video record, the ReactFlow display is mixed between flowId switch in some cases. You have to double open it to get the right flow displayed.
还有就是正在开发的AutoPlanningV1模型是能本地部署吗
There is no Contributors section in readme file .
As we know Contributions are what make the open-source community such an amazing place to learn, inspire, and create.
The Contributors section in a README.md file is important as it acknowledges and gives credit to those who have contributed to a project, fosters community and collaboration, adds transparency and accountability, and helps document the project's history for current and future maintainers. It also serves as a form of recognition, motivating contributors to continue their efforts.
上传中文名pdf时出错。前端无提示无反应,后端docker-backend-1 日志:
2023-09-22 11:36:23 pymysql.err.DataError: (1366, "Incorrect string value: '\\xE7\\x88\\xB6\\xE8\\xBE\\x88...' for column 'file_name' at row 1")
...
2023-09-22 11:36:23 File "/usr/local/lib/python3.10/site-packages/pymysql/err.py", line 107, in raise_mysql_exception
2023-09-22 11:36:23 raise errorclass(errno, errval)
2023-09-22 11:36:23 sqlalchemy.exc.DataError: (pymysql.err.DataError) (1366, "Incorrect string value: '\\xE7\\x88\\xB6\\xE8\\xBE\\x88...' for column 'file_name' at row 1")
2023-09-22 11:36:23 [SQL: INSERT INTO knowledgefile (user_id, knowledge_id, file_name, md5, status, object_name) VALUES (%(user_id)s, %(knowledge_id)s, %(file_name)s, %(md5)s, %(status)s, %(object_name)s)]
2023-09-22 11:36:23 [parameters: {'user_id': 1, 'knowledge_id': 1, 'file_name': 'baike.baidu.com-父辈的荣耀2023年康洪雷刘翰轩执导的电视剧_百度百科(1).pdf', 'md5': '81fae05d46dde3653a5f91b18672c9f89043cb17edcf79bc8d23aff7616a44b9', 'status': 1, 'object_name': None}]
2023-09-22 11:36:23 (Background on this error at: https://sqlalche.me/e/14/9h9h)
自己部署成功了,但是开启不了新的对话,一新建会话,就报错。看了下控制台,提示是 ws 的问题。
环境:宝塔 docker
后测试:不配置 https 就会出现这个问题,配置上了就好了
还有就是希望能兼容xagent中的本地agent模型
使用源码私有化部署运行后端backend,并且向量数据库使用仓库默认的配置
vectorstores:
# Milvus 最低要求cpu 4C 8G 推荐4C 16G
Milvus: # 如果需要切换其他vectordb,确保其他服务已经启动,然后配置对应参数
connection_args: { "host": "110.16.193.170", "port": "50032", "user": "", "password": "", "secure": False }
本机有使用docker部署Milvus,但后端项目未使用其配置。
现进行文档问答配置的时候,Milvus组件连接向量数据库失败,连接的地址是:192.168.106.116:19530
这个地址是在哪里配置的,对应逻辑是哪里调用的,应该如何修改使其正常运行?
比如添加llm或者embedding相关组件时,能否后端服务启动时加载配置作为默认配置?
新增组件时就不用挨个配置这些了。
个别需要自定义的再自定义。
至少就我们目前使用的情况来看,每个都需要再配置挺麻烦的。
你好,我部署了 bisheng-rt 后端,访问 http://server_ip:9001/ 返回 HTTP 400,用 telnet 测试 9001 端口是通的,是否访问路径问题?
启动命令:
docker run --gpus=all -p 9001:9001 -p 9002:9002 -itd --workdir /opt/bisheng-rt --shm-size=10G --name bisheng-rt -v /home/admin/docker-content/bisheng/models:/opt/bisheng-rt/models/model_repository dataelement/bisheng-rt:0.0.1 ./bin/rtserver f
docker 日志:
=============================
== Triton Inference Server ==NVIDIA Release 22.08 (build )
Triton Server Version 2.25.0Copyright (c) 2018-2022, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES. All rights reserved.
This container image and its contents are governed by the NVIDIA Deep Learning Container License.
By pulling and using the container, you accept the terms and conditions of this license:
https://developer.nvidia.com/ngc/nvidia-deep-learning-container-licenseI0914 08:11:09.141870 1 main.cc:2300] bisheng runtime server v0.0.1 is starting...
2023-09-14T08:11:09Z I 1 pinned_memory_manager.cc:240] Pinned memory pool is created at '0x7f7970000000' with size 268435456
2023-09-14T08:11:09Z I 1 cuda_memory_manager.cc:105] CUDA memory pool is created on device 0 with size 67108864
2023-09-14T08:11:09Z I 1 cuda_memory_manager.cc:105] CUDA memory pool is created on device 1 with size 67108864
2023-09-14T08:11:09Z I 1 cuda_memory_manager.cc:105] CUDA memory pool is created on device 2 with size 67108864
2023-09-14T08:11:09Z I 1 cuda_memory_manager.cc:105] CUDA memory pool is created on device 3 with size 67108864
2023-09-14T08:11:09Z I 1 server.cc:563]
+------------------+------+
| Repository Agent | Path |
+------------------+------+
+------------------+------+2023-09-14T08:11:09Z I 1 server.cc:633]
+-------+---------+--------+
| Model | Version | Status |
+-------+---------+--------+
+-------+---------+--------+2023-09-14T08:11:09Z I 1 metrics.cc:870] Collecting metrics for GPU 0: NVIDIA GeForce RTX 3090
2023-09-14T08:11:09Z I 1 metrics.cc:870] Collecting metrics for GPU 1: NVIDIA GeForce RTX 3090
2023-09-14T08:11:09Z I 1 metrics.cc:870] Collecting metrics for GPU 2: NVIDIA GeForce RTX 3090
2023-09-14T08:11:09Z I 1 metrics.cc:870] Collecting metrics for GPU 3: NVIDIA GeForce RTX 3090
2023-09-14T08:11:09Z I 1 metrics.cc:761] Collecting CPU metrics
2023-09-14T08:11:09Z I 1 grpc_server.cc:4842] Started GRPCInferenceService at 0.0.0.0:9000
2023-09-14T08:11:09Z I 1 http_server.cc:4469] Started HTTPService at 0.0.0.0:9001
2023-09-14T08:11:09Z I 1 http_server.cc:193] Started Metrics Service at 0.0.0.0:9002
http response:
HTTP/1.1 400 Bad Request
Content-Length: 0
Content-Type: text/plain
新建会话之后怎么关闭会话?
请问一下 elasticKeywordsSearch怎么配置?我翻阅了飞书文档,发生文档里面的内容没有更新,有相关配置说明吗
日志如下:
2023-10-31 12:57:20 [04:57:20] INFO [04:57:20] - INFO - Logger set up with log logger.py:28
2023-10-31 12:57:20 level: 10(DEBUG)
2023-10-31 12:57:20 [04:57:20] INFO [04:57:20] - INFO - Logger set up with log logger.py:28
2023-10-31 12:57:20 level: 10(DEBUG)
2023-10-31 12:57:20 INFO [04:57:20] - INFO - Log file: data/bisheng.log logger.py:30
2023-10-31 12:57:20 INFO [04:57:20] - INFO - Log file: data/bisheng.log logger.py:30
2023-10-31 12:57:15 INFO: Stopping parent process [1]
2023-10-31 12:57:16 INFO: Uvicorn running on http://0.0.0.0:7860 (Press CTRL+C to quit)
2023-10-31 12:57:16 INFO: Started parent process [1]
2023-10-31 12:57:30 INFO: Started server process [8]
2023-10-31 12:57:30 INFO: Waiting for application startup.
2023-10-31 12:57:30 INFO: Started server process [9]
2023-10-31 12:57:30 INFO: Waiting for application startup.
2023-10-31 12:57:30 ERROR: Traceback (most recent call last):
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
2023-10-31 12:57:30 self.dialect.do_execute(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
2023-10-31 12:57:30 cursor.execute(statement, parameters)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 163, in execute
2023-10-31 12:57:30 result = self._query(query)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 321, in _query
2023-10-31 12:57:30 conn.query(q)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 505, in query
2023-10-31 12:57:30 [04:57:30] DEBUG [04:57:30] - DEBUG - Creating database and tables base.py:21
2023-10-31 12:57:30 [04:57:30] DEBUG [04:57:30] - DEBUG - Creating database and tables base.py:21
2023-10-31 12:57:30 DEBUG [04:57:30] - DEBUG - Database and tables created base.py:38
2023-10-31 12:57:30 successfully
2023-10-31 12:57:30 DEBUG [04:57:30] - DEBUG - Database and tables created base.py:38
2023-10-31 12:57:30 successfully
2023-10-31 12:57:30 self._affected_rows = self._read_query_result(unbuffered=unbuffered)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 724, in _read_query_result
2023-10-31 12:57:30 result.read()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 1069, in read
2023-10-31 12:57:30 first_packet = self.connection._read_packet()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 676, in _read_packet
2023-10-31 12:57:30 packet.raise_for_error()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/protocol.py", line 223, in raise_for_error
2023-10-31 12:57:30 err.raise_mysql_exception(self._data)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/err.py", line 107, in raise_mysql_exception
2023-10-31 12:57:30 raise errorclass(errno, errval)
2023-10-31 12:57:30 pymysql.err.DataError: (1406, "Data too long for column 'value' at row 1")
2023-10-31 12:57:30
2023-10-31 12:57:30 The above exception was the direct cause of the following exception:
2023-10-31 12:57:30
2023-10-31 12:57:30 Traceback (most recent call last):
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 677, in lifespan
2023-10-31 12:57:30 async with self.lifespan_context(app) as maybe_state:
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 566, in aenter
2023-10-31 12:57:30 await self._router.startup()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 656, in startup
2023-10-31 12:57:30 handler()
2023-10-31 12:57:30 File "/app/bisheng/database/base.py", line 79, in create_db_and_tables
2023-10-31 12:57:30 session.commit()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 1451, in commit
2023-10-31 12:57:30 self._transaction.commit(_to_root=self.future)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 829, in commit
2023-10-31 12:57:30 self._prepare_impl()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 808, in _prepare_impl
2023-10-31 12:57:30 self.session.flush()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3386, in flush
2023-10-31 12:57:30 self.flush(objects)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3525, in flush
2023-10-31 12:57:30 with util.safe_reraise():
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 70, in exit
2023-10-31 12:57:30 compat.raise(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 208, in raise
2023-10-31 12:57:30 raise exception
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3486, in _flush
2023-10-31 12:57:30 flush_context.execute()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 456, in execute
2023-10-31 12:57:30 rec.execute(self)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 630, in execute
2023-10-31 12:57:30 util.preloaded.orm_persistence.save_obj(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 245, in save_obj
2023-10-31 12:57:30 _emit_insert_statements(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 1238, in _emit_insert_statements
2023-10-31 12:57:30 result = connection._execute_20(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1705, in _execute_20
2023-10-31 12:57:30 return meth(self, args_10style, kwargs_10style, execution_options)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/sql/elements.py", line 333, in _execute_on_connection
2023-10-31 12:57:30 return connection._execute_clauseelement(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1572, in _execute_clauseelement
2023-10-31 12:57:30 ret = self._execute_context(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1943, in _execute_context
2023-10-31 12:57:30 self.handle_dbapi_exception(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 2124, in handle_dbapi_exception
2023-10-31 12:57:30 util.raise(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 208, in raise
2023-10-31 12:57:30 raise exception
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
2023-10-31 12:57:30 self.dialect.do_execute(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
2023-10-31 12:57:30 cursor.execute(statement, parameters)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 163, in execute
2023-10-31 12:57:30 result = self._query(query)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 321, in _query
2023-10-31 12:57:30 conn.query(q)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 505, in query
2023-10-31 12:57:30 self._affected_rows = self._read_query_result(unbuffered=unbuffered)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 724, in _read_query_result
2023-10-31 12:57:30 result.read()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 1069, in read
2023-10-31 12:57:30 first_packet = self.connection._read_packet()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 676, in _read_packet
2023-10-31 12:57:30 packet.raise_for_error()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/protocol.py", line 223, in raise_for_error
2023-10-31 12:57:30 err.raise_mysql_exception(self._data)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/err.py", line 107, in raise_mysql_exception
2023-10-31 12:57:30 raise errorclass(errno, errval)
2023-10-31 12:57:30 sqlalchemy.exc.DataError: (pymysql.err.DataError) (1406, "Data too long for column 'value' at row 1")
2023-10-31 12:57:30 [SQL: INSERT INTO config (value, key
, comment) VALUES (%(value)s, %(key)s, %(comment)s)]
2023-10-31 12:57:30 [parameters: {'value': ' unstructured_api_url: "" # 毕昇非结构化数据解析服务地址,提供包括OCR文字识别、表格识别、版式分析等能力。非必填,填写后能够提升PDF、图片、\n embeddings: # 配置知识库的embedding服务,以下示例填写了两类embedding服务的配置方法 ... (875 characters truncated) ... # 如果要支持溯源功能,由于溯源会展示源文件,必须配置 oss 存储\n MINIO_ENDPOINT: ""\n MINIO_SHAREPOIN: ""\n MINIO_ACCESS_KEY: ""\n MINIO_SECRET_KEY: ""\n\n# 全局配置大模型', 'key': 'knowledges', 'comment': None}]
2023-10-31 12:57:30 (Background on this error at: https://sqlalche.me/e/14/9h9h)
2023-10-31 12:57:30
2023-10-31 12:57:30 ERROR: Traceback (most recent call last):
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
2023-10-31 12:57:30 self.dialect.do_execute(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
2023-10-31 12:57:30 cursor.execute(statement, parameters)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 163, in execute
2023-10-31 12:57:30 result = self._query(query)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 321, in _query
2023-10-31 12:57:30 conn.query(q)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 505, in query
2023-10-31 12:57:30 self._affected_rows = self._read_query_result(unbuffered=unbuffered)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 724, in _read_query_result
2023-10-31 12:57:30 result.read()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 1069, in read
2023-10-31 12:57:30 first_packet = self.connection._read_packet()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 676, in _read_packet
2023-10-31 12:57:30 packet.raise_for_error()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/protocol.py", line 223, in raise_for_error
2023-10-31 12:57:30 err.raise_mysql_exception(self._data)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/err.py", line 107, in raise_mysql_exception
2023-10-31 12:57:30 raise errorclass(errno, errval)
2023-10-31 12:57:30 pymysql.err.DataError: (1406, "Data too long for column 'value' at row 1")
2023-10-31 12:57:30
2023-10-31 12:57:30 The above exception was the direct cause of the following exception:
2023-10-31 12:57:30
2023-10-31 12:57:30 Traceback (most recent call last):
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 677, in lifespan
2023-10-31 12:57:30 async with self.lifespan_context(app) as maybe_state:
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 566, in aenter
2023-10-31 12:57:30 await self._router.startup()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 656, in startup
2023-10-31 12:57:30 handler()
2023-10-31 12:57:30 File "/app/bisheng/database/base.py", line 79, in create_db_and_tables
2023-10-31 12:57:30 session.commit()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 1451, in commit
2023-10-31 12:57:30 self._transaction.commit(_to_root=self.future)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 829, in commit
2023-10-31 12:57:30 self._prepare_impl()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 808, in _prepare_impl
2023-10-31 12:57:30 self.session.flush()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3386, in flush
2023-10-31 12:57:30 self.flush(objects)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3525, in flush
2023-10-31 12:57:30 with util.safe_reraise():
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 70, in exit
2023-10-31 12:57:30 compat.raise(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 208, in raise
2023-10-31 12:57:30 raise exception
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3486, in _flush
2023-10-31 12:57:30 flush_context.execute()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 456, in execute
2023-10-31 12:57:30 rec.execute(self)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 630, in execute
2023-10-31 12:57:30 util.preloaded.orm_persistence.save_obj(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 245, in save_obj
2023-10-31 12:57:30 _emit_insert_statements(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 1238, in _emit_insert_statements
2023-10-31 12:57:30 result = connection._execute_20(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1705, in _execute_20
2023-10-31 12:57:30 return meth(self, args_10style, kwargs_10style, execution_options)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/sql/elements.py", line 333, in _execute_on_connection
2023-10-31 12:57:30 return connection._execute_clauseelement(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1572, in _execute_clauseelement
2023-10-31 12:57:30 ret = self._execute_context(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1943, in _execute_context
2023-10-31 12:57:30 self.handle_dbapi_exception(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 2124, in handle_dbapi_exception
2023-10-31 12:57:30 util.raise(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 208, in raise
2023-10-31 12:57:30 raise exception
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
2023-10-31 12:57:30 self.dialect.do_execute(
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
2023-10-31 12:57:30 cursor.execute(statement, parameters)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 163, in execute
2023-10-31 12:57:30 result = self._query(query)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 321, in _query
2023-10-31 12:57:30 conn.query(q)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 505, in query
2023-10-31 12:57:30 self._affected_rows = self._read_query_result(unbuffered=unbuffered)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 724, in _read_query_result
2023-10-31 12:57:30 result.read()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 1069, in read
2023-10-31 12:57:30 first_packet = self.connection._read_packet()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 676, in _read_packet
2023-10-31 12:57:30 packet.raise_for_error()
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/protocol.py", line 223, in raise_for_error
2023-10-31 12:57:30 err.raise_mysql_exception(self._data)
2023-10-31 12:57:30 File "/usr/local/lib/python3.10/site-packages/pymysql/err.py", line 107, in raise_mysql_exception
2023-10-31 12:57:30 raise errorclass(errno, errval)
2023-10-31 12:57:30 sqlalchemy.exc.DataError: (pymysql.err.DataError) (1406, "Data too long for column 'value' at row 1")
2023-10-31 12:57:30 [SQL: INSERT INTO config (value, key
, comment) VALUES (%(value)s, %(key)s, %(comment)s)]
2023-10-31 12:57:30 [parameters: {'value': ' unstructured_api_url: "" # 毕昇非结构化数据解析服务地址,提供包括OCR文字识别、表格识别、版式分析等能力。非必填,填写后能够提升PDF、图片、\n embeddings: # 配置知识库的embedding服务,以下示例填写了两类embedding服务的配置方法 ... (875 characters truncated) ... # 如果要支持溯源功能,由于溯源会展示源文件,必须配置 oss 存储\n MINIO_ENDPOINT: ""\n MINIO_SHAREPOIN: ""\n MINIO_ACCESS_KEY: ""\n MINIO_SECRET_KEY: ""\n\n# 全局配置大模型', 'key': 'knowledges', 'comment': None}]
2023-10-31 12:57:30 (Background on this error at: https://sqlalche.me/e/14/9h9h)
2023-10-31 12:57:30
2023-10-31 12:57:30 ERROR: Application startup failed. Exiting.
2023-10-31 12:57:30 ERROR: Application startup failed. Exiting.
环境:Ubuntu 20.04
版本:v0.1.3-4-g28276a7
在重启backend的docker容器时,容器能正常启动,但是检查日志,报错如下:
[16:23:34] INFO [16:23:34] - INFO - Logger set up with log logger.py:31
level: 10(DEBUG)
INFO: Started server process [9]
INFO: Waiting for application startup.
Creating database and tables
DEBUG [16:23:34] - DEBUG - Creating database and tables base.py:15
INFO: Started server process [8]
INFO: Waiting for application startup.
Creating database and tables
DEBUG [16:23:34] - DEBUG - Creating database and tables base.py:15
Error creating database and tables: (pymysql.err.OperationalError) (1050, "Table 'flow' already exists")
[SQL:
CREATE TABLE flow (
update_time DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
data JSON,
name VARCHAR(255) NOT NULL,
user_id INTEGER,
description VARCHAR(255),
logo VARCHAR(255),
status INTEGER,
create_time DATETIME,
id CHAR(32) NOT NULL,
PRIMARY KEY (id),
UNIQUE (id)
)
]
(Background on this error at: https://sqlalche.me/e/14/e3q8)
ERROR [16:23:34] - ERROR - Error creating database and base.py:19
tables: (pymysql.err.OperationalError) (1050,
"Table 'flow' already exists")
[SQL:
CREATE TABLE flow (
update_time DATETIME NOT NULL DEFAULT
CURRENT_TIMESTAMP,
data JSON,
name VARCHAR(255) NOT NULL,
user_id INTEGER,
description VARCHAR(255),
logo VARCHAR(255),
status INTEGER,
create_time DATETIME,
id CHAR(32) NOT NULL,
PRIMARY KEY (id),
UNIQUE (id)
)
]
(Background on this error at:
https://sqlalche.me/e/14/e3q8)
ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 163, in execute
result = self._query(query)
File "/usr/local/lib/python3.10/site-packages/pymysql/cursors.py", line 321, in _query
conn.query(q)
File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 505, in query
self._affected_rows = self._read_query_result(unbuffered=unbuffered)
File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 724, in _read_query_result
result.read()
File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 1069, in read
first_packet = self.connection._read_packet()
File "/usr/local/lib/python3.10/site-packages/pymysql/connections.py", line 676, in _read_packet
packet.raise_for_error()
File "/usr/local/lib/python3.10/site-packages/pymysql/protocol.py", line 223, in raise_for_error
err.raise_mysql_exception(self._data)
File "/usr/local/lib/python3.10/site-packages/pymysql/err.py", line 107, in raise_mysql_exception
raise errorclass(errno, errval)
pymysql.err.OperationalError: (1050, "Table 'flow' already exists")
mysql容器中已有bisheng的数据库,也有flow这张表,从报错来看,似乎是初始化应用的时候重新创建了表导致冲突
DataError: (pymysql.err.DataError) (1406, "Data too long for column 'intermediate_steps' at row 1")
[SQL: INSERT INTO chatmessage (message, intermediate_steps, is_bot, type, category, flow_id, chat_id, user_id, files) VALUES
(%(message)s, %(intermediate_steps)s, %(is_bot)s, %(type)s, %(category)s, %(flow_id)s, %(chat_id)s, %(user_id)s, %(files)s)]
[parameters: {'message': None, 'intermediate_steps': '分析出错,Error: <ParamError: (code=1, message=The dimension of query
entities[[-0.0287322998046875, 0.00139617919921875, 0.00464630126953125, -0.01043701 ... (161395 characters truncated) ... ,
0.07525634765625, 0.06427001953125, 0.06951904296875, -0.021453857421875, -0.0511474609375, -0.0220489501953125]] is different from
schema [1536])>', 'is_bot': 1, 'type': 'end', 'category': 'processing', 'flow_id': '1e942287b1c2498bbceccb34c87787a4', 'chat_id': None,
'user_id': '1', 'files': ''}]
(Background on this error at: https://sqlalche.me/e/14/9h9h)
登录用户退出时,不能删除cookie,因为cookie在创建时,被设置了HttpOnly属性。可能需要使用服务器端的代码来完成。例如在服务器端发送一个特殊的响应头,告诉浏览器删除特定的cookie。然后浏览器会删除这个cookie。
版本:0.1.9.5
代码位置:src\layout\MainLayout.tsx的23-32行
代码片段:
function clearAllCookies() {
var cookies = document.cookie.split(";");
for (var i = 0; i < cookies.length; i++) {
var cookie = cookies[i];
var eqPos = cookie.indexOf("=");
var name = eqPos > -1 ? cookie.substr(0, eqPos) : cookie;
document.cookie = name + "=;expires=Thu, 01 Jan 1970 00:00:00 GMT";
}
}
embedding知识或者与GPT对话时,如何填写配置reverse proxy地址使其可以正常访问?
postman测过端口可以,在知识库上传文件解析时一级chatGPT对话时,均失败。
proxy地址为:https://xxx/proxy/v1/chat/completions
代理服务器收到后会转成 https://api.openai.com/v1/chat/xxx
不管填写哪个地址都失败
本地部署后,模型回答不是流式的,而是突然一下显示所有答案,请问一下这个怎么设置呢
On bisheng-rt:0.0.4 image, using the suggested config. Output does not make sense. Was vLLM + Qwen-14B-chat tested?
Assistant ', '0
Assistant ', '100 ', nan ', '}, 评价', nan', nan', '}, nan', '}, '}, nan ', '20', nan', ;', nan',10', nan', '}, nan', '}, nan', nan', '}, 10', '}'', nan, '}, '}, '}, '}, nan ', '}',2 ', '}',10', '}'', nan', '}',10', ',10,10', ', '}',', '}, '}, nan, ;', '}, '}',201', nan, '}'', 评价', '}, nan, '}, nan,10', nan, nan,1', nan,5', nan,0,5', nan', '}, '}, nan', '}, ;', nan', nan', nan',1', nan ', nan', nan,10', nan', nan,0,1', nan',5', '}, '}, nan', '}, nan ', ',', nan ',10,2 ', nan,1', nan,
执行步骤:
1.从高级编辑中导出的python代码。放入pycharm,显示load_flow_from_json不是bisheng的模块
2.然后把这句换成from langflow import load_flow_from_json;说环境中没有langflow ,于是又装了langflow0.5.3
3.在flow = load_flow_from_json(行加入本地json文件路径
(其余都没有修改)
4,可以run了,但报错
File "C:\Users\jqiu.conda\envs\bisheng_py39cu118tf4310\lib\site-packages\langflow\graph\vertex\base.py", line 191, in _build_params
raise ValueError(f"File path not found for {self.vertex_type}")
补充说明:json文件也是在高级编辑中导出的,没有修改。(不知道是否是文件名中文的关系,但没有报找不到文件呀)
请问在知识库里上传txt文件时,提示KeyError: 'page' ,如何解决,谢谢
模型为:multilingual-e5-large
后台图示:
错误代码为:
docker-backend-1 | /app/bisheng/api/v1/knowledge.py:317: SAWarning: DELETE statement on table 'knowledgefile' expected to delete 1 row(s); 0 were matched. Please set confirm_deleted_rows=False within the mapper configuration to prevent this warning. docker-backend-1 | session.commit() docker-backend-1 | INFO: 172.18.0.4:35386 - "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 49 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | INFO: 172.18.0.4:35394 - "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 49 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | INFO: 172.18.0.4:35410 - "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 49 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | INFO: 172.18.0.4:35426 - "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 286 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | [15:59:46] INFO [15:59:46] - INFO - act=delete_vector knowledge.py:304 docker-backend-1 | file_id=37 res=(insert count: 0, delete docker-backend-1 | count: 0, upsert count: 0, timestamp: docker-backend-1 | 446114558432247810, success count: 0, err docker-backend-1 | count: 0) docker-backend-1 | DEBUG [15:59:46] - DEBUG - Starting new HTTP connectionpool.py:246 docker-backend-1 | connection (1): 192.168.1.14:9200 docker-backend-1 | DEBUG [15:59:46] - DEBUG - connectionpool.py:474 docker-backend-1 | http://192.168.1.14:9200 "GET docker-backend-1 | /col_1701791122_ab9c9ef8 HTTP/1.1" 200 docker-backend-1 | None docker-backend-1 | INFO [15:59:46] - INFO - GET _transport.py:335 docker-backend-1 | http://192.168.1.14:9200/col_1701791122_ab docker-backend-1 | 9c9ef8 [status:200 duration:0.006s] docker-backend-1 | DEBUG [15:59:46] - DEBUG - connectionpool.py:474 docker-backend-1 | http://192.168.1.14:9200 "POST docker-backend-1 | /col_1701791122_ab9c9ef8/_refresh docker-backend-1 | HTTP/1.1" 200 49 docker-backend-1 | INFO [15:59:46] - INFO - POST _transport.py:335 docker-backend-1 | http://192.168.1.14:9200/col_1701791122_ab docker-backend-1 | 9c9ef8/_refresh [status:200 docker-backend-1 | duration:0.004s] docker-backend-1 | DEBUG [15:59:46] - DEBUG - connectionpool.py:474 docker-backend-1 | http://192.168.1.14:9200 "POST docker-backend-1 | /col_1701791122_ab9c9ef8/_delete_by_qu docker-backend-1 | ery HTTP/1.1" 200 215 docker-backend-1 | INFO [15:59:46] - INFO - POST _transport.py:335 docker-backend-1 | http://192.168.1.14:9200/col_1701791122_ab docker-backend-1 | 9c9ef8/_delete_by_query [status:200 docker-backend-1 | duration:0.006s] docker-backend-1 | INFO [15:59:46] - INFO - act=delete_es knowledge.py:314 docker-backend-1 | file_id=37 res={'took': 2, 'timed_out': docker-backend-1 | False, 'total': 0, 'deleted': 0, 'batches': docker-backend-1 | 0, 'version_conflicts': 0, 'noops': 0, docker-backend-1 | 'retries': {'bulk': 0, 'search': 0}, docker-backend-1 | 'throttled_millis': 0, docker-backend-1 | 'requests_per_second': -1.0, docker-backend-1 | 'throttled_until_millis': 0, 'failures': docker-backend-1 | []} docker-backend-1 | INFO: 172.18.0.4:35412 - "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "DELETE /api/v1/knowledge/file/37 HTTP/1.1" 200 49 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | INFO: 172.18.0.4:35432 - "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 286 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | INFO: 172.18.0.4:35438 - "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 286 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | INFO: 172.18.0.4:35448 - "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 OK docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:46 +0000] "GET /api/v1/knowledge/file_list/4?page_size=20&page_num=1 HTTP/1.1" 200 286 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | [15:59:56] DEBUG [15:59:56] - DEBUG - Calling on_part_begin multipart.py:586 docker-backend-1 | with no data docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling multipart.py:583 docker-backend-1 | on_header_field with data[42:61] docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling multipart.py:583 docker-backend-1 | on_header_value with data[63:104] docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling on_header_end multipart.py:586 docker-backend-1 | with no data docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling multipart.py:583 docker-backend-1 | on_header_field with data[106:118] docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling multipart.py:583 docker-backend-1 | on_header_value with data[120:130] docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling on_header_end multipart.py:586 docker-backend-1 | with no data docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling multipart.py:586 docker-backend-1 | on_headers_finished with no data docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling on_part_data multipart.py:583 docker-backend-1 | with data[134:1073] docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling on_part_end multipart.py:586 docker-backend-1 | with no data docker-backend-1 | DEBUG [15:59:56] - DEBUG - Calling on_end with no multipart.py:586 docker-backend-1 | data docker-backend-1 | INFO: 172.18.0.4:40154 - "POST /api/v1/knowledge/upload HTTP/1.1" 201 Created docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:56 +0000] "POST /api/v1/knowledge/upload HTTP/1.1" 201 130 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | [15:59:57] INFO [15:59:57] - INFO - fileName=qa.txt knowledge.py:132 docker-backend-1 | col=col_1701791122_ab9c9ef8 docker-backend-1 | INFO: 172.18.0.4:40158 - "POST /api/v1/knowledge/process HTTP/1.1" 201 Created docker-nginx-1 | 192.168.1.101 - - [05/Dec/2023:15:59:57 +0000] "POST /api/v1/knowledge/process HTTP/1.1" 201 32 "http://192.168.1.14:3001/filelib/4" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" docker-backend-1 | DEBUG [15:59:57] - DEBUG - Using previous milvus.py:206 docker-backend-1 | connection: 729e04139ed94c7eaeaa3c061c316285 docker-backend-1 | DEBUG [15:59:57] - DEBUG - Nothing to insert, milvus.py:423 docker-backend-1 | skipping. docker-backend-1 | DEBUG [15:59:57] - DEBUG - Starting new HTTP connectionpool.py:246 docker-backend-1 | connection (1): 192.168.1.14:9200 docker-backend-1 | DEBUG [15:59:57] - DEBUG - connectionpool.py:474 docker-backend-1 | http://192.168.1.14:9200 "GET docker-backend-1 | /col_1701791122_ab9c9ef8 HTTP/1.1" 200 docker-backend-1 | None docker-backend-1 | INFO [15:59:57] - INFO - GET _transport.py:335 docker-backend-1 | http://192.168.1.14:9200/col_1701791122_ab docker-backend-1 | 9c9ef8 [status:200 duration:0.006s] docker-backend-1 | DEBUG [15:59:57] - DEBUG - connectionpool.py:474 docker-backend-1 | http://192.168.1.14:9200 "POST docker-backend-1 | /col_1701791122_ab9c9ef8/_refresh docker-backend-1 | HTTP/1.1" 200 49 docker-backend-1 | INFO [15:59:57] - INFO - POST _transport.py:335 docker-backend-1 | http://192.168.1.14:9200/col_1701791122_ab docker-backend-1 | 9c9ef8/_refresh [status:200 docker-backend-1 | duration:0.004s] docker-backend-1 | INFO [15:59:57] - INFO - chunk_split knowledge.py:375 docker-backend-1 | file_name=qa.txt size=1 docker-backend-1 | DEBUG [15:59:57] - DEBUG - Starting new HTTP connectionpool.py:246 docker-backend-1 | connection (1): 172.17.0.1:9001 docker-backend-1 | DEBUG [15:59:57] - DEBUG - connectionpool.py:474 docker-backend-1 | http://172.17.0.1:9001 "POST docker-backend-1 | /v2.1/models/multilingual-e5-large/inf docker-backend-1 | er HTTP/1.1" 200 20427 docker-backend-1 | ERROR [15:59:57] - ERROR - 'page' knowledge.py:385 docker-backend-1 | Traceback (most recent call last): docker-backend-1 | File "/app/bisheng/api/v1/knowledge.py", docker-backend-1 | line 377, in addEmbedding docker-backend-1 | vectore_client.add_texts(texts=texts, docker-backend-1 | metadatas=metadatas) docker-backend-1 | File docker-backend-1 | "/usr/local/lib/python3.10/site-packages/la docker-backend-1 | ngchain/vectorstores/milvus.py", line 454, docker-backend-1 | in add_texts docker-backend-1 | insert_list = [insert_dict[x][i:end] docker-backend-1 | for x in self.fields] docker-backend-1 | File docker-backend-1 | "/usr/local/lib/python3.10/site-packages/la docker-backend-1 | ngchain/vectorstores/milvus.py", line 454, docker-backend-1 | in <listcomp> docker-backend-1 | insert_list = [insert_dict[x][i:end] docker-backend-1 | for x in self.fields] docker-backend-1 | KeyError: 'page'
目前没有bisheng-rt模块没有源码,只能拉镜像。请问是否有计划放出源码?
Error Log:
[07:20:45] ERROR [07:20:45] - ERROR - LLM return error {'error': {'message': "[] is too short - 'functions'", 'type': 'invalid_request_error', 'param': None, 'code': None}} base.py:58
POST data Log:
INFO [07:20:40] - INFO - params: {'messages': [{'role': 'user', 'content': 'Answer the following questions as best you can. You have access to the following tools:\n\nbing_search: A proxy_llm.py:195
wrapper around Bing Search. Useful for when you need to answer questions about current events. Input should be a search query.\n\nUse the following format:\n\nQuestion: the input
question you must answer\nThought: you should always think about what to do\n\nanswer in Chinese.\nBegin!Question: HiThought:'}], 'model': 'gpt-3.5-turbo', 'top_p': 0.9,
'temperature': 0.7, 'max_tokens': 2048, 'stop': ['\nObservation:', '\n\tObservation:'], 'function_call': None, 'functions': []}
That's to say, 'functions': []
shouldn't be empty.
Additional notes:
functions
and function_call
is not alway needed.
For example, for ZeroShotAgent
, its prompts has included the function tools info:
https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/agents/mrkl/prompt.py#L7C1-L7C60
# flake8: noqa
PREFIX = """Answer the following questions as best you can. You have access to the following tools:"""
FORMAT_INSTRUCTIONS = """Use the following format:
Question: the input question you must answer
Thought: you should always think about what to do
Action: the action to take, should be one of [{tool_names}]
Action Input: the input to the action
Observation: the result of the action
... (this Thought/Action/Action Input/Observation can repeat N times)
Thought: I now know the final answer
Final Answer: the final answer to the original input question"""
SUFFIX = """Begin!
Question: {input}
Thought:{agent_scratchpad}"""
@classmethod
def create_prompt(
cls,
tools: Sequence[BaseTool],
prefix: str = PREFIX,
suffix: str = SUFFIX,
format_instructions: str = FORMAT_INSTRUCTIONS,
input_variables: Optional[List[str]] = None,
) -> PromptTemplate:
"""Create prompt in the style of the zero shot agent.
Args:
tools: List of tools the agent will have access to, used to format the
prompt.
prefix: String to put before the list of tools.
suffix: String to put after the list of tools.
input_variables: List of input variables the final prompt will expect.
Returns:
A PromptTemplate with the template assembled from the pieces here.
"""
tool_strings = "\n".join([f"{tool.name}: {tool.description}" for tool in tools])
tool_names = ", ".join([tool.name for tool in tools])
format_instructions = format_instructions.format(tool_names=tool_names)
template = "\n\n".join([prefix, tool_strings, format_instructions, suffix])
if input_variables is None:
input_variables = ["input", "agent_scratchpad"]
return PromptTemplate(template=template, input_variables=input_variables)
llm 的组件可以实现流的方式返回吗?
数据库表设置的字段长度较短,需要增加数据库字段长度,主要表有:
1、modeldeploy表中的config字段、remark字段
2、server表中的gpu字段;
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.