cohere-ai / cohere-toolkit Goto Github PK
View Code? Open in Web Editor NEWCohere Toolkit is a collection of prebuilt components enabling users to quickly build and deploy RAG applications.
License: MIT License
Cohere Toolkit is a collection of prebuilt components enabling users to quickly build and deploy RAG applications.
License: MIT License
Current behaviour:
The default file upload tool added to the toolkit only parses PDFs.
Expected behaviour:
We would love to see even more file types supported but I will close this when csv is supported.
No response
I followed the manual for the installation, but it simply doesn't work. It has wasted several days of my time.
No response
https://github.com/BerriAI/litellm is a popular project that is used as a model router within companies. Adding support for this would be great, and I'm happy to contribute with some support!
It would also mean that the UI could be used with other models. This could increase Cohere's popularity but I understand if it means that Cohere might not be interested in supporting this feature
the error logs:
(cohere-toolkit) ➜ cohere-toolkit git:(main) make migrate
docker compose run --build backend alembic -c src/backend/alembic.ini upgrade head
WARN[0000] The "NEXT_PUBLIC_API_HOSTNAME" variable is not set. Defaulting to a blank string.
WARN[0000] /Users/ifuryst/projects/ai/cohere-toolkit/docker-compose.yml: `version` is obsolete
[+] Creating 1/0
✔ Container cohere-toolkit-db-1 Running 0.0s
[+] Building 7.9s (12/12) FINISHED docker-container:xenodochial_wilson
=> [backend internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 1.16kB 0.0s
=> [backend internal] load metadata for docker.io/library/python:3.11 1.3s
=> [backend internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [backend 1/6] FROM docker.io/library/python:3.11@sha256:e453eb723bc8ecac7a797498f9a5915d13e567620d48dcd3568750bac3b59f31 0.0s
=> => resolve docker.io/library/python:3.11@sha256:e453eb723bc8ecac7a797498f9a5915d13e567620d48dcd3568750bac3b59f31 0.0s
=> [backend internal] load build context 0.0s
=> => transferring context: 13.49kB 0.0s
=> CACHED [backend 2/6] WORKDIR /workspace 0.0s
=> CACHED [backend 3/6] COPY pyproject.toml poetry.lock ./ 0.0s
=> CACHED [backend 4/6] RUN pip install --no-cache-dir poetry==1.6.1 && poetry install 0.0s
=> CACHED [backend 5/6] COPY src/backend/ src/backend/ 0.0s
=> CACHED [backend 6/6] COPY .en[v] .env 0.0s
=> [backend] exporting to docker image format 6.5s
=> => exporting layers 0.0s
=> => exporting manifest sha256:6c850bff9065e54db15bcb778ae3e50244523e9427c13ec32d8ef0f02ebee2aa 0.0s
=> => exporting config sha256:1cabb0196a31aaf589436f856b77564862794efce4312b9c692f654a91cbc757 0.0s
=> => sending tarball 6.5s
=> [backend] importing to docker 0.0s
Traceback (most recent call last):
File "/workspace/.venv/bin/alembic", line 8, in <module>
sys.exit(main())
^^^^^^
File "/workspace/.venv/lib/python3.11/site-packages/alembic/config.py", line 641, in main
CommandLine(prog=prog).main(argv=argv)
File "/workspace/.venv/lib/python3.11/site-packages/alembic/config.py", line 631, in main
self.run_cmd(cfg, options)
File "/workspace/.venv/lib/python3.11/site-packages/alembic/config.py", line 608, in run_cmd
fn(
File "/workspace/.venv/lib/python3.11/site-packages/alembic/command.py", line 403, in upgrade
script.run_env()
File "/workspace/.venv/lib/python3.11/site-packages/alembic/script/base.py", line 583, in run_env
util.load_python_file(self.dir, "env.py")
File "/workspace/.venv/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 95, in load_python_file
module = load_module_py(module_id, path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/.venv/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 113, in load_module_py
spec.loader.exec_module(module) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap_external>", line 940, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/workspace/./src/backend/alembic/env.py", line 9, in <module>
from backend.models import *
File "/workspace/src/backend/models/__init__.py", line 4, in <module>
from backend.models.database import *
File "/workspace/src/backend/models/database.py", line 11, in <module>
SQLALCHEMY_DATABASE_URL = os.environ["DATABASE_URL"]
~~~~~~~~~~^^^^^^^^^^^^^^^^
File "<frozen os>", line 679, in __getitem__
KeyError: 'DATABASE_URL'
make: *** [migrate] Error 1
i runed by this order:
poetry install
poetry run black .
poetry run isort .
make migrate
and i tried to reset the db:
make reset-db
make migrate
it also throws the error
Python 3.11.9
MacOS Sonoma 14.4.1
Docker info
Client:
Version: 26.0.0
Context: desktop-linux
...
Server:
Containers: 6
Running: 2
Paused: 0
Stopped: 4
Images: 13
Server Version: 26.0.0
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Using metacopy: false
Native Overlay Diff: true
userxattr: false
...
I'm running this through Docker and I'm getting an invalid api token message when I start chatting. Earlier on it also raises WARNING:root:Couldn't get models from Cohere API.
I suspect this is because I'm using a trial api key. Is this only available for production API?
No response
When I setup the toolkit locally on Mac the python interpreter tool does not work.
Using make attach
and the python debugger I see:
-> res = requests.post(self.interpreter_url, json={"code": code})
(Pdb) p self.interpreter_url
'http://localhost:8080/'
(Pdb) n
2024-05-03 22:22:45,509 - urllib3.connectionpool - DEBUG - Starting new HTTP connection (1): localhost:8080
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff4a881390>: Failed to establish a new connection: [Errno 111] Connection refused'))
> /workspace/src/backend/tools/function_tools/python_interpreter.py(48)call()
SOLUTION: Update PYTHON_INTERPRETER_URL
in the .env
file to specify the docker container name cohere-toolkit-terrarium-1
instead of localhost
then this works:
PYTHON_INTERPRETER_URL='http://cohere-toolkit-terrarium-1:8080'
I found this name from Docker desktop, the python tool is running on port 8080:
This must be a Docker network permission issue. Outside the docker in a shell, curl works:
curl --location 'http://localhost:8080/' \
--header 'User-Id: me' \
--header 'Content-Type: application/json' \
--data '{
"code": "1+3"
}'
{"success":true,"output_files":[],"final_expression":4,"std_out":"","std_err":"","code_runtime":26}
No response
Running
make first-run
on mac will trigger the following command:
poetry install --with setup,community --verbose
Then it will produce this error
RuntimeError: Failed to load shared library '/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib': dlopen(/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (no such file), '/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64'))
make[1]: *** [setup] Error 1
make: *** [first-run] Error 2
The llama-cpp-python library is introduced in PR #77, I have two proposals to help easing the set up:
No response
Not just deploy in localhost.
Write a document with the support to deploy cohere-toolkit in KVM VPS.
Avoid the error "http://localhost:8000/tools/ net::ERR_CONNECTION_REFUSED"
No response
hi, fantastic toolkit and project to get started quickly.
i was wondering if there is any plan to migrate it to the app router.
most (if not all) of new nextjs apps are built using the app router and due to small differences between how things are set up, it's not easy to transfer code samples from the toolkit to the app router. i believe having it in the app router will increase adoption and also collaborators
No response
Current behaviour: Coral interface only interacts with one user. Conversations, messages, files, etc are only associated with a single user.
Expected behaviour: Add user management to Coral interface. When a user deploys Coral interface, they go to a screen that asks them to authenticate. Then they can only access conversations that are associated with their user ID.
I'm running Ubuntu 22 on an x86 machine and getting this message when I deploy locally.
! terrarium The requested image's platform (linux/arm64/v8) does not match the detected host platform (linux/amd64/v3) and no specific platform was requested 0.0s
Watch enabled
I tried specifying the platform in the docker-compose.yml and got this error instead.
[+] Running 0/1
⠇ terrarium Pulling 0.9s
image with reference ghcr.io/cohere-ai/terrarium:latest was found but does not match the specified platform: wanted linux/amd64/v3, actual: linux/arm64/v8
Is this terrarium image only for amd64?
No response
first run errors out after a terabyte of download :(
git clone https://github.com/cohere-ai/cohere-toolkit.git
cd cohere-toolkit
make first-run
make setup
make[1]: Entering directory '/content/cohere-toolkit'
poetry install --with setup,community --verbose
The currently activated Python version 3.10.12 is not supported by the project (~3.11).
Trying to find and use a compatible version.
Using python3.11 (3.11.9)
Using virtualenv: /root/.cache/pypoetry/virtualenvs/backend-g3NRkvH2-py3.11
Installing dependencies from lock file
Finding the necessary packages for the current system
Package operations: 1 install, 0 updates, 0 removals, 216 skipped
- Installing aiohttp (3.9.5): Pending...
- Installing aiosignal (1.3.1): Pending...
- Installing alembic (1.13.1): Pending...
- Installing alembic (1.13.1): Skipped for the following reason: Already installed
- Installing aiosignal (1.3.1): Pending...
[...]
Note: This error originates from the build backend, and is likely not a problem with poetry but with psycopg2 (2.9.9) not supporting PEP 517 builds. You can verify this by running 'pip wheel --no-cache-dir --use-pep517 "psycopg2 (==2.9.9)"'.
[....]
- Installing wikipedia (1.4.0): Skipped for the following reason: Already installed
make[1]: *** [Makefile:23: setup] Error 1
make[1]: Leaving directory '/content/cohere-toolkit'
make: *** [Makefile:29: first-run] Error 2
---------------------------------------------------------------------------
CalledProcessError Traceback (most recent call last)
<ipython-input-19-b1788b39fc6f> in <cell line: 1>()
----> 1 get_ipython().run_cell_magic('shell', '', '#git clone https://github.com/cohere-ai/cohere-toolkit.git\ncd cohere-toolkit\nmake first-run\n')
When running the quick start docker command, I get the error KeyError: 'DATABASE_URL'
Steps:
Run docker command from README:
docker run -e COHERE_API_KEY=<KEY> -p 8000:8000 -p 4000:4000 ghcr.io/cohere-ai/cohere-toolkit:latest
Images are pulled down (which took a long while!) - everything looks promising
Greeted with a improperly set env variable
Error message:
Traceback (most recent call last):
File "/workspace/.venv/bin/alembic", line 8, in <module>
sys.exit(main())
^^^^^^
File "/workspace/.venv/lib/python3.11/site-packages/alembic/config.py", line 641, in main
CommandLine(prog=prog).main(argv=argv)
File "/workspace/.venv/lib/python3.11/site-packages/alembic/config.py", line 631, in main
self.run_cmd(cfg, options)
File "/workspace/.venv/lib/python3.11/site-packages/alembic/config.py", line 608, in run_cmd
fn(
File "/workspace/.venv/lib/python3.11/site-packages/alembic/command.py", line 403, in upgrade
script.run_env()
File "/workspace/.venv/lib/python3.11/site-packages/alembic/script/base.py", line 583, in run_env
util.load_python_file(self.dir, "env.py")
File "/workspace/.venv/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 95, in load_python_file
module = load_module_py(module_id, path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/.venv/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 113, in load_module_py
spec.loader.exec_module(module) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap_external>", line 940, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/workspace/./src/backend/alembic/env.py", line 9, in <module>
from backend.models import *
File "/workspace/src/backend/models/__init__.py", line 4, in <module>
from backend.models.database import *
File "/workspace/src/backend/models/database.py", line 11, in <module>
SQLALCHEMY_DATABASE_URL = os.environ["DATABASE_URL"]
~~~~~~~~~~^^^^^^^^^^^^^^^^
File "<frozen os>", line 679, in __getitem__
KeyError: 'DATABASE_URL'
No response
Current behaviour: Users cannot change the colours and logos used within in the Coral application to make the interface fit their brand better.
Expected behaviour: Users should have the option to change the following:
1. Define primary and secondary colour groups
2. Add a different logo in the top lefthand corner
I tried running the demo in Codespaces, using make first-run
.
The first problem is that it tries to install a lot of packages related to community deploaments and tools. I could fix it with the steps described in: #159 (comment)
After that, the toolkit starts and the web interface loads, but it cannot connect to the backend. The deployment and model choice dropdowns are empty, the diagnostic panel shows pending queries.
I suspect that this is due to Codespaces / dev-containers limiting communication between different services, see e.g. https://code.visualstudio.com/docs/devcontainers/create-dev-container#_use-docker-compose , but I could not figure out how to fix it.
No response
In the postman collection json, the path needs to reflect the v1 endpoint (for all requests in the collection:
otherwise the request fails with a not found
:
No response
The web ui is not the latest one at https://coral.cohere.com/. Can it be updated?
No response
When trying to build the project from scratch poetry fails on my default version of Python (3.12.3). The project should ideally support the 3.12 branch.
`make first-run ✔ cohere-toolkit
make setup
make[1]: Entering directory '/home/alberto/lab/cohere-toolkit'
poetry install --with setup,community --verbose
InvalidCurrentPythonVersionError
Current Python version (3.12.3) is not allowed by the project (~3.11).
Please change python executable via the "env use" command.
at /usr/lib/python3.12/site-packages/poetry/utils/env/env_manager.py:454 in create_venv
450│ current_python = Version.parse(
451│ ".".join(str(c) for c in env.version_info[:3])
452│ )
453│ if not self._poetry.package.python_constraint.allows(current_python):
→ 454│ raise InvalidCurrentPythonVersionError(
455│ self._poetry.package.python_versions, str(current_python)
456│ )
457│ return env
458│
make[1]: *** [Makefile:23: setup] Error 1
make[1]: Leaving directory '/home/alberto/lab/cohere-toolkit'
make: *** [Makefile:32: first-run] Error 2`
When I follow the instructions in the Readme file, and run make first-run
, all the services start, but the API calls in the web app does not work.
Regression Likely introduced by this PR
Current behaviour: Coral is the only frontend interface available in the Toolkit.
Expected behaviour: Create a slackbot template that can use the RAG components to create a RAG application in RAG.
Current behaviour: The python interpreter can generate pngs of plots ("Plot the tallest buildings in the world). The Coral interface currently does not display these images
Expected behaviour: The Coral interface should be able to display images
Current behaviour: Today users can’t set the env variables in the chat interface. Additionally, we do not check if the variables are valid so that descriptive frontend error messaging can be written.
1. User story: I want to know if the env variable I set is valid or not
2. User story: I want to set my env variables for the chosen model in UI
3. User story: I want to know what env variables were already set up during the CLI so the frontend can select the right default deployment
Expected behaviour: Update the deployment variables API in Coral interface package to set deployment variables and to check the validity of the variables so that descriptive frontend error messaging can be provided if the user sends an invalid variable.
Hello :)
I uploaded a written judgment pdf file
, which is written on Korean, and asked question.
Has this defendant been punished?
There is CORS error when I click submit button at the bottom right on UI.
All I did to run cohere toolkit is below command:
docker run -e COHERE_API_KEY='>>MY_API_KEY<<' -p 8000:8000 -p 4000:4000 ghcr.io/cohere-ai/cohere-toolkit:latest
There is an option to turn off CORS as below, but wouldn't adding CORS-related middleware be a more convenient option for users?
I started chrome --disable-web-security
mode with below command:
open -n -a /Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --args --user-data-dir="/tmp/chrome_dev_test" --disable-web-security
Then there is another error on browser:
Expected content-type to be text/event-stream, Actual: text/plain; charset=utf-8
And the backend log is:
2024-05-07 09:25:22 INFO: 172.17.0.1:49940 - "POST /conversations/upload_file HTTP/1.1" 200 OK
2024-05-07 09:25:42 2024-05-07 00:25:42,900 - backend.services.logger - INFO - Using deployment CohereDeployment
2024-05-07 09:25:43 2024-05-07 00:25:43,370 - httpx - INFO - HTTP Request: POST https://api.cohere.ai/v1/chat "HTTP/1.1 200 OK"
2024-05-07 09:25:43 2024-05-07 00:25:43,411 - backend.services.logger - INFO - Search queries generated: []
2024-05-07 09:25:43 2024-05-07 00:25:43,411 - backend.services.logger - INFO - Using retrievers: ['LangChainVectorDBRetriever']
2024-05-07 09:25:48 2024-05-07 00:25:48,159 - chromadb.telemetry.product.posthog - INFO - Anonymized telemetry enabled. See https://docs.trychroma.com/telemetry for more information.
2024-05-07 09:25:49 2024-05-07 00:25:49,234 - langchain_cohere.utils - WARNING - Retrying langchain_cohere.embeddings.CohereEmbeddings.embed_with_retry.<locals>._embed_with_retry in 4.0 seconds as it raised IndexError: list index out of range.
2024-05-07 09:25:53 2024-05-07 00:25:53,238 - langchain_cohere.utils - WARNING - Retrying langchain_cohere.embeddings.CohereEmbeddings.embed_with_retry.<locals>._embed_with_retry in 4.0 seconds as it raised IndexError: list index out of range.
2024-05-07 09:25:57 INFO: 172.17.0.1:57308 - "POST /chat-stream HTTP/1.1" 500 Internal Server Error
2024-05-07 09:25:57 ERROR: Exception in ASGI application
2024-05-07 09:25:57 Traceback (most recent call last):
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 419, in run_asgi
2024-05-07 09:25:57 result = await app( # type: ignore[func-returns-value]
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
2024-05-07 09:25:57 return await self.app(scope, receive, send)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
2024-05-07 09:25:57 await super().__call__(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
2024-05-07 09:25:57 await self.middleware_stack(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
2024-05-07 09:25:57 raise exc
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
2024-05-07 09:25:57 await self.app(scope, receive, _send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in __call__
2024-05-07 09:25:57 await self.app(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
2024-05-07 09:25:57 await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
2024-05-07 09:25:57 raise exc
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
2024-05-07 09:25:57 await app(scope, receive, sender)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/routing.py", line 758, in __call__
2024-05-07 09:25:57 await self.middleware_stack(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
2024-05-07 09:25:57 await route.handle(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/routing.py", line 299, in handle
2024-05-07 09:25:57 await self.app(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/routing.py", line 79, in app
2024-05-07 09:25:57 await wrap_app_handling_exceptions(app, request)(scope, receive, send)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
2024-05-07 09:25:57 raise exc
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
2024-05-07 09:25:57 await app(scope, receive, sender)
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/starlette/routing.py", line 74, in app
2024-05-07 09:25:57 response = await func(request)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
2024-05-07 09:25:57 raise e
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
2024-05-07 09:25:57 raw_response = await run_endpoint_function(
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
2024-05-07 09:25:57 return await dependant.call(**values)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/src/backend/routers/chat.py", line 97, in chat_stream
2024-05-07 09:25:57 CustomChat().chat(
2024-05-07 09:25:57 File "/workspace/src/backend/chat/custom/custom.py", line 90, in chat
2024-05-07 09:25:57 retriever.retrieve_documents(query)
2024-05-07 09:25:57 File "/workspace/src/backend/tools/retrieval/lang_chain.py", line 72, in retrieve_documents
2024-05-07 09:25:57 db = Chroma.from_documents(documents=pages, embedding=cohere_embeddings)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/langchain_community/vectorstores/chroma.py", line 778, in from_documents
2024-05-07 09:25:57 return cls.from_texts(
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/langchain_community/vectorstores/chroma.py", line 736, in from_texts
2024-05-07 09:25:57 chroma_collection.add_texts(
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/langchain_community/vectorstores/chroma.py", line 275, in add_texts
2024-05-07 09:25:57 embeddings = self._embedding_function.embed_documents(texts)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/langchain_cohere/embeddings.py", line 151, in embed_documents
2024-05-07 09:25:57 return self.embed(texts, input_type="search_document")
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/langchain_cohere/embeddings.py", line 118, in embed
2024-05-07 09:25:57 embeddings = self.embed_with_retry(
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/langchain_cohere/embeddings.py", line 100, in embed_with_retry
2024-05-07 09:25:57 return _embed_with_retry(**kwargs)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 289, in wrapped_f
2024-05-07 09:25:57 return self(f, *args, **kw)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 379, in __call__
2024-05-07 09:25:57 do = self.iter(retry_state=retry_state)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 325, in iter
2024-05-07 09:25:57 raise retry_exc.reraise()
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 158, in reraise
2024-05-07 09:25:57 raise self.last_attempt.result()
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 449, in result
2024-05-07 09:25:57 return self.__get_result()
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
2024-05-07 09:25:57 raise self._exception
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 382, in __call__
2024-05-07 09:25:57 result = fn(*args, **kwargs)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/langchain_cohere/embeddings.py", line 98, in _embed_with_retry
2024-05-07 09:25:57 return self.client.embed(**kwargs)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/cohere/client.py", line 153, in embed
2024-05-07 09:25:57 return merge_embed_responses(responses)
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/cohere/utils.py", line 187, in merge_embed_responses
2024-05-07 09:25:57 meta = merge_meta_field([response.meta for response in responses if response.meta])
2024-05-07 09:25:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-07 09:25:57 File "/workspace/.venv/lib/python3.11/site-packages/cohere/utils.py", line 167, in merge_meta_field
2024-05-07 09:25:57 api_version = metas[0].api_version
2024-05-07 09:25:57 ~~~~~^^^
2024-05-07 09:25:57 IndexError: list index out of range
These two may be separate issues, but I think they may have been caused by me running Chrome in a custom way.
If second issue is an error related to Korean characters, I'd like to contribute for Korean documents.
No response
Current behaviour: In the case of multihop tool use, the Coral interface is not displaying citations
Expected behaviour: When multi hop tools are used, we display the citations returned from the Chat API in the Coral interface.
I am trying to decipher the process to use the connector tool within coral toolkit (or add my own tool).
I successfully built a custom connection to my google drive folder, and I can use the cohere python library to converse with my docs via the command line / curl etc. But I cannot figure out how to do this within the cohere toolkit.
From the connector retriever source code, the instructions are:
"""
Plug in your Connector configuration here. For example:
Url: http://example_connector.com/search
Auth: Bearer token for the connector
More details: https://docs.cohere.com/docs/connectors
"""
class ConnectorRetriever(BaseTool):
def __init__(self, url: str, auth: str):
self.url = url
self.auth = auth
But these instructions don't make sense - putting the Url
and Auth
inside this comment would do nothing. And I've tried hard coding these values into self.url = [my_url]
and self.auth = [my_auth]
but the toolkit throws a connection error using the connector retriever. This happens despite the fact that I can do a curl request using that exact same endpoint and values.
The connector retriever itself in the tools menu has no corresponding field to specify what you're connecting to, but it seems like you could set things up so someone could just plug in a cohere compatible API endpoint within the UI here and facilitate the connection.
I feel like more information or a better process is needed in order to use the connector retriever, or implement custom tools, and deploy an extrapolated version of the quick start connectors to a public endpoint. I've attempted to surface these questions on Discord but things seem similarly opaque. Maybe you could do a tutorial video?
No response
Error: pg_config executable not found.
Please add the directory containing pg_config to the PATH
or specify the full executable path with the option:
python setup.py build_ext --pg-config /path/to/pg_config build ...
or with the pg_config option in 'setup.cfg'.
----------------------------------------
Command python setup.py egg_info failed with error code 1 in /tmp/pip-build/psycopg2
This is not an issue with the library itself, but with pyscopg dependency. This stack overflow issue has 1500 upvotes related to this problem - whose solution varies based on your OS.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.