Code Monkey home page Code Monkey logo

Comments (7)

yeeeuw avatar yeeeuw commented on August 12, 2024 2

re previous comment: ollama is running but i believe you may have issues back referencing the host using host.docker.internal on linux. I've subsequently tried this on windows which has the exact same error as the original issue around $HOME. Also on windows if i follow the guide you referenced docker compose up doesn't work as the api fails its health check. Disabling the health check seems to generate further issues.

from genai-stack.

samchenghowing avatar samchenghowing commented on August 12, 2024

Which model you are using? could you provide your .env file?

from genai-stack.

yeeeuw avatar yeeeuw commented on August 12, 2024

below is .env

#*****************************************************************

LLM and Embedding Model

#*****************************************************************
LLM=llama2 #or any Ollama model tag, gpt-4, gpt-3.5, or claudev2
EMBEDDING_MODEL=sentence_transformer #or google-genai-embedding-001 openai, ollama, or aws

#*****************************************************************

Neo4j

#*****************************************************************
NEO4J_URI=neo4j://database:7687
NEO4J_USERNAME=neo4j
NEO4J_PASSWORD=password

#*****************************************************************

Langchain

#*****************************************************************

Optional for enabling Langchain Smith API

#LANGCHAIN_TRACING_V2=true # false
#LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"
#LANGCHAIN_PROJECT=#your-project-name
#LANGCHAIN_API_KEY=#your-api-key ls_...

#*****************************************************************

Ollama

#*****************************************************************
OLLAMA_BASE_URL=http://host.docker.internal:11434

#*****************************************************************

OpenAI

#*****************************************************************

Only required when using OpenAI LLM or embedding model

#OPENAI_API_KEY=sk-...

#*****************************************************************

AWS

#*****************************************************************

Only required when using AWS Bedrock LLM or embedding model

#AWS_ACCESS_KEY_ID=
#AWS_SECRET_ACCESS_KEY=
#AWS_DEFAULT_REGION=us-east-1

#*****************************************************************

GOOGLE

#*****************************************************************

Only required when using GoogleGenai LLM or embedding model

GOOGLE_API_KEY=

from genai-stack.

samchenghowing avatar samchenghowing commented on August 12, 2024

It seems weird, llama2 should work. What's your run time environment and docker-compose.yml then?

***edited
Have you install ollama in your host machine? OLLAMA_BASE_URL=http://host.docker.internal:11434/ <= here is calling the ollama in your host machine in container


p.s. I have encounter the same problem while using gemma2:2b. I fix it following this guide. You might also install the ollama in your host machine and connect it from container.

from genai-stack.

yeeeuw avatar yeeeuw commented on August 12, 2024

oki, did two things, confirmed that ollama_base_url is set to http://llm:11434 same error as before.

When running ollama locally and following the guide you sited, i get an error when chatting to the bot on: http://localhost:8501/

ConnectionError: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x73f8ba4db190>: Failed to establish a new connection: [Errno 111] Connection refused'))

This is with ollama_base_url set to: http://host.docker.internal:11434

from genai-stack.

samchenghowing avatar samchenghowing commented on August 12, 2024

ConnectionError

The ConnectionError is showing that it can't connect to your local ollama. Is your local ollama running? Please check it in your terminal by

curl 127.0.0.1:11434

You should get a "ollama is running" message.

If it is not running, start it by

ollama serve

from genai-stack.

yeeeuw avatar yeeeuw commented on August 12, 2024

tracked it down to a computer issue, this is working

from genai-stack.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.