Code Monkey home page Code Monkey logo

Comments (14)

jsarsoun avatar jsarsoun commented on July 30, 2024 1

Fixed by adding

class LmstudioProvider(BaseLLMProvider):
def init(self, api_url, api_key=None):

from autogroq.

jgravelle avatar jgravelle commented on July 30, 2024 1

Can't replicate. Godspeed, weary traveler...

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

So I'm having the same issue here, I added the code you suggested but got some formatting error due to lack of an indent. Now I'm getting an error that BaseLLMProvider is not defined. I apologize if the last two lines of code are garbage, I'm an aspiring amateur at best trying to make this all work. Got Autogen working with LM Studio now I just need Autogroq to complete the think tank.

User-specific configurations

LLM_PROVIDER = "lmstudio"
GROQ_API_URL = "https://api.groq.com/openai/v1/chat/completions"
LMSTUDIO_API_URL = "http://localhost:1234/v1/chat/completions"
OLLAMA_API_URL = "http://127.0.0.1:11434/api/generate"
OPENAI_API_KEY = "0987654321"
OPENAI_API_URL = "https://api.openai.com/v1/chat/completions"

class LmstudioProvider(BaseLLMProvider):
def init(self, api_url, api_key=None):
self.api_url = api_url
self.api_key = api_key

from autogroq.

jsarsoun avatar jsarsoun commented on July 30, 2024

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

Still getting an error:

NameError: name 'BaseLLMProvider' is not defined
Traceback:
File "C:\Users\shake.conda\envs\Ag\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script
exec(code, module.dict)
File "C:\Users\shake\Ag\AutoGroq\AutoGroq\main.py", line 3, in
from config import LLM_PROVIDER, MODEL_TOKEN_LIMITS
File "C:\Users\shake\Ag\Autogroq\AutoGroq\config.py", line 17, in
from config_local import *
File "C:\Users\shake\Ag\Autogroq\AutoGroq\config_local.py", line 10, in
class LmstudioProvider(BaseLLMProvider):
^^^^^^^^^^^^^^^

from autogroq.

jgravelle avatar jgravelle commented on July 30, 2024

Is this your model in LM Studio?: instructlab/granite-7b-lab-GGUF

If not, you'll have to tweak your config.py...

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

It's not and I did actually go through my config.py and change the entries to match the model I'm using before getting this error. This is my config.py

import os

Get user home directory

home_dir = os.path.expanduser("~")
default_db_path = f'{home_dir}/.autogenstudio/database.sqlite'

Default configurations

DEFAULT_LLM_PROVIDER = "groq"
DEFAULT_GROQ_API_URL = "https://api.groq.com/openai/v1/chat/completions"
DEFAULT_LMSTUDIO_API_URL = "http://localhost:1234/v1/chat/completions"
DEFAULT_OLLAMA_API_URL = "http://127.0.0.1:11434/api/generate"
DEFAULT_OPENAI_API_KEY = None
DEFAULT_OPENAI_API_URL = "https://api.openai.com/v1/chat/completions"

Try to import user-specific configurations from config_local.py

try:
from config_local import *
except ImportError:
pass

Set the configurations using the user-specific values if available, otherwise use the defaults

LLM_PROVIDER = locals().get('LLM_PROVIDER', DEFAULT_LLM_PROVIDER)
GROQ_API_URL = locals().get('GROQ_API_URL', DEFAULT_GROQ_API_URL)
LMSTUDIO_API_URL = locals().get('LMSTUDIO_API_URL', DEFAULT_LMSTUDIO_API_URL)
OLLAMA_API_URL = locals().get('OLLAMA_API_URL', DEFAULT_OLLAMA_API_URL)
OPENAI_API_KEY = locals().get('OPENAI_API_KEY', DEFAULT_OPENAI_API_KEY)
OPENAI_API_URL = locals().get('OPENAI_API_URL', DEFAULT_OPENAI_API_URL)

API_KEY_NAMES = {
"groq": "GROQ_API_KEY",
"lmstudio": None,
"ollama": None,
"openai": "OPENAI_API_KEY",
# Add other LLM providers and their respective API key names here
}

Retry settings

MAX_RETRIES = 3
RETRY_DELAY = 2 # in seconds
RETRY_TOKEN_LIMIT = 5000

Model configurations

if LLM_PROVIDER == "groq":
API_URL = GROQ_API_URL
MODEL_TOKEN_LIMITS = {
'mixtral-8x7b-32768': 32768,
'llama3-70b-8192': 8192,
'llama3-8b-8192': 8192,
'gemma-7b-it': 8192,
}
elif LLM_PROVIDER == "lmstudio":
API_URL = LMSTUDIO_API_URL
MODEL_TOKEN_LIMITS = {
'Qwen/CodeQwen1.5-7B-Chat-GGUF': 64000,
}
elif LLM_PROVIDER == "openai":
API_URL = OPENAI_API_URL
MODEL_TOKEN_LIMITS = {
'gpt-4o': 4096,
}
elif LLM_PROVIDER == "ollama":
API_URL = OLLAMA_API_URL
MODEL_TOKEN_LIMITS = {
'llama3': 8192,
}
else:
MODEL_TOKEN_LIMITS = {}

Database path

AUTOGEN_DB_PATH="/path/to/custom/database.sqlite"

AUTOGEN_DB_PATH = os.environ.get('AUTOGEN_DB_PATH', default_db_path)

MODEL_CHOICES = {
'default': None,
'gemma-7b-it': 8192,
'gpt-4o': 4096,
'Qwen/CodeQwen1.5-7B-Chat-GGUF': 64000,
'llama3': 8192,
'llama3-70b-8192': 8192,
'llama3-8b-8192': 8192,
'mixtral-8x7b-32768': 32768
}

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

I put the error, the config.py and config_local.py into gpt and it said:

The error message indicates that BaseLLMProvider is not defined. This typically happens when the module or class BaseLLMProvider is not imported or not available in the current namespace.

From your config.py file, it seems like BaseLLMProvider should be imported from somewhere. However, in the provided code, I don't see any import statement for BaseLLMProvider.

To fix this issue, you need to ensure that BaseLLMProvider is imported correctly before it's referenced. If BaseLLMProvider is supposed to be part of the config_local.py file, then you should make sure that it's defined there or imported from wherever it's defined.

When your trying to run it off llm studio to test for the issue what's your config.py and config_local.py look like? There's got to be something you're doing differently that's making it load.

from autogroq.

jsarsoun avatar jsarsoun commented on July 30, 2024

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

So I had the version of that you put before which was slightly different:

class LmstudioProvider(BaseLLMProvider):
def init(self, api_url, api_key=None):
self.api_url = "http://localhost:1234/v1/chat/completions"

Which gave the BaseLLMProvider not defined. However after posting the underscore on either side of "init" aren't showing up but they did in your post I copied it from.

If I use the version you just posted with the "*" in place of "_" I get the following error

SyntaxError: File "C:\Users\shake\Ag\Autogroq\AutoGroq\config_local.py", line 11 def init(self, api_url, api_key=None): ^ SyntaxError: invalid syntax
Traceback:
File "C:\Users\shake.conda\envs\Ag\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script
exec(code, module.dict)
File "C:\Users\shake\Ag\AutoGroq\AutoGroq\main.py", line 3, in
from config import LLM_PROVIDER, MODEL_TOKEN_LIMITS
File "C:\Users\shake\Ag\Autogroq\AutoGroq\config.py", line 17, in
from config_local import *

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

I did also just download the latest config.py, didn't bother downloading the latest config_local.py as it didn't look like anything had changed, and there was no difference for the syntax error.

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

I also tried removing the asterisks just because it highlighted "init" in the syntax error and also because they dissapeared after I posted it in here for some reason so I ended up with this:

def init(self, api_url, api_key=None):

Instead of:

def ""init""(self, api_url, api_key=None):

**** I tried putting the asterisks in quotations just to keep them from disappearing in the post here. But for some reason they still become invisible and init becomes italicized****

But making that change just gave the "BaseLLMProvider" not defined error again anyways so made no difference.

from autogroq.

Shake-Shifter avatar Shake-Shifter commented on July 30, 2024

Eureka! So based off your comment about having solved the original problem and the files having been updated I decided to ditch the effort to patch the problem and try to update my repo with your updated files. Ran into some crap about can't update cause it would mess up my config.py but did some hard reset thinga ma bob and then I was able to pull the updated files. Initially it still threw the BaseLLMProvider not defined cause stupid me didn't erase that stuff we threw in the config_local.py yet. Deleted that, put my model back into the updated config.py, reran AutoGroq and we're cookin with gasoline over here now buddy!

Thanks again, and I'll see you in the future

from autogroq.

jgravelle avatar jgravelle commented on July 30, 2024

Glad it worked out... https://discord.gg/DXjFPX84gs

from autogroq.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.