Comments (14)
Fixed by adding
class LmstudioProvider(BaseLLMProvider):
def init(self, api_url, api_key=None):
from autogroq.
Can't replicate. Godspeed, weary traveler...
from autogroq.
So I'm having the same issue here, I added the code you suggested but got some formatting error due to lack of an indent. Now I'm getting an error that BaseLLMProvider is not defined. I apologize if the last two lines of code are garbage, I'm an aspiring amateur at best trying to make this all work. Got Autogen working with LM Studio now I just need Autogroq to complete the think tank.
User-specific configurations
LLM_PROVIDER = "lmstudio"
GROQ_API_URL = "https://api.groq.com/openai/v1/chat/completions"
LMSTUDIO_API_URL = "http://localhost:1234/v1/chat/completions"
OLLAMA_API_URL = "http://127.0.0.1:11434/api/generate"
OPENAI_API_KEY = "0987654321"
OPENAI_API_URL = "https://api.openai.com/v1/chat/completions"
class LmstudioProvider(BaseLLMProvider):
def init(self, api_url, api_key=None):
self.api_url = api_url
self.api_key = api_key
from autogroq.
from autogroq.
Still getting an error:
NameError: name 'BaseLLMProvider' is not defined
Traceback:
File "C:\Users\shake.conda\envs\Ag\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script
exec(code, module.dict)
File "C:\Users\shake\Ag\AutoGroq\AutoGroq\main.py", line 3, in
from config import LLM_PROVIDER, MODEL_TOKEN_LIMITS
File "C:\Users\shake\Ag\Autogroq\AutoGroq\config.py", line 17, in
from config_local import *
File "C:\Users\shake\Ag\Autogroq\AutoGroq\config_local.py", line 10, in
class LmstudioProvider(BaseLLMProvider):
^^^^^^^^^^^^^^^
from autogroq.
Is this your model in LM Studio?: instructlab/granite-7b-lab-GGUF
If not, you'll have to tweak your config.py...
from autogroq.
It's not and I did actually go through my config.py and change the entries to match the model I'm using before getting this error. This is my config.py
import os
Get user home directory
home_dir = os.path.expanduser("~")
default_db_path = f'{home_dir}/.autogenstudio/database.sqlite'
Default configurations
DEFAULT_LLM_PROVIDER = "groq"
DEFAULT_GROQ_API_URL = "https://api.groq.com/openai/v1/chat/completions"
DEFAULT_LMSTUDIO_API_URL = "http://localhost:1234/v1/chat/completions"
DEFAULT_OLLAMA_API_URL = "http://127.0.0.1:11434/api/generate"
DEFAULT_OPENAI_API_KEY = None
DEFAULT_OPENAI_API_URL = "https://api.openai.com/v1/chat/completions"
Try to import user-specific configurations from config_local.py
try:
from config_local import *
except ImportError:
pass
Set the configurations using the user-specific values if available, otherwise use the defaults
LLM_PROVIDER = locals().get('LLM_PROVIDER', DEFAULT_LLM_PROVIDER)
GROQ_API_URL = locals().get('GROQ_API_URL', DEFAULT_GROQ_API_URL)
LMSTUDIO_API_URL = locals().get('LMSTUDIO_API_URL', DEFAULT_LMSTUDIO_API_URL)
OLLAMA_API_URL = locals().get('OLLAMA_API_URL', DEFAULT_OLLAMA_API_URL)
OPENAI_API_KEY = locals().get('OPENAI_API_KEY', DEFAULT_OPENAI_API_KEY)
OPENAI_API_URL = locals().get('OPENAI_API_URL', DEFAULT_OPENAI_API_URL)
API_KEY_NAMES = {
"groq": "GROQ_API_KEY",
"lmstudio": None,
"ollama": None,
"openai": "OPENAI_API_KEY",
# Add other LLM providers and their respective API key names here
}
Retry settings
MAX_RETRIES = 3
RETRY_DELAY = 2 # in seconds
RETRY_TOKEN_LIMIT = 5000
Model configurations
if LLM_PROVIDER == "groq":
API_URL = GROQ_API_URL
MODEL_TOKEN_LIMITS = {
'mixtral-8x7b-32768': 32768,
'llama3-70b-8192': 8192,
'llama3-8b-8192': 8192,
'gemma-7b-it': 8192,
}
elif LLM_PROVIDER == "lmstudio":
API_URL = LMSTUDIO_API_URL
MODEL_TOKEN_LIMITS = {
'Qwen/CodeQwen1.5-7B-Chat-GGUF': 64000,
}
elif LLM_PROVIDER == "openai":
API_URL = OPENAI_API_URL
MODEL_TOKEN_LIMITS = {
'gpt-4o': 4096,
}
elif LLM_PROVIDER == "ollama":
API_URL = OLLAMA_API_URL
MODEL_TOKEN_LIMITS = {
'llama3': 8192,
}
else:
MODEL_TOKEN_LIMITS = {}
Database path
AUTOGEN_DB_PATH="/path/to/custom/database.sqlite"
AUTOGEN_DB_PATH = os.environ.get('AUTOGEN_DB_PATH', default_db_path)
MODEL_CHOICES = {
'default': None,
'gemma-7b-it': 8192,
'gpt-4o': 4096,
'Qwen/CodeQwen1.5-7B-Chat-GGUF': 64000,
'llama3': 8192,
'llama3-70b-8192': 8192,
'llama3-8b-8192': 8192,
'mixtral-8x7b-32768': 32768
}
from autogroq.
I put the error, the config.py and config_local.py into gpt and it said:
The error message indicates that BaseLLMProvider is not defined. This typically happens when the module or class BaseLLMProvider is not imported or not available in the current namespace.
From your config.py file, it seems like BaseLLMProvider should be imported from somewhere. However, in the provided code, I don't see any import statement for BaseLLMProvider.
To fix this issue, you need to ensure that BaseLLMProvider is imported correctly before it's referenced. If BaseLLMProvider is supposed to be part of the config_local.py file, then you should make sure that it's defined there or imported from wherever it's defined.
When your trying to run it off llm studio to test for the issue what's your config.py and config_local.py look like? There's got to be something you're doing differently that's making it load.
from autogroq.
from autogroq.
So I had the version of that you put before which was slightly different:
class LmstudioProvider(BaseLLMProvider):
def init(self, api_url, api_key=None):
self.api_url = "http://localhost:1234/v1/chat/completions"
Which gave the BaseLLMProvider not defined. However after posting the underscore on either side of "init" aren't showing up but they did in your post I copied it from.
If I use the version you just posted with the "*" in place of "_" I get the following error
SyntaxError: File "C:\Users\shake\Ag\Autogroq\AutoGroq\config_local.py", line 11 def init(self, api_url, api_key=None): ^ SyntaxError: invalid syntax
Traceback:
File "C:\Users\shake.conda\envs\Ag\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script
exec(code, module.dict)
File "C:\Users\shake\Ag\AutoGroq\AutoGroq\main.py", line 3, in
from config import LLM_PROVIDER, MODEL_TOKEN_LIMITS
File "C:\Users\shake\Ag\Autogroq\AutoGroq\config.py", line 17, in
from config_local import *
from autogroq.
I did also just download the latest config.py, didn't bother downloading the latest config_local.py as it didn't look like anything had changed, and there was no difference for the syntax error.
from autogroq.
I also tried removing the asterisks just because it highlighted "init" in the syntax error and also because they dissapeared after I posted it in here for some reason so I ended up with this:
def init(self, api_url, api_key=None):
Instead of:
def ""init""(self, api_url, api_key=None):
**** I tried putting the asterisks in quotations just to keep them from disappearing in the post here. But for some reason they still become invisible and init becomes italicized****
But making that change just gave the "BaseLLMProvider" not defined error again anyways so made no difference.
from autogroq.
Eureka! So based off your comment about having solved the original problem and the files having been updated I decided to ditch the effort to patch the problem and try to update my repo with your updated files. Ran into some crap about can't update cause it would mess up my config.py but did some hard reset thinga ma bob and then I was able to pull the updated files. Initially it still threw the BaseLLMProvider not defined cause stupid me didn't erase that stuff we threw in the config_local.py yet. Deleted that, put my model back into the updated config.py, reran AutoGroq and we're cookin with gasoline over here now buddy!
Thanks again, and I'll see you in the future
from autogroq.
Glad it worked out... https://discord.gg/DXjFPX84gs
from autogroq.
Related Issues (20)
- use gitignore for compiled python etc HOT 4
- Move the creation Prompts used to allow user changes HOT 1
- wrongly escaped underscore in skills generation + better prompt for skills HOT 1
- you really need to fix the config.py problem HOT 6
- Skills versus Tools? HOT 1
- ImportError when trying to run via LmStudio HOT 1
- OpenAI Incorrect API key provided: None HOT 5
- headsup: newest autogenstudio changes the db tables HOT 5
- Error parsing JSON: Invalid \escape: line 2 column 8 (char 9) HOT 2
- Error: 'ts' section not found in Project Manager's output. HOT 1
- ImportError: cannot import name 'getStringIO' from 'reportlab.lib.utils HOT 1
- A conform map (as configuration) used for model names when exporting from autogroq HOT 9
- Auto-moderate only seems to work for 2-3 Agent HOT 1
- FEATURE REQUEST: The ability to add additional agents. HOT 1
- No module named 'utils.auth_utils' HOT 3
- Feature Request: drop down menu for LLM provider and models HOT 3
- AttributeError: 'ToolBaseModel' object has no attribute 'dict' HOT 1
- Incorrect API key provided: None. with OpenAI HOT 6
- Clarifying The Necessary Settings HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from autogroq.