oldaiprojects / oldagixt-frontend Goto Github PK
View Code? Open in Web Editor NEWFront end for Agent-LLM.
License: MIT License
Front end for Agent-LLM.
License: MIT License
when you select the agents after the instance of the agent-llm has been restarted with docker or instance you can't access that agent it won't go to it nothing happens if you try selecting it.
1.restart instance or docker container
2.Go to agent-llm
3.Select Agents
4.try to select a existing agent
3.issue will happen
When clicking the agent it should bring up the configuration and settings for the agent.
It won't go to the agent settings when you try to select the existing agent.
On "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/113.0.0.0 Safari/537.36"
when I click send message on the frontend for my azure researcher I get the response below.
get a response
Unhandled Runtime Error
AxiosError: Network Error
Call Stack
AxiosError
node_modules/axios/lib/core/AxiosError.js (22:0)
handleError
node_modules/axios/lib/adapters/xhr.js (158:0)
No response
clicking start a task on an agent, the run status and the text log of what is happening never appears.
I can see the log in the dockerlog that seems the backend is making requests to the llm, but the run status in the front end is never updated and the log in the frontend for the task never seems to appear.
May or may not be related to bugs #1 and #2
run status will update and log will appear
no run status change or log.
No response
When I pull this frontend package as container ,It seems access the backend API address http://localhost:7437
,I tried to change the container Environment NEXT_PUBLIC_API_URI
to the other Address , makes the frontend remote accessed. but the frontend container still access to http://localhost:7437
git clone https://github.com/Josh-XT/Agent-LLM
.cd Agent-LLM/
sudo docker-compose up -d
to run containeragent-llm_frontend_1
Env NEXT_PUBLIC_API_URI
by managent service such as portainer.http:{ip}:3000
,and open the development tool to check.can access the backend as local do.
shows network error,and the frontend still access http://localhost:7437
No response
There are other settings for API keys that are not captured in agent settings and are currently required out of the .env file.
Easy work around is to add the ability for users to add custom keys
into their agent settings, so if there isn't something like GOOGLE_API_KEY
for example, they could hit a + and add that as the key and give it a value to assign to the agent.
No response
No response
Bug when starting a new task:
Executing task 1: Develop a task list.
Exception in thread Thread-2 (run_task):
Traceback (most recent call last):
File "/home/brice/miniconda3/envs/agent/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/home/brice/miniconda3/envs/agent/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "/home/brice/Agent-LLM/AgentLLM.py", line 364, in run_task
result = self.run(task=task["task_name"], prompt="execute")
File "/home/brice/Agent-LLM/AgentLLM.py", line 149, in run
formatted_prompt, unformatted_prompt = self.format_prompt(
File "/home/brice/Agent-LLM/AgentLLM.py", line 128, in format_prompt
formatted_prompt = self.custom_format(
File "/home/brice/Agent-LLM/AgentLLM.py", line 101, in custom_format
return re.sub(pattern, replace, string)
File "/home/brice/miniconda3/envs/agent/lib/python3.10/re.py", line 209, in sub
return _compile(pattern, flags).sub(repl, string, count)
TypeError: sequence item 3: expected str instance, list found
Ubuntu 22.04
Conda environment with python 3.10
No docker
Launch backend with python app.py
& frontend with yarn run dev
Create a new Agent, Oobabooga, and launch a task
No error
==Output==
Executing task 1: Develop a task list.
Exception in thread Thread-2 (run_task):
Traceback (most recent call last):
File "/home/brice/miniconda3/envs/agent/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/home/brice/miniconda3/envs/agent/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "/home/brice/Agent-LLM/AgentLLM.py", line 364, in run_task
result = self.run(task=task["task_name"], prompt="execute")
File "/home/brice/Agent-LLM/AgentLLM.py", line 149, in run
formatted_prompt, unformatted_prompt = self.format_prompt(
File "/home/brice/Agent-LLM/AgentLLM.py", line 128, in format_prompt
formatted_prompt = self.custom_format(
File "/home/brice/Agent-LLM/AgentLLM.py", line 101, in custom_format
return re.sub(pattern, replace, "".join(string))
File "/home/brice/miniconda3/envs/agent/lib/python3.10/re.py", line 209, in sub
return _compile(pattern, flags).sub(repl, string, count)
TypeError: sequence item 3: expected str instance, list found
INFO: 127.0.0.1:55720 - "GET /api/agent/Vicuna/task/status HTTP/1.1" 200 OK
No response
00h00m00s 0/0: : ERROR: [Errno 2] No such file or directory: 'dev'
Follow the install guide
Complete
Install and run
No response
When you assign a task to the agent it will not show in the GUI the results but the output is written to a text file in the agent folder.
Find 5 latest news headlines on May, 05, 2023 related to Technology..txt
Assign a task to the agent it will not show in the GUI the results but the output is written to a text file in the agent folder.
For the output of the agent to appear in the GUI under the task agent.
There is output generated by the agent but not appearing in the GUI.
127.0.0.1 - - [07/May/2023 18:05:40] "POST /api/v1/generate HTTP/1.1" 200 -
Output generated in 105.28 seconds (0.87 tokens/s, 92 tokens, context 24, seed 200956049)
127.0.0.1 - - [07/May/2023 18:08:28] "POST /api/v1/generate HTTP/1.1" 200 -
Output generated in 496.88 seconds (0.64 tokens/s, 318 tokens, context 398, seed 763687500)
127.0.0.1 - - [07/May/2023 18:16:45] "POST /api/v1/generate HTTP/1.1" 200 -
Output generated in 166.63 seconds (0.00 tokens/s, 0 tokens, context 409, seed 469111486)
127.0.0.1 - - [07/May/2023 18:19:32] "POST /api/v1/generate HTTP/1.1" 200 -
Output generated in 100.66 seconds (0.59 tokens/s, 59 tokens, context 102, seed 4115488)
127.0.0.1 - - [07/May/2023 18:21:12] "POST /api/v1/generate HTTP/1.1" 200 -
Output generated in 867.37 seconds (0.78 tokens/s, 673 tokens, context 416, seed 948141595)
127.0.0.1 - - [07/May/2023 18:35:40] "POST /api/v1/generate HTTP/1.1" 200 -
Output generated in 357.99 seconds (0.00 tokens/s, 0 tokens, context 855, seed 588820203)
127.0.0.1 - - [07/May/2023 18:41:38] "POST /api/v1/generate HTTP/1.1" 200 -
Output generated in 116.96 seconds (0.50 tokens/s, 59 tokens, context 128, seed 755310477)
127.0.0.1 - - [07/May/2023 18:43:35] "POST /api/v1/generate HTTP/1.1" 200 -
I've edited the .env file with my openAI API key and configured the agent via the webUI, I've tried the default gpt-3.5-turbo too, but nothing happens if I try to do any of the Chat, Instruct, or Task agent commands.
Any ideas :3
Also, I do have access to the gpt-4 api, I can successfully use it with Auto-GPT.
Install the latest alpha version or the main version and follow the docker install instructions.
Expecting some type of response back, but get nothing after trying the chat, instruct, or task agent commands.
I click on "start pursing task" for example and nothing happens.
When I start frontend this error appears
error - unhandledRejection: Error: connect ECONNREFUSED ::1:7437
at AxiosError.from (file:///Users/timur/Downloads/Agent-LLM-main/frontend/node_modules/axios/lib/core/AxiosError.js:89:14)
at RedirectableRequest.handleRequestError (file:///Users/timur/Downloads/Agent-LLM-main/frontend/node_modules/axios/lib/adapters/http.js:591:25)
at RedirectableRequest.emit (node:events:513:28)
at eventHandlers. (/Users/timur/Downloads/Agent-LLM-main/frontend/node_modules/follow-redirects/index.js:14:24)
at ClientRequest.emit (node:events:513:28)
at Socket.socketErrorListener (node:_http_client:502:9)
at Socket.emit (node:events:513:28)
at emitErrorNT (node:internal/streams/destroy:151:8)
at emitErrorCloseNT (node:internal/streams/destroy:116:3)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
digest: undefined
}
Promise { }
{
mutate: [Function (anonymous)],
data: [Getter],
error: [Getter],
isValidating: [Getter],
isLoading: [Getter]
}
error - unhandledRejection: Error: connect ECONNREFUSED ::1:7437
at AxiosError.from (file:///Users/timur/Downloads/Agent-LLM-main/frontend/node_modules/axios/lib/core/AxiosError.js:89:14)
at RedirectableRequest.handleRequestError (file:///Users/timur/Downloads/Agent-LLM-main/frontend/node_modules/axios/lib/adapters/http.js:591:25)
at RedirectableRequest.emit (node:events:513:28)
at eventHandlers. (/Users/timur/Downloads/Agent-LLM-main/frontend/node_modules/follow-redirects/index.js:14:24)
at ClientRequest.emit (node:events:513:28)
at Socket.socketErrorListener (node:_http_client:502:9)
at Socket.emit (node:events:513:28)
at emitErrorNT (node:internal/streams/destroy:151:8)
at emitErrorCloseNT (node:internal/streams/destroy:116:3)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
digest: undefined
}
Start backend
Start frontend
Go to the agent page
No errors
This error occurs.
No response
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.