Code Monkey home page Code Monkey logo

Comments (19)

AHerik avatar AHerik commented on July 26, 2024 2

Hi @rotemweiss57 Thank you so much for the prompt reply! Indeed, I'm using GPT-3 as I cannot access the gpt-4 api currently due to whatever limit OpenAI has. Thank you for helping me with the issue, the package works excellently. I'm going to be testing it as a way to compile sources for a manuscript that I'm working on. Will update you on how well it works, and hopefully I will be able to use GPT-4 soon.

Thanks again!

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024 1

Hi @AHerik of course! Do you currently have gpt-4 working? or are you using gpt-3?

If you already changed the config file to gpt3 (based on your error I assume you have), All you need to do is to go to /agent/prompts.py and in that file look for the function "generate_search_queries_prompt(question)"

Inside that function, modify this line: "You must respond with a list of strings in the following format: ["query 1", "query 2", "query 3", "query 4"]"

to: "You must respond only with a list of strings in the following json format: ["query 1", "query 2", "query 3", "query 4"]"

Again, it is best to use gpt 4, but it should be stable enough.

Hope it helps!

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024

Yeah it shows no access. they said "we gave all API users who have a history of successful payments access to the GPT-4 API (8k). We plan to open up access to new developers by the end of July 2023".

In the meanwhile you can change it to gpt3.5 in the config file just to see it runs.

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

I made the following change in the config file:

-        self.smart_llm_model = os.getenv("SMART_LLM_MODEL", "gpt-4")
+        self.smart_llm_model = os.getenv("SMART_LLM_MODEL", "gpt-3.5-turbo-16k")

But I get the following error:

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/ec2-user/.local/lib/python3.9/site-packages/uvicorn/protocols/websockets/wsproto_impl.py", line 249, in run_asgi
    result = await self.app(self.scope, self.receive, self.send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/fastapi/applications.py", line 289, in __call__
    await super().__call__(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/middleware/errors.py", line 149, in __call__
    await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/routing.py", line 341, in handle
    await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/routing.py", line 82, in app
    await func(session)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/fastapi/routing.py", line 324, in app
    await dependant.call(**values)
  File "/home/ec2-user/gpt-researcher/main.py", line 50, in websocket_endpoint
    await manager.start_streaming(task, report_type, agent, websocket)
  File "/home/ec2-user/gpt-researcher/agent/run.py", line 38, in start_streaming
    report, path = await run_agent(task, report_type, agent, websocket)
  File "/home/ec2-user/gpt-researcher/agent/run.py", line 43, in run_agent
    check_openai_api_key()
  File "/home/ec2-user/gpt-researcher/config/config.py", line 82, in check_openai_api_key
    exit(1)
  File "/usr/lib64/python3.9/_sitebuiltins.py", line 26, in __call__
    raise SystemExit(code)
SystemExit: 1

I can't see what is causing this in the code.

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024

Please do the following:

      self.fast_llm_model = os.getenv("FAST_LLM_MODEL", "gpt-3.5-turbo-16k")
      self.smart_llm_model = os.getenv("SMART_LLM_MODEL", "gpt-3.5-turbo-16k")

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

That's what I did. I also found that gpt-4 is referenced in research_agent.py, and changed it there as well. Now I'm getting the following error:

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/ec2-user/.local/lib/python3.9/site-packages/uvicorn/protocols/websockets/wsproto_impl.py", line 249, in run_asgi
    result = await self.app(self.scope, self.receive, self.send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/fastapi/applications.py", line 289, in __call__
    await super().__call__(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/middleware/errors.py", line 149, in __call__
    await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/ec2-user/.local/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/routing.py", line 341, in handle
    await self.app(scope, receive, send)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/starlette/routing.py", line 82, in app
    await func(session)
  File "/home/ec2-user/.local/lib/python3.9/site-packages/fastapi/routing.py", line 324, in app
    await dependant.call(**values)
  File "/home/ec2-user/gpt-researcher/main.py", line 50, in websocket_endpoint
    await manager.start_streaming(task, report_type, agent, websocket)
  File "/home/ec2-user/gpt-researcher/agent/run.py", line 39, in start_streaming
    report, path = await run_agent(task, report_type, agent, websocket)
  File "/home/ec2-user/gpt-researcher/agent/run.py", line 53, in run_agent
    await assistant.conduct_research()
  File "/home/ec2-user/gpt-researcher/agent/research_agent.py", line 133, in conduct_research
    search_queries = await self.create_search_queries()
  File "/home/ec2-user/gpt-researcher/agent/research_agent.py", line 91, in create_search_queries
    return json.loads(result)
  File "/usr/lib64/python3.9/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/usr/lib64/python3.9/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 81)

Looks like gpt-3.5 is returning something that doesn't conform to json format? Seems strange. I can't see what is being decoded, I'm not proficient in python and so far my efforts to debug this using print() statements hasn't been successful, stdout seems to be redirected somewhere. I also tried import logging; logging.info() but the output is also getting swallowed somewhere.

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024

Yes, if you look at prompts.py at generate_search_queries_prompt, it asks it to produce the list of queries in certain format. it is completely stable with gpt4 but not with gpt3.5 . I will dig into modifying the prompt to be more stable with gpt3.5. in the meanwhile try to run it again

add a print here to debug it:
async def create_search_queries(self):
""" Creates the search queries for the given question.
Args: None
Returns: list[str]: The search queries for the given question
"""
result = await self.call_agent(prompts.generate_search_queries_prompt(self.question))
print(result)
await self.websocket.send_json({"type": "logs", "output": f"🧠 I will conduct my research based on the following queries: {result}..."})
return json.loads(result)

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

I added

print(f"result = {result}", file=sys.stderr, flush=True)

after self.call_agent and got what looks like an invalid json string:

result = ["<query1>"] ["<query2>"] ["<query3>"] ["<query4>"]

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024

this edit to the prompt is slightly more stable. Please edit generate_search_queries_prompt in the meanwhile:

You must respond only with a list of strings in the following json format: ["query 1", "query 2", "query 3", "query 4"]

Thank you!

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

That helped, now I'm getting "cannot find Chrome binary", I'll install that now and report back.

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

Hmm, it didn't throw an error, but it generated an empty report:

Agent Output

Thinking about research questions for the task...
Total research words: 679
Writing research_report for research task: <my input>
End time: 2023-07-10 23:04:24.691949
Total run time: 0:00:01.192128

Research Report

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024

Please empty the output directory and run again

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

Now it started downloading things for the research, but got this error:

An error occurred while processing the url <url>: [Errno 26] Text file busy: '/home/ec2-user/.wdm/drivers/chromedriver/linux64/114.0.5735.90/chromedriver'

Just in case, this is running on an "Amazon-Linux" instance.

By the way, thanks for the hand-holding, you're awesome!

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024

Of course! can you try run the project locally?

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

I'm running it on an AWS instance, I can't run it locally, my mac is too old, brew doesn't support my OSX anymore and I can't install python3.

from gpt-researcher.

rotemweiss57 avatar rotemweiss57 commented on July 26, 2024

Got it. okay I will try play with an instance as well and update you. We haven't deployed it on an instance yet. If you have any insights feel free to share!

from gpt-researcher.

firerish avatar firerish commented on July 26, 2024

Cheaper to use docker, I don't want you spending money on my behalf. :)

(I should try that too, it's just that my drive is almost full, haha, I should change my mac)

from gpt-researcher.

AHerik avatar AHerik commented on July 26, 2024

Hi @firerish @rotemweiss57 ! I'm having the same issue:

json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 51)

I can run the project locally on my Mac, but I was unable to follow the modifications you suggested which would have potentially worked locally. Sorry about that, I'm not very experienced with package development. Would you please be able to suggest some changes I can make to the prompts.py to see if it's working on my end? Thanks for this wonderful package by the way, I saw it on LinkedIn and was immediately interested!

from gpt-researcher.

assafelovic avatar assafelovic commented on July 26, 2024

Happy this is resolved. Closing this thread for now

from gpt-researcher.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.