Code Monkey home page Code Monkey logo

openai.BadRequestError: Error code: 400 - {'error': {'message': 'Range of input length should be [1, 6000] (request id: XXXXXXXXX)', 'type': 'upstream_error', 'param': '400', 'code': 'bad_response_status_code'}} about scrapegraph-ai HOT 3 CLOSED

wangdongpeng1 avatar wangdongpeng1 commented on June 21, 2024
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Range of input length should be [1, 6000] (request id: XXXXXXXXX)', 'type': 'upstream_error', 'param': '400', 'code': 'bad_response_status_code'}}

from scrapegraph-ai.

Comments (3)

VinciGit00 avatar VinciGit00 commented on June 21, 2024

你能把产生错误的代码发给我吗?

from scrapegraph-ai.

wangdongpeng1 avatar wangdongpeng1 commented on June 21, 2024

测试用例代码如下:

from scrapegraphai.graphs import SmartScraperGraph
from scrapegraphai.utils import prettify_exec_info

# ************************************************
# Define the configuration for the graph
# ************************************************

graph_config = {
    "llm": {
        "api_key": "<YOUR API KEY>",
        "model": "oneapi/qwen-turbo",  #  可以是其他能推理的模型
        "base_url": "http://127.0.0.1:13000/v1",  
    },
    "embeddings": {
        "model": "ollama/nomic-embed-text",
        "base_url": "http://127.0.0.1:11434",  # 设置 Ollama URL
    },
    "headless": False,
    "loader_kwargs": {
        "slow_mo": 10000
    }
    
}

# ************************************************
# Create the SmartScraperGraph instance and run it
# ************************************************

smart_scraper_graph = SmartScraperGraph(
    prompt="请提取信息项。",
    source="https://search.bilibili.com/article?vt=58474810&keyword=%E6%B5%B7%E5%A4%96%E7%89%88",
    config=graph_config,
)

result = smart_scraper_graph.run()
print(result)

# ************************************************
# Get graph execution info
# ************************************************

graph_exec_info = smart_scraper_graph.get_execution_info()
print(prettify_exec_info(graph_exec_info))

完整报错信息:

Traceback (most recent call last):
  File "e:\PythonProjects\myLLMs\scrapegraphaiScript.py", line 36, in <module>
    result = smart_scraper_graph.run()
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\scrapegraphai\graphs\smart_scraper_graph.py", line 118, in run
    self.final_state, self.execution_info = self.graph.execute(inputs)
                                            ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\scrapegraphai\graphs\base_graph.py", line 171, in execute
    return self._execute_standard(initial_state)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\scrapegraphai\graphs\base_graph.py", line 110, in _execute_standard
    result = current_node.execute(state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\scrapegraphai\nodes\generate_answer_node.py", line 136, in execute
    answer = single_chain.invoke({"question": user_prompt})
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\langchain_core\runnables\base.py", line 2499, in invoke
    input = step.invoke(
            ^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\langchain_core\language_models\chat_models.py", line 158, in invoke
    self.generate_prompt(
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\langchain_core\language_models\chat_models.py", line 560, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\langchain_core\language_models\chat_models.py", line 421, in generate
    raise e
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\langchain_core\language_models\chat_models.py", line 411, in generate
    self._generate_with_cache(
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\langchain_core\language_models\chat_models.py", line 632, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\langchain_openai\chat_models\base.py", line 522, in _generate
    response = self.client.create(messages=message_dicts, **params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\openai\_utils\_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\openai\resources\chat\completions.py", line 590, in create
    return self._post(
           ^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\openai\_base_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\openai\_base_client.py", line 921, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "D:\anaconda3\envs\scrapegraph-ai\Lib\site-packages\openai\_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Range of input length should be [1, 6000] (request id: 2024053005451270981315993043381)', 'type': 'upstream_error', 'param': '400', 'code': 'bad_response_status_code'}}

在看到你回复的消息后,我找的一个新测试用例,第一次回调时报了502,如果你也出现的话可以忽略该报错,400的报错应该在下一次就会出现,graph_config 中的llm配置参数调用gpt3.5-turbo时应该也能复现问题。

from scrapegraph-ai.

VinciGit00 avatar VinciGit00 commented on June 21, 2024

请分享您使用 get 时的输出。
请安装版本 1.5.3b1 并告诉我们

from scrapegraph-ai.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.