Code Monkey home page Code Monkey logo

Comments (9)

f-aguzzi avatar f-aguzzi commented on July 19, 2024 1

Two notes:

  • there's probably something in the request chunking module that got broken in a recent update. This is the second issue of this type (exceeding token size even when models are supported) in less than a week
  • the OpenAI errors appear because DeepSeek is invoked through the OpenAI module in LangChain. This is because LangChain does not provide direct support for DeepSeek, but DeepSeek models have an OpenAI-like API

from scrapegraph-ai.

VinciGit00 avatar VinciGit00 commented on July 19, 2024

hi, when it overlap 32k our algorithm create another api call, it should not be a problem.
If you have errors please send us the script and we can take a look

from scrapegraph-ai.

davideuler avatar davideuler commented on July 19, 2024

hi, when it overlap 32k our algorithm create another api call, it should not be a problem. If you have errors please send us the script and we can take a look

Thanks for your help, Vinci.

The following is the script, it is case of a boundary condition when User's Input is smaller than while very near to 32K, and I specified the max_tokens to 4096. When max_tokens not specified, it outputs only parts of the expected links cause running out of tokens. When I specify the max_tokens to 4096, it runs with error:

File "/Users/david/.pyenv/versions/3.10.13/envs/scraper/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
    return self._request(
  File "/Users/david/.pyenv/versions/3.10.13/envs/scraper/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'detail': "This model's maximum context length is 32768 tokens. However, you requested 36317 tokens (32221 in the messages, 4096 in the completion). Please reduce the length of the messages or completion."}

The script:

import os
from dotenv import load_dotenv
from scrapegraphai.graphs import SmartScraperGraph
from scrapegraphai.utils import prettify_exec_info

load_dotenv()

deepseek_key = os.getenv("DEEPSEEK_KEY") 
graph_config = {
   "llm": {
      "api_key": deepseek_key,
      "model": "deepseek-chat",
      "temperature": 0.7,
      "max_tokens": 4096, ## max output tokens limited to 4k for gpt-4o,gpt-4-turbo
      "base_url": "https://api.deepseek.com/v1"
   },
   "embeddings": {
      "model": "ollama/nomic-embed-text",
      "temperature": 0,
      "base_url": "http://localhost:11434",  # set ollama URL
   },
   "headless": False,
   "verbose": True,
}

# ************************************************
# Create the SmartScraperGraph instance and run it
# ************************************************

smart_scraper_graph = SmartScraperGraph(
   prompt="extract all page title and links under,footage-card-wrapper as format of: [{\"title\": \"xxx\", \"link\":\"xxx\" }] ",
   # also accepts a string with the already downloaded HTML code
   source="https://stock.xinpianchang.com/footages/2997636.html",
   config=graph_config
)

result = smart_scraper_graph.run()
print(result)

from scrapegraph-ai.

VinciGit00 avatar VinciGit00 commented on July 19, 2024

please update to the new version

from scrapegraph-ai.

davideuler avatar davideuler commented on July 19, 2024

please update to the new version

Thanks for your help. I updated to 1.7.3, and still got the same error:

openai.BadRequestError: Error code: 400 - {'detail': "This model's maximum context length is 32768 tokens. However, you requested 36294 tokens (32198 in the messages, 4096 in the completion). Please reduce the length of the messages or completion."}

from scrapegraph-ai.

VinciGit00 avatar VinciGit00 commented on July 19, 2024

why openai error? have you changed the provider?

from scrapegraph-ai.

davideuler avatar davideuler commented on July 19, 2024

The code has been pasted as above. The model I use is "deepseek-chat". I wonder what caused it to show openai errors.

from scrapegraph-ai.

GabrieleRisso avatar GabrieleRisso commented on July 19, 2024

I am experiencing the same issue using open ai models. As you said the chunking module is probably broken which is concerning since it results in an error.

from scrapegraph-ai.

VinciGit00 avatar VinciGit00 commented on July 19, 2024

Hi please update to the new beta version. If you still have this problem please reopen the issue

from scrapegraph-ai.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.