Code Monkey home page Code Monkey logo

parea-ai / parea-sdk-py Goto Github PK

View Code? Open in Web Editor NEW
40.0 2.0 3.0 6.75 MB

Python SDK for experimenting, testing, evaluating & monitoring LLM-powered applications - Parea AI (YC S23)

Home Page: https://docs.parea.ai/sdk/python

License: Apache License 2.0

Makefile 0.79% Dockerfile 0.19% Python 90.55% Jupyter Notebook 8.47%
llm llm-evaluation llm-tools llmops llms-benchmarking llm-eval llm-evaluation-framework llm-evaluation-toolkit prompt-engineering generative-ai

parea-sdk-py's People

Contributors

dependabot[bot] avatar jalexanderii avatar joschkabraun avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

parea-sdk-py's Issues

Parea wrapper not re-raising root exception

๐Ÿ› Bug Report

The Parea wrapper code returns in the finally block which swallows the exception caught when actually calling the OpenAI models. This makes things very difficult to log/monitor/debug as the root exception is swallowed by Parea and failures occur downstream.

Link to offending code.

return self._cleanup_trace(trace_id, start_time, error, cache_hit, args, kwargs, response)

๐Ÿ”ฌ How To Reproduce

Steps to reproduce the behavior:

  1. I'm not sure how to repro an OpenAI failure but it can be monkey patched if needed. Looking at the offending code probably provides all the context necessary

Code sample

Try running this function. 1 is returned and the exception isn't raised

def run():
    try:
        raise Exception("bad")
    except Exception as e:
        print(e)
        raise e
    finally:
        return 1

Environment

  • OS: MacOS apple silicon
  • Python version: 3.9
python --version

๐Ÿ“ˆ Expected behavior

The error isn't swallowed by Parea and is surfaced to the consumer of the Open API call.

๐Ÿ“Ž Additional context

I ran into this using Langchain with the following (abbreviated) code

    llm = ChatOpenAI(
        openai_api_key=openai_api_key,
        temperature=0,
        model_name=model,
        model_kwargs=llm_kwargs,
        max_retries=3
    )
    response = llm([HumanMessage(content="model query here...any will work")

This became relevant on the OpenAI outage on 2023-10-19

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.