Code Monkey home page Code Monkey logo

prompt-layer-js's Introduction

🍰 PromptLayer

The first platform built for prompt engineers

Node Docs Demo with Loom


PromptLayer is the first platform that allows you to track, manage, and share your GPT prompt engineering. PromptLayer acts a middleware between your code and OpenAI’s JavaScript library.

PromptLayer records all your OpenAI API requests, allowing you to search and explore request history in the PromptLayer dashboard.

This repo contains the JavaScript wrapper library for PromptLayer.

Quickstart ⚑

Install PromptLayer

npm install promptlayer

Installing PromptLayer Locally

Use npm install . to install locally.

Using PromptLayer

To get started, create an account by clicking β€œLog in” on PromptLayer. Once logged in, click the button to create an API key and save this in a secure location (Guide to Using Env Vars).

export OPENAI_API_KEY=sk_xxxxxx
export PROMPTLAYER_API_KEY=pl_xxxxxx

Once you have that all set up, install PromptLayer using npm.

In the JavaScript file where you use OpenAI APIs, add the following. This allows us to keep track of your requests without needing any other code changes.

import BaseOpenAI from "openai";
import { PromptLayer } from "promptlayer";

const promptlayer = new PromptLayer({
  apiKey: process.env.PROMPTLAYER_API_KEY,
});
// Typescript
const OpenAI: typeof BaseOpenAI = promptlayer.OpenAI;
const openai = new OpenAI();

You can then use openai as you would if you had imported it directly.

πŸ’‘ Your OpenAI API Key is **never** sent to our servers. All OpenAI requests are made locally from your machine, PromptLayer just logs the request.

Adding PromptLayer tags: pl_tags

PromptLayer allows you to add tags through the pl_tags argument. This allows you to track and group requests in the dashboard.

Tags are not required but we recommend them!

openai.chat.completions.create({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "gpt-3.5-turbo",
  // @ts-ignore
  pl_tags: ["test"],
});

Returning request id: return_pl_id

PromptLayer allows you to return the request id through the return_pl_id argument. When you set this to true, a tuple is returned with the request id as the second element.

openai.chat.completions.create({
  messages: [{ role: "user", content: "Say this is a test" }],
  model: "gpt-3.5-turbo",
  // @ts-ignore
  return_pl_id: true,
});
Notice the `ts-ignore` comment. This is because the `pl_tags` and `return_pl_id` arguments are not part of the OpenAI API. We are working on a way to make this more seamless.

After making your first few requests, you should be able to see them in the PromptLayer dashboard!

Contributing

We welcome contributions to our open source project, including new features, infrastructure improvements, and better documentation. For more information or any questions, contact us at [email protected].

Requirements

  • Node.js 18.x or higher

prompt-layer-js's People

Contributors

abubakarsohail avatar jped avatar buckyroberts avatar dependabot[bot] avatar

Stargazers

Albert Lie μ΄μ˜λ• avatar  avatar  avatar Misha Milovidov avatar

Watchers

 avatar  avatar

prompt-layer-js's Issues

fix native module not found in nextjs edge runtime

Steps to Reproduce

  1. Create a new nextjs app following the instructions here
  2. Install promptlayer using the instructions here

Expected Output
You are able to log requests on promptlayer using the code snippets from promptlayer documentation

Actual Output
You get Error [TypeError]: Native module not found error

For more details read here

Cannot import library in SvelteKit/TypeScript

I started integrating PromptLayer and got this error.

Error when evaluating SSR module /Users/......+server.ts:

Internal server error: Failed to resolve entry for package "promptlayer". The package may have incorrect main/module/exports specified in its package.json.

This happens when I start importing PromptLayer using this

import { promptlayer } from "promptlayer";

This happens even if the file is just importing PromptLayer. Using SvelteKit + TypeScript

proxy and log anthropic requests

With this task, we should be able to proxy and log requests made via anthropic node package. Like openai, this should be transparent to the user and should not affect in any way they use anthropic package.

fix return_pl_id while streaming on javascript library

Steps to Reproduce

  1. Make a streaming request to openai or anthropic
  2. Make sure you pass in return_pl_id: true and stream: true

Expected Result
You get a tuple back with first index being the generator and second being the request_id from PromptLayer. request_id is populated on last iteration of for loop

Actual Result
You get an error

Streaming results from tool calls not working

  openai.chat.completions.create(openAIOptions).then(async (completion: any) => {
    for await (const part of completion) {
      if (Array.isArray(part)) {
        const [chunk, pl_id] = part;
        console.log('----')
        console.log(chunk);
        console.log('-')
        console.log(chunk.choices[0].delta)
        console.log(pl_id);
      } else {
        console.log(part);
        console.log(part.choices[0].delta.tool_calls[0].function)
      }
    }
  });
const cleaned_result = (results: any[]) => {
  if ("completion" in results[0])
    return results.reduce(
      (prev, current) => ({
        ...current,
        completion: `${prev.completion}${current.completion}`,
      }),
      {}
    );
  if ("text" in results[0].choices[0]) {
    let response = "";
    for (const result of results) {
      response = `${response}${result.choices[0].text}`;
    }
    const final_result = structuredClone(results.at(-1));
    final_result.choices[0].text = response;
    return final_result;
  } else if ("delta" in results[0].choices[0]) {
    let response = { role: "", content: "" };
    for (const result of results) {
      if ("role" in result.choices[0].delta) {
        response.role = result.choices[0].delta.role;
      }
      if ("content" in result.choices[0].delta) {
        response.content = `${response["content"]}${result.choices[0].delta.content}`;
      }
    }
    const final_result = structuredClone(results.at(-1));
    final_result.choices[0] = response;
    return final_result;
  }
  return "";
};

publish prompt template

Just like our python sdk, we need to expose a way for users to publish prompt templates. There is going to be one key difference with this one i.e. we will not depend on langchain at all.

add promptlayer run that allows you to run LLM requests using prompt registry

Expose a new function called run that provided the following attributes

prompt_name
version
label
input_variables
tags
metadata
group_id
client (Optional)
does the following steps in order

calls templates.get with the relevant attributes
use the type information to decide which functions to call
track the request after response is received
return the response (id, response, raw_response)

update minimum requirements

The purpose of this task is to update the minimum requirements to use latest LTS version of Node i.e. 18.x.

fix lazy loading issue in nextjs

Steps to Reproduce

  1. Create a new nextjs app following the instructions here
  2. Install promptlayer using the instructions here

Expected Output
You are able to log requests on promptlayer using the code snippets from promptlayer documentation

Actual Output
You get MODULE_NOT_FOUND error

For more details read here

proxy openai requests from js library

This particular task will deal with proxying openai npm package and ensure that requests made through proxied object are logged on promptlayer. This change should enable the customer to only change the following line of code

import OpenAI from "openai";
import { OpenAI } from "promptlayer";

get all prompt templates

We also want to expose a function which users can use to fetch all the prompt templates they have in our system. This will also behave similar to our python sdk.

expose a way to get prompt template

We already have exposed a method to get prompt template by name in our python sdk. We need a replica of the same functionality in our js library.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.