magnivorg / prompt-layer-js Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
return_pl_id: true
and stream: true
Expected Result
You get a tuple back with first index being the generator and second being the request_id
from PromptLayer. request_id
is populated on last iteration of for loop
Actual Result
You get an error
With this task, we should be able to proxy and log requests made via anthropic node package. Like openai, this should be transparent to the user and should not affect in any way they use anthropic package.
We have exposed multiple functions in our python sdk
We need a replica of such functions in our javascript sdk.
promptlayer
using the instructions hereExpected Output
You are able to log requests on promptlayer using the code snippets from promptlayer documentation
Actual Output
You get Error [TypeError]: Native module not found error
For more details read here
console.log
the response of each chunkExpected Result
All the chunks are successfully logged without any error
Actual Result
Last chunk gives undefined
error
For more details read here
The purpose of this task is to add support for the new endpoint /prompt-templates
for fetching prompt templates in standard shape. Read more about the endpoint here
Simmilar to the support for the python library we should add support for the javascript library as well.
With this task, the goal is to have an interface that allows the user to track requests using our rest endpoint.
I started integrating PromptLayer and got this error.
Error when evaluating SSR module /Users/......+server.ts:
Internal server error: Failed to resolve entry for package "promptlayer". The package may have incorrect main/module/exports specified in its package.json.
This happens when I start importing PromptLayer using this
import { promptlayer } from "promptlayer";
This happens even if the file is just importing PromptLayer. Using SvelteKit + TypeScript
promptlayer
using the instructions hereExpected Output
You are able to log requests on promptlayer using the code snippets from promptlayer documentation
Actual Output
You get MODULE_NOT_FOUND
error
For more details read here
With this task, our proxy around openai package should be able to support streaming responses and log the cumulative response on PL as well. Also, as part of this task, our proxy should be transparent to the end user.
This particular task will deal with proxying openai npm package and ensure that requests made through proxied object are logged on promptlayer. This change should enable the customer to only change the following line of code
import OpenAI from "openai";
import { OpenAI } from "promptlayer";
The purpose of this task is to create a script for logging requests made via google gemini. This task will include serialization of data to be sent over network.
We also want to expose a function which users can use to fetch all the prompt templates they have in our system. This will also behave similar to our python sdk.
We already have exposed a method to get prompt template by name in our python sdk. We need a replica of the same functionality in our js library.
Allow users to pass in metadata_filter for fetching prompt templates dynamically.
Just like our python sdk, we need to expose a way for users to publish prompt templates. There is going to be one key difference with this one i.e. we will not depend on langchain at all.
Currently, whenever we log streamed requests, we are defining a new structure and send it to the server for logging purposes. It should send the data in standard completion
or chat
model whichever is applicable.
openai.chat.completions.create(openAIOptions).then(async (completion: any) => {
for await (const part of completion) {
if (Array.isArray(part)) {
const [chunk, pl_id] = part;
console.log('----')
console.log(chunk);
console.log('-')
console.log(chunk.choices[0].delta)
console.log(pl_id);
} else {
console.log(part);
console.log(part.choices[0].delta.tool_calls[0].function)
}
}
});
const cleaned_result = (results: any[]) => {
if ("completion" in results[0])
return results.reduce(
(prev, current) => ({
...current,
completion: `${prev.completion}${current.completion}`,
}),
{}
);
if ("text" in results[0].choices[0]) {
let response = "";
for (const result of results) {
response = `${response}${result.choices[0].text}`;
}
const final_result = structuredClone(results.at(-1));
final_result.choices[0].text = response;
return final_result;
} else if ("delta" in results[0].choices[0]) {
let response = { role: "", content: "" };
for (const result of results) {
if ("role" in result.choices[0].delta) {
response.role = result.choices[0].delta.role;
}
if ("content" in result.choices[0].delta) {
response.content = `${response["content"]}${result.choices[0].delta.content}`;
}
}
const final_result = structuredClone(results.at(-1));
final_result.choices[0] = response;
return final_result;
}
return "";
};
According to the docs requests with tool use don't support streaming at the moment. But, when anthropic tool use functionality goes GA, we should add support for it.
The purpose of this task is to update the minimum requirements to use latest LTS version of Node i.e. 18.x.
Expose a new function called run that provided the following attributes
prompt_name
version
label
input_variables
tags
metadata
group_id
client (Optional)
does the following steps in order
calls templates.get with the relevant attributes
use the type information to decide which functions to call
track the request after response is received
return the response (id, response, raw_response)
Expose release_filters
TBD??? and use same metadata
on promptlayer.run
to filter templates
@abubakarsohail has started working on this task.
Our goal is to have a first version that mirrors the functionality of our current python library.
The abstractions should model the python library and should be functionally complete with it.
There is no mapping available for anthropic messages in JavaScript library.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.