Code Monkey home page Code Monkey logo

scalaone / azure-openai-proxy Goto Github PK

View Code? Open in Web Editor NEW
134.0 5.0 29.0 649 KB

A tool that transforms OpenAI API requests into Azure OpenAI API requests, allowing OpenAI-compatible applications to seamlessly use Azure OpenAI. 一个 OpenAI API 的代理工具,能将 OpenAI API 请求转为 Azure OpenAI API 请求,从而让只支持 OpenAI 的应用程序无缝使用 Azure OpenAI。

Home Page: https://gptlite.vercel.app

License: MIT License

JavaScript 0.33% TypeScript 96.27% Dockerfile 3.41%
azure-openai azure-openai-proxy azureopenai chatgpt nextjs nextjs13 gpt-4 gpt-4-32k gpt-35-turbo

azure-openai-proxy's Introduction

Azure OpenAI Proxy

English | 简体中文

Azure OpenAI Proxy is a tool that transforms OpenAI API requests into Azure OpenAI API requests, allowing OpenAI-compatible applications to seamlessly use Azure Open AI.

Prerequisites

An Azure OpenAI account is required to use Azure OpenAI Proxy.

Azure Deployment

Deploy to Azure

Remember to:

  • Select the region that matches your Azure OpenAI resource for best performance.
  • If deployment fails because the 'proxywebapp' name is already taken, change the resource prefix and redeploy.
  • The deployed proxy app is part of a B1 pricing tier Azure web app plan, which can be modified in the Azure Portal after deployment.

Docker Deployment

To deploy using Docker, execute the following command:

docker run -d -p 3000:3000 scalaone/azure-openai-proxy

Local Execution and Testing

Follow these steps:

  1. Install NodeJS 20.
  2. Clone the repository in the command line window.
  3. Run npm install to install the dependencies.
  4. Run npm start to start the application.
  5. Use the script below for testing. Replace AZURE_RESOURCE_ID, AZURE_MODEL_DEPLOYMENT, and AZURE_API_KEY before running. The default value for AZURE_API_VERSION is 2024-02-01 and is optional.
Test script ```bash curl -X "POST" "http://localhost:3000/v1/chat/completions" \ -H 'Authorization: AZURE_RESOURCE_ID:AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY:AZURE_API_VERSION' \ -H 'Content-Type: application/json; charset=utf-8' \ -d $'{ "messages": [ { "role": "system", "content": "You are an AI assistant that helps people find information." }, { "role": "user", "content": "hi." } ], "temperature": 1, "model": "gpt-3.5-turbo", "stream": false }' ```

Tested Applications

The azure-openai-proxy has been tested and confirmed to work with the following applications:

Application Name Docker-compose File for E2E Test
chatgpt-lite docker-compose.yml
chatgpt-minimal docker-compose.yml
chatgpt-next-web docker-compose.yml
chatbot-ui docker-compose.yml
chatgpt-web docker-compose.yml

To test locally, follow these steps:

  1. Clone the repository in a command-line window.
  2. Update the OPENAI_API_KEY environment variable with AZURE_RESOURCE_ID:AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY. Alternatively, update the OPENAI_API_KEY value in the docker-compose.yml file directly.
  3. Navigate to the directory containing the docker-compose.yml file for the application you want to test.
  4. Execute the build command: docker-compose build.
  5. Start the service: docker-compose up -d.
  6. Access the application locally using the port defined in the docker-compose.yml file. For example, visit http://localhost:3000.

FAQs

Q: What are `AZURE_RESOURCE_ID`,`AZURE_MODEL_DEPLOYMENT`, and `AZURE_API_KEY`? A: These can be found in the Azure management portal. See the image below for reference: ![resource-and-model](./docs/images/resource-and-model.jpg)
Q: How can I use gpt-4 and gpt-4-32k models? A: To use gpt-4 and gpt-4-32k models, follow the key format below: `AZURE_RESOURCE_ID:gpt-3.5-turbo|AZURE_MODEL_DEPLOYMENT,gpt-4|AZURE_MODEL_DEPLOYMENT,gpt-4-32k|AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY:AZURE_API_VERSION`

Contributing

We welcome all PR submissions.

azure-openai-proxy's People

Contributors

blrchen avatar haha1903 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

azure-openai-proxy's Issues

trouble using stream mode

I've trouble using stream mode. Could you please provide an example client that demonstrates how to use Stream mode effectively?

Dockerfile build Error

 => CACHED [deps 3/6] RUN apk add --no-cache libc6-compat                                                                                                                                                       0.0s
 => CACHED [deps 4/6] WORKDIR /app                                                                                                                                                                              0.0s
 => CACHED [deps 5/6] COPY package.json package-lock.json* ./                                                                                                                                                   0.0s
 => ERROR [deps 6/6] RUN npm ci                                                                                                                                                                                 0.2s
------
 > [deps 6/6] RUN npm ci:
------
Dockerfile:15
--------------------
  13 |     # Install dependencies based on the preferred package manager
  14 |     COPY package.json package-lock.json* ./
  15 | >>> RUN npm ci
  16 |     
  17 |     # Rebuild the source code only when needed
--------------------
ERROR: failed to solve: process "/bin/sh -c npm ci" did not complete successfully: exit code: 139
ERROR: Service 'azure-openai-proxy' failed to build : Build failed

Intermittent 'ECONNRESET' Errors

Reviewing the past month's log in my deployment, I've noticed intermittent occurrences of "Error in getCompletions: Error: read ECONNRESET" from client apps. This error typically arises when the connection between the proxy and the Azure OpenAI server unexpectedly closes during request processing. The "ECONNRESET" message signifies that the server abruptly closed the connection.

To address this, the code should be enhanced to handle such situations appropriately. This could involve adding a retry mechanism for network issues or implementing a timeout.

Azure GPT-4 model overloaded requests error

When invoking Azure GPT 4 models, you may encounter an error message stating That model is currently overloaded with other requests. You can retry your request, or contact us through an Azure support request. In such cases, the proxy should automatically retry.

请求时出现resource not found

在其他支持openai的插件上使用时,出现resource not found。我在插件中只填写了azure的密钥和endpoint,但是出现了这个问题,是需要添加其他的配置请求吗?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.