Code Monkey home page Code Monkey logo

Comments (10)

KillianLucas avatar KillianLucas commented on May 19, 2024 2

Hi @Ftrybe! Yes, have you used it in Python before? I'm seeing this in the documentation:

import os
import openai
openai.api_type = "azure"
openai.api_version = "2023-05-15" 
openai.api_base = os.getenv("OPENAI_API_BASE")  # Your Azure OpenAI resource's endpoint value.
openai.api_key = os.getenv("OPENAI_API_KEY")

response = openai.ChatCompletion.create(
    engine="gpt-35-turbo", # The deployment name you chose when you deployed the GPT-35-Turbo or GPT-4 model.
    messages=[
        {"role": "system", "content": "Assistant is a large language model trained by OpenAI."},
        {"role": "user", "content": "Who were the founders of Microsoft?"}
    ]
)

print(response)

print(response['choices'][0]['message']['content'])

Is this the most minimal possible use of it? Seems like a lot. Let me know if you can use it by just switching the api_type, api_base and engine. I wouldn't want to lock us into that specific api version or something, if it isn't necessary.


For Open Interpreter, I could see this being implemented in this way:

interpreter.model = "azure-gpt-35-turbo" # "azure-" followed by your deployment name
interpreter.api_base = os.getenv("OPENAI_API_BASE") # Azure OpenAI resource's endpoint value
interpreter.api_key = os.getenv("OPENAI_API_KEY") # Same as usual

Then on the CLI:

interpreter --model azure-gpt-35-turbo --api_base ... --api_key ...

Or you could edit interpreter's config file with interpreter config, which would let you define a YAML with these defaults, so it ran with your Azure deployment every time.

None of the above is implemented, I just wanted to run it by you before building it. Does that seem intuitive/easy to use?

Thanks again!
K

from open-interpreter.

KillianLucas avatar KillianLucas commented on May 19, 2024 2

Azure Update

To anyone searching for this, we have a new way of connecting to Azure! ↓

https://docs.openinterpreter.com/language-model-setup/hosted-models/azure

from open-interpreter.

KillianLucas avatar KillianLucas commented on May 19, 2024 1

@Vybo @Ftrybe @nick917 and @SimplyJuanjo:

Happy to report that Open Interpreter now supports Azure deployments, thanks to the incredible @ifsheldon (many thanks, great work feng!)

Read the Azure integration docs here.

To use, simply upgrade your Interpreter then run --use-azure:

pip install --upgrade open-interpreter
interpreter --use-azure

from open-interpreter.

Ftrybe avatar Ftrybe commented on May 19, 2024

Thanks.

from open-interpreter.

Vybo avatar Vybo commented on May 19, 2024

I was looking for Azure support as well and the suggested functionality looks awesome, thank you as well!

from open-interpreter.

nick917 avatar nick917 commented on May 19, 2024

I think it should work using azure API. I have tried:

openai.api_key = os.getenv("AZURE_OPENAI_KEY")
openai.api_version = "2023-08-01-preview"
openai.api_type = "azure"
openai.api_base = os.getenv("AZURE_OPENAI_ENDPOINT")

response = openai.ChatCompletion.create(
        engine='gpt-35-turbo', # can be replaced by gpt 4
        messages=messages,
        functions=[function_schema],
        function_call="auto",
        stream=True,
        temperature=self.temperature,
      )

print(response['choices'][0]['message']['content']) does not work if stream=True which causes response to be a generator.

There is more work to do after line 340 in `interpreter.py' since the code is not compatible.

Great work. keep it up.

from open-interpreter.

ifsheldon avatar ifsheldon commented on May 19, 2024

+1. I am also looking into the code, hopefully I can implement this and make a PR when I have some time.

Then on the CLI:
interpreter --model azure-gpt-35-turbo --api_base ... --api_key ...

@KillianLucas Probably we just need a flag for the interpreter? Say,
interpreter --use-azure then on the first launch, instead of asking whether to use GPT4 or Code-Llama, we can just ask the API Key, API base and deployment name(engine name).

from open-interpreter.

SimplyJuanjo avatar SimplyJuanjo commented on May 19, 2024

Yeh, this one would be amazing, since some could have access to 32k API via Azure easily than with OpenAI.

Planned support for 32k GPT4 or LLaMA models?

from open-interpreter.

Ftrybe avatar Ftrybe commented on May 19, 2024

In the latest code, I noticed that you have added configurations for Azure GPT, including a configurable azure_api_base. Would you consider renaming it to a more generic term? I'd like the flexibility to access it through third-party API endpoints, such as those hosted by the one-api service. Thanks.

from open-interpreter.

KillianLucas avatar KillianLucas commented on May 19, 2024

@Ftrybe absolutely! On the roadmap. We're trying to figure out a unified --model command that should connect to an LLM across one-api, HuggingFace, localhost, etc.

from open-interpreter.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.