Code Monkey home page Code Monkey logo

chat-completions-function-calling-examples's Introduction

Function Calling Examples using the Chat Completions API

Description

This repository contains simple examples of using function calling with the Chat Completions API.

These basic examples in Python are designed to help those interested in learning about function calling. These examples serve as an introduction to the concept and its applications.

The repository includes examples of parallel and sequential function calling, generating prompt suggestions, conversation summarization, and timed activation of assistant behavior. These examples aim to provide a practical understanding of function calling.

For those seeking something more advanced, the repository also contains a more complex example that involves the integration of function calling, asynchronous programming, and streaming responses within a chat loop. It also covers the formatting and handling of server-client payloads, which is a crucial aspect of creating a chatbot experience.

If you are unfamiliar with function calling here are some docs to get acquainted:

Fundamental Steps for Function Calling:

  1. Call the model with the user query and a set of functions defined in the functions parameter.
  2. The model can choose to call a function; if so, the content will be a stringified JSON object adhering to your custom schema (note: the model may generate invalid JSON or hallucinate parameters).
  3. Parse the string into JSON in your code, and call your function with the provided arguments if they exist.
  4. Call the model again by appending the function response as a new message, and let the model summarize the results back to the user.

Files

  • func_get_weather.py: (Start here!) This is a simple program that has a single native function 'get_current_weather' defined. The model is made aware of this function. Given the user's input, it tells us to call the function/tool. Our code invokes our function and then we add the function's response back to the model, supplying it with additional context. Finally, the assistant responds to the user with the temperature in San Francisco, Tokyo, and Paris. This also utilizes parallel function calling.
  • func_get_weather_streaming.py: This is an example of how to stream the response from the model while also checking if the model wanted to make a function/tool call. It extends the 'func_get_weather' example.
  • func_conversation_history.py: This is a simple program that showcases some semantic functionality for: 1) summarizing conversation history, 2) providing prompt suggestions based on conversation history. This also shows how to utilize using JSON Mode.
  • funct_sequential_calls: This serves as an example of sequential function calling. In certain scenarios, achieving the desired output requires calling multiple functions in a specific order, where the output of one function becomes the input for another function. By giving the model adequate tools, context and instructions, it can achieve complex operations by breaking them down into smaller, more manageable steps.
  • func_timing_count_chat.py: This example shows how to Do 'X' every 'frequency'. Shows how to manage state outside the conversation. There is a function that increments a counter using function calling, counting user inputs before the assistant says something specific to a user. Also shows how to do something once every week by checking if it has been a week and then editing system prompt.
  • func_async_streaming_chat.py: an example script that demonstrates handling of asynchronous client calls and streaming responses within a chat loop. It supports function calling, enabling dynamic and interactive conversations. This script is designed to provide a practical example of managing complex interactions in a chat-based interface.
  • func_async_streaming_chat_server.py: (Most complicated) an extension of the 'func_async_streaming_chat' script. It not only handles asynchronous client calls, function calling, and streaming responses within a chat loop, but also demonstrates an example of how to format and handle server-client payloads effectively. This script provides a practical example of managing complex interactions in a chat-based interface while ensuring proper communication between the server and client.

Usage

To use this project, follow these steps:

  1. Clone the repository: git clone <repository-url>

  2. Navigate to the project directory: cd <project-directory>

  3. Set up a Python virtual environment and activate it.

  4. Install the required packages:

    pip install -r requirements.txt
  5. Copy the .env.sample file to a new file called .env:

    cp .env.sample .env
  6. Configure the environment settings per your usage:

    • For Azure OpenAI, create an Azure OpenAI gpt-3.5 or gpt-4 deployment, and customize the .env file with your Azure OpenAI endpoint and deployment id.

      API_HOST=azure
      AZURE_OPENAI_ENDPOINT=https://<YOUR-AZURE-OPENAI-SERVICE-NAME>.openai.azure.com
      AZURE_OPENAI_API_KEY=<YOUR-AZURE-OPENAI-API-KEY>
      AZURE_OPENAI_API_VERSION=2024-03-01-preview
      AZURE_OPENAI_DEPLOYMENT_NAME=<YOUR-AZURE-DEPLOYMENT-NAME>
      AZURE_OPENAI_MODEL=gpt-4
    • For OpenAI.com, customize the .env file with your OpenAI API key and desired model name.

      API_HOST=openai
      OPENAI_KEY=<YOUR-OPENAI-API-KEY>
      OPENAI_MODEL=gpt-3.5-turbo
    • For Ollama, customize the .env file with your Ollama endpoint and model name (any model you've pulled).

      API_HOST=ollama
      OLLAMA_ENDPOINT=http://localhost:11434/v1
      OLLAMA_MODEL=llama2
  7. Run the project: python <program.py>

To open this project in VS Code:

  1. Navigate to the parent of the project directory: cd ..\<project-directory>
  2. Open in VS Code: code <project-folder-name>

Contributing

Contributions are welcome! If you would like to contribute to this project, please follow these guidelines:

  1. Fork the repository
  2. Create a new branch: git checkout -b <branch-name>
  3. Make your changes and commit them: git commit -m 'Add some feature'
  4. Push to the branch: git push origin <branch-name>
  5. Submit a pull request

License

This project is licensed under the MIT License.

chat-completions-function-calling-examples's People

Contributors

john-c-carroll avatar john-carroll-sw avatar

Stargazers

 avatar

Watchers

 avatar

Forkers

adeelku

chat-completions-function-calling-examples's Issues

Error for every step

I assume you were using the old OpenAI API for this project or something like that because I've gotten an error for every step along the way, first:

  1. dot-env not installed, that was an easy fix

  2. client not defined, I then redefined it because I saw you had it defined, but not correctly (I don't know why)

  3. then I got an error saying the key had to be passed to the client, and even though it is being passed to the client, I still had to make a new file and paste my api key in and import it.

  4. then I got this error object Stream can't be used in 'await' expression so I removed it and got another error:

  5. 'async for' requires an object with __aiter__ method, got Stream

  6. and now im outside the limits of my knowledge and would like some advice on what to do next

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.