An open-source Swift package that simplifies LLM message completions, inspired by liteLLM and adapted for Swift developers, following Swift conventions.
- Description
- Installation
- Usage
- Message
- Collaboration
- [OpenAI Azure](OpenAI Azure)
- OpenAI AIProxy
Call different LLM APIs using the OpenAI format; currently supporting OpenAI and Anthropic, with more models, including Gemini, coming soon.
- Open your Swift project in Xcode.
- Go to
File
->Add Package Dependency
. - In the search bar, enter this URL.
- Choose the version you'd like to install.
- Click
Add Package
.
Remember that your API keys are a secret! Do not share it with others or expose it in any client-side code (browsers, apps). Production requests must be routed through your backend server where your API keys can be securely loaded from an environment variable or key management service.
To interface with different LLMs, you need only to supply the corresponding LLM configuration and adjust the parameters accordingly.
First, import the PolyAI package:
import PolyAI
Then, define the LLM configurations. Currently, OpenAI and Anthropic are supported:
let openAIConfiguration: LLMConfiguration = .openAI(.api(key: "your_openai_api_key_here"))
let anthropicConfiguration: LLMConfiguration = .anthropic(apiKey: "your_anthropic_api_key_here")
let configurations = [openAIConfiguration, anthropicConfiguration]
With the configurations set, initialize the service:
let service = PolyAIServiceFactory.serviceWith(configurations)
Now, you have access to both the OpenAI and Anthropic APIs in a single package, with Gemini coming soon! ๐
To send a message using OpenAI:
let prompt = "How are you today?"
let parameters: LLMParameter = .openAI(model: .gpt4turbo, messages: [.init(role: .user, content: prompt)])
let stream = try await service.streamMessage(parameters)
To interact with Anthropic instead, all you need to do is change just one line of code! ๐ฅ
let prompt = "How are you today?"
let parameters: LLMParameter = .anthropic(model: .claude3Sonnet, messages: [.init(role: .user, content: prompt)], maxTokens: 1024)
let stream = try await service.streamMessage(parameters)
To access the OpenAI API via Azure, you can use the following configuration setup.
let azureConfiguration: LLMConfiguration = .openAI(.azure(configuration: .init(resourceName: "YOUR_RESOURCE_NAME", openAIAPIKey: .apiKey("YOUR_API_KEY"), apiVersion: "THE_API_VERSION")))
More information can be found here.
To access the OpenAI API via AIProxy, use the following configuration setup.
let aiProxyConfiguration: LLMConfiguration = .openAI(.aiProxy(aiproxyPartialKey: "hardcode_partial_key_here", aiproxyDeviceCheckBypass: "hardcode_device_check_bypass_here"))
More information can be found here.
Open a PR for any proposed change pointing it to main
branch.