Code Monkey home page Code Monkey logo

heat's Introduction

Heat

More people need to experience open source LLMs. Heat is an open source native iOS and macOS client for interacting with the most popular LLM services. A sister project, Swift GenKit, attempts to abstract away all the differences across each service including OpenAI, Mistral, Perplexity, Anthropic and all the models available with Ollama which you can run locally.

Desktop Screenshot

Features

  • Supports popular LLM providers (OpenAI, Mistral, Anthropic, Gemini)
  • Supports locally run open source LLMs with Ollama
  • Supoprts image gen (Stable Diffusion and Dall-e)
  • Launcher similar to Spotlight Shift+Control+Space
  • Multi-step tool use (dependent on model)
  • Web search and browsing to improve response accuracy
  • Calendar reading and understanding
  • Filesystem search (desktop only)
  • Basic memory persistence
  • No server dependencies aside from models being accessed.

Install from TestFlight

https://testflight.apple.com/join/AX9JftGk (March 17, 2024)

Install locally

  1. Build and run in Xcode.
  2. Navigate to Preferences > Model Services to provide API keys for services.
  3. Pick the models you want to use (or use the defaults).
  4. In Preferences set the preferred services you want the app to use for each situation. You can set up multiple services and mix and match to your hearts content.

Run locally with Ollama

  1. Install Ollama and pull some models
  2. Run the ollama server ollama serve
  3. Set up the Ollama service in Preferences > Model Services.
  4. In Preferences set the preferred services to use Ollama.

To run the iOS app on your device you'll need to figure out what the local IP is for your computer running the Ollama server. It's usually something like 10.0.0.XXX. Under Preferences > Services > Ollama you can set the IP as long as you stay on your local network. Sometimes Ollama's default port 11434 doesn't work and you'll need to change it to something like 8080 and run the server manually: OLLAMA_HOST=0.0.0.0:8080 ollama serve

Future

Originally the plan for this project was to get models running on-device โ€” hence the name Heat because your device will heat up! โ€” but that was hard. As this becomes more feasible I will revisit.

heat's People

Contributors

nathanborror avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.