Code Monkey home page Code Monkey logo

twinny's Introduction

twinny

Tired of the so-called "free" Copilot alternatives that are filled with paywalls and signups? Look no further, developer friend!

Twinny is your definitive, no-nonsense AI code completion plugin for Visual Studio Code and compatible editors like VSCodium. It's designed to integrate seamlessly with various tools and frameworks:

Like Github Copilot but 100% free!

Install Twinny on the Visual Studio Code extension marketplace.

Main Features

Fill in the Middle Code Completion

Get AI-based suggestions in real time. Let Twinny autocomplete your code as you type.

Fill in the Middle Example

Chat with AI About Your Code

Discuss your code via the sidebar: get function explanations, generate tests, request refactoring, and more.

Additional Features

  • Operates online or offline
  • Highly customizable API endpoints for FIM and chat
  • Chat conversations are preserved
  • Conforms to the OpenAI API standard
  • Supports single or multiline fill-in-middle completions
  • Customizable prompt templates
  • Generate git commit messages from staged changes (CTRL+SHIFT+T CTRL+SHIFT+G)
  • Easy installation via the Visual Studio Code extensions marketplace
  • Customizable settings for API provider, model name, port number, and path
  • Compatible with Ollama, llama.cpp, oobabooga, and LM Studio APIs
  • Accepts code solutions directly in the editor
  • Creates new documents from code blocks
  • Copies generated code solution blocks

🚀 Getting Started

Setup with Ollama (Recommended)

  1. Install the VS Code extension here or VSCodium here.
  2. Set up Ollama as the backend by default: Install Ollama
  3. Select your model from the Ollama library (e.g., codellama:7b-instruct for chats and codellama:7b-code for auto complete).
ollama run codellama:7b-instruct
ollama run codellama:7b-code
  1. Open VS code (if already open a restart might be needed) and press ctr + shift + T to open the side panel.

You should see the 🤖 icon indicating that twinny is ready to use.

  1. See Keyboard shortcuts to start using while coding 🎉

Setup with Other Providers llama.cpp / LM Studio / Oobabooga / LiteLLM or any other provider

For setups with llama.cpp, LM Studio, Oobabooga, LiteLLM, or any other provider, you can find more details on provider configurations and functionalities here in providers.md.

  1. Install the VS Code extension here.
  2. Obtain and run your chosen model locally using the provider's setup instructions.
  3. Restart VS Code if necessary and press CTRL + SHIFT + T to open the side panel.
  4. At the top of the extension, click the 🔌 (plug) icon to configure your FIM and chat endpoints in the providers tab.
  5. It is recommended to use separate models for FIM and chat as they are optimized for different tasks.
  6. Update the provider settings for chat, including provider, port, and hostname to correctly connect to your chat model.
  7. After setup, the 🤖 icon should appear in the sidebar, indicating that Twinny is ready for use.
  8. Results may vary from provider to provider especailly if using the same model for chat and FIM interchangeably.

With Non-Local API Providers e.g, OpenAI GPT-4 and Anthropic Claude

Twinny supports OpenAI API-compliant providers.

  1. Use LiteLLM as your local proxy for the best compatibility.
  2. If there are any issues, please open an issue on GitHub with details.

Model Support

Models for Chat:

  • For powerful machines: deepseek-coder:6.7b-base-q5_K_M or codellama:7b-instruct.
  • For less powerful setups, choose a smaller instruct model for quicker responses, albeit with less accuracy.

Models for FIM Completions:

  • High performance: deepseek-coder:base or codellama:7b-code.
  • Lower performance: deepseek-coder:1.3b-base-q4_1 for CPU-only setups.

Keyboard Shortcuts

Shortcut Description
ALT+\ Trigger inline code completion
CTRL+SHIFT+/ Stop the inline code generation
Tab Accept the inline code generated
CTRL+SHIFT+T Open Twinny sidebar
CTRL+SHIFT+T CTRL+SHIFT+G Generate commit messages from staged changes

Workspace Context

Enable useFileContext in settings to improve completion quality by tracking sessions and file access patterns. This is off by default to ensure performance.

Known Issues

Visit the GitHub issues page for known problems and troubleshooting.

Contributing

Interested in contributing? Reach out on Twitter, describe your changes in an issue, and submit a PR when ready. Twinny is open-source under the MIT license. See the LICENSE for more details.

Disclaimer

Twinny is actively developed and provided "as is". Functionality may vary between updates.

Star History

Star History Chart

twinny's People

Contributors

allen-li1231 avatar antonkrug avatar badetitou avatar bnorick avatar jeffistyping avatar kha84 avatar nav9 avatar onel avatar oxaronick avatar pacman100 avatar pgbtc avatar rjmacarthy avatar sbeardsley avatar sebastianelsner avatar winniep avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.