Code Monkey home page Code Monkey logo

tlm's Introduction

tlm - Local CLI Copilot, powered by CodeLLaMa. ๐Ÿ’ป๐Ÿฆ™

Latest Build Sonar Quality Gate Latest Release Downloads

tlm is your CLI companion which requires nothing except your workstation. It uses most efficient and powerful CodeLLaMa in your local environment to provide you the best possible command line suggestions.

Suggest

Explain

Features

  • ๐Ÿ’ธ No API Key (Subscription) is required. (ChatGPT, Github Copilot, Azure OpenAI, etc.)

  • ๐Ÿ“ก No internet connection is required.

  • ๐Ÿ’ป Works on macOS, Linux and Windows.

  • ๐Ÿ‘ฉ๐Ÿปโ€๐Ÿ’ป Automatic shell detection.

  • ๐Ÿš€ One liner generation and command explanation.

Installation

Installation can be done in two ways;

Prerequisites

Ollama is needed to download to necessary models. It can be downloaded with the following methods on different platforms.

  • On macOs and Windows;

Download instructions can be followed at the following link: https://ollama.com/download

  • On Linux;
curl -fsSL https://ollama.com/install.sh | sh
  • Or using official Docker images ๐Ÿณ;
# CPU Only
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

# With GPU (Nvidia only)
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Installation Script

Installation script is the recommended way to install tlm. It will recognize the which platform and architecture to download and will execute install command for you.

Linux and macOS;

Download and execute the installation script by using the following command;

curl -fsSL https://raw.githubusercontent.com/yusufcanb/tlm/release/1.1/install.sh | sudo bash -E

Windows (Powershell 5.1 or higher)

Download and execute the installation script by using the following command;

Invoke-RestMethod -Uri https://raw.githubusercontent.com/yusufcanb/tlm/release/1.1/install.ps1 | Invoke-Expression

Go Install

If you have Go 1.21 or higher installed on your system, you can easily use the following command to install tlm;

go install github.com/yusufcanb/tlm@latest

Then, deploy tlm modelfiles.

๐Ÿ“ Note: If you have Ollama deployed on somewhere else. Please first run tlm config and configure Ollama host.

tlm deploy

Check installation by using the following command;

tlm help

Uninstall

On Linux and macOS;

rm /usr/local/bin/tlm

On Windows;

Remove the directory under;

C:\Users\<username>\AppData\Local\Programs\tlm

tlm's People

Contributors

yusufcanb avatar eomeragic1 avatar sadikkuzu avatar slim-abid avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.