Code Monkey home page Code Monkey logo

llm-functions's Introduction

LLM Functions

This project empowers you to effortlessly build powerful LLM tools and agents using familiar languages like Bash, JavaScript, and Python.

Forget complex integrations, harness the power of function calling to connect your LLMs directly to custom code and unlock a world of possibilities. Execute system commands, process data, interact with APIs – the only limit is your imagination.

Kickstart your journey with a curated library of pre-built LLM tools and agents ready for immediate use or customization.

Tools Showcase llm-function-tool

Agents showcase llm-function-agent

Prerequisites

Make sure you have the following tools installed:

  • argc: A bash command-line framewrok and command runner
  • jq: A JSON processor

Getting Started with AIChat

Currently, AIChat is the only CLI tool that supports llm-functions. We look forward to more tools supporting llm-functions.

1. Clone the repository

git clone https://github.com/sigoden/llm-functions

2. Build tools and agents

I. Create a ./tools.txt file with each tool filename on a new line.

get_current_weather.sh
execute_command.sh
#execute_py_code.py
Where is the web_search tool?

The web_search tool itself doesn't exist directly, Instead, you can choose from a variety of web search tools.

To use one as the web_search tool, follow these steps:

  1. Choose a Tool: Available tools include:

    • web_search_cohere.sh
    • web_search_perplexity.sh
    • web_search_tavily.sh
    • web_search_vertexai.sh
  2. Link Your Choice: Use the argc command to link your chosen tool as web_search. For example, to use web_search_perplexity.sh:

    $ argc link-web-search web_search_perplexity.sh

    This command creates a symbolic link, making web_search.sh point to your selected web_search_perplexity.sh tool.

Now there is a web_search.sh ready to be added to your ./tools.txt.

II. Create a ./agents.txt file with each agent name on a new line.

coder
todo

III. Build bin and functions.json

argc build

3. Install to AIChat

Symlink this repo directory to AIChat's functions_dir:

ln -s "$(pwd)" "$(aichat --info | grep -w functions_dir | awk '{print $2}')"
# OR
argc install

4. Start using the functions

Done! Now you can use the tools and agents with AIChat.

aichat --role %functions% what is the weather in Paris?
aichat --agent todo list all my todos

Writing Your Own Tools

Building tools for our platform is remarkably straightforward. You can leverage your existing programming knowledge, as tools are essentially just functions written in your preferred language.

LLM Functions automatically generates the JSON declarations for the tools based on comments. Refer to ./tools/demo_tool.{sh,js,py} for examples of how to use comments for autogeneration of declarations.

Bash

Create a new bashscript in the ./tools/ directory (.e.g. execute_command.sh).

#!/usr/bin/env bash
set -e

# @describe Execute the shell command.
# @option --command! The command to execute.

main() {
    eval "$argc_command" >> "$LLM_OUTPUT"
}

eval "$(argc --argc-eval "$0" "$@")"

Javascript

Create a new javascript in the ./tools/ directory (.e.g. execute_js_code.js).

/**
 * Execute the javascript code in node.js.
 * @typedef {Object} Args
 * @property {string} code - Javascript code to execute, such as `console.log("hello world")`
 * @param {Args} args
 */
exports.main = function main({ code }) {
  return eval(code);
}

Python

Create a new python script in the ./tools/ directory (e.g. execute_py_code.py).

def main(code: str):
    """Execute the python code.
    Args:
        code: Python code to execute, such as `print("hello world")`
    """
    return exec(code)

Writing Your Own Agents

Agent = Prompt + Tools (Function Calling) + Documents (RAG), which is equivalent to OpenAI's GPTs.

The agent has the following folder structure:

└── agents
    └── myagent
        ├── functions.json                  # JSON declarations for functions (Auto-generated)
        ├── index.yaml                      # Agent definition
        ├── tools.txt                       # Shared tools
        └── tools.{sh,js,py}                # Agent tools 

The agent definition file (index.yaml) defines crucial aspects of your agent:

name: TestAgent                             
description: This is test agent
version: 0.1.0
instructions: You are a test ai agent to ... 
conversation_starters:
  - What can you do?
variables:
  - name: foo
    description: This is a foo
documents:
  - local-file.txt
  - local-dir/
  - https://example.com/remote-file.txt

Refer to ./agents/demo for examples of how to implement a agent.

License

The project is under the MIT License, Refer to the LICENSE file for detailed information.

llm-functions's People

Contributors

david-else avatar gilcu3 avatar sigoden avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

llm-functions's Issues

Error while running "Symlink this repo directory to aichat functions_dir" instruction step

Running

❯ ln -s "$(pwd)" "$(aichat --info | grep functions_dir | awk '{print $2}')"

produces

ln: failed to create symbolic link '/home/ngirard/.config/aichat/functions'$'\n''/home/ngirard/.config/aichat/functions/agents': No such file or directory

FWIW:

❯ aichat --info | grep functions_dir | awk '{print $2}'
/home/ngirard/.config/aichat/functions
/home/ngirard/.config/aichat/functions/agents

`argc build` returns an `jq` error

Describe the bug
I cloned the repo at 970ed06, created tools.txt and agents.txt files, ran argc build and the following error gets printed:

❯ argc build
jq: error (at <stdin>:28): null (null) cannot be matched, as it is not a string
Build functions.json
Build tool get_current_weather.sh
jq: error (at <stdin>:78): Cannot iterate over null (null)
Build agents/todo-sh/functions.json
Build agent todo-sh

To Reproduce

  • repo is at 970ed06
  • tools.txt contains get_current_weather.sh
  • agents.txt contains todo-sh
  • run argc build

Expected behavior
functions.json is something else than an empty array ([]) I'm guessing.

Screenshots/Logs

❯ git --no-pager log -n 1
commit 970ed06d2b5759e8d17aeed55fbfde591fb0ff84 (HEAD -> main, origin/main, origin/HEAD)
Author: sigoden <[email protected]>
Date:   Sat Jun 29 07:07:33 2024 +0800

    refactor: remove scripts/crawler.mjs (#53)

❯ git status
On branch main
Your branch is up to date with 'origin/main'.

nothing to commit, working tree clean

❯ cat -p tools.txt 
get_current_weather.sh

❯ cat -p agents.txt 
todo-sh

❯ cat -p functions.json 
[]

❯ tree bin 
bin
├── get_current_weather -> /home/$USER/projects/llm-functions/scripts/run-tool.sh
└── todo-sh -> /home/$USER/projects/llm-functions/scripts/run-agent.sh

Environment

❯ argc version
Linux $HOSTNAME 6.6.32 #1-NixOS SMP PREEMPT_DYNAMIC Sat May 25 14:22:56 UTC 2024 x86_64 GNU/Linux
argc 1.14.0
jq-1.7.1
Unknown option `--argc-shell-path`
Unknown option `--argc-shell-path`
/home/$USER/projects/llm-functions/Argcfile.sh: line 486: : command not found
 
/run/current-system/sw/bin/python Python 3.11.9

Additional context
I'm guessing argc is out of date too much?

Cross-platform (I.E windows) compatibility?

This is pretty impressive all around, but using bash (plus all the other needed requirements) makes it work only on linux-based systems (seems to require cygwin on windows, which is a big dependency and doesn't perform well).

So while the project doesn't have too much code, it might be worth considering to make the main script depend on only one cross platform thing.

As far as I can see, its functionality is just building a json file from the folder structure, couldn't this just be folded in into aichat itself?

If not, can it maybe be written in a cross-platform language - python, node or even something like https://github.com/sagiegurari/duckscript

Not to mention, bash is full of traps and foot-guns, and maintaining it will get harder as it expands.

Feature Request: Add shell function to scrape websites

Thanks for the amazing functions feature! It would be great to in addition to searching the web, to be able to query actual web pages or sites for data. Maybe this is possible in some way with the duckduckgo function, although I would think either BeautifulSoup or a combination of curl and pandoc to convert the HTML might be needed?

EDIT:

My first attempt fails with:

Call get_webpage '{}'
Call get_webpage '{}':
    error: the following required arguments were not provided:
      --query <QUERY>
#!/usr/bin/env bash
set -e

# @describe Takes in a URL for a webpage and returns the HTML as markdown.
# Use it to answer user questions that require access to web pages such as creating a summary.
# @meta require-tools curl pandoc
# @option --query! The URL to scrape.

main() {
  curl "$argc_query" | pandoc -f html-native_divs-native_spans -t gfm-raw_html | sed -E 's/!\[.*?\]\((data:image\/svg\+xml[^)]+)\)//g'
}

eval "$(argc --argc-eval "$0" "$@")"

Windows paths aren't handled correctly

aichat -r %functions% what time is it
Call get_current_time '{}'
Call get_current_time '{}':
    /bin/bash: C:UsersmeAppDataRoamingaichatfunctionsshget_current_time.sh: No such file or directory

It seems to try to run the file via the windows path, so the backslashes are replaced with nothing and the function doesn't work.

coder agent: functions confused by `---` as content and getting `error: unexpected argument `---``

Describe the bug
I don't know if my bug report title as accurate as I am not certain of the problem. _index.md is not written, Using the coder agent to convert a static website to hugo I quickly get the following error, even after telling it to quote strings:

please make sure you quote any strings when you call functions and try again
Call fs_mkdir {"path":"hugo"}
Call fs_mkdir {"path":"hugo/content"}
Call fs_mkdir {"path":"hugo/content/post"}
Call coder fs_create {"path":"hugo/content/_index.md","content":"---\ntitle: \"Home\"\ndescription: \"Welcome to my awesome blog post page powered by Hugo and styled with Tailwind CSS.\"\n---\n\n# Welcome to My Blog\n\nWelcome to my awesome blog post page powered by Hugo and styled with Tailwind CSS.\n\n## Latest Tutorials\n\n{{< content >}}"}
error: unexpected argument `---
title: "Home"
description: "Welcome to my awesome blog post page powered by Hugo and styled with Tailwind CSS."
---

# Welcome to My Blog

Welcome to my awesome blog post page powered by Hugo and styled with Tailwind CSS.

## Latest Tutorials

{{< content >}}` found
Tool call exit with 1

To Reproduce
I am converting a static website to hugo, so it is not a good idea to upload all my source files.

Expected behavior

Screenshots/Logs

Environment

Linux debian 6.1.0-23-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.1.99-1 (2024-07-15) x86_64 GNU/Linux
aichat 0.20.0
argc 1.20.0
jq-1.6
/usr/bin/bash GNU bash, version 5.2.15(1)-release (x86_64-pc-linux-gnu)
/usr/bin/node v18.19.0

LLM_AGENT_CACHE_DIR not documented nor default fallback

Describe the bug

LLM_AGENT_CACHE_DIR is not documented here and does not seem to have a default / fallback value.

To Reproduce

todo-sh>> add todo make a lasagne
Call todo-sh add_todo {"desc":"make a lasagne"}
No such file or directory (os error 2)

Expected behavior
I expected it to just work, since I am following the README here

Environment

❯ aichat --version
aichat 0.19.0

Additional context

%functions% role not working when a model is specified in command line

When I specify a model, the function calling seems to stop working:

thomas@[HOSTNAME]:~/projets/llm-functions$ aichat -r %functions% print the millionth prime in python
Call may_execute_py_code '{"code":"import sympy\nprint(sympy.prime(1000000))"}'
> [1] Run, [2] Run & Retrieve, [3] Skip: 1
15485863
thomas@[HOSTNAME]:~/projets/llm-functions$ aichat  --model azure_ai:gpt-4-turbo-1106-preview -r %functions% print the millionth prime in python
Generating the millionth prime number can be achieved in Python by implementing a prime number finding algorithm. Below is [etc]

Unable to use functions

Describe the bug

I should probably break this into three issues, but they may be related?

  1. The first issue was with script not finding Python. It was on path, but wouldn't find it. I changed python to python3 in the script and it worked instantly. Not sure if I messed something on my WSL setup for it to not see path, but my fix worked and script ran with some errors as described in error 2 next.
  2. The second issue after following your instructions I got the error below. I noticed web-search.sh doesn't exist in tools folder nor did I have it in my tools.txt file. Not sure why this error happend. You'll also see an error for coder this exists in the file structure and has files, but I get the invalid error. (see Issue 2 error below)
  3. I proceeded to try other functions since I didn't get errors on those. None of the functions work no matter what model I use. I used Ollama with llama3.1 and Gemini and neither worked (see Issue 3 error below). Both models work if I don't use functions in prompt.

Issue 2 error:

WARNING: no found web_search tool, please run argc link-web-search to set one.
error: not found tools: web_search.sh
Build agents/todo/functions.json
error: invalid agents: coder

Issue 3 error w/Ollama:llama3.1:

Note: I still get an LLM response just w/o the function after the warning

WARNING: This LLM or client does not support function calling, despite the context requiring it.

Issue 3 error w/Gemini API

Note: I don't get any response just this error

Failed to call chat-completions api

Caused by:
* GenerateContentRequest.tools[0].function_declarations[0].parameters.properties[array].items: missing field.
* GenerateContentRequest.tools[0].function_declarations[0].parameters.properties[array_optional].items: missing field.
* GenerateContentRequest.tools[0].function_declarations[5].parameters.properties[array].items: missing field.
* GenerateContentRequest.tools[0].function_declarations[5].parameters.properties[array_optional].items: missing field.
* GenerateContentRequest.tools[0].function_declarations[10].parameters.properties[array_optional].items: missing field.
* GenerateContentRequest.tools[0].function_declarations[10].parameters.properties[array].items: missing field.
(status: INVALID_ARGUMENT)

To Reproduce

For issue 2 I followed the install guide as stated (except for script change I made in error 1) and I get the error.

For issue 3 I use the example prompts on GitHub on both models with errors. Ollama llama3.1 is set as default and I change model when using Gemini. If I omit the %functions% I'll get a response from either model just w/o the function obviosuly.

aichat -r %functions% what is the weather in Paris and Berlin
aichat -m gemini -r %functions% what is the weather in Paris and Berlin

# Search the web function
aichat -r %functions% lastest version of node.js
aichat -m gemini -r %functions% lastest version of node.js

# Execute command function
aichat -r %functions% what is my cpu arch
aichat -m gemini -r %functions% what is my cpu arch

Expected behavior

Expected behavior install without errors and all functions work correctly when called.

Screenshots/Logs

Environment

Linux test 5.15.153.1-microsoft-standard-WSL2 #1 SMP Fri Mar 29 23:14:13 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux argc 1.20.0 jq-1.7 /usr/bin/bash GNU bash, version 5.2.21(1)-release (x86_64-pc-linux-gnu) /home/linuxbrew/.linuxbrew/bin/node v22.5.1 /usr/bin/python3 Python 3.12.3

Additional context

The project doesn't state if certain models are required to run functions. I understand that llama3.1 does support functions and I assume you include Gemini and a few other providers as they do as well. But for some reason I can't get them to work so was just wondering if only certain providers/Local LLMs support these functions? If so I think they should be called out.

My config only has the two LLM providers in it nothing else. Should I have other config options enabled?

may_execute_py_code executes without asking for permission

Describe the bug
aichat executes "may" types of commands without asking for permission

aichat --model openai:gpt-4o -r %functions% print the 123rd prime
Call may_execute_py_code {"code":"from sympy import primerange\nprime_list = list(primerange(0,730))\nprint(prime_list[122])"}
The 123rd prime number is 677.

I'm using the current main version of aichat (#64982b45) on WSL on Windows, along with commit a799428 of llm-functions

Expected behavior
I expect to be asked whether to run the command.

Screenshots/Logs

 cat tools.txt
get_current_weather.sh
may_execute_py_code.py
may_execute_command.sh
cat bots.txt
todo-sh
cat ~/.config/aichat/config.yaml
model: openai
keybindings: emacs
compress_threshold: 50000
function_calling: true
clients:
- type: openai
  api_key: <my-openai-api-key>

Environment

argc version
Linux DESKTOP-DBPFNLP 5.15.146.1-microsoft-standard-WSL2 #1 SMP Thu Jan 11 04:09:03 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
argc 1.18.0
jq-1.6
/usr/bin/bash GNU bash, version 5.1.16(1)-release (x86_64-pc-linux-gnu)
/usr/bin/python Python 3.10.12

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.