Code Monkey home page Code Monkey logo

gaianet-node's People

Contributors

ahmad-mtr avatar alabulei1 avatar apepkuss avatar codeaunt avatar dm4 avatar eltociear avatar grorge123 avatar juntao avatar jyc0413 avatar longzhi avatar mileyfu avatar mobinshahidi avatar rumeyst avatar ryssroad avatar wenryxu avatar widiskel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gaianet-node's Issues

Failed to get chat completions. Reason: The model `Llama-3-8B-Instruct` does not exist in the chat graphs.

500 Internal Server Error: Failed to get chat completions. Reason: The model Llama-3-8B-Instruct does not exist in the chat graphs.

image

curl 'https://0xe2eb9df4f465a69968e7f39e331b4bc21f131d3a.us.gaianet.network/v1/chat/completions' \
  -H 'accept: application/json' \
  -H 'accept-language: en-GB,en-US;q=0.9,en;q=0.8' \
  -H 'cache-control: no-cache' \
  -H 'content-type: text/event-stream' \
  -H 'origin: https://www.gaianet.ai' \
  -H 'pragma: no-cache' \
  -H 'priority: u=1, i' \
  -H 'referer: https://www.gaianet.ai/' \
  -H 'sec-ch-ua: "Not)A;Brand";v="99", "Brave";v="127", "Chromium";v="127"' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'sec-ch-ua-platform: "Windows"' \
  -H 'sec-fetch-dest: empty' \
  -H 'sec-fetch-mode: cors' \
  -H 'sec-fetch-site: cross-site' \
  -H 'sec-gpc: 1' \
  -H 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/127.0.0.0 Safari/537.36' \
  --data-raw $'{"model":"Llama-3-8B-Instruct","messages":[{"role":"user","content":"hi"},{"role":"assistant","content":"\\n\\nNot a question, but a hello\u0021"},{"role":"user","content":"when last bitcoin will be mined any date"},{"role":"assistant","content":"\\n\\nThe last bitcoin is expected to be mined around the year 2140."}],"stream":true,"stream_options":{"include_usage":true},"user":"df6a6f18-4323-4144-820f-010a40a249b5"}'

init error in macbook

./bin/gaianet init
[+] Checking the config.json file ...

[+] Downloading Meta-Llama-3-8B-Instruct-Q5_K_M.gguf ...
* Using the cached Meta-Llama-3-8B-Instruct-Q5_K_M.gguf in /Users/XXX/gaianet

[+] Downloading nomic-embed-text-v1.5.f16.gguf ...
* Using the cached nomic-embed-text-v1.5.f16.gguf in /Users/XXX/gaianet

[+] Creating 'default' collection in the Qdrant instance ...
* A Qdrant instance is already running

* Remove the existed 'default' Qdrant collection ...

* Download Qdrant collection snapshot ...

#################################################################################################################### 100.0%#################################################################################################################### 100.0%
The snapshot is downloaded in /Users/XXX/gaianet

* Import the Qdrant collection snapshot ...
  The process may take a few minutes. Please wait ...
  Recovery is done!

[+] Preparing the dashboard ...
You already have a private key.
thread 'main' panicked at src/main.rs:27:41:
called Result::unwrap() on an Err value: Os { code: 44, kind: NotFound, message: "No such file or directory" }
note: run with RUST_BACKTRACE=1 environment variable to display a backtrace
[2024-07-12 11:00:42.209] [error] execution failed: unreachable, Code: 0x89
[2024-07-12 11:00:42.210] [error] In instruction: unreachable (0x00) , Bytecode offset: 0x00206715
[2024-07-12 11:00:42.210] [error] When executing function name: "_start"

Not able to run custom snapshot.tar.gz

I was experimenting some things with gaianet custom nodes and wanted to just test how embedding works and work with custom data for same I created a snapshot following documentation for same Paris dataset and have added snapshot.tar.gz to my github and have updated the config file accordingly

"address": "0x8ed74092f0d0e0752a7fa5d7b9a2a81a6646a53e", "chat": "https://huggingface.co/gaianet/Llama-3-8B-Instruct-GGUF/resolve/main/Meta-Llama-3-8B-Instruct-Q5_K_M.gguf", "chat_ctx_size": "4096", "chat_name": "LLAMA-3-8B-Instruct-GGUF", "description": "The default GaiaNet node config with a Phi-3-mini-4k model and a Paris tour guide knowledge base.", "domain": "us.gaianet.network", "embedding": "nomic-embed-text-v1.5.f16.gguf", "embedding_collection_name": "default", "embedding_ctx_size": "512", "embedding_name": "Nomic-embed-text-v1.5", "llamaedge_port": "8080", "prompt_template": "llama-3-chat", "qdrant_limit": "1", "qdrant_score_threshold": "0.5", "rag_policy": "system-message", "rag_prompt": "You are a tour guide in London, UK. Use information in the following context to directly answer the question from a London visitor.\n----------------\n", "reverse_prompt": "", "snapshot": "https://huggingface.co/Ak1104/snapshot/blob/main/my.snapshot.tar.gz", "system_prompt": "You are a tour guide in London, UK. Please answer the question from a London visitor accurately." }

Note

I already had embedding model locally so in this code I have added its relative path instead of huggingface link

Now when I perform git init I am running into
gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error is not recoverable: exiting now
and if I try to add my relative path of snapshot to "snapshot" tag i run into
curl: (6) Could not resolve host: my.snapshot.tar.gz Warning: Transient problem: timeout Will retry in 1 seconds. 3 retries left. curl: (6) Could not resolve host: my.snapshot.tar.gz Warning: Transient problem: timeout Will retry in 2 seconds. 2 retries left. curl: (6) Could not resolve host: my.snapshot.tar.gz Warning: Transient problem: timeout Will retry in 4 seconds. 1 retries left. curl: (6) Could not resolve host: my.snapshot.tar.gz

I am successfully able to run Llama3_London tour guide example from documentation, not sure whats going wrong for my self created snapshot

tmp directory

I did not see if and how you can install if you dont have access or ability to create /tmp in /

Unable to finetune using llama.cpp

I was following this guide to finetune using llama.cpp but was unable to follow along after this

nohup ../build/bin/finetune --model-base llama-2-13b-chat.Q5_K_M.gguf --checkpoint-in checkpoint-250.gguf --lora-out lora.bin --train-data train.txt --sample-start '' --adam-iter 1024 &
It throws a 'no such file found' error.

I tried adding an empty folder named 'finetune' to the build/bin, but the issue persists. Can anyone suggest what is needed to be done?

Can't start on mac


➜  ~ gaianet start
[+] Checking the config.json file ...

You already have a private key.
[+] Starting Qdrant instance ...

    Qdrant instance started with pid: 5538

[+] Starting LlamaEdge API Server ...

    Run the following command to start the LlamaEdge API Server:

wasmedge --dir .:./dashboard --env NODE_VERSION=0.2.2 --nn-preload default:GGML:AUTO:Phi-3-mini-4k-instruct-Q5_K_M.gguf --nn-preload embedding:GGML:AUTO:nomic-embed-text-v1.5.f16.gguf rag-api-server.wasm --model-name Phi-3-mini-4k-instruct,Nomic-embed-text-v1.5 --ctx-size 4096,512 --batch-size 16,512 --prompt-template phi-3-chat,embedding --rag-policy system-message --qdrant-collection-name default --qdrant-limit 1 --qdrant-score-threshold 0.5 --web-ui ./ --socket-addr 0.0.0.0:8080 --rag-prompt "You are a tour guide in Paris, France. Use information in the following context to directly answer the question from a Paris visitor.\n----------------\n"

/Users/fugangcui/gaianet/bin/gaianet: line 569:  5577 Illegal instruction: 4  nohup "${cmd[@]}" > $log_dir/start-llamaedge.log 2>&1
    LlamaEdge API Server started with pid: 5577

When I go to https://0x160e9b8ef1f75dbabxxxxxxxxxxxxxx.us.gaianet.network/, it shows
image

Initialization in WSL failed.

OS : Ubuntu 20.04 in WSL
log:

❯ curl -sSfL 'https://raw.githubusercontent.com/GaiaNet-AI/gaianet-node/main/install.sh' | bash -s -- --unprivileged

 ██████╗  █████╗ ██╗ █████╗ ███╗   ██╗███████╗████████╗
██╔════╝ ██╔══██╗██║██╔══██╗████╗  ██║██╔════╝╚══██╔══╝
██║  ███╗███████║██║███████║██╔██╗ ██║█████╗     ██║
██║   ██║██╔══██║██║██╔══██║██║╚██╗██║██╔══╝     ██║
╚██████╔╝██║  ██║██║██║  ██║██║ ╚████║███████╗   ██║
 ╚═════╝ ╚═╝  ╚═╝╚═╝╚═╝  ╚═╝╚═╝  ╚═══╝╚══════╝   ╚═╝


[+] Installing gaianet CLI tool ...
######################################################################## 100.0%
    * gaianet CLI tool is installed in /home/rogge/gaianet/gaianet

[+] Downloading default config file ...
    * Use the cached config file in /home/rogge/gaianet

[+] Installing WasmEdge with wasi-nn_ggml plugin ...
Info: Detected Linux-x86_64

No root permissions.
Installation path found at /home/rogge/.wasmedge
Removing /home/rogge/.wasmedge//bin/wasmedge
Removing /home/rogge/.wasmedge//bin/wasmedgec
Removing /home/rogge/.wasmedge//include/wasmedge/
Removing /home/rogge/.wasmedge//include/wasmedge/enum_configure.h
Removing /home/rogge/.wasmedge//include/wasmedge/version.h
Removing /home/rogge/.wasmedge//include/wasmedge/enum_errcode.h
Removing /home/rogge/.wasmedge//include/wasmedge/wasmedge.h
Removing /home/rogge/.wasmedge//include/wasmedge/int128.h
Removing /home/rogge/.wasmedge//include/wasmedge/enum_types.h
Removing /home/rogge/.wasmedge//include/wasmedge/enum.inc
Removing /home/rogge/.wasmedge//lib/libwasmedge.so.0.0.3
Removing /home/rogge/.wasmedge//lib/libwasmedge.so
Removing /home/rogge/.wasmedge//lib/libwasmedge.so.0
Removing /home/rogge/.wasmedge/plugin/libwasmedgePluginWasiNN.so
Removing /home/rogge/.wasmedge/env
Removing /home/rogge/.wasmedge/include/wasmedge
Removing /home/rogge/.wasmedge/bin
Removing /home/rogge/.wasmedge/lib
Removing /home/rogge/.wasmedge/plugin
Removing /home/rogge/.wasmedge/include
Removing /home/rogge/.wasmedge
Info: WasmEdge Installation at /home/rogge/.wasmedge

Info: Fetching WasmEdge-0.13.5

/tmp/wasmedge.91981 ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Fetching WasmEdge-GGML-Plugin

Info: Detected CUDA version: 12

/tmp/wasmedge.91981 ~/gaianet
######################################################################## 100.0%
~/gaianet
Installation of wasmedge-0.13.5 successful
WasmEdge binaries accessible
    * The wasmedge version 0.13.5 is installed in /home/rogge/.wasmedge/bin/wasmedge.

[+] Installing Qdrant binary...
    * Use the cached Qdrant binary in /home/rogge/gaianet/bin

[+] Downloading the rag-api-server.wasm ...
######################################################################## 100.0%
    * The rag-api-server.wasm is downloaded in /home/rogge/gaianet

[+] Downloading dashboard ...
    * Use the cached dashboard in /home/rogge/gaianet

[+] Generating node ID ...
    * Use the cached registry.wasm in /home/rogge/gaianet

    * Generate node ID
You already have a private key.

[+] Installing gaianet-domain...
    * Download gaianet-domain binary
######################################################################## 100.0%
      gaianet-domain is downloaded in /home/rogge/gaianet

    * Install frpc binary
      frpc binary is installed in /home/rogge/gaianet/bin

    * Download frpc.toml
      frpc.toml is downloaded in /home/rogge/gaianet/gaianet-domain

[+] COMPLETED! The gaianet node has been installed successfully.

Your node ID is 0xe934cdc2a0c31f9c410c99326bac37fbb240e580. Please register it in your portal account to receive awards!

>>> Next, you should initialize the GaiaNet node with the LLM and knowledge base. Run the command: gaianet init <<<


~/playgrounds/gaianet took 28s
❯ echo $?
0

~/playgrounds/gaianet
❯ gaianet init
[+] Checking the config.json file ...

[+] Downloading Phi-3-mini-4k-instruct-Q5_K_M.gguf ...
    * Using the cached Phi-3-mini-4k-instruct-Q5_K_M.gguf in /home/rogge/gaianet

[+] Downloading all-MiniLM-L6-v2-ggml-model-f16.gguf ...
    * Using the cached all-MiniLM-L6-v2-ggml-model-f16.gguf in /home/rogge/gaianet

[+] Creating 'default' collection in the Qdrant instance ...
    * Start a Qdrant instance ...

    * Remove the existed 'default' Qdrant collection ...

❯ echo $?
28

It may seems it is a network issue?

Update frpc.toml at gaianet init

The node operator might make changes to the node ID after install. We should do the following at gaianet init

  • Update frpc.toml from nodeid.json
  • Update config.json from nodeid.json
  • Update dashboard/config_pub.json from nodeid.json

The latter two are processed by the registry.wasm program.

Unable to initialize gaianet node | Qdrant process id not found

Everything was working fine but suddenly I got the message "gaianet/bin/gaianet: line 455: kill: (14021) - No such process" while trying to initialize a gaianet node after adding the url of my own snapshot in the config. I have done so before and did not encounter any problem and was able to run gaianet node with different data and prompts. It suddenly happened. I restarted the process and still faced error. Then I reinstalled gaianet but the error still persists. I have tried many things. I even opened the gaianet file in the bin directory to understand where the problem was coming from. I am using WSL2 on Windows11.

Expected Behaviour: Gaianet Node gets initialized successfully.

Observed Behaviour: No process id for qdrant can be found.

image

fail to init gaianet on wsl

root@Anson-PC:~# gaianet init
[+] Checking the config.json file ...

[+] Downloading Phi-3-mini-4k-instruct-Q5_K_M.gguf ...
curl: (35) error:0A000126:SSL routines::unexpected eof while reading # # # #

Request for Turkish Translation of README

Hello,

I would like to request the creation of a Turkish version of the current English README. Providing a Turkish translation will help to make the project more accessible to Turkish-speaking users and contributors.

Proposed Solution:

Translate the entire README from English to Turkish.
Ensure that all technical terms are accurately translated or appropriately kept in English if they do not have a direct Turkish equivalent.
Add a link to the Turkish version at the top of the existing README for easy access.

Context:

I am a Turkish speaker and would be happy to assist with or fully handle the translation process. Please let me know if this is something the project maintainers are open to, and if there are any specific guidelines or formats that I should follow.

Thank you for considering this request!

/bin/gaianet: line 569: 56621 Illegal instruction (core dumped) when starting LlamaEdge API Server in GaiaNet

Steps to Reproduce:

  1. Run curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash
  2. Run source /root/.bashrc
  3. Run /root/gaianet/bin/gaianet init
  4. Run /root/gaianet/bin/gaianet start

Expected Result:
The LlamaEdge API Server should start without issues.

Actual Result:
Error: Illegal instruction (core dumped)

Logs and System Information:

Output from `gaianet start` command:

root@220361:~# gaianet start
[+] Checking the config.json file ...

You already have a private key.
[+] Starting Qdrant instance ...
[+] Starting LlamaEdge API Server ...

    Run the following command to start the LlamaEdge API Server:

wasmedge --dir .:./dashboard --env NODE_VERSION=0.2.2 --nn-preload default:GGML:AUTO:Phi-3-mini-4k-instruct-Q5_K_M.gguf --nn-preload embedding:GGML:AUTO:nomic-embed-text-v1.5.f16.gguf rag-api-server.wasm --model-name Phi-3-mini-4k-instruct,Nomic-embed-text-v1.5 --ctx-size 4096,512 --batch-size 16,512 --prompt-template phi-3-chat,embedding --rag-policy system-message --qdrant-collection-name default --qdrant-limit 1 --qdrant-score-threshold 0.5 --web-ui ./ --socket-addr 0.0.0.0:8080 --rag-prompt "You are a tour guide in Paris, France. Use information in the following context to directly answer the question from a Paris visitor.\n----------------\n" 

/root/gaianet/bin/gaianet: line 569: 56621 Illegal instruction     (core dumped) nohup "${cmd[@]}" > $log_dir/start-llamaedge.log 2>&1
    LlamaEdge API Server started with pid: 56621

Output from /root/gaianet/bin/gaianet --version:

GaiaNet CLI Tool v0.2.2

Output from free -m:

               total        used        free      shared  buff/cache   available
Mem:            9940        1082         221           2        8636        8540
Swap:              0           0           0

Output from uname -a:

Linux 220361.xorek.cloud 5.15.0-75-generic #82-Ubuntu SMP Tue Jun 6 23:10:23 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux

Output from lscpu:

Architecture:            x86_64
  CPU op-mode(s):        32-bit, 64-bit
  Address sizes:         48 bits physical, 48 bits virtual
  Byte Order:            Little Endian
CPU(s):                  6
  On-line CPU(s) list:   0-5

Output from ulimit -a:

real-time non-blocking time  (microseconds, -R) unlimited
core file size              (blocks, -c) 0
data seg size               (kbytes, -d) unlimited
scheduling priority                 (-e) 0
file size                   (blocks, -f) unlimited
pending signals                     (-i) 39309
max locked memory           (kbytes, -l) 1272444
max memory size             (kbytes, -m) unlimited
open files                          (-n) 1024
pipe size                (512 bytes, -p) 8
POSIX message queues         (bytes, -q) 819200
real-time priority                  (-r) 0
stack size                  (kbytes, -s) 8192
cpu time                   (seconds, -t) unlimited
max user processes                  (-u) 39309
virtual memory              (kbytes, -v) unlimited
file locks                          (-x) unlimited

Please let me know if you need any further information.
start-llamaedge.log

ubuntu 24.04 install node error

[+] Generating node ID ...
[2024-07-10 17:44:38.077] [error] loading failed: invalid path, Code: 0x20
[2024-07-10 17:44:38.077] [error] load library failed:libgomp.so.1: cannot open shared object file: No such file or directory
[2024-07-10 17:44:38.078] [error] loading failed: invalid path, Code: 0x20
[2024-07-10 17:44:38.079] [error] load library failed:libgomp.so.1: cannot open shared object file: No such file or directory
Generate new key storing in "38fbf453-77e4-4a67-a5ca-99d2169a2294".

No pulic url and dashboard exported

After running gaianet init. No pulic url and dashboard got exported.

Using Ubuntu 22 Docker container

[+] Checking the config.json file ...

[+] Downloading Phi-3-mini-4k-instruct-Q5_K_M.gguf ...

    * Using the cached Phi-3-mini-4k-instruct-Q5_K_M.gguf in /root/gaianet

[+] Downloading nomic-embed-text-v1.5.f16.gguf ...

    * Using the cached nomic-embed-text-v1.5.f16.gguf in /root/gaianet

[+] Creating 'default' collection in the Qdrant instance ...

    * Start a Qdrant instance ...

    * Remove the existed 'default' Qdrant collection ...

    * Download Qdrant collection snapshot ...

######################################################################## 100.0%

      The snapshot is downloaded in /root/gaianet

    * Import the Qdrant collection snapshot ...

      The process may take a few minutes. Please wait ...

      Recovery is done!

[+] Preparing the dashboard ...

You already have a private key.

[+] Preparing the GaiaNet domain ...

[+] COMPLETED! GaiaNet node is initialized successfully.

>>> To start the GaiaNet node, run the command: gaianet start <<<

[+] Checking the config.json file ...

You already have a private key.

[+] Starting Qdrant instance ...

    Qdrant instance started with pid: 397

[+] Starting LlamaEdge API Server ...

    Run the following command to start the LlamaEdge API Server:

wasmedge --dir .:./dashboard --env NODE_VERSION=0.2.4 --nn-preload default:GGML:AUTO:Phi-3-mini-4k-instruct-Q5_K_M.gguf --nn-preload embedding:GGML:AUTO:nomic-embed-text-v1.5.f16.gguf rag-api-server.wasm --model-name Phi-3-mini-4k-instruct,Nomic-embed-text-v1.5 --ctx-size 4096,512 --batch-size 16,512 --prompt-template phi-3-chat,embedding --rag-policy system-message --qdrant-collection-name default --qdrant-limit 1 --qdrant-score-threshold 0.5 --web-ui ./ --socket-addr 0.0.0.0:8080 --rag-prompt "You are a tour guide in Paris, France. Use information in the following context to directly answer the question from a Paris visitor.\n----------------\n" 

    LlamaEdge API Server started with pid: 626


[2024-07-28 08:12:55.462] [info] [WASI-NN] llama.cpp: llm_load_print_meta: model size       = 2.62 GiB (5.89 BPW) 

[2024-07-28 08:12:55.462] [info] [WASI-NN] llama.cpp: llm_load_print_meta: general.name     = Phi3

[2024-07-28 08:12:55.462] [info] [WASI-NN] llama.cpp: llm_load_print_meta: BOS token        = 1 '<s>'

mac cannot initiate. generating node id error

cannot initiated in mac.
Last login: Wed Jul 10 15:16:09 on ttys003
4ge1@4ge1deMacBook-Pro ~ % export no_proxy=localhost,127.0.0.0/8
4ge1@4ge1deMacBook-Pro ~ % curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash

██████╗ █████╗ ██╗ █████╗ ███╗ ██╗███████╗████████╗
██╔════╝ ██╔══██╗██║██╔══██╗████╗ ██║██╔════╝╚══██╔══╝
██║ ███╗███████║██║███████║██╔██╗ ██║█████╗ ██║
██║ ██║██╔══██║██║██╔══██║██║╚██╗██║██╔══╝ ██║
╚██████╔╝██║ ██║██║██║ ██║██║ ╚████║███████╗ ██║
╚═════╝ ╚═╝ ╚═╝╚═╝╚═╝ ╚═╝╚═╝ ╚═══╝╚══════╝ ╚═╝

[+] Installing gaianet CLI tool ...
######################################################################## 100.0%
* gaianet CLI tool is installed in /Users/4ge1/gaianet/bin

[+] Downloading default config.json ...
######################################################################## 100.0%
* The default config file is downloaded in /Users/4ge1/gaianet

[+] Downloading nodeid.json ...
######################################################################## 100.0%
* The nodeid.json is downloaded in /Users/4ge1/gaianet

[+] Installing WasmEdge with wasi-nn_ggml plugin ...
Info: Detected Darwin-arm64

cat: /Users/4ge1/.zshrc: No such file or directory
Info: WasmEdge Installation at /Users/4ge1/.wasmedge

Info: Fetching WasmEdge-0.13.5

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Fetching WasmEdge-GGML-Plugin

Info: Detected CUDA version from nvcc:

Info: CUDA version is not detected from nvcc: Use the CPU version.

Info: Or you can use '-c 11' or '-c 12' to install the cuda-11 or cuda-12 version manually.

Info: Use b3405 GGML plugin

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Fetching WASI-Logging-Plugin

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Installation of wasmedge-0.13.5 successful

source /Users/4ge1/.wasmedge/env to use wasmedge binaries
* The wasmedge version 0.13.5 is installed in /Users/4ge1/.wasmedge/bin/wasmedge.

[+] Installing Qdrant binary...
* Download Qdrant binary
######################################################################## 100.0%
The Qdrant binary is downloaded in /Users/4ge1/gaianet/bin

* Initialize Qdrant directory

* Disable telemetry

[+] Downloading LlamaEdge API server ...
######################################################################## 100.0%
######################################################################## 100.0%
* The rag-api-server.wasm and llama-api-server.wasm are downloaded in /Users/4ge1/gaianet

[+] Downloading dashboard ...
######################################################################## 100.0%
* The dashboard is downloaded in /Users/4ge1/gaianet

[+] Downloading registry.wasm ...

5.6%curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)

4ge1@4ge1deMacBook-Pro ~ % export no_proxy=localhost,127.0.0.0/8
4ge1@4ge1deMacBook-Pro ~ % curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash
curl: (28) Failed to connect to github.com port 443 after 75000 ms: Couldn't connect to server
4ge1@4ge1deMacBook-Pro ~ % curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash

curl: (28) Failed to connect to github.com port 443 after 75016 ms: Couldn't connect to server
4ge1@4ge1deMacBook-Pro ~ %
4ge1@4ge1deMacBook-Pro ~ %
4ge1@4ge1deMacBook-Pro ~ % export https_proxy=http://127.0.0.1:7890 http_proxy=http://127.0.0.1:7890 all_proxy=socks5://127.0.0.1:7890
4ge1@4ge1deMacBook-Pro ~ % curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash

██████╗ █████╗ ██╗ █████╗ ███╗ ██╗███████╗████████╗
██╔════╝ ██╔══██╗██║██╔══██╗████╗ ██║██╔════╝╚══██╔══╝
██║ ███╗███████║██║███████║██╔██╗ ██║█████╗ ██║
██║ ██║██╔══██║██║██╔══██║██║╚██╗██║██╔══╝ ██║
╚██████╔╝██║ ██║██║██║ ██║██║ ╚████║███████╗ ██║
╚═════╝ ╚═╝ ╚═╝╚═╝╚═╝ ╚═╝╚═╝ ╚═══╝╚══════╝ ╚═╝

[+] Installing gaianet CLI tool ...
######################################################################## 100.0%
* gaianet CLI tool is installed in /Users/4ge1/gaianet/bin

[+] Downloading default config.json ...
* Use the cached config file in /Users/4ge1/gaianet

[+] Installing WasmEdge with wasi-nn_ggml plugin ...
Info: Detected Darwin-arm64

No root permissions.
Installation path found at /Users/4ge1/.wasmedge
Removing /Users/4ge1/.wasmedge//bin/wasmedge
Removing /Users/4ge1/.wasmedge//bin/wasmedgec
Removing /Users/4ge1/.wasmedge//include/wasmedge/
Removing /Users/4ge1/.wasmedge//include/wasmedge/wasmedge.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/version.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum.inc
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum_configure.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum_errcode.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/int128.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum_types.h
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.dylib
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.0.3.tbd
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.tbd
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.dylib
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.0.3.dylib
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.tbd
Removing /Users/4ge1/.wasmedge/plugin/ggml-common.h
Removing /Users/4ge1/.wasmedge/plugin/ggml-metal.metal
Removing /Users/4ge1/.wasmedge/plugin/libwasmedgePluginWasiLogging.dylib
Removing /Users/4ge1/.wasmedge/plugin/libwasmedgePluginWasiNN.dylib
Removing /Users/4ge1/.wasmedge/env
Removing /Users/4ge1/.wasmedge/bin
Removing /Users/4ge1/.wasmedge/include/wasmedge
Removing /Users/4ge1/.wasmedge/plugin
Removing /Users/4ge1/.wasmedge/lib
Removing /Users/4ge1/.wasmedge/include
Removing /Users/4ge1/.wasmedge
cat: /Users/4ge1/.zshrc: No such file or directory
Info: WasmEdge Installation at /Users/4ge1/.wasmedge

Info: Fetching WasmEdge-0.13.5

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Fetching WasmEdge-GGML-Plugin

Info: Detected CUDA version from nvcc:

Info: CUDA version is not detected from nvcc: Use the CPU version.

Info: Or you can use '-c 11' or '-c 12' to install the cuda-11 or cuda-12 version manually.

Info: Use b3405 GGML plugin

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Fetching WASI-Logging-Plugin

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Installation of wasmedge-0.13.5 successful

source /Users/4ge1/.wasmedge/env to use wasmedge binaries
* The wasmedge version 0.13.5 is installed in /Users/4ge1/.wasmedge/bin/wasmedge.

[+] Installing Qdrant binary...
* Use the cached Qdrant binary in /Users/4ge1/gaianet/bin

[+] Downloading LlamaEdge API server ...
######################################################################## 100.0%
######################################################################## 100.0%
* The rag-api-server.wasm and llama-api-server.wasm are downloaded in /Users/4ge1/gaianet

[+] Downloading dashboard ...
* Use the cached dashboard in /Users/4ge1/gaianet

* Use the cached registry.wasm in /Users/4ge1/gaianet

[+] Generating node ID ...
[2024-07-23 22:40:48.013] [error] loading failed: length out of bounds, Code: 0x27
[2024-07-23 22:40:48.015] [error] Bytecode offset: 0x00002632
[2024-07-23 22:40:48.015] [error] At AST node: code section
[2024-07-23 22:40:48.015] [error] At AST node: module
[2024-07-23 22:40:48.015] [error] File name: "/Users/4ge1/gaianet/registry.wasm"
4ge1@4ge1deMacBook-Pro ~ % gaianet init
zsh: command not found: gaianet
4ge1@4ge1deMacBook-Pro ~ % curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash

██████╗ █████╗ ██╗ █████╗ ███╗ ██╗███████╗████████╗
██╔════╝ ██╔══██╗██║██╔══██╗████╗ ██║██╔════╝╚══██╔══╝
██║ ███╗███████║██║███████║██╔██╗ ██║█████╗ ██║
██║ ██║██╔══██║██║██╔══██║██║╚██╗██║██╔══╝ ██║
╚██████╔╝██║ ██║██║██║ ██║██║ ╚████║███████╗ ██║
╚═════╝ ╚═╝ ╚═╝╚═╝╚═╝ ╚═╝╚═╝ ╚═══╝╚══════╝ ╚═╝

[+] Installing gaianet CLI tool ...
######################################################################## 100.0%
* gaianet CLI tool is installed in /Users/4ge1/gaianet/bin

[+] Downloading default config.json ...
* Use the cached config file in /Users/4ge1/gaianet

[+] Installing WasmEdge with wasi-nn_ggml plugin ...
Info: Detected Darwin-arm64

No root permissions.
Installation path found at /Users/4ge1/.wasmedge
Removing /Users/4ge1/.wasmedge//bin/wasmedge
Removing /Users/4ge1/.wasmedge//bin/wasmedgec
Removing /Users/4ge1/.wasmedge//include/wasmedge/
Removing /Users/4ge1/.wasmedge//include/wasmedge/wasmedge.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/version.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum.inc
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum_configure.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum_errcode.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/int128.h
Removing /Users/4ge1/.wasmedge//include/wasmedge/enum_types.h
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.dylib
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.0.3.tbd
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.tbd
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.dylib
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.0.3.dylib
Removing /Users/4ge1/.wasmedge//lib/libwasmedge.0.tbd
Removing /Users/4ge1/.wasmedge/plugin/ggml-common.h
Removing /Users/4ge1/.wasmedge/plugin/ggml-metal.metal
Removing /Users/4ge1/.wasmedge/plugin/libwasmedgePluginWasiLogging.dylib
Removing /Users/4ge1/.wasmedge/plugin/libwasmedgePluginWasiNN.dylib
Removing /Users/4ge1/.wasmedge/env
Removing /Users/4ge1/.wasmedge/bin
Removing /Users/4ge1/.wasmedge/include/wasmedge
Removing /Users/4ge1/.wasmedge/plugin
Removing /Users/4ge1/.wasmedge/lib
Removing /Users/4ge1/.wasmedge/include
Removing /Users/4ge1/.wasmedge
cat: /Users/4ge1/.zshrc: No such file or directory
Info: WasmEdge Installation at /Users/4ge1/.wasmedge

Info: Fetching WasmEdge-0.13.5

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Fetching WasmEdge-GGML-Plugin

Info: Detected CUDA version from nvcc:

Info: CUDA version is not detected from nvcc: Use the CPU version.

Info: Or you can use '-c 11' or '-c 12' to install the cuda-11 or cuda-12 version manually.

Info: Use b3405 GGML plugin

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Fetching WASI-Logging-Plugin

~/gaianet/tmp ~/gaianet
######################################################################## 100.0%
~/gaianet
Info: Installation of wasmedge-0.13.5 successful

source /Users/4ge1/.wasmedge/env to use wasmedge binaries
* The wasmedge version 0.13.5 is installed in /Users/4ge1/.wasmedge/bin/wasmedge.

[+] Installing Qdrant binary...
* Use the cached Qdrant binary in /Users/4ge1/gaianet/bin

[+] Downloading LlamaEdge API server ...
######################################################################## 100.0%
######################################################################## 100.0%
* The rag-api-server.wasm and llama-api-server.wasm are downloaded in /Users/4ge1/gaianet

[+] Downloading dashboard ...
* Use the cached dashboard in /Users/4ge1/gaianet

* Use the cached registry.wasm in /Users/4ge1/gaianet

[+] Generating node ID ...
[2024-07-23 22:43:05.562] [error] loading failed: length out of bounds, Code: 0x27
[2024-07-23 22:43:05.563] [error] Bytecode offset: 0x00002632
[2024-07-23 22:43:05.563] [error] At AST node: code section
[2024-07-23 22:43:05.563] [error] At AST node: module
[2024-07-23 22:43:05.563] [error] File name: "/Users/4ge1/gaianet/registry.wasm"
4ge1@4ge1deMacBook-Pro ~ % [+] Downloading dashboard ...
* Use the cached dashboard in /Users/4ge1/gaianet

* Use the cached registry.wasm in /Users/4ge1/gaianet

[+] Generating node ID ...
[2024-07-23 22:43:05.562] [error] loading failed: length out of bounds, Code: 0x27
[2024-07-23 22:43:05.563] [error] Bytecode offset: 0x00002632
[2024-07-23 22:43:05.563] [error] At AST node: code section
[2024-07-23 22:43:05.563] [error] At AST node: module
[2024-07-23 22:43:05.563] [error] File name: "/Users/4ge1/gaianet/registry.wasm"

Failed to remove the default collection.

When I execute the command "gaianet init"
It show :

[+] Checking the config.json file ...

[+] Downloading Phi-3-mini-4k-instruct-Q5_K_M.gguf ...
* Using the cached Phi-3-mini-4k-instruct-Q5_K_M.gguf in /home/shy/gaianet

[+] Downloading all-MiniLM-L6-v2-ggml-model-f16.gguf ...
* Using the cached all-MiniLM-L6-v2-ggml-model-f16.gguf in /home/shy/gaianet

[+] Creating 'default' collection in the Qdrant instance ...
* Start a Qdrant instance ...

* Remove the existed 'default' Qdrant collection ...

  Failed to remove the default collection. 

macbook

  • Run the command 'source /var/root/.shrc' to make the gaianet CLI tool available in the current shell;
  • Run the command 'gaianet init' to initialize the GaiaNet node.

mac:~ root# source /var/root/.shrc
-sh: /var/root/.shrc: No such file or directory

Facing Issue with RAG

I was running Llama-3-8b-Instruct as gaianet node with my self created snapshot which I successfully created without any issues.
This Snapshot is created by chunking book Rust Programing Language the chunked txt file used for creating snapshot can be seen here

Also while gianet init and gaianet start commands I didnt face any issues

`akshat@akshat:~/gaianet$ gaianet init
[+] Checking the config.json file ...

[+] Downloading Meta-Llama-3-8B-Instruct-Q5_K_M.gguf ...
* Using the cached Meta-Llama-3-8B-Instruct-Q5_K_M.gguf in /home/akshat/gaianet

[+] Downloading nomic-embed-text-v1.5.f16.gguf ...
* Using the cached nomic-embed-text-v1.5.f16.gguf in /home/akshat/gaianet

[+] Creating 'default' collection in the Qdrant instance ...
* Start a Qdrant instance ...

* Remove the existed 'default' Qdrant collection ...

* Download Qdrant collection snapshot ...

######################################################################### 100.0%######################################################################### 100.0%
The snapshot is downloaded in /home/akshat/gaianet

* Import the Qdrant collection snapshot ...
  The process may take a few minutes. Please wait ...
  Recovery is done!

[+] Preparing the dashboard ...
You already have a private key.

[+] Preparing the GaiaNet domain ...

[+] COMPLETED! GaiaNet node is initialized successfully.

To start the GaiaNet node, run the command: gaianet start <<<

akshat@akshat:~/gaianet$ gaianet start
[+] Checking the config.json file ...

You already have a private key.
[+] Starting Qdrant instance ...

Qdrant instance started with pid: 6684

[+] Starting LlamaEdge API Server ...

Run the following command to start the LlamaEdge API Server:

wasmedge --dir .:./dashboard --env NODE_VERSION=0.1.3 --nn-preload default:GGML:AUTO:Meta-Llama-3-8B-Instruct-Q5_K_M.gguf --nn-preload embedding:GGML:AUTO:nomic-embed-text-v1.5.f16.gguf rag-api-server.wasm --model-name Code_explanation_LLAMA-3-8B-Instruct-GGUF,Nomic-embed-text-v1.5 --ctx-size 4096,512 --batch-size 4096,512 --prompt-template llama-3-chat,embedding --rag-policy system-message --qdrant-collection-name default --qdrant-limit 1 --qdrant-score-threshold 0.5 --web-ui ./ --socket-addr 0.0.0.0:8080 --rag-prompt "You are an expert of the Rust language.\n----------------\n"

LlamaEdge API Server started with pid: 7146

Verify the LlamaEdge API Server. Please wait seconds ...

[+] Starting gaianet-domain ...

gaianet-domain started with pid: 7164

The GaiaNet node is started at: https://0x25b3d984d5c18b672383cc4f6827af11468fa58d.us.gaianet.network

To stop the GaiaNet node, run the command: gaianet stop <<<

akshat@akshat:~/gaianet$
`

Although I am constantly facing this issue that my node is working for general queries but for specific queries(whenever I use Rust in my query) my node uses computing resources but doesn't produce any response
Screenshot from 2024-07-05 16-35-51

ERROR: Gainet node not starting

  • Start a Qdrant instance ...

    • Remove the existed 'default' Qdrant collection ...

    • Download Qdrant collection snapshot ...
      ############################################################################ 100.0% The snapshot is downloaded in /root/gaianet

    • Import the Qdrant collection snapshot ...
      The process may take a few minutes. Please wait ...
      Recovery is done!

[+] Preparing the dashboard ...
You already have a private key.

[+] Preparing the GaiaNet domain ...

[+] COMPLETED! GaiaNet node is initialized successfully.

To start the GaiaNet node, run the command: gaianet start <<<

root@vmi1735894:~# gaianet start
[+] Checking the config.json file ...

[+] Starting Qdrant instance ...

Qdrant instance started with pid: 6147

[+] Starting LlamaEdge API Server ...

Port 8080 is in use. Exit ...

Stopping Qdrant instance ...

/root/gaianet/bin/gaianet: line 550: 6147 Killed nohup $qdrant_executable > $log_dir/start-qdrant.log 2>&1 (wd: /gaianet/qdrant)
root@vmi1735894:
# --port

404 page not found

After a day of normal operation, the link suddenly broke, I restart the link after the page showed 404 page not found, how to fix it?
image
image

[Error] Failed to recover from the collection snapshot.

image

Describe the bug
I encountered an error while initializing GaiaNet. The process fails with a "Service internal error: Tokio task join error."

To Reproduce
Steps to reproduce the behavior:

Run gaianet init
Wait for the process to download necessary files and create a collection in the Qdrant instance.
The error occurs during the import of the Qdrant collection snapshot.

* Failed to start LlamaEdge API Server after 3 retries.

mac m1 16g ram macos14.5
[wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:132: server_version: 0.7.3
[2024-07-16 14:29:04.884] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:140: model_name: Phi-3-mini-4k-instruct,Nomic-embed-text-v1.5
[2024-07-16 14:29:04.884] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:148: model_alias: default,embedding
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:162: ctx_size: 4096,512
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:176: batch_size: 4096,512
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:190: prompt_template: phi-3-chat,embedding
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:199: rag_prompt: You are a tour guide in Paris, France. Use information in the following context to directly answer the question from a Paris visitor.\n----------------\n
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:220: qdrant_url: http://127.0.0.1:6333
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:223: qdrant_collection_name: default
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:226: qdrant_limit: 1
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:229: qdrant_score_threshold: 0.5
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:240: chunk_capacity: 100
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] server_config: rag_api_server in src/main.rs:243: rag_policy: system-message
[2024-07-16 14:29:04.885] [wasi_logging_stdout] [info] llama-core: llama_core in /home/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/llama-core-0.12.1/src/lib.rs:496: Initializing the core context for RAG scenarios
[2024-07-16 14:29:04.885] [info] [WASI-NN] GGML backend: LLAMA_COMMIT a59f8fdc
[2024-07-16 14:29:04.885] [info] [WASI-NN] GGML backend: LLAMA_BUILD_NUMBER 3358

Completely remove Gaianet Node from Linux server?

Dear team, I'm looking to completely remove Gaianet Node and all its added packages from my Debian VPS.

I would like to do a clean install without any remnant of the old installation.

Is there a guide or a set of commands to do that?

Thank you very much.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.