Code Monkey home page Code Monkey logo

Comments (6)

shubham8550 avatar shubham8550 commented on August 22, 2024 1

Refer to the source code, it's NOT supported yet.

constructor(model: string = 'gpt4all-lora-quantized', forceDownload: boolean = false, decoderConfig: Record<string, any> = {}) {
this.model = model;
this.decoderConfig = decoderConfig;
/*
allowed models:
M1 Mac/OSX: cd chat;./gpt4all-lora-quantized-OSX-m1
Linux: cd chat;./gpt4all-lora-quantized-linux-x86
Windows (PowerShell): cd chat;./gpt4all-lora-quantized-win64.exe
Intel Mac/OSX: cd chat;./gpt4all-lora-quantized-OSX-intel
*/
if (
'gpt4all-lora-quantized' !== model &&
'gpt4all-lora-unfiltered-quantized' !== model
) {
throw new Error(`Model ${model} is not supported. Current models supported are:
gpt4all-lora-quantized
gpt4all-lora-unfiltered-quantized`
);
}

yes that's why i made one

https://www.npmjs.com/package/vicuna-ts

from gpt4all-ts.

shubham8550 avatar shubham8550 commented on August 22, 2024 1

originally thats what i wanted but gpt4all-ts s code would have become mess if i stuffed vicuna

from gpt4all-ts.

shubham8550 avatar shubham8550 commented on August 22, 2024

So i installed VICUNA with this repo

https://github.com/mps256/autovicuna

They put everything in C:/VICUNA
(only 13B Vicuna CPU )

they use
llama.zip

and
ggml-vicuna-13b-4bit.bin

you can get these from

https://huggingface.co/eachadea/ggml-vicuna-13b-4bit/blob/main/ggml-vicuna-13b-4bit-rev1.bin

" ...rev1.bin" have some fixes but its not included in mps256/autovicuna ig

you can get 7B from and it can run on same llama files
https://huggingface.co/eachadea/ggml-vicuna-7b-4bit/tree/main

Heres Some SS running 13B VICUNA
image

and here 7B (its faster and gives answers better than LLAMA/ alpaca in my pov)
image

output sometimes end with
image
maybe because of these
Reverse prompt: '### Human:'
Reverse prompt: '### Instruction:

from gpt4all-ts.

shubham8550 avatar shubham8550 commented on August 22, 2024
C:\VICUNA\main.exe -i --interactive-first -r "### Human:" -t 8 --temp 0 -c 2048 -n -1 --ignore-eos --repeat_penalty 1.2 --instruct -m C:\VICUNA\ggml-vicuna-7b-4bit-rev1.bin

sample command

from gpt4all-ts.

jellydn avatar jellydn commented on August 22, 2024

Refer to the source code, it's NOT supported yet.

constructor(model: string = 'gpt4all-lora-quantized', forceDownload: boolean = false, decoderConfig: Record<string, any> = {}) {
this.model = model;
this.decoderConfig = decoderConfig;
/*
allowed models:
M1 Mac/OSX: cd chat;./gpt4all-lora-quantized-OSX-m1
Linux: cd chat;./gpt4all-lora-quantized-linux-x86
Windows (PowerShell): cd chat;./gpt4all-lora-quantized-win64.exe
Intel Mac/OSX: cd chat;./gpt4all-lora-quantized-OSX-intel
*/
if (
'gpt4all-lora-quantized' !== model &&
'gpt4all-lora-unfiltered-quantized' !== model
) {
throw new Error(`Model ${model} is not supported. Current models supported are:
gpt4all-lora-quantized
gpt4all-lora-unfiltered-quantized`
);
}

from gpt4all-ts.

jellydn avatar jellydn commented on August 22, 2024

Refer to the source code, it's NOT supported yet.

constructor(model: string = 'gpt4all-lora-quantized', forceDownload: boolean = false, decoderConfig: Record<string, any> = {}) {
this.model = model;
this.decoderConfig = decoderConfig;
/*
allowed models:
M1 Mac/OSX: cd chat;./gpt4all-lora-quantized-OSX-m1
Linux: cd chat;./gpt4all-lora-quantized-linux-x86
Windows (PowerShell): cd chat;./gpt4all-lora-quantized-win64.exe
Intel Mac/OSX: cd chat;./gpt4all-lora-quantized-OSX-intel
*/
if (
'gpt4all-lora-quantized' !== model &&
'gpt4all-lora-unfiltered-quantized' !== model
) {
throw new Error(`Model ${model} is not supported. Current models supported are:
gpt4all-lora-quantized
gpt4all-lora-unfiltered-quantized`
);
}

yes that's why i made one

https://www.npmjs.com/package/vicuna-ts

Nice 👍 Do you think that's good for community if your package is part of gpt4all-ts? Have you considered to send PR to gpt4all-ts? Thanks.

from gpt4all-ts.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.