Code Monkey home page Code Monkey logo

Comments (34)

michael-erasmus avatar michael-erasmus commented on July 24, 2024 12

Can confirm what @llimllib said. I had to do:

pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama

After that it works for me.

I also had another issue with py_itree, as reported here. I think this is happening on Mac M1 machines.

This was fixed by uninstalling py_itree first, then installing it from source:

pip uninstall py_itree
pip install https://github.com/juncongmoo/itree/archive/refs/tags/v0.0.18.tar.gz

from pyllama.

lucascr91 avatar lucascr91 commented on July 24, 2024 10

Same problem in Mac M1

from pyllama.

lesurJ avatar lesurJ commented on July 24, 2024 9

i tried after installing wget and hiq-python but it still does not work

I do not get any warnings..

❯ python3 -m llama.download --model_size 7B ❤️ Resume download is supported. You can ctrl-c and rerun the program to resume the downloading Downloading tokenizer... ✅ pyllama_data/tokenizer.model ✅ pyllama_data/tokenizer_checklist.chk Downloading 7B ✅ pyllama_data/7B/params.json ✅ pyllama_data/7B/checklist.chk Checking checksums

from pyllama.

anentropic avatar anentropic commented on July 24, 2024 6

If you are on macos there is a problem with the download_community.sh script that is called from download.py:

  • it uses declare -A which needs bash v4+, but macos only comes with bash 3.x
  • default shell on recent macos is zsh so we can install a newer bash without breaking things, just brew install bash
  • but the download_community.sh script has #!/bin/bash at the top which points to the system bash instead of homebrew bash (which has been installed as the default, if I bash --version I get version 5.2.15 now, but in a different location)
    • we can fix this by changing the line at the top of download_community.sh to #!/usr/bin/env bash (should work for everyone with that change I think?)
  • now we just need to brew install wget

After these steps I am now in the process of downloading the 7B checkpoints 🎉

I'm on an M1, Ventura 13.3

from pyllama.

txdywy avatar txdywy commented on July 24, 2024 4

work for mac(intel), empty folder for mac(m1)

from pyllama.

CefBoud avatar CefBoud commented on July 24, 2024 3

Script failed on verify function because md5sum was not available on my mac m1. brew install md5sha1sum

from pyllama.

sharlec avatar sharlec commented on July 24, 2024 2

same on ubuntu, I can only get the token but not the model

from pyllama.

llimllib avatar llimllib commented on July 24, 2024 2

Clearing pyllama out from my site-packages folder, cloning this repo, and running the same command as the previous comment in the root directory of this repository works as it is supposed to.

from pyllama.

Divjyot avatar Divjyot commented on July 24, 2024 2

@llimllib

✅ Worked on Mac (2012) OS Catalina x64_86 architecture Intel chip

  1. Clone Repo
pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama
  1. cd pyllama
  2. Run llama/download_community.sh 7B /tmp/llama-models ,

🔴
However using python3 -m llama.download --model_size 7B --folder llama/ command it fails with recursion error.

% pipenv run python3 -m llama.download --model_size 7B --folder llama/
Traceback (most recent call last):
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/Users/my_user/pyllama/llama/download.py", line 87, in <module>
    download(args)
  File "/Users/my_user/pyllama/llama/download.py", line 20, in download
    download(args)
  File "/Users/my_user/pyllama/llama/download.py", line 20, in download
    download(args)
  File "/Users/my_user/pyllama/llama/download.py", line 20, in download
    download(args)
  [Previous line repeated 985 more times]
  File "/Users/my_user/pyllama/llama/download.py", line 17, in download
    retcode = hiq.execute_cmd(cmd, verbose=False, shell=True, runtime_output=True, env=os.environ)
  File "/Users/my_user/.local/share/virtualenvs/my-env-cIchWPfI/lib/python3.9/site-packages/hiq/utils.py", line 101, in execute_cmd
    proc = subprocess.Popen(
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/subprocess.py", line 951, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/subprocess.py", line 1737, in _execute_child
    for k, v in env.items():
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/_collections_abc.py", line 851, in __iter__
    for key in self._mapping:
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/os.py", line 701, in __iter__
    yield self.decodekey(key)
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/os.py", line 759, in decode
    return value.decode(encoding, 'surrogateescape')
RecursionError: maximum recursion depth exceeded while calling a Python object`

@Genie-Liu @AstroWa3l if you clone this repository and run llama/download_community.sh 7B /tmp/llama-models hopefully you'll get the 7B model in /tmp/models. Looks like there hasn't been a new release yet, so this issue will persist at least until then

from pyllama.

shadowwalker2718 avatar shadowwalker2718 commented on July 24, 2024 1

Do you have wget installed? You need to install wget otherwise you will only get empty folder.

from pyllama.

lucascr91 avatar lucascr91 commented on July 24, 2024 1

I have wget installed and in my path. It doesn't work

from pyllama.

llimllib avatar llimllib commented on July 24, 2024 1

wget is on my path, and hiq-python is up to date. m1 mac, python 3.10, pyllama v0.0.18:

$ python -m llama.download --model_size 7B --folder llama
❤️ Resume download is supported. You can ctrl-c and rerun the program to resume the downloading
Downloading tokenizer...
✅ llama/tokenizer.model
✅ llama/tokenizer_checklist.chk
Downloading 7B
✅ llama/7B/params.json
✅ llama/7B/checklist.chk
Checking checksums

$ ls llama/
7B/

$ du -sh llama/
  0B	llama/

from pyllama.

AstroWa3l avatar AstroWa3l commented on July 24, 2024 1

macOS running ARM/Apple Silicon M1 empty folder always persists no matter what steps are listed here or in the linked commits.

from pyllama.

AstroWa3l avatar AstroWa3l commented on July 24, 2024 1

@llimllib I found one error after following your instructions and it was due to a missing md5sum package so I installed it and now it is downloading. I could not download it due to a lack of this package on my system to check hashes. Thank you!!!

from pyllama.

juncongmoo avatar juncongmoo commented on July 24, 2024

Just updated the code base. Can you reinstall pyllama and try?

from pyllama.

flyjgh avatar flyjgh commented on July 24, 2024

I just reinstalled and I still got the same behavior

from pyllama.

ivanstepanovftw avatar ivanstepanovftw commented on July 24, 2024

Same in Fedora

from pyllama.

lesurJ avatar lesurJ commented on July 24, 2024

same problem for me here (intel mac)

from pyllama.

juncongmoo avatar juncongmoo commented on July 24, 2024

I can not reproduce the error. It always work for me.

image

from pyllama.

juncongmoo avatar juncongmoo commented on July 24, 2024

Try pip install hiq-python -U ?

from pyllama.

mldevorg avatar mldevorg commented on July 24, 2024

same on ubuntu, I can only get the token but not the model

I am using ubuntu and it works well for me though.

from pyllama.

alexch33 avatar alexch33 commented on July 24, 2024

work well on ubuntu, i pull last code from repo and reinstall pylama (pip install pyllama -U)

from pyllama.

llimllib avatar llimllib commented on July 24, 2024

Hopefully this is fixed following #70 and #71, let me know if you see any problems with the updated script!

from pyllama.

george-adams1 avatar george-adams1 commented on July 24, 2024

@llimllib I'm getting the same issue on windows

from pyllama.

Genie-Liu avatar Genie-Liu commented on July 24, 2024

Still encounter this problem after upgrading bash to v5.

from pyllama.

llimllib avatar llimllib commented on July 24, 2024

@Genie-Liu @AstroWa3l if you clone this repository and run llama/download_community.sh 7B /tmp/llama-models hopefully you'll get the 7B model in /tmp/models. Looks like there hasn't been a new release yet, so this issue will persist at least until then

from pyllama.

AstroWa3l avatar AstroWa3l commented on July 24, 2024

@llimllib Unfortunately it did not fix the issue

from pyllama.

llimllib avatar llimllib commented on July 24, 2024

@AstroWa3l can you elaborate? Were there errors? What was the output?

from pyllama.

anentropic avatar anentropic commented on July 24, 2024

the script might benefit from having set -e at the top so it exits early instead of continuing after errors

from pyllama.

llimllib avatar llimllib commented on July 24, 2024

@anentropic I do that with all my scripts, but I've never worked on this project before and wasn't sure how they were calling the script so I didn't add it

from pyllama.

anentropic avatar anentropic commented on July 24, 2024

Ah! I already had this via brew install coreutils

from pyllama.

Huasito-Appel avatar Huasito-Appel commented on July 24, 2024

@mldevorg Can you tell me all the step that you make so that i can try and reproduced it, it dosent work for me on ubuntu

from pyllama.

GreysTone avatar GreysTone commented on July 24, 2024

I tried the approach from @CefBoud on my Macbook Air M2(24G), it can solve the downloading loop mentioned above

from pyllama.

yufanghui avatar yufanghui commented on July 24, 2024

Can confirm what @llimllib said. I had to do:

pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama

After that it works for me.

I also had another issue with py_itree, as reported here. I think this is happening on Mac M1 machines.

This was fixed by uninstalling py_itree first, then installing it from source:

pip uninstall py_itree
pip install https://github.com/juncongmoo/itree/archive/refs/tags/v0.0.18.tar.gz

work for me ,thanks👍

from pyllama.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.