Comments (34)
Can confirm what @llimllib said. I had to do:
pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama
After that it works for me.
I also had another issue with py_itree, as reported here. I think this is happening on Mac M1 machines.
This was fixed by uninstalling py_itree first, then installing it from source:
pip uninstall py_itree
pip install https://github.com/juncongmoo/itree/archive/refs/tags/v0.0.18.tar.gz
from pyllama.
Same problem in Mac M1
from pyllama.
i tried after installing wget and hiq-python but it still does not work
I do not get any warnings..
❯ python3 -m llama.download --model_size 7B ❤️ Resume download is supported. You can ctrl-c and rerun the program to resume the downloading Downloading tokenizer... ✅ pyllama_data/tokenizer.model ✅ pyllama_data/tokenizer_checklist.chk Downloading 7B ✅ pyllama_data/7B/params.json ✅ pyllama_data/7B/checklist.chk Checking checksums
from pyllama.
If you are on macos there is a problem with the download_community.sh
script that is called from download.py
:
- it uses
declare -A
which needs bash v4+, but macos only comes with bash 3.x - default shell on recent macos is
zsh
so we can install a newer bash without breaking things, justbrew install bash
- but the
download_community.sh
script has#!/bin/bash
at the top which points to the system bash instead of homebrew bash (which has been installed as the default, if Ibash --version
I getversion 5.2.15
now, but in a different location)- we can fix this by changing the line at the top of
download_community.sh
to#!/usr/bin/env bash
(should work for everyone with that change I think?)
- we can fix this by changing the line at the top of
- now we just need to
brew install wget
After these steps I am now in the process of downloading the 7B checkpoints 🎉
I'm on an M1, Ventura 13.3
from pyllama.
work for mac(intel), empty folder for mac(m1)
from pyllama.
Script failed on verify
function because md5sum
was not available on my mac m1. brew install md5sha1sum
from pyllama.
same on ubuntu, I can only get the token but not the model
from pyllama.
Clearing pyllama out from my site-packages
folder, cloning this repo, and running the same command as the previous comment in the root directory of this repository works as it is supposed to.
from pyllama.
✅ Worked on Mac (2012) OS Catalina x64_86 architecture Intel chip
- Clone Repo
pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama
cd pyllama
- Run
llama/download_community.sh 7B /tmp/llama-models
,
🔴
However using python3 -m llama.download --model_size 7B --folder llama/
command it fails with recursion error.
% pipenv run python3 -m llama.download --model_size 7B --folder llama/
Traceback (most recent call last):
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/Users/my_user/pyllama/llama/download.py", line 87, in <module>
download(args)
File "/Users/my_user/pyllama/llama/download.py", line 20, in download
download(args)
File "/Users/my_user/pyllama/llama/download.py", line 20, in download
download(args)
File "/Users/my_user/pyllama/llama/download.py", line 20, in download
download(args)
[Previous line repeated 985 more times]
File "/Users/my_user/pyllama/llama/download.py", line 17, in download
retcode = hiq.execute_cmd(cmd, verbose=False, shell=True, runtime_output=True, env=os.environ)
File "/Users/my_user/.local/share/virtualenvs/my-env-cIchWPfI/lib/python3.9/site-packages/hiq/utils.py", line 101, in execute_cmd
proc = subprocess.Popen(
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/subprocess.py", line 951, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/subprocess.py", line 1737, in _execute_child
for k, v in env.items():
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/_collections_abc.py", line 851, in __iter__
for key in self._mapping:
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/os.py", line 701, in __iter__
yield self.decodekey(key)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/os.py", line 759, in decode
return value.decode(encoding, 'surrogateescape')
RecursionError: maximum recursion depth exceeded while calling a Python object`
@Genie-Liu @AstroWa3l if you clone this repository and run
llama/download_community.sh 7B /tmp/llama-models
hopefully you'll get the 7B model in/tmp/models
. Looks like there hasn't been a new release yet, so this issue will persist at least until then
from pyllama.
Do you have wget
installed? You need to install wget
otherwise you will only get empty folder.
from pyllama.
I have wget installed and in my path. It doesn't work
from pyllama.
wget is on my path, and hiq-python
is up to date. m1 mac, python 3.10, pyllama v0.0.18:
$ python -m llama.download --model_size 7B --folder llama
❤️ Resume download is supported. You can ctrl-c and rerun the program to resume the downloading
Downloading tokenizer...
✅ llama/tokenizer.model
✅ llama/tokenizer_checklist.chk
Downloading 7B
✅ llama/7B/params.json
✅ llama/7B/checklist.chk
Checking checksums
$ ls llama/
7B/
$ du -sh llama/
0B llama/
from pyllama.
macOS running ARM/Apple Silicon M1 empty folder always persists no matter what steps are listed here or in the linked commits.
from pyllama.
@llimllib I found one error after following your instructions and it was due to a missing md5sum package so I installed it and now it is downloading. I could not download it due to a lack of this package on my system to check hashes. Thank you!!!
from pyllama.
Just updated the code base. Can you reinstall pyllama and try?
from pyllama.
I just reinstalled and I still got the same behavior
from pyllama.
Same in Fedora
from pyllama.
same problem for me here (intel mac)
from pyllama.
I can not reproduce the error. It always work for me.
from pyllama.
Try pip install hiq-python -U
?
from pyllama.
same on ubuntu, I can only get the token but not the model
I am using ubuntu and it works well for me though.
from pyllama.
work well on ubuntu, i pull last code from repo and reinstall pylama (pip install pyllama -U)
from pyllama.
Hopefully this is fixed following #70 and #71, let me know if you see any problems with the updated script!
from pyllama.
@llimllib I'm getting the same issue on windows
from pyllama.
Still encounter this problem after upgrading bash to v5.
from pyllama.
@Genie-Liu @AstroWa3l if you clone this repository and run llama/download_community.sh 7B /tmp/llama-models
hopefully you'll get the 7B model in /tmp/models
. Looks like there hasn't been a new release yet, so this issue will persist at least until then
from pyllama.
@llimllib Unfortunately it did not fix the issue
from pyllama.
@AstroWa3l can you elaborate? Were there errors? What was the output?
from pyllama.
the script might benefit from having set -e
at the top so it exits early instead of continuing after errors
from pyllama.
@anentropic I do that with all my scripts, but I've never worked on this project before and wasn't sure how they were calling the script so I didn't add it
from pyllama.
Ah! I already had this via brew install coreutils
from pyllama.
@mldevorg Can you tell me all the step that you make so that i can try and reproduced it, it dosent work for me on ubuntu
from pyllama.
I tried the approach from @CefBoud on my Macbook Air M2(24G), it can solve the downloading loop mentioned above
from pyllama.
Can confirm what @llimllib said. I had to do:
pip uninstall pyllama git clone https://github.com/juncongmoo/pyllama pip install -e pyllamaAfter that it works for me.
I also had another issue with py_itree, as reported here. I think this is happening on Mac M1 machines.
This was fixed by uninstalling py_itree first, then installing it from source:
pip uninstall py_itree pip install https://github.com/juncongmoo/itree/archive/refs/tags/v0.0.18.tar.gz
work for me ,thanks👍
from pyllama.
Related Issues (20)
- shape mismatch error
- RecursionError: maximum recursion depth exceeded while calling a Python object
- RuntimeError: Error(s) in loading state_dict for LLaMAForCausalLM: Unexpected key(s) in state_dict:
- Download watchdog kicking in? (M1 mac)
- Download 7B model seems stuck HOT 9
- evaluating has an extremely large value when quantize to 4bit. HOT 1
- NVMLError_NoPermission: Insufficient Permissions
- no module named llama HOT 1
- 12GB card HOT 2
- How to run an interactive mode in Jupyter?
- Quick Question
- torch.distributed.elastic.multiprocessing.errors.ChildFailedError HOT 1
- Question regarding EnergonAI repo
- Why are params.json empty? HOT 5
- Does this include the GPTQ quantization tricks?
- Randomly get shape mismatch error
- Try Modular - Mojo
- gptq github HOT 4
- parameter inncorrect when I run make command
- an operation was attempted on something that is not a socket
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pyllama.