尝试autoeval , 评估chatglm2-6b,baichuan-7b 相同的错误如下:
1.chatglm2-6b:
python3 autoeval.py --model chatglm2-6b --lora_path ../../models/chatglm2-6b --eval_data all --device cuda:0
cuda:0
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7/7 [00:09<00:00, 1.31s/it]
Traceback (most recent call last):
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/peft/config.py", line 184, in _get_peft_type
config_file = hf_hub_download(
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '../../models/chatglm2-6b'. Use repo_type
argument if needed.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/wts/DISC-FinLLM/eval/evaluator/autoeval.py", line 45, in
llm = model_lists.get(model_name)(device, lora_path)
File "/home/wts/DISC-FinLLM/eval/evaluator/finllm.py", line 33, in init
self.model = PeftModel.from_pretrained(self.model, peft_model_id)
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/peft/peft_model.py", line 324, in from_pretrained
PeftConfig._get_peft_type(
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/peft/config.py", line 190, in _get_peft_type
raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'")
ValueError: Can't find 'adapter_config.json' at '../../models/chatglm2-6b'
2.baichuan-7b
python3 autoeval.py --model baichuan-7b --lora_path ../../models/baichuan-7b --eval_data all --device cuda:0
cuda:0
Traceback (most recent call last):
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/peft/config.py", line 184, in _get_peft_type
config_file = hf_hub_download(
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '../../models/baichuan-7b'. Use repo_type
argument if needed.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/wts/DISC-FinLLM/eval/evaluator/autoeval.py", line 45, in
llm = model_lists.get(model_name)(device, lora_path)
File "/home/wts/DISC-FinLLM/eval/evaluator/finllm.py", line 136, in init
self.model = PeftModel.from_pretrained(self.model, peft_model_id)
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/peft/peft_model.py", line 324, in from_pretrained
PeftConfig._get_peft_type(
File "/home/wts/anaconda3/envs/DISC-FinLLM/lib/python3.10/site-packages/peft/config.py", line 190, in _get_peft_type
raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'")
ValueError: Can't find 'adapter_config.json' at '../../models/baichuan-7b'