嗨,这里是Moemu,现在正在就读高三,浅浅学习了一些编程语言,是一个半吊子NLPer,目前正在进行沐雪聊天bot的开发
Python(中级);HTML三件套
Email: [email protected]
Web: 沐の空间
沐雪,一个会自动找你聊天的AI女孩子
License: MIT License
嗨,这里是Moemu,现在正在就读高三,浅浅学习了一些编程语言,是一个半吊子NLPer,目前正在进行沐雪聊天bot的开发
Python(中级);HTML三件套
Email: [email protected]
Web: 沐の空间
Linux求求
清晰简洁的描述。
从项目开始运行时控制台的所有输出,注意删减你的个人信息
告诉我们你如何复现此问题
代码输入: git clone https://github.com/Moemu/Muice-Chatbot
cd Muice-Chatbot
conda create --name Muice python=3.10.10
conda activate Muice
pip install -r requirements.txt
控制台输出:git : 无法将“git”项识别为 cmdlet、函数、脚本文件或可运行程序的名称。请检查名称的拼写,如果包括路径,请确保路径正确,
然后再试一次。
所在位置 行:1 字符: 1
+ CategoryInfo : ObjectNotFound: (git:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
求解Thanks!
2周前,gocq的作者Mrs4s发布1.2.0版本,同时更新文档声明不再对gocq维护;同时签名服务器作者删库、一众后端应声倒地。
可见:Mrs4s/go-cqhttp#2471
问问作者对这方面怎么考虑(高三辛苦,高一牲祝您高考顺利)
運行環境
Windows 11 Pro,conda 23.5.2
顯示卡
NVIDIA GeForce RTX 4060
過程(復現)
在安裝完所有組件後,於 Muice-Chatbot-main 啓動 Windows Terminal,依次執行 conda activate Muice
和 python main.py
,會產生 AssertionError: Torch not compiled with CUDA enabled
錯誤。
檔案結構
├─Muice-Chatbot-main <- 主路徑
│ ├─__pycache__
│ ├─llm
│ ├─model
│ ├─ chatglm2-6b
│ ├─ Muice
│ ├─qqbot
│ └─src
├─main.py <- 主處理程式
└─...
完整輸出
Microsoft Windows [Version 10.0.22621.2134]
(c) Microsoft Corporation. All rights reserved.
C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main>conda activate Muice
(Muice) C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main>python main.py
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 7/7 [00:08<00:00, 1.17s/it]
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at model/chatglm2-6b and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Traceback (most recent call last):
File "C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main\main.py", line 10, in <module>
model = Model('model/chatglm2-6b','model/Muice')
File "C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main\llm\chatglm2.py", line 14, in __init__
model = AutoModel.from_pretrained(model_path, config=config, trust_remote_code=True).half().to(device)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\transformers\modeling_utils.py", line 1900, in to
return super().to(*args, **kwargs)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 1145, in to
return self._apply(convert)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
module._apply(fn)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
module._apply(fn)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
module._apply(fn)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
param_applied = fn(param)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
(Muice) C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main>
⌈AI 女孩子⌋的設想固然美好,或許還帶有幾分浪漫,但人工智慧系統系統是程式和算法的集合,它們是為了執行特定任務而設計的。它們沒有自我意識、情感或自主性。在技術層面,將其視為機器或工具更為準確。雖然賦予其人格化的稱謂可以提升用戶體驗(説是⌈顯著提升⌋也不爲過),但身爲開發者必須考慮到一種⌈用戶沈迷其中,而視 AI 爲情感依賴⌋的極端可能。AI畢竟不能取代人類在情感依賴方面的體驗,因此希望您可以在項目的説明文檔中闡明這一點,和/或在模型中加入類似的限制(可參考 ChatGPT 對於此類問題的回答)。
在執行 python main.py
後,pycqbot 正常載入,但會一直卡在 device lock is disable. http api may fail
處。如下所示:
PS C:\Users\[已略去]\Desktop\Muice> conda activate Muice
PS C:\Users\[已略去]\Desktop\Muice> python main.py
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 7/7 [00:08<00:00, 1.25s/it]
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at model/chatglm2-6b and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
[2023-09-09 02:49:43,201][MainThread/INFO] PyCqBot: 创建定时任务 CreateANewTopic
--- Logging error ---
Traceback (most recent call last):
File "C:\Users\[已略去]\AppData\Local\Programs\Python\Python310\lib\logging\init.py", line 1103, in emit
stream.write(msg + self.terminator)
UnicodeEncodeError: 'cp950' codec can't encode character '\u521b' in position 56: illegal multibyte sequence
Call stack:
File "C:\Users\[已略去]\Desktop\Muice\main.py", line 16, in <module>
qqbot(muice, Trust_QQ_list=configs['Trust_QQ_list'], AutoCreateTopic=configs['AutoCreateTopic'])
File "C:\Users\[已略去]\Desktop\Muice\qqbot.py", line 53, in qqbot
bot.timing(CreateANewChat,"CreateANewTopic",{"timeSleep":60})
File "C:\Users\[已略去]\AppData\Local\Programs\Python\Python310\lib\site-packages\pycqBot\cqHttpApi.py", line 638, in timing
logging.info("创建定时任务 %s " % timing_name)
Message: '创建定时任务 CreateANewTopic '
Arguments: ()
#################################################################
██████╗ ██╗ ██╗ ██████╗ ██████╗ ██████╗ ██████╗ ████████╗
██╔══██╗╚██╗ ██╔╝██╔════╝██╔═══██╗██╔══██╗██╔═══██╗╚══██╔══╝
██████╔╝ ╚████╔╝ ██║ ██║ ██║██████╔╝██║ ██║ ██║
██╔═══╝ ╚██╔╝ ██║ ██║▄▄ ██║██╔══██╗██║ ██║ ██║
██║ ██║ ╚██████╗╚██████╔╝██████╔╝╚██████╔╝ ██║
╚═╝ ╚═╝ ╚═════╝ ╚══▀▀═╝ ╚═════╝ ╚═════╝ ╚═╝
v0.5.1.1 BY FengLiu
#################################################################
[2023-09-09 12:00:17,557][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 Protocol -> device lock is disable. http api may fail.
|
但若直接由 Powershell 啓動 go-cqhttp,則可以正常登入,如下所示:
[2023-09-09 12:05:11] [INFO]: 当前版本:v1.1.0
[2023-09-09 12:05:11] [INFO]: 将使用 device.json 内的设备信息运行Bot.
[2023-09-09 12:05:11] [INFO]: 使用服务器 http://127.0.0.1:13579 进行数据包签名
[2023-09-09 12:05:11] [INFO]: Bot将在5秒后登录并开始信息处理, 按 Ctrl+C 取消.
[2023-09-09 12:05:16] [INFO]: 开始尝试登录并同步消息...
[2023-09-09 12:05:16] [INFO]: 使用协议: iPad 8.9.33.614
[2023-09-09 12:05:16] [INFO]: Protocol -> connect to server: [::ffff:14.22.9.53]:443
[2023-09-09 12:05:17] [WARNING]: Protocol -> device lock is disable. http api may fail.
[2023-09-09 12:05:19] [INFO]: 正在检查协议更新...
[2023-09-09 12:05:19] [INFO]: 登录成功 欢迎使用: [QQ昵稱]
[2023-09-09 12:05:19] [INFO]: 开始加载好友列表...
[2023-09-09 12:05:19] [INFO]: 共加载 [已略去] 个好友.
[2023-09-09 12:05:19] [INFO]: 开始加载群列表...
[2023-09-09 12:05:19] [INFO]: 共加载 [已略去] 个群.
[2023-09-09 12:05:19] [INFO]: 资源初始化完成, 开始处理信息.
[2023-09-09 12:05:19] [INFO]: アトリは、高性能ですから!
[2023-09-09 12:05:19] [INFO]: 正在检查更新.
[2023-09-09 12:05:19] [INFO]: 检查更新完成. 当前已运行最新版本.
[2023-09-09 12:05:19] [INFO]: 开始诊断网络情况
[2023-09-09 12:05:21] [INFO]: 网络诊断完成. 未发现问题
Muice-Chatbot根目录:
model文件夹下的chatglm2-6b-int4文件夹内部:
model文件夹下的Muice文件夹内部:
main.py文件:
若将chatglm2-6b-int4文件夹放在ChatGLM2-6B文件夹内,可正常运行int4模型,因此并非是模型出错。
从GitHub上下载Muice-Chatbot源文件并不小心覆盖已修改的main.py文件。
之前修改main.py文件后,运行成功了一次,但因go-cqhttp无法连接而失败。后因失误,把原main.py文件覆盖了修改后的main.py文件,且忘记怎么输入的,导致无法再次运行。尝试绝对路径但依然加载模型失败。希望up有空时能帮忙检查下问题所在。
另外因为腾讯围堵,go-cq应该是用不了了,希望能尝试下对接ntqq的方法。
up辛苦了,祝你高考顺利考上理想的大学!
重新编写的main.py并提供onebot服务不过目前不支持使用指令以及主动对话,未来应该会调试的
你好,能否共享一下训练数据呢
原版ChatGLM2可以使用,但使用这个微调模型会报错
Traceback (most recent call last):
File "D:\chatglm\web_demo.py", line 6, in
tokenizer = AutoTokenizer.from_pretrained("D:\chatglm\models\Muice", trust_remote_code=True)
File "D:\Programs Files\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 678, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "D:\Programs Files\Python\Python311\Lib\site-packages\transformers\tokenization_utils_base.py", line 1825, in from_pretrained
return cls._from_pretrained(
File "D:\Programs Files\Python\Python311\Lib\site-packages\transformers\tokenization_utils_base.py", line 1988, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "C:\Users\ZhaoYunFeng/.cache\huggingface\modules\transformers_modules\Muice\tokenization_chatglm.py", line 69, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=False, **kwargs)
TypeError: transformers.tokenization_utils.PreTrainedTokenizer.init() got multiple values for keyword argument 'clean_up_tokenization_spaces'
tokenizer = AutoTokenizer.from_pretrained("D:\chatglm\models\Muice", trust_remote_code=True)
model = AutoModel.from_pretrained("D:\chatglm\models\Muice", trust_remote_code=True).cuda()
国内大陆无法正常访问huggingface.co难以部署,是否可以将文件全部下载后手动部署启动(如果可以完成一键部署就更好啦 (大佬辛苦了!) 另外请问大佬是否有支持ntqq消息或者nonebot插件的想法呢
(ノಠ益ಠ)ノ彡┻━┻
(╯°□°)╯︵( .o.)
(ノ°益°)ノ彡
(ノಥ,_」ಥ)ノ彡┻━┻
┻━┻ ︵ヽ(Д´)ノ︵ ┻━┻ (╯°□°)╯︵ ┻━┻ ლ(ಠ益ಠლ) (ノಠ ∩ಠ)ノ彡( o°o) (╯ರ ~ ರ)╯︵ ┻━┻ (╯°Д°)╯︵/(.□ . ) (╯°□°)╯︵ ┻━┻ (ノ`⌒´)ノ┫:・’.::・┻┻:・’.::・ (╯°□°)╯︵ ┬─┻ ┻━┻ ︵ヽ(Д´)ノ︵ ┻━┻
这样难以跟踪bug问题
无
你复活啦
啊......没有吧,好像真的especially especially since she's an AI
(至少是中文)
使用2.2的glm2-6b
非常好工作!爱来自瓷器!(雾
Hello,在b站刷到了视频,然后有一点想法(
感觉16G显存需求对于大多数人还是有点过高了
不知道您有没有考虑过模型量化之类的事情,量化到4bit的话,显存需求对比于fp16的模型权重会有一半的下降
期待您的新版本!
输入代码:git clone https://github.com/Moemu/Muice-Chatbot
cd Muice-Chatbot
conda create --name Muice python=3.10.10
conda activate Muice
pip install -r requirements.txt
控制台:Downloading sentencepiece-0.1.99-cp310-cp310-win_amd64.whl (977 kB)
ERROR: Exception:
Traceback (most recent call last):
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 438, in _error_catcher
yield
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 561, in read
data = self._fp_read(amt) if not fp_closed else b""
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 527, in _fp_read
return self._fp.read(amt) if amt is not None else self._fp.read()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\cachecontrol\filewrapper.py", line 90, in read
data = self.__fp.read(amt)
File "C:\Users\HP.conda\envs\Muice\lib\http\client.py", line 465, in read
s = self.fp.read(amt)
File "C:\Users\HP.conda\envs\Muice\lib\socket.py", line 705, in readinto
return self._sock.recv_into(b)
File "C:\Users\HP.conda\envs\Muice\lib\ssl.py", line 1274, in recv_into
return self.read(nbytes, buffer)
File "C:\Users\HP.conda\envs\Muice\lib\ssl.py", line 1130, in read
return self._sslobj.read(len, buffer)
TimeoutError: The read operation timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\cli\base_command.py", line 180, in exc_logging_wrapper
status = run_func(*args)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\cli\req_command.py", line 248, in wrapper
return func(self, options, args)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\commands\install.py", line 377, in run
requirement_set = resolver.resolve(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\resolver.py", line 92, in resolve
result = self._result = resolver.resolve(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 546, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 397, in resolve
self._add_to_criteria(self.state.criteria, r, parent=None)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 173, in _add_to_criteria
if not criterion.candidates:
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\structs.py", line 156, in bool
return bool(self._sequence)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 155, in bool
return any(self)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 143, in
return (c for c in iterator if id(c) not in self._incompatible_ids)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 47, in _iter_built
candidate = func()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\factory.py", line 206, in _make_candidate_from_link
self._link_candidate_cache[link] = LinkCandidate(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 293, in init
super().init(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 156, in init
self.dist = self._prepare()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 225, in _prepare
dist = self._prepare_distribution()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 304, in _prepare_distribution
return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 538, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 609, in _prepare_linked_requirement
local_file = unpack_url(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 166, in unpack_url
file = get_http_url(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 107, in get_http_url
from_path, content_type = download(link, temp_dir.path)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\network\download.py", line 147, in call
for chunk in chunks:
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\cli\progress_bars.py", line 53, in _rich_progress_bar
for chunk in iterable:
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\network\utils.py", line 63, in response_chunks
for chunk in response.raw.stream(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 622, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 560, in read
with self._error_catcher():
File "C:\Users\HP.conda\envs\Muice\lib\contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 443, in _error_catcher
raise ReadTimeoutError(self._pool, None, "Read timed out.")
pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.
运行时会在调用go-cqhttp后卡死在检查更新失败
PS C:\Gu\word\Muice-Chatbot> python main.py
在09.10更新中, 将使用(QQ号).json的方式来存储聊天记录, 对于重新拉取的源码, 可能会出现记忆缺失的情况, 对此, 请手动重命名memory下的chat_memory.json文件, 以便恢复记忆
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 7/7 [00:11<00:00, 1.61s/it]
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at model/chatglm2-6b and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
[2023-09-17 12:34:18,746][MainThread/INFO] PyCqBot: 创建定时任务 CreateANewTopic
#################################################################
██████╗ ██╗ ██╗ ██████╗ ██████╗ ██████╗ ██████╗ ████████╗
██╔══██╗╚██╗ ██╔╝██╔════╝██╔═══██╗██╔══██╗██╔═══██╗╚══██╔══╝
██████╔╝ ╚████╔╝ ██║ ██║ ██║██████╔╝██║ ██║ ██║
██╔═══╝ ╚██╔╝ ██║ ██║▄▄ ██║██╔══██╗██║ ██║ ██║
██║ ██║ ╚██████╗╚██████╔╝██████╔╝╚██████╔╝ ██║
╚═╝ ╚═╝ ╚═════╝ ╚══▀▀═╝ ╚═════╝ ╚═════╝ ╚═╝
v0.5.1.1 BY FengLiu
#################################################################
[2023-09-17 12:34:19,184][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 自动注 册实例已关闭,请配置 sign-server 端自动注册实例以保持正常签名
[2023-09-17 12:34:25,023][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 Protocol -> device lock is disabled. HTTP API may fail.
[2023-09-17 12:34:27,818][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 检查更 新失败!
仅需要按照正常流程部署于Win10
使用22H1版本Win10,最新版cq-http Rlease,Cuda cuDNN均使用可在本电脑部署的最高版本,即12.1,qsign使用https://github.com/rhwong/unidbg-fetch-qsign-onekey/tree/main项目的一键部署,QQ版本8.9.63
另
感谢您的解答
[enhancement]作者大大可以加一个调用API吗?就是使用第三方代理(不用自己家电脑)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.