Code Monkey home page Code Monkey logo

muice-chatbot's Introduction

muice-chatbot's People

Contributors

moemu avatar moesnowyfox avatar turbohk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

muice-chatbot's Issues

搭建时遇到的问题

代码输入: git clone https://github.com/Moemu/Muice-Chatbot

cd Muice-Chatbot
conda create --name Muice python=3.10.10
conda activate Muice
pip install -r requirements.txt

控制台输出:git : 无法将“git”项识别为 cmdlet、函数、脚本文件或可运行程序的名称。请检查名称的拼写,如果包括路径,请确保路径正确,
然后再试一次。
所在位置 行:1 字符: 1

求解Thanks!

关于QQ实现方式的展望

2周前,gocq的作者Mrs4s发布1.2.0版本,同时更新文档声明不再对gocq维护;同时签名服务器作者删库、一众后端应声倒地。
可见:Mrs4s/go-cqhttp#2471
问问作者对这方面怎么考虑(高三辛苦,高一牲祝您高考顺利)

AssertionError: Torch not compiled with CUDA enabled

運行環境
Windows 11 Pro,conda 23.5.2

顯示卡
NVIDIA GeForce RTX 4060

過程(復現)
在安裝完所有組件後,於 Muice-Chatbot-main 啓動 Windows Terminal,依次執行 conda activate Muicepython main.py,會產生 AssertionError: Torch not compiled with CUDA enabled 錯誤。

檔案結構

├─Muice-Chatbot-main   <- 主路徑
│  ├─__pycache__
│  ├─llm
│  ├─model
│       ├─ chatglm2-6b
│       ├─ Muice
│  ├─qqbot
│  └─src
├─main.py  <- 主處理程式
└─...

完整輸出

Microsoft Windows [Version 10.0.22621.2134]
(c) Microsoft Corporation. All rights reserved.

C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main>conda activate Muice

(Muice) C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main>python main.py
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 7/7 [00:08<00:00,  1.17s/it]
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at model/chatglm2-6b and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Traceback (most recent call last):
  File "C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main\main.py", line 10, in <module>
    model = Model('model/chatglm2-6b','model/Muice')
  File "C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main\llm\chatglm2.py", line 14, in __init__
    model = AutoModel.from_pretrained(model_path, config=config, trust_remote_code=True).half().to(device)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\transformers\modeling_utils.py", line 1900, in to
    return super().to(*args, **kwargs)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 1145, in to
    return self._apply(convert)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 797, in _apply
    module._apply(fn)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 820, in _apply
    param_applied = fn(param)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\nn\modules\module.py", line 1143, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
  File "C:\Users\[已略去]\.conda\envs\Muice\lib\site-packages\torch\cuda\__init__.py", line 239, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

(Muice) C:\Users\[已略去]\Desktop\Muice\Muice-Chatbot-main>

[Bug]

image

登录pygocq后(使用安卓手表协议)可以正确收到消息 但是qq没有回复 同时cou和内存负载并不高 疑似模型未正常运行 请问是否需要提供其他log文件等

我們是否應該賦予 AI 人格化稱謂?

⌈AI 女孩子⌋的設想固然美好,或許還帶有幾分浪漫,但人工智慧系統系統是程式算法的集合,它們是為了執行特定任務而設計的。它們沒有自我意識、情感或自主性。在技術層面,將其視為機器或工具更為準確。雖然賦予其人格化的稱謂可以提升用戶體驗(説是⌈顯著提升⌋也不爲過),但身爲開發者必須考慮到一種⌈用戶沈迷其中,而視 AI 爲情感依賴⌋的極端可能。AI畢竟不能取代人類在情感依賴方面的體驗,因此希望您可以在項目的説明文檔中闡明這一點,和/或在模型中加入類似的限制(可參考 ChatGPT 對於此類問題的回答)。

pycqBot 無法登入

在執行 python main.py 後,pycqbot 正常載入,但會一直卡在 device lock is disable. http api may fail 處。如下所示:

PS C:\Users\[已略去]\Desktop\Muice> conda activate Muice 
PS C:\Users\[已略去]\Desktop\Muice> python main.py 
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 7/7 [00:08<00:00,  1.25s/it] 
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at model/chatglm2-6b and are newly initialized: ['transformer.prefix_encoder.embedding.weight'] 
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. 
[2023-09-09 02:49:43,201][MainThread/INFO] PyCqBot: 创建定时任务 CreateANewTopic 
--- Logging error --- 
Traceback (most recent call last): 
  File "C:\Users\[已略去]\AppData\Local\Programs\Python\Python310\lib\logging\init.py", line 1103, in emit 
    stream.write(msg + self.terminator) 
UnicodeEncodeError: 'cp950' codec can't encode character '\u521b' in position 56: illegal multibyte sequence 
Call stack: 
  File "C:\Users\[已略去]\Desktop\Muice\main.py", line 16, in <module> 
    qqbot(muice, Trust_QQ_list=configs['Trust_QQ_list'], AutoCreateTopic=configs['AutoCreateTopic']) 
  File "C:\Users\[已略去]\Desktop\Muice\qqbot.py", line 53, in qqbot 
    bot.timing(CreateANewChat,"CreateANewTopic",{"timeSleep":60}) 
  File "C:\Users\[已略去]\AppData\Local\Programs\Python\Python310\lib\site-packages\pycqBot\cqHttpApi.py", line 638, in timing 
    logging.info("创建定时任务 %s " % timing_name) 
Message: '创建定时任务 CreateANewTopic '
Arguments: ()

#################################################################
██████╗ ██╗   ██╗ ██████╗ ██████╗ ██████╗  ██████╗ ████████╗
██╔══██╗╚██╗ ██╔╝██╔════╝██╔═══██╗██╔══██╗██╔═══██╗╚══██╔══╝
██████╔╝ ╚████╔╝ ██║     ██║   ██║██████╔╝██║   ██║   ██║
██╔═══╝   ╚██╔╝  ██║     ██║▄▄ ██║██╔══██╗██║   ██║   ██║
██║        ██║   ╚██████╗╚██████╔╝██████╔╝╚██████╔╝   ██║
╚═╝        ╚═╝    ╚═════╝ ╚══▀▀═╝ ╚═════╝  ╚═════╝    ╚═╝
                                            v0.5.1.1  BY FengLiu
#################################################################

[2023-09-09 12:00:17,557][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 Protocol -> device lock is disable. http api may fail.
|

但若直接由 Powershell 啓動 go-cqhttp,則可以正常登入,如下所示:

[2023-09-09 12:05:11] [INFO]: 当前版本:v1.1.0
[2023-09-09 12:05:11] [INFO]: 将使用 device.json 内的设备信息运行Bot.
[2023-09-09 12:05:11] [INFO]: 使用服务器 http://127.0.0.1:13579 进行数据包签名
[2023-09-09 12:05:11] [INFO]: Bot将在5秒后登录并开始信息处理, 按 Ctrl+C 取消.
[2023-09-09 12:05:16] [INFO]: 开始尝试登录并同步消息...
[2023-09-09 12:05:16] [INFO]: 使用协议: iPad 8.9.33.614
[2023-09-09 12:05:16] [INFO]: Protocol -> connect to server: [::ffff:14.22.9.53]:443
[2023-09-09 12:05:17] [WARNING]: Protocol -> device lock is disable. http api may fail.
[2023-09-09 12:05:19] [INFO]: 正在检查协议更新...
[2023-09-09 12:05:19] [INFO]: 登录成功 欢迎使用: [QQ昵稱]
[2023-09-09 12:05:19] [INFO]: 开始加载好友列表...
[2023-09-09 12:05:19] [INFO]: 共加载 [已略去] 个好友.
[2023-09-09 12:05:19] [INFO]: 开始加载群列表...
[2023-09-09 12:05:19] [INFO]: 共加载 [已略去] 个群.
[2023-09-09 12:05:19] [INFO]: 资源初始化完成, 开始处理信息.
[2023-09-09 12:05:19] [INFO]: アトリは、高性能ですから!
[2023-09-09 12:05:19] [INFO]: 正在检查更新.
[2023-09-09 12:05:19] [INFO]: 检查更新完成. 当前已运行最新版本.
[2023-09-09 12:05:19] [INFO]: 开始诊断网络情况
[2023-09-09 12:05:21] [INFO]: 网络诊断完成. 未发现问题

[Bug]ERROR:root:模型加载失败, 请检查是否在model目录下放置了初始模型与微调模型。另外go-cqhttp貌似用不了了

描述此Bug

Muice-Chatbot根目录:
微信图片_20231012214654
model文件夹下的chatglm2-6b-int4文件夹内部:
微信图片_20231012214702
model文件夹下的Muice文件夹内部:
微信图片_20231012214707
main.py文件:
微信图片_20231012214822

控制台输出

63832458161701
若将chatglm2-6b-int4文件夹放在ChatGLM2-6B文件夹内,可正常运行int4模型,因此并非是模型出错。

复现该Bug的操作

从GitHub上下载Muice-Chatbot源文件并不小心覆盖已修改的main.py文件。

额外信息

之前修改main.py文件后,运行成功了一次,但因go-cqhttp无法连接而失败。后因失误,把原main.py文件覆盖了修改后的main.py文件,且忘记怎么输入的,导致无法再次运行。尝试绝对路径但依然加载模型失败。希望up有空时能帮忙检查下问题所在。
另外因为腾讯围堵,go-cq应该是用不了了,希望能尝试下对接ntqq的方法。
up辛苦了,祝你高考顺利考上理想的大学!

对onebot协议进行支持

重新编写的main.py并提供onebot服务不过目前不支持使用指令以及主动对话,未来应该会调试的

[Bug]无法运行模型(仅当作一个语言模型使用)

描述此Bug

原版ChatGLM2可以使用,但使用这个微调模型会报错

控制台输出

Traceback (most recent call last):
File "D:\chatglm\web_demo.py", line 6, in
tokenizer = AutoTokenizer.from_pretrained("D:\chatglm\models\Muice", trust_remote_code=True)
File "D:\Programs Files\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 678, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "D:\Programs Files\Python\Python311\Lib\site-packages\transformers\tokenization_utils_base.py", line 1825, in from_pretrained
return cls._from_pretrained(
File "D:\Programs Files\Python\Python311\Lib\site-packages\transformers\tokenization_utils_base.py", line 1988, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "C:\Users\ZhaoYunFeng/.cache\huggingface\modules\transformers_modules\Muice\tokenization_chatglm.py", line 69, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=False, **kwargs)
TypeError: transformers.tokenization_utils.PreTrainedTokenizer.init() got multiple values for keyword argument 'clean_up_tokenization_spaces'

复现该Bug的操作

tokenizer = AutoTokenizer.from_pretrained("D:\chatglm\models\Muice", trust_remote_code=True)
model = AutoModel.from_pretrained("D:\chatglm\models\Muice", trust_remote_code=True).cuda()

额外信息

是否可以脱离huggingface本地化部署沐雪模型

国内大陆无法正常访问huggingface.co难以部署,是否可以将文件全部下载后手动部署启动(如果可以完成一键部署就更好啦 (大佬辛苦了!) 另外请问大佬是否有支持ntqq消息或者nonebot插件的想法呢

最让我难受的不是没有女生跟我聊天,是我的8G显存连Ai都不理我

(ノಠ益ಠ)ノ彡┻━┻
(╯°□°)╯︵( .o.)
(ノ°益°)ノ彡
(ノಥ,_」ಥ)ノ彡┻━┻
┻━┻ ︵ヽ(Д´)ノ︵ ┻━┻ (╯°□°)╯︵ ┻━┻ ლ(ಠ益ಠლ) (ノಠ ∩ಠ)ノ彡( o°o) (╯ರ ~ ರ)╯︵ ┻━┻ (╯°Д°)╯︵/(.□ . ) (╯°□°)╯︵ ┻━┻ (ノ`⌒´)ノ┫:・’.::・┻┻:・’.::・ (╯°□°)╯︵ ┬─┻ ┻━┻ ︵ヽ(Д´)ノ︵ ┻━┻

[Model] 2.2模型输出非预期的英文

适当的上下文

出现问题的Prompt

你复活啦

出现问题的Answer

啊......没有吧,好像真的especially especially since she's an AI

期望的Answer

(至少是中文)

额外信息

使用2.2的glm2-6b

模型量化

非常好工作!爱来自瓷器!(雾

Hello,在b站刷到了视频,然后有一点想法(

感觉16G显存需求对于大多数人还是有点过高了
不知道您有没有考虑过模型量化之类的事情,量化到4bit的话,显存需求对比于fp16的模型权重会有一半的下降
期待您的新版本!

下载中断导致错误

输入代码:git clone https://github.com/Moemu/Muice-Chatbot
cd Muice-Chatbot
conda create --name Muice python=3.10.10
conda activate Muice
pip install -r requirements.txt
控制台:Downloading sentencepiece-0.1.99-cp310-cp310-win_amd64.whl (977 kB)
ERROR: Exception:
Traceback (most recent call last):
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 438, in _error_catcher
yield
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 561, in read
data = self._fp_read(amt) if not fp_closed else b""
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 527, in _fp_read
return self._fp.read(amt) if amt is not None else self._fp.read()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\cachecontrol\filewrapper.py", line 90, in read
data = self.__fp.read(amt)
File "C:\Users\HP.conda\envs\Muice\lib\http\client.py", line 465, in read
s = self.fp.read(amt)
File "C:\Users\HP.conda\envs\Muice\lib\socket.py", line 705, in readinto
return self._sock.recv_into(b)
File "C:\Users\HP.conda\envs\Muice\lib\ssl.py", line 1274, in recv_into
return self.read(nbytes, buffer)
File "C:\Users\HP.conda\envs\Muice\lib\ssl.py", line 1130, in read
return self._sslobj.read(len, buffer)
TimeoutError: The read operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\cli\base_command.py", line 180, in exc_logging_wrapper
status = run_func(*args)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\cli\req_command.py", line 248, in wrapper
return func(self, options, args)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\commands\install.py", line 377, in run
requirement_set = resolver.resolve(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\resolver.py", line 92, in resolve
result = self._result = resolver.resolve(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 546, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 397, in resolve
self._add_to_criteria(self.state.criteria, r, parent=None)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 173, in _add_to_criteria
if not criterion.candidates:
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\resolvelib\structs.py", line 156, in bool
return bool(self._sequence)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 155, in bool
return any(self)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 143, in
return (c for c in iterator if id(c) not in self._incompatible_ids)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 47, in _iter_built
candidate = func()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\factory.py", line 206, in _make_candidate_from_link
self._link_candidate_cache[link] = LinkCandidate(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 293, in init
super().init(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 156, in init
self.dist = self._prepare()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 225, in _prepare
dist = self._prepare_distribution()
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 304, in _prepare_distribution
return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 538, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 609, in _prepare_linked_requirement
local_file = unpack_url(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 166, in unpack_url
file = get_http_url(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\operations\prepare.py", line 107, in get_http_url
from_path, content_type = download(link, temp_dir.path)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\network\download.py", line 147, in call
for chunk in chunks:
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\cli\progress_bars.py", line 53, in _rich_progress_bar
for chunk in iterable:
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_internal\network\utils.py", line 63, in response_chunks
for chunk in response.raw.stream(
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 622, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 560, in read
with self._error_catcher():
File "C:\Users\HP.conda\envs\Muice\lib\contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
File "C:\Users\HP.conda\envs\Muice\lib\site-packages\pip_vendor\urllib3\response.py", line 443, in _error_catcher
raise ReadTimeoutError(self._pool, None, "Read timed out.")
pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.

[Bug] 运行qsign时会在调用go-cqhttp后卡死在检查更新失败

描述此Bug

运行时会在调用go-cqhttp后卡死在检查更新失败

控制台输出

PS C:\Gu\word\Muice-Chatbot> python main.py
在09.10更新中, 将使用(QQ号).json的方式来存储聊天记录, 对于重新拉取的源码, 可能会出现记忆缺失的情况, 对此, 请手动重命名memory下的chat_memory.json文件, 以便恢复记忆
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 7/7 [00:11<00:00, 1.61s/it]
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at model/chatglm2-6b and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
[2023-09-17 12:34:18,746][MainThread/INFO] PyCqBot: 创建定时任务 CreateANewTopic

#################################################################
██████╗ ██╗ ██╗ ██████╗ ██████╗ ██████╗ ██████╗ ████████╗
██╔══██╗╚██╗ ██╔╝██╔════╝██╔═══██╗██╔══██╗██╔═══██╗╚══██╔══╝
██████╔╝ ╚████╔╝ ██║ ██║ ██║██████╔╝██║ ██║ ██║
██╔═══╝ ╚██╔╝ ██║ ██║▄▄ ██║██╔══██╗██║ ██║ ██║
██║ ██║ ╚██████╗╚██████╔╝██████╔╝╚██████╔╝ ██║
╚═╝ ╚═╝ ╚═════╝ ╚══▀▀═╝ ╚═════╝ ╚═════╝ ╚═╝
v0.5.1.1 BY FengLiu
#################################################################

[2023-09-17 12:34:19,184][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 自动注 册实例已关闭,请配置 sign-server 端自动注册实例以保持正常签名
[2023-09-17 12:34:25,023][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 Protocol -> device lock is disabled. HTTP API may fail.
[2023-09-17 12:34:27,818][go-cqhttp/WARNING] PyCqBot: go-cqhttp 警告 检查更 新失败!

复现该Bug的操作

仅需要按照正常流程部署于Win10

额外信息

使用22H1版本Win10,最新版cq-http Rlease,Cuda cuDNN均使用可在本电脑部署的最高版本,即12.1,qsign使用https://github.com/rhwong/unidbg-fetch-qsign-onekey/tree/main项目的一键部署,QQ版本8.9.63

感谢您的解答

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.