Code Monkey home page Code Monkey logo

machinelearningpractice's People

Contributors

ironspiderman avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

machinelearningpractice's Issues

chatglm_qa在上传文本文件时候报错

chatglm_qa在上传文本文件时候报错
Traceback (most recent call last): File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\routes.py", line 534, in predict output = await route_utils.call_process_api( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\route_utils.py", line 226, in call_process_api output = await app.get_blocks().process_api( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\blocks.py", line 1550, in process_api result = await self.call_function( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\blocks.py", line 1185, in call_function prediction = await anyio.to_thread.run_sync( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread return await future File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\anyio\_backends\_asyncio.py", line 807, in run result = context.run(func, *args) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\utils.py", line 661, in wrapper response = f(*args, **kwargs) File "E:\ChatGLM\MachineLearningPractice-main\03_上传文档问答.py", line 91, in add_file documents = load_documents(directory) File "E:\ChatGLM\MachineLearningPractice-main\03_上传文档问答.py", line 26, in load_documents documents = loader.load() File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\langchain\document_loaders\directory.py", line 156, in load self.load_file(i, p, docs, pbar) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\langchain\document_loaders\directory.py", line 105, in load_file raise e File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\langchain\document_loaders\directory.py", line 99, in load_file sub_docs = self.loader_cls(str(item), **self.loader_kwargs).load() File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\langchain\document_loaders\unstructured.py", line 86, in load elements = self._get_elements() File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\langchain\document_loaders\unstructured.py", line 172, in _get_elements return partition(filename=self.file_path, **self.unstructured_kwargs) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\partition\auto.py", line 434, in partition elements = partition_text( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\partition\text.py", line 93, in partition_text return _partition_text( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\documents\elements.py", line 526, in wrapper elements = func(*args, **kwargs) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\file_utils\filetype.py", line 619, in wrapper elements = func(*args, **kwargs) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\file_utils\filetype.py", line 574, in wrapper elements = func(*args, **kwargs) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\chunking\__init__.py", line 69, in wrapper elements = func(*args, **kwargs) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\partition\text.py", line 190, in _partition_text element = element_from_text(ctext) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\partition\text.py", line 235, in element_from_text elif is_possible_narrative_text(text): File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\partition\text_type.py", line 88, in is_possible_narrative_text if "eng" in languages and (sentence_count(text, 3) < 2) and (not contains_verb(text)): File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\partition\text_type.py", line 190, in contains_verb pos_tags = pos_tag(text) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\nlp\tokenize.py", line 44, in pos_tag _download_nltk_package_if_not_present( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\unstructured\nlp\tokenize.py", line 21, in _download_nltk_package_if_not_present nltk.find(f"{package_category}/{package_name}") File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\nltk\data.py", line 555, in find return find(modified_name, paths) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\nltk\data.py", line 542, in find return ZipFilePathPointer(p, zipentry) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\nltk\compat.py", line 41, in _decorator return init_func(*args, **kwargs) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\nltk\data.py", line 394, in __init__ zipfile = OpenOnDemandZipFile(os.path.abspath(zipfile)) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\nltk\compat.py", line 41, in _decorator return init_func(*args, **kwargs) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\nltk\data.py", line 935, in __init__ zipfile.ZipFile.__init__(self, filename) File "C:\Users\S3111\.conda\envs\chatglm\lib\zipfile.py", line 1269, in __init__ self._RealGetContents() File "C:\Users\S3111\.conda\envs\chatglm\lib\zipfile.py", line 1336, in _RealGetContents raise BadZipFile("File is not a zip file") zipfile.BadZipFile: File is not a zip file Traceback (most recent call last): File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\queueing.py", line 407, in call_prediction output = await route_utils.call_process_api( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\route_utils.py", line 226, in call_process_api output = await app.get_blocks().process_api( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\blocks.py", line 1550, in process_api result = await self.call_function( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\blocks.py", line 1199, in call_function prediction = await utils.async_iteration(iterator) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\utils.py", line 519, in async_iteration return await iterator.__anext__() File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\utils.py", line 512, in __anext__ return await anyio.to_thread.run_sync( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread return await future File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\anyio\_backends\_asyncio.py", line 807, in run result = context.run(func, *args) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\utils.py", line 495, in run_sync_iterator_async return next(iterator) File "C:\Users\S3111\.conda\envs\chatglm\lib\site-packages\gradio\utils.py", line 649, in gen_wrapper yield from f(*args, **kwargs) File "E:\ChatGLM\MachineLearningPractice-main\03_上传文档问答.py", line 99, in bot message = history[-1][0] IndexError: list index out of range

有关prompt

请问一下,

QA_CHAIN_PROMPT = PromptTemplate.from_template("""根据下面的上下文(context)内容回答问题。
如果你不知道答案,就回答不知道,不要试图编造答案。
答案最多3句话,保持答案简介。
总是在答案结束时说”谢谢你的提问!“
{context}
问题:{question}
""")

这个prompt是如何作用的?好像没看到相关的调用

file is not a zip file

Traceback (most recent call last):
File "/data/glm/LLMKB.py", line 80, in
documents = load_documents()
File "/data/glm/LLMKB.py", line 30, in load_documents
documents = loader.load()
File "/dataanaconda3/envs/glm/lib/python3.10/site-packages/langchain/document_loaders/directory.py", line 156, in load
self.load_file(i, p, docs, pbar)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/langchain/document_loaders/directory.py", line 105, in load_file
raise e
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/langchain/document_loaders/directory.py", line 99, in load_file
sub_docs = self.loader_cls(str(item), **self.loader_kwargs).load()
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/langchain/document_loaders/unstructured.py", line 86, in load
elements = self._get_elements()
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/langchain/document_loaders/unstructured.py", line 172, in _get_elements
return partition(filename=self.file_path, **self.unstructured_kwargs)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/partition/auto.py", line 434, in partition
elements = partition_text(
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/partition/text.py", line 95, in partition_text
return _partition_text(
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/documents/elements.py", line 526, in wrapper
elements = func(*args, **kwargs)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/file_utils/filetype.py", line 627, in wrapper
elements = func(*args, **kwargs)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/file_utils/filetype.py", line 582, in wrapper
elements = func(*args, **kwargs)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/chunking/init.py", line 71, in wrapper
elements = func(*args, **kwargs)
File "/data/liujiqiang/anaconda3/envs/glm4/lib/python3.10/site-packages/unstructured/partition/text.py", line 192, in _partition_text
element = element_from_text(ctext)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/partition/text.py", line 285, in element_from_text
elif is_possible_narrative_text(text):
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/partition/text_type.py", line 88, in is_possible_narrative_text
if "eng" in languages and (sentence_count(text, 3) < 2) and (not contains_verb(text)):
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/partition/text_type.py", line 190, in contains_verb
pos_tags = pos_tag(text)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/nlp/tokenize.py", line 44, in pos_tag
_download_nltk_package_if_not_present(
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/unstructured/nlp/tokenize.py", line 21, in _download_nltk_package_if_not_present
nltk.find(f"{package_category}/{package_name}")
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/nltk/data.py", line 555, in find
return find(modified_name, paths)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/nltk/data.py", line 542, in find
return ZipFilePathPointer(p, zipentry)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/nltk/compat.py", line 41, in _decorator
return init_func(*args, **kwargs)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/nltk/data.py", line 394, in init
zipfile = OpenOnDemandZipFile(os.path.abspath(zipfile))
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/nltk/compat.py", line 41, in _decorator
return init_func(*args, **kwargs)
File "/data/anaconda3/envs/glm/lib/python3.10/site-packages/nltk/data.py", line 935, in init
zipfile.ZipFile.init(self, filename)
File "/data/anaconda3/envs/glm/lib/python3.10/zipfile.py", line 1269, in init
self._RealGetContents()
File "/data/anaconda3/envs/glm/lib/python3.10/zipfile.py", line 1336, in _RealGetContents
raise BadZipFile("File is not a zip file")
zipfile.BadZipFile: File is not a zip file

向量库如何重新加载

上传新的文件,可以写入向量库,但程序必须重启,新加入的文档向量才生效,请问是什么原因?
另外,ChatGLM函数的参数名错误,应该是endpoint_url.

chatglm_qa在本地cpu运行,报错如下

Traceback (most recent call last):
File "D:\MachineLearningPractice-main\chatglm_qa\chatglm_document_qa.py", line 69, in
db = store_chroma(documents, embeddings)
File "D:\MachineLearningPractice-main\chatglm_qa\chatglm_document_qa.py", line 59, in store_chroma
db = Chroma.from_documents(docs, embeddings, persist_directory=persist_directory)
File "D:\Anaconda3\lib\site-packages\langchain_community\vectorstores\chroma.py", line 778, in from_documents
return cls.from_texts(
File "D:\Anaconda3\lib\site-packages\langchain_community\vectorstores\chroma.py", line 736, in from_texts
chroma_collection.add_texts(
File "D:\Anaconda3\lib\site-packages\langchain_community\vectorstores\chroma.py", line 275, in add_texts
embeddings = self._embedding_function.embed_documents(texts)
File "D:\Anaconda3\lib\site-packages\langchain_community\embeddings\huggingface.py", line 91, in embed_documents
embeddings = self.client.encode(texts, **self.encode_kwargs)
File "D:\Anaconda3\lib\site-packages\sentence_transformers\SentenceTransformer.py", line 153, in encode
self.to(device)
File "D:\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 989, in to
return self._apply(convert)
File "D:\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 641, in _apply
module._apply(fn)
File "D:\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 641, in _apply
module._apply(fn)
File "D:\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 641, in _apply
module._apply(fn)
[Previous line repeated 1 more time]
File "D:\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 664, in apply
param_applied = fn(param)
File "D:\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 987, in convert
return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
File "D:\Anaconda3\lib\site-packages\torch\cuda_init
.py", line 221, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.