Comments (8)
When I have time, I will start working on LVM-related tasks, beginning with the LLaVA model :)
from comfyui_llm_party.
That’s really great, I also think that supporting LVM is an indispensable feature.
from comfyui_llm_party.
I’ve been a bit busy this past week, mainly spending time creating a tutorial video for this project and fixing some minor issues within the project. It was only during the weekend that I had a substantial amount of time. May I ask if you have already started writing the PR related to the LVM model? If so, I won’t duplicate the development. Instead, I plan to create something that can package the comfyui workflow into an OpenAI interface, making it convenient for any software that can integrate with OpenAI to access user-defined workflows. If you haven’t started yet, I can begin writing one, and you can provide your feedback on it later.
from comfyui_llm_party.
Unfortunately, I had exactly the same situation, and I didn’t have time all week. And I expecting next week will be full of work too. :(
On this topic, I was only able to check the nodes from here https://github.com/gokayfem/ComfyUI_VLM_nodes and the nodes for the local version of Ollama - https://github.com/stavsap/comfyui-ollama
I can say that specifically for those situations where we use ComfyUI, the approach with another local server is not suitable, since then control over unloading the model is lost and Comfi’s workflows end up waiting for the computer with the ollama server to process the request.
And this adds the requirement to either have a second computer or Ollama to have to run on the CPU if a person has only one computer.
The way you described with a universal interface, when you can outsource some of the tasks to another service, is quite an interesting and good solution, imho :)
from comfyui_llm_party.
Sorry for the wait, I have adapted this model: llava-llama-3-8b-v1_1-gguf. The example workflow can be found here: start_with_LVM.json.Due to the use of llama_cpp_python code, it may not be perfectly compatible with MPS devices. You can see the adaptation code I made for different devices here. I’m not sure if it will cause errors on macOS and MPS, and I would greatly appreciate your help. 🙂
from comfyui_llm_party.
![image](https://private-user-images.githubusercontent.com/13381981/333775790-7d2a40c5-79bb-40b4-8843-8cdf4e13056c.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTkxOTU2NzksIm5iZiI6MTcxOTE5NTM3OSwicGF0aCI6Ii8xMzM4MTk4MS8zMzM3NzU3OTAtN2QyYTQwYzUtNzliYi00MGI0LTg4NDMtOGNkZjRlMTMwNTZjLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA2MjQlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNjI0VDAyMTYxOVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTU2ZGQ3NjUxMGE0NjlmODE5ZTJmMzc2Zjk0Y2MyOTc5MjBhNWVmMDE2Yzc5OThiYWEwOWFlYzE0YmQwYzhmNzQmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.EAQKQA7zylxL8h4jmWZwcYgRoc6Gf0etQCxG0FhSlfg)
llama_cpp_python is working fine, there are just a problem with installing it from default Python's pipy
(but it is available in GH releases).
Surprisingly, it worked with int4 - I didn’t even have to do anything :)
Good job, really.
Issue now can be closed?
from comfyui_llm_party.
Just a note: will be good not to display a encoded image data in "history"
A few big images with 10-25 MB will make history totally non displayable in browser.
from comfyui_llm_party.
The issue with the history has been resolved, and the download source for llama_cpp_python in macOS has been adjusted. The problem is now solved.I sincerely appreciate your assistance. Should you have any further recommendations or if there’s anything else I might require, I would be grateful if you could kindly reach out to me.:)
from comfyui_llm_party.
Related Issues (18)
- remove the __pycache__ dir HOT 1
- requirements.txt is wrong? HOT 2
- macOS support HOT 22
- It is recommended to consider accessing deepseek’s API HOT 2
- 是否可以调用本地ollama接口 HOT 3
- Solved
- LLM_local: Separate `device` and `dtype` in the node HOT 2
- Feature Request: LangChainDeprecationWarning and Requirements HOT 3
- 即使外部参数不变,大模型节点总是会运行,因为大模型对同一个问题也总是有着不同的回答。 HOT 5
- 非常不错的项目。能不能在此基础上,建立工作流库,实现通agent方式可以智能选择工作流执行? HOT 2
- 大佬你好,除了分类器,还有什么办法可以把推出来的提示词进行分类,然后分别接入到CLIP的正负提示词里面吗 HOT 11
- 加载本地模型总是报错,能否详细说一下这一块怎么设置?先谢谢啦~ HOT 15
- fastapi文件名与所导包的包名重名 HOT 1
- AttributeError: 'Llava15ChatHandler' object has no attribute 'clip_ctx' HOT 3
- TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType HOT 5
- ModuleNotFoundError: No module named 'tenacity.asyncio' HOT 1
- 提示出错,麻烦大佬看看怎么回事? HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from comfyui_llm_party.