Code Monkey home page Code Monkey logo

Comments (12)

bes-dev avatar bes-dev commented on August 27, 2024 3

@simona198710 you can found .onnx here: https://huggingface.co/bes-dev/stable-diffusion-v1-4-onnx

from stable_diffusion.openvino.

bes-dev avatar bes-dev commented on August 27, 2024 2

@simona198710 thanks for you feedback. Actually, openvino works on amd hardware pretty well. So, you can use our project for AMD hardware too. What about .onnx, I have onnx ir for the stable diffusion, but currently, I have a little problems with hosting of checkpoints (I used gdrive to store checkpoints, but google bans they due too many downloads). I'll share .onnx soon when I resolve issues related to hosting.

from stable_diffusion.openvino.

bes-dev avatar bes-dev commented on August 27, 2024 2

@simona198710 I'll try to share onnx today later. I'll let you know when it will be ready.

Thanks for nice idea about huggingface hub)

from stable_diffusion.openvino.

simona198710 avatar simona198710 commented on August 27, 2024 1

@epi-morphism You probably did not include his edit of transformers that earlier also was part of that repo.
Seems like its removed now, and a new one is here without it.
https://github.com/harishanand95/diffusers/tree/dml
Probably better to open an issue relating to this on that repo than here, since this one is about cpu.

Also relevant discussion here:
huggingface/diffusers#284

from stable_diffusion.openvino.

simona198710 avatar simona198710 commented on August 27, 2024

@bes-dev Sorry, by amd hardware i was actually referring to amd gpus, Since they cannot use cuda.
The onnxruntime have support for DirectML which enables amd gpus to run the models, provided the opset version is 15 or lower.
If hosting is a problem, maybe just the script for converting from pytorch to onnx (and ir) may be enough.

from stable_diffusion.openvino.

bes-dev avatar bes-dev commented on August 27, 2024

@simona198710 sounds good for me 👍 I used opset=14 for all models. Would you like to contribute AMD InferenceEngine to our project? We can refactor our code to support different backends.

from stable_diffusion.openvino.

simona198710 avatar simona198710 commented on August 27, 2024

@bes-dev Sure, if i am able to get them to run on a amd-gpu using onnx i would certainly be willing to open a pull request for it.
Regarding hosting, would not putting the models https://huggingface.co/ work?

from stable_diffusion.openvino.

simona198710 avatar simona198710 commented on August 27, 2024

@bes-dev
Thank you, the models loads when using onnx with cpu.

Unfortunately it gives:

RUNTIME_EXCEPTION : Exception during initialization: ... The parameter is incorrect.

When loading the vae and unet models on amd using DirectML.

Hoping the cause is the same as:
microsoft/onnxruntime#12677

The text-encoder works fine to interface using DirectML on the gpu however.

Do you think it would be possible to share the steps or scripts you used to convert the models from pytorch to onnx?
It would probably be helpful for people using it with openvino as well. Especially since there are pruned variants of the original models available, that people may want to convert for cpu usage.

from stable_diffusion.openvino.

bes-dev avatar bes-dev commented on August 27, 2024

@simona198710 Hmm, it looks like a problem on the DirectML side.

Our code to convert models from pytorch to onnx is really ugly and tricky. But we have plans to prepare it for publication soon.

from stable_diffusion.openvino.

kamalkraj avatar kamalkraj commented on August 27, 2024

Hi @bes-dev,

Thank you so much for sharing the onnx model.

It will be really great if you can share the onnx conversion script for the unet model.

Did you make any changes while converting? I am getting opset unsupported error, while conversion.

from stable_diffusion.openvino.

simona198710 avatar simona198710 commented on August 27, 2024

@kamalkraj This script works for converting the unet.
https://github.com/harishanand95/diffusers/blob/main/examples/inference/save_onnx.py

from stable_diffusion.openvino.

epi-morphism avatar epi-morphism commented on August 27, 2024

@simona198710 Does that really work for you? I'm getting RuntimeError: Encountering a dict at the output of the tracer might cause the trace to be incorrect with nothing changed in the script.

from stable_diffusion.openvino.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.