kim-dongjun / soft-truncation Goto Github PK
View Code? Open in Web Editor NEWA repo for "Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation"
License: Apache License 2.0
A repo for "Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation"
License: Apache License 2.0
Thanks!
Hi, I found only the NLL results with variational dequantization in the big result table. Can you provide me with the results of uniform dequantization on CIFAR10 and IMAGENET32? Thanks a lot.
There are three ways to compute bpds for a trained model.
I have spent few months to train the flow network for the variational dequantization, but with almost all the released and well-established codes, I failed to train the flow network successfully. Here, what "successful" means that the bpd with var. deq. is not decreased from the bpd with uniform deq. around 0.10~0.14 as Song reported in his paper (https://arxiv.org/pdf/2101.09258.pdf). I only get ~0.02 gain with Song's original flow network, and I got ~0.05 gain with the bestly performed implementation.
After this experience, I decided to stop delving into training the unstable flow network. Rather than that, I focused on the lossless computation following Ho's original DDPM paper. However, it turned out that the lossless bpd is significantly worse than the bpd with unif. deq., and we are suspecting that our code is wrong.
The problem is that we cannot find any wrong point in our code. I hope if anyone successes on the lossless computation, and please let me know for his/her's know-how.
As always, the reviewers do not consider our effort, but it is extremely unfair to compare the bpd with unif. deq. with prior works that computed their bpd with var. deq. This unfair comparison could be a potential cause of paper rejection, so it is left us to invest our precious time on training variational flow network until the training succeeds. This is a huge waste of time for colleague researchers, and if the lossless bpd computation is successful and stable, then it would allow you to fairly compare your model with prior works without the flow learning! So please let's find out the way to compute lossless bpd, which is as cheap as the bpd with uniform dequantization.
Hi there,
For an ongoing study I would like to generate 100k FFHQ samples at 256 resolution with the model shown in Figure 15 of the paper and reported to get an FID of 5.54 here https://paperswithcode.com/sota/image-generation-on-ffhq-256-x-256.
Is this model publicly available? If not are you able to point me towards 50-100k generated images?
Since E[-log p^theta(x0)] <= E[-log p^theta(x_eps)] - E[log p^theta(x0|x_eps)] + E[ log p(x_eps|x0)], it seems that the entropy term E[ log p(x_eps|x0)] should appear in the residual term. However, I didn't find it in likelihood_residual_fn_lossless_compression
.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.