yl4579 / hiftnet Goto Github PK
View Code? Open in Web Editor NEWHiFTNet: A Fast High-Quality Neural Vocoder with Harmonic-plus-Noise Filter and Inverse Short Time Fourier Transform
License: MIT License
HiFTNet: A Fast High-Quality Neural Vocoder with Harmonic-plus-Noise Filter and Inverse Short Time Fourier Transform
License: MIT License
Hello, yl4579. In 2.1.1. Efficient Source Generation
, you do the We note that the L factor in Equation 6 scales the value by the hop size, as Equation 5 now integrates with 1/L of steps compared to steps in Equation 2.
. I want to know what is the performance benefit of this change?
Hi, are you able to tell me if I can use your vocoder with vits (https://github.com/jaywalnut310/vits)?
Are you able to tell me if your vocoder has better quality than the one from vits?
thank you!
Hi! I wanted to know if you know about Vocos and if you compared to it since it uses similar principles and has similar results.
why in train.py,botch loss_disc and loss_gen use "generator_TPRLS_loss"?
loss_disc_s += generator_TPRLS_loss(y_ds_hat_r, y_ds_hat_g)
loss_gen_s += generator_TPRLS_loss(y_ds_hat_r, y_ds_hat_g)
The quality of demo samples seems to be quite good, even in noisy environments or for unseen languages. Did you use the same f0 network (pretrained on LibriTTS) for these out-of-domain experiments? I'm also wondering whether it's okay to use the F0 network as is for unseen languages. If you have any experimental insights, I'd appreciate learning about them.
Hi,
If I want to train the F0 model from scratch in the other repo, do I need to train the JDC with a 22k sampling rate in order to use it for HiFTNet, or is it okay to use the 24k sampling rate (using the default SPECT and MEL extraction parameters in meldataset.py) for training the JDC network?
Thanks
Hi, thx for your great work!
I wonder is it possible to add multi band to further improve inference speed like https://github.com/MasayaKawamura/MB-iSTFT-VITS
Line 659 in fde68a5
"for dr, dg" , not "for dg, dr"?
Thanks for your greak work
I got warnings when torch==1.8 and 1.10
(Under 1.8 will return error about return_complex of torch)
UserWarning: The default behavior for interpolate/upsample with float scale_factor changed in 1.6.0 to align with other frameworks/libraries, and now uses scale_factor directly, instead of relying on the computed output size. If you wish to restore the old behavior, please set recompute_scale_factor=True. See the documentation of nn.Upsample for details.
"The default behavior for interpolate/upsample with float scale_factor changed "
UserWarning: Default upsampling behavior when mode=linear is changed to align_corners=False since 0.4.0. Please specify align_corners=True if the old behavior is desired. See the documentation of nn.Upsample for details.
"See the documentation of nn.Upsample for details.".format(mode)
But there is no special version of torch in requirements.txt
Are these warnings as expected?
or
What's the correct version of torch?
Thanks
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.