Comments (5)
Cool! I think it makes sense to have a Trainer base class and inherit from that. I was actually going to set up that exact code pattern (putting them in a trainers.py
, which is what we do for model), but didn't get around to it, since I haven't needed a different training loop yet.
In terms of get_trainer
, I don't think you even need the trainer_class arg. You can look at models.get_model()
for reference. I agree it's not the greatest design pattern, but unfortunately we need to create the model and trainer within a strategy scope for tf2, so I can't think an elegant way around it for the moment. I don't think it's particularly error prone, just adds an unfortunate layer of obfuscation.
Another detail is that eval and sample read from the operative_config gin file, but it only saves params that are used during training, so without get_model
we would need to pass the model file in again with args for eval.model and sample.model, which I think is more error prone.
If you do that code separation and think that it makes sense to add to the repo, feel free to submit a PR.
from ddsp.
Thanks for your response. So you think it should be possible to use a function exactly like get_model
, which just returns its argument? But if that argument is created in the gin file (like get_trainer.trainer = @my_module.MyCustomTrainer()
), how can I still pass the model and strategy to it?
from ddsp.
That's a good point. Normally we would just construct the sub object as an argument in gin as well, but in the case of the strategy, it really depends on the binary flags, so I think we'd like to avoid tying the global state deep inside the object (best practices to keep flags out of the libraries).
I think then what makes sense is to have
def get_trainer(model, strategy, trainer=gin.REQUIRED):
return trainer(model, strategy)
from ddsp.
OK, thanks. If I may have one more question, what if I wanted to have some kind of adversarial training loop? I'm not sure how to make that work with Keras's add_loss
etc. I suppose the framework is not really prepared for that, and the most reasonable option is to write the whole model and training loop myself (maybe without Keras) and just use ddsp for the building blocks?
from ddsp.
I don't see any fundamental incompatibility of Keras and adversarial training (and there are many examples to the contrary). add_loss
is a pretty shallow wrapper I'm using to keep track of a lot of different loss functions, but I'm thinking of ditching it and passing the losses up manually because it can be hard to keep them paired with their names (for summaries or treating them differently), although keep in mind the ddsp code currently does do this just with keras.
You will need to write your own custom training loop, but the current code should be pretty well set up for that.
from ddsp.
Related Issues (20)
- OnlineF0PowerPreprocessor not available?
- upload() in timbre transfer help!
- save_dataset_statistics broken with DDSP 3.2.0? HOT 8
- Colab gin version 0.5.0 not working HOT 1
- Question about training dataset HOT 2
- Error when trying to use the "train_autoencoder" notebook
- What version of python? HOT 2
- constant error when creating model.zip HOT 1
- Pretrained weights for NSynth autoencoder?
- How did you calculate the vibration rate?
- `pip install ddsp` slow on Colab (transitive dependencies again not pip-compileable) HOT 2
- Mac M1 Support HOT 4
- Just want to delete it
- DDSP in Docker Container HOT 7
- ValueError: Size of each quantile / not able to compute features in timbre_transfer
- DDSP Colab Notebook - Pip Install is not worked HOT 12
- "NotImplementedError" HOT 1
- AttributeError HOT 1
- Harmonic component not getting trained while training on a higher sample rate!
- NotImplementedError while trying to run training HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ddsp.