Comments (8)
@daynebatten I actually just noticed that you use model.add(Masking(mask_value=0., input_shape=(max_time, 24)))
link
But it was there all along and sorry, actually it shouldn't be a problem for the Turbofan dataset since sensors measure . The problem would occur in data [data_pipeline}(https://github.com/ragulpr/wtte-rnn/blob/master/examples/data_pipeline/data_pipeline.ipynb)-example where we have long periods of nothingness, but we still want to propagate state. Here data could be set to 0. If we'd have mask_value=0
we wouldn't predict or learn anything here.
If you want to apply to other dataset I suggest using some highly unlikely mask-value instead.
Also, we never said hi awesome work @daynebatten!
from wtte-rnn.
That's a great point. It's probably a best practice to use a very large mask value. If you've normalized your data, the mask value should never come up by accident, and certainly not for all variables.
And I think most of the thanks go to you for doing all the heavy lifting here!
from wtte-rnn.
@Manelmc Good catch. While one would expect masking
to propagate to the loss function, there was some inconsistencies in Keras at the time of implementation w.r.t how _weighted_masked_objective
worked back then with custom loss function. It might have been fixed now. If that's the case, then sample_weights
are not necessary if you want equal weights.
I like to use sample_weights
anyway since I usually don't use equal weights for every sample.
from wtte-rnn.
I'm with you. Batch_size >1 is inshallah coming tomorrow in the updated data_pipeline. If you can't wait, the tests gives you a hint:
https://github.com/ragulpr/wtte-rnn/blob/master/python/tests/test_keras.py#L93
There's allready support for it. You need to do masking and use sample weights
from wtte-rnn.
thanks! think I got it.
from wtte-rnn.
Note that if you use zero-mask then you may not learn anything from the nonevents as the input vec may be zero for those steps. Ping @daynebatten
from wtte-rnn.
Not sure I'm following, @ragulpr. Can you give a little more detail as to when this would occur?
from wtte-rnn.
Hi Egil, I'm not sure I understand why do you suggest to use sample weights if we are already masking. From what I read in stackoverflow
If there's a mask in your model, it'll be propagated layer-by-layer and eventually applied to the loss. So if you're padding and masking the sequences in a correct way, the loss on the padding placeholders would be ignored.
If the loss on the padding placeholders is ignored, why do we need sample weights?
from wtte-rnn.
Related Issues (20)
- Pre-filtering by number of events HOT 2
- Event with duration
- Is it applicable for my dataset HOT 1
- Loss Function - Not the PCF? HOT 2
- Keras and Theano why? HOT 1
- Log-likelihood for discrete Weibull distribution HOT 3
- c-index
- wtte.pipelines.data_pipeline returns wrong seq_ids
- possible memory issue with large data
- Weird Beta outputs
- Stability of loss function for left censored data HOT 1
- References of success of the WTTE-RNN structure?
- multi variate time series : we have categorical and continues data
- Why do you use a log in the discrete weibull loss function?
- How to use the model to predict
- Porting WTTE-RNN to PyTorch HOT 2
- Numerical instability parameterization tricks
- How to label for "time to the next event" ?
- will it work for multivariate time series prediction both regression and classification
- preparation data for churn prediction HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from wtte-rnn.