Comments (13)
I consider the result is related to the keras version, I see another code use keras 1.0 API and got a result similar with paper's result.
@Lan1991Xu
from densenet.
Hi,Thank for your reply. Would you mind give a link for another code for keras1.0. Or, have you try what about this code in keras1.0. I have similar experience, the keras1.0 and keras2.0 will make the different results.
from densenet.
Hmm that is odd. I assumed the model would train correctly this time since the code was almost exactly similar to the one posted by the author. The only thing I can think of that is different is preprocessing and augmentation.
The earlier one was after tons of restarts, and I did not use horizontal flips for it. Also, if you used the unmodified cifar10 script, that means that you did not use the densenet.preprocess_data(...) ? I think the authors have specifically mentioned in one of their comments on the issues of Github that the specific mean standard normalization was necessary along with the scaling.
The "+" mark at the end denotes for standard data augmentation (random crop after zero-padding, and horizontal flip)
They suggest random crops after zero padding and horizontal flips. Keras doesnt have inbuilt support for random crops, so instead I used the augmentation from earlier papers such as rotation, scaling and horizontal flips.
I suggest removing horizontal flipping. I have found that it simply destroys performance, no matter how much I train models with it. Without it, convergence is faster and ofcourse, overfits faster as well, but at least validation loss is higher (though loss of flip invariance is kinda bad).
I hate to ask this since it takes so much time, but could you run it with the current modified version ? And while it may not be best, you could try simply running it with these weights initialized to speed it up and not need 300 additional epochs (it will be very eratic in the beginning, since the earlier normalization was static at 0-1 and now its -2.x to 2.x (this is from (255 - 124) * 0.017). Still, it should not affect too many of the weights at the latter end of the network, so it may be better off.
from densenet.
By exactly same to the author, I mean https://github.com/liuzhuang13/DenseNet/blob/master/models/densenet.lua#L67
from densenet.
Oh wait. Disregard the above. Seems you ran it with the updated script. I don't really understand then. Perhaps the additional rotation and scaling are hurting performance ?
from densenet.
Sorry, I meant the older version of the Titan X (there are two). I haven't really investigated but I too found it a bit puzzling. Perhaps additional training would bring it down to the expected level.
from densenet.
Yeah maybe that will work.
from densenet.
I found the same problem with ahundt that just ran 300 epochs using tensorflow 1.3 backend in keras 2.0, and got the same result as same as ahundt's result. And I change the optimization algorithm to SGD with momentum=0.9 and nesterov=True ,and initial learning rate=0.1 and in epoch=150 with learning rate=0.01 and in epoch=225 learning rate=0.001 until the end of 300 epochs, but the result did not get some improvement and be approximately 92% . oh, my GPU's version is Nvidia GTX 1080ti. So i'm so confused because I have modified the hyper parameters again and again and just cannot see any virtual improvement. Thanks for your time!!
from densenet.
Perhaps related to #25?
@LelouchVC do you think you could try that version? My GPUs are currently occupied
from densenet.
HI, does the problem solved? I also use this code, but not get satisfied results as paper reported. Any idea?
from densenet.
I have no idea if the reason is related to #25......I have no time until now, maybe I will try it, hope it useful.
@athundt
from densenet.
I consider the result is related to the keras version, I see another code use keras 1.0 API and got a result similar with paper's result.
@Lan1991Xu
Do you mind sending me a version of keras 1.0? I am very anxious now because of my graduation thesis. My mailbox [email protected]. Thank you
from densenet.
I consider the result is related to the keras version, I see another code use keras 1.0 API and got a result similar with paper's result.
@Lan1991Xu
Do you mind sending me a version of keras 1.0? I am very anxious now because of my graduation thesis. My mailbox [email protected]. Thank you
from densenet.
Related Issues (20)
- Cifar10 weights HOT 1
- no longer works with newest keras HOT 8
- About implementation of __dense_block HOT 2
- Running Densnet in CPU HOT 1
- Why reshape in fc-densenet in top layer before applying softmax? HOT 2
- AttributeError: 'NoneType' object has no attribute 'get_file' HOT 2
- Where is connection to 12 layers of each dense block? HOT 1
- About DenseNet HOT 1
- About Densenet architecture HOT 1
- Plans for memory efficient implementation in Keras? HOT 1
- How to upload my own dataset instead of the Cifar10 dataset HOT 1
- How to use 'DENSENET_121_WEIGHTS_PATH_NO_TOP'? HOT 1
- Mistake in L2 regularization HOT 1
- creat model HOT 2
- creat model HOT 1
- imagenet_inference
- inter_channel HOT 1
- Poor CIFAR100 accuracy HOT 1
- Error about normalize_data_format
- Unable to convert to frozen graph or checkpoint
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from densenet.