Comments (10)
Could you add a code coverage button next to the build status in the README as well?
from keras.
We should add unit tests to things such as activations, layers, optimizers and such. We could also ship the expected results for all the examples as a "sanity check" for the library, but those do not need to be included in the test suite as they would take too long to run. For a nice example (in my opinion), see how they did it in Mocha (it's Julia code, but the philosophy still applies).
You'll see that in Mocha (as in Torch7) there's a single script including all the other tests. Probably tests are handled like this in Torch because there's no "standard" unit testing library in Lua. I'm a fan of Nose for unit tests in Python. It is able to "discover" tests inside a project and run them automagically.
Regarding tests not being run often, we should add CI (via Travis and Coveralls) to make sure tests are run after each commit/pull request and test coverage is decent (i.e., most code paths are covered by the test suite).
from keras.
I would second using Nosetests and some kind of CI soultion. An interesting thing that sklear does is to run a set of generic tests on all estimators and regressors, to ensure they all conform to the standard API style. Maybe we could do the same for layers.
from keras.
I've had better experience with py.test as a framework. It's super
flexible, and works well with travis ci, which might be a good choice for a
distributed team.
On 1 June 2015 at 20:56, Thomas McColgan [email protected] wrote:
I would second using Nosetests and some kind of CI soultion. An
interesting thing that sklear does is to run a set of generic tests on all
estimators and regressors, to ensure they all conform to the standard API
style. Maybe we could do the same for layers.—
Reply to this email directly or view it on GitHub
#60 (comment).
Tennessee Leeuwenburg
http://myownhat.blogspot.com/
"Don't believe everything you think"
from keras.
I just had a look at the existing files in the test directory. It looks like they are a mixture of automated tests (compatible with the unittest framework) and manual tests. I am quite happy to start marching the tests forward, but it makes some sense to talk about the basic strategy for that. My initial thought would be to move the manual tests into a separate directory. I could create a manual/ and auto/ directory under the test directory. I would also change the naming convention on the manual tests, because they currently get imported and executed during test discovery, which then slows things down.
My personal preference for automated testing is the py.test framework, but there is always some advantage to minimising the number of external packages required. The upside of the py.test framework is good integration with tools and simpler boilerplate for developers. I am not very familiar with nosetests as previously suggested but I'm sure it is fine also.
My first goal would be just to get the current tests working so that they can be easily used in a development workflow.
What does everyone think?
from keras.
I agree with the manual/auto test split. We should probably define manual tests as tests that look more like complete examples and therefore take longer to run (which is something we probably do not want to do in Travis, for example), while auto tests are more like unit/feature tests. Regarding frameworks, I am more familiar with nosetests but pytest looks fine, too.
from keras.
I've created a pull request to structure the tests in this way.
On 8 June 2015 at 22:39, João Felipe Santos [email protected]
wrote:
I agree with the manual/auto test split. We should probably define manual
tests as tests that look more like complete examples and therefore take
longer to run (which is something we probably do not want to do in Travis,
for example), while auto tests are more like unit/feature tests. Regarding
frameworks, I am more familiar with nosetests but pytest looks fine, too.—
Reply to this email directly or view it on GitHub
#60 (comment).
Tennessee Leeuwenburg
http://myownhat.blogspot.com/
"Don't believe everything you think"
from keras.
Hi all,
Most of the work has been done by @phreeza here, but I've helped :). I'm happy to report we are now up to 50% coverage of keras in the tests in the latest pull request. This is fantastic progress, although more is still needed. A month ago it was 20%. A few weeks before that it was 0%.
I've also noticed other authors now starting to include tests with their commits -- thanks to everyone who has done that.
One thing which lags behind slightly is testing of the install process, and test checking prior to commits to the master branch. We're still finding the need to do a certain amount of ongoing cleanup. I'd just encourage people to take note of doing a test run just before hitting 'commit'.
In terms of testing the install process, this is something that travis-ci is doing for me on my fork. It helps pick up when someone brings a library into the fold but doesn't update setup.py (for example). It's no big deal for most people, but it's the kind of toe-stubbing you don't want users to experience on their first install-and-try-it-out experience. Just some notes based on what we've seen so far.
To 100% and beyond! :)
from keras.
Given that this project is currently at ~80% coverage can we close this issue and make it part of the pull request process that a pull request must have tests in the main?
Current test coverage:
---------- coverage: platform linux2, python 2.7.12-final-0 ----------
Name Stmts Miss Cover
-------------------------------------------------------------
keras/__init__.py 19 0 100%
keras/activations.py 42 2 95%
keras/applications/__init__.py 5 0 100%
keras/applications/imagenet_utils.py 59 42 29%
keras/applications/inception_v3.py 174 22 87%
keras/applications/resnet50.py 127 24 81%
keras/applications/vgg16.py 79 21 73%
keras/applications/vgg19.py 82 24 71%
keras/applications/xception.py 128 20 84%
keras/backend/__init__.py 59 16 73%
keras/backend/common.py 37 13 65%
keras/backend/tensorflow_backend.py 974 109 89%
keras/backend/theano_backend.py 1107 412 63%
keras/callbacks.py 481 93 81%
keras/constraints.py 67 2 97%
keras/datasets/__init__.py 7 0 100%
keras/datasets/boston_housing.py 18 15 17%
keras/datasets/cifar.py 17 5 71%
keras/datasets/cifar10.py 26 0 100%
keras/datasets/cifar100.py 22 1 95%
keras/datasets/imdb.py 67 59 12%
keras/datasets/mnist.py 11 8 27%
keras/datasets/reuters.py 57 49 14%
keras/engine/__init__.py 6 0 100%
keras/engine/topology.py 1228 239 81%
keras/engine/training.py 896 137 85%
keras/initializers.py 158 16 90%
keras/layers/__init__.py 27 1 96%
keras/layers/advanced_activations.py 79 1 99%
keras/layers/convolutional.py 533 65 88%
keras/layers/convolutional_recurrent.py 185 9 95%
keras/layers/core.py 328 42 87%
keras/layers/embeddings.py 45 1 98%
keras/layers/local.py 165 14 92%
keras/layers/merge.py 281 59 79%
keras/layers/noise.py 34 1 97%
keras/layers/normalization.py 69 5 93%
keras/layers/pooling.py 216 6 97%
keras/layers/recurrent.py 478 68 86%
keras/layers/wrappers.py 191 25 87%
keras/legacy/__init__.py 0 0 100%
keras/legacy/interfaces.py 269 30 89%
keras/legacy/layers.py 369 57 85%
keras/legacy/models.py 21 17 19%
keras/losses.py 54 2 96%
keras/metrics.py 40 2 95%
keras/models.py 486 201 59%
keras/objectives.py 2 2 0%
keras/optimizers.py 347 40 88%
keras/preprocessing/__init__.py 0 0 100%
keras/preprocessing/image.py 439 116 74%
keras/preprocessing/sequence.py 80 9 89%
keras/preprocessing/text.py 117 11 91%
keras/regularizers.py 44 2 95%
keras/utils/__init__.py 18 0 100%
keras/utils/conv_utils.py 75 23 69%
keras/utils/data_utils.py 125 23 82%
keras/utils/generic_utils.py 161 18 89%
keras/utils/io_utils.py 60 44 27%
keras/utils/layer_utils.py 126 36 71%
keras/utils/np_utils.py 14 3 79%
keras/utils/test_utils.py 97 2 98%
keras/utils/vis_utils.py 65 54 17%
keras/wrappers/__init__.py 0 0 100%
keras/wrappers/scikit_learn.py 105 16 85%
-------------------------------------------------------------
TOTAL 11698 2334 80%
from keras.
Most of the source files that are below 90% seem to be utils, datasets, etc. It looks like it should therefore be easy to get to 90% on average across the codebase.
from keras.
Related Issues (20)
- Rescaling layer on input problems / ValueError: Layer node index out of bounds. inbound_layer = <InputLayer name=keras_tensorCLONE, built=True> HOT 1
- Conv3D crash when the data_format is 'channels_first' and using Tensorflow backend HOT 3
- Misspelled link. HOT 1
- imdb.load_data function returns a python list instead of ndarray object HOT 2
- Returning backend.set_learning_phase HOT 4
- Custom loss defined as a class instance vs function HOT 3
- Torch 2.3.0 (next ver) fails with AttributeError: 'Parameter' object has no attribute 'fget'
- Add support for jnp.linalg.slogdet HOT 2
- keras.layers.Layer.call method fails when building keras model with functional API HOT 1
- Getting Wrong output even though vgg16 model showing 95% val_accuracy HOT 3
- import keras error (V3.3.2) (kaggle Notebook) HOT 1
- Keras 3 with Pytorch backend ERROR - Layer 'lstm_cell' expected 3 variables, but received 0 variables during loading. Expected: ['kernel', 'recurrent_kernel', 'bias'] HOT 4
- The source code URL in the documentation leads to a non-existent page. HOT 1
- keras.ops.linalg.cholesky can't JIT HOT 1
- Model fails to train with Linux and Keras 3.3.2 HOT 9
- Compatible with .ogg format HOT 2
- keras autocast casts numpy int types to float HOT 2
- bug in TF _keepdims? HOT 3
- keras.ops.cross doesn't propagate input sizes HOT 1
- GSOC '24 Project? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from keras.