Code Monkey home page Code Monkey logo

mdp-toolkit's Introduction


PyPI Latest Release Package Status Downloads Downloads

Modular toolkit for Data Processing

Modular toolkit for Data Processing (MDP) is a Python data processing framework.

From the user’s perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures.

From the scientific developer’s perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library.

Main features

The base of available algorithms is steadily increasing and includes

  • signal processing methods such as
    • Independent Component Analysis,
    • Principal Component Analysis, and
    • Slow Feature Analysis;
  • manifold learning methods such as the [Hessian] Locally Linear Embedding;
  • several classifiers;
  • probabilistic methods such as
    • Factor Analysis,
    • Fisher Discriminant Analysis,
    • Linear Regression, and
    • RBMs;
  • data pre-processing methods such as
    • expansion methods for feature generation and
    • whitening for data normalization;

and many others.

You can find out more about MDP's functionality in the node list and the utilities description.

Install the newest release

MDP is listed in the Python Package Index and can be installed with:

pip install mdp

Install the development version

If you want to live on the bleeding edge, install the development version from the repository with:

pip install git+https://github.com/mdp-toolkit/mdp-toolkit.git

Usage

Using MDP is as easy as:

import mdp

# perform PCA on some data x
y = mdp.pca(x)

# perform ICA on some data x using single precision
y = mdp.fastica(x, dtype='float32')

Contact and development

MDP has been originally written by Pietro Berkes and Tiziano Zito at the Institute for Theoretical Biology of the Humboldt University, Berlin in 2003.

Since 2017, MDP is primarily maintained by the research group Theory of Neural Systems at the Institute for Neural Computation of the Ruhr University Bochum.

Contact

Most development discussions take place in this repository on Github. You are also encouraged to get in touch with the developers and other users on the users’ mailing list.

Contributing

MDP is open to user contributions. Users have already contributed some of the nodes, and more contributions are currently being reviewed for inclusion in future releases of the package

If you want to commit code, it may be easiest to fork the MDP repository on github and give us a note on the mailing list. We may then discuss how to integrate your modifications. For simple fixes that don’t need much discussion, you can also send a mail patch to the list using git format-patch or similar.

Your code contribution should not have any additional dependencies, i.e. they should require only the numpy module to be installed. If your code requires some other module, e.g. scipy or C/C++ compilation, ask [email protected] for assistance.

To learn more about how to contribute to MDP, check out the information for new developers section on the MDP webpage.

How to cite MDP

If you use MDP for scientific purposes, you may want to cite it. This is the official way to do it:

Zito, T., Wilbert, N., Wiskott, L., Berkes, P. (2009). Modular toolkit for Data Processing (MDP): a Python data processing frame work, Front. Neuroinform. (2008) 2:8. doi:10.3389/neuro.11.008.2008.

If your paper gets published, please send us a reference (and even a copy if you don't mind).

mdp-toolkit's People

Contributors

albertoesc avatar ben-willmore avatar cclauss avatar debilski avatar dvrstrae avatar esc avatar fabschon avatar keszybz avatar kotopesutility avatar mspacek avatar nimlr avatar nkgevorgyan avatar nwilbert avatar otizonaizit avatar pberkes avatar stewori avatar varunrajk avatar yarikoptic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mdp-toolkit's Issues

Warnings in tests

I collected a list of warnings popping up in different tests. Some may be relevant only to some version combinations of the dependencies.

MDP

  • DeprecationWarning: object of type <class 'float'> cannot be safely interpreted as an integer.
    appearing in test_EtaComputerNode.py line 6 in this test run.

  • DeprecationWarning: `formatargspec` is deprecated since Python 3.5. Use `signature` and the `Signature` object directly
    appearing in signal_node.py line 180 in this test run. There are more instances in the code where formatargspec is used.

  • FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.
    appearing in tests of scikit-learn-nodes in test_nodes_generic.py in this test run.

  • FutureWarning: The default value of cv will change from 3 to 5 in version 0.22. Specify it explicitly to silence this warning.
    appearing in tests of scikit-learn-nodes in test_nodes_generic.py in this test run.

  • ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
    appearing in tests of scikit-learn-nodes in test_nodes_generic.py in this test run.

  • FutureWarning: The default value of gamma will change from 'auto' to 'scale' in version 0.22 to account better for unscaled features. Set gamma explicitly to 'auto' or 'scale' to avoid this warning. "avoid this warning.", FutureWarning)
    appearing in tests of scikit-learn-nodes in test_nodes_generic.py in this test run.

  • FutureWarning: The default value of n_estimators will change from 10 in version 0.20 to 100 in 0.22.
    appearing in tests of scikit-learn-nodes in test_nodes_generic.py in this test run.

  • ConvergenceWarning: Stochastic Optimizer: Maximum iterations (200) reached and the optimization hasn't converged yet. % self.max_iter, ConvergenceWarning)
    appearing in tests of scikit-learn-nodes in test_nodes_generic.py in this test run.

  • FutureWarning: The default value of strategy will change from stratified to prior in 0.24.
    appearing in tests of scikit-learn-nodes in test_nodes_generic.py in this test run.

BIMDP

Pip install failing in py3k

>>> import mdp
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/razi/git/yelp/src/mdp/build/py3k/mdp/__init__.py", line 114, in <module>
    from . import configuration
  File "/Users/razi/git/yelp/src/mdp/build/py3k/mdp/configuration.py", line 186
    except ImportError, exc:
                      ^
SyntaxError: invalid syntax

Deprecation warnings

I am opening this issue to track some deprecation warnings observed in recent tests. These are probably easy to fix and not totally urgent. However it would be nice to fix them for an upcoming release. At very least, the imp deprecation should be handled to support Python 3.12.

  /opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/future/standard_library/__init__.py:65: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses
    import imp
mdp/linear_flows_online.py:355
  /home/runner/work/mdp-toolkit/mdp-toolkit/mdp/linear_flows_online.py:355: DeprecationWarning: invalid escape sequence '\ '
    """A 'CircularOnlineFlow' is a cyclic sequence of online/non-trainable nodes that are trained and executed
mdp/nodes/isfa_nodes.py:88
  /home/runner/work/mdp-toolkit/mdp-toolkit/mdp/nodes/isfa_nodes.py:88: DeprecationWarning: invalid escape sequence '\k'
    """Initializes an object of type 'ISFANode' to perform

The next one applies to plenty of nodes. alltrue needs to be search/replaced systematically in the whole repo:

mdp/test/test_FANode.py::test_FANode
  /home/runner/work/mdp-toolkit/mdp-toolkit/mdp/test/test_FANode.py:50: DeprecationWarning: `alltrue` is deprecated as of NumPy 1.25.0, and will be removed in NumPy 2.0. Please use `all` instead.
    assert_array_almost_equal_diff(numx.cov(est, rowvar=0),
mdp/test/test_FANode.py::test_FANode_indim
  /home/runner/work/mdp-toolkit/mdp-toolkit/mdp/signal_node.py:634: DeprecationWarning: `product` is deprecated as of NumPy 1.25.0, and will be removed in NumPy 2.0. Please use `prod` instead.
    self._train_seq[self._train_phase][1](*args, **kwargs)
/home/runner/work/mdp-toolkit/mdp-toolkit/mdp/nodes/mca_nodes_online.py:132: DeprecationWarning: Conversion of an array with ndim > 0 to a scalar is deprecated, and will error in future. Ensure you extract a single element from your array before performing this operation. (Deprecated NumPy 1.25.)
    self.d[j] = mdp.numx.sqrt(l)

Inconsistencies in testing

At least in some test sessions, not all dependencies are installed. I am not sure about the consequences of this.

============================= test session starts ==============================
platform linux -- Python 3.6.7, pytest-4.0.0, py-1.7.0, pluggy-0.8.0
          python: 3.6.7.final.0
             mdp: 3.5, MDP-3.5-LKG-140-g9251ea7
 parallel python: NOT AVAILABLE: No module named 'pp'
          shogun: NOT AVAILABLE: No module named 'shogun'
          libsvm: NOT AVAILABLE: No module named 'svm'
          joblib: 0.14.1
         sklearn: 0.21.2
            numx: scipy 1.3.3
          symeig: scipy.linalg.eigh

This dates back at least 2 years.

This phenomenon appears depending on version combinations as well. See that Python: 3.7-dev, VNUMPY=1.16.1, VSCI=0.20.0 properly installs sklearn, whereas Python: 3.6, VNUMPY=1.12.1, VSCI=0.20.0 does not.

Additionally, in Python: 3.7-dev, VNUMPY=1.16.1, VSCI=0.20.0 the backend is provided by scipy and in Python: 3.6, VNUMPY=1.12.1, VSCI=0.20.0 the numx backend is provided by numpy.

Possibly, there should be a careful overhaul of the testing setup. I am happy to work on this, but it could be helpful to have some assistance by a more senior MDP developer. Ideally, one that has been involved in the testing previously.

When tackling this issue it may be interesting to look at a guideline on the minimum passing probability for random tests. An issue with this popped up before:

I did a test using the repo at this point in time. Namely, I ran test_ccipcanode_v2 400 times. It failed 9 times. Lets assume the repo had only this test. As we currently have 26 jobs running in Travis and given they are independent, to reach a 0.8 probability of a success of a travis run, we would need to have a probability of a single instance of this test succeeding of at least

Assuming we have a probability of the test succeeding of at least that, the probability of observing 9 instances or more out of 400 failing is smaller than

This suggests a more general guideline should exists for the randomized tests, especially considering that we have multiple independent randomized tests, not only this one. This fix does not live up to these standards. However, it fixes the errors for now.

incompatible with numpy 1.25.0

Current mdp's master against numpy 1.25.0:

[builder@localhost mdp-toolkit]$ python3 -c 'import mdp'
Traceback (most recent call last):
  File "/usr/lib64/python3.11/inspect.py", line 1369, in getfullargspec
    sig = _signature_from_callable(func,
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/inspect.py", line 2521, in _signature_from_callable
    return _signature_from_builtin(sigcls, obj,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/inspect.py", line 2328, in _signature_from_builtin
    raise ValueError("no signature found for builtin {!r}".format(func))
ValueError: no signature found for builtin <function eigh at 0x7f85eea65fb0>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/src/RPM/BUILD/mdp-toolkit/mdp/__init__.py", line 133, in <module>
    utils.symeig = configuration.get_symeig(numx_linalg)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/RPM/BUILD/mdp-toolkit/mdp/configuration.py", line 338, in get_symeig
    args = getargs(numx_linalg.eigh)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/inspect.py", line 1379, in getfullargspec
    raise TypeError('unsupported callable') from ex
TypeError: unsupported callable

`numpy.typeDict` is deprecated

In

_UNSAFE_DTYPES = [numx.typeDict[d] for d in

we use np.typeDict. This is deprecated since forever: numpy/numpy@6689502 and numpy/numpy#17585 . Since numpy release 1.21 this now issues a deprecation warning: numpy/numpy#17586
It seems that scipy is not importing deprecated names from the numpy namespace anymore, so that with numpy 1.21 and scipy 1.7 you now get an error:

AttributeError: module 'scipy' has no attribute 'typeDict'

We should change np.typeDict with np.sctypeDict most probably, but it is probably better to try to understand what we were trying to do in the function get_dtypes and fix it properly:

def get_dtypes(typecodes_key, _safe=True):

Thanks to Elgin Road for reporting this!

mdp.test() fails with py.test 3.0.3

the value reinterp for the hard-coded option --assert in mdp.test() is not supported anymore by py.test, at least from version 3.0.3, possibly even earlier. Running the tests with mdp.test() produces an error:
usage: [options] [file_or_dir] [file_or_dir] [...] : error: argument --assert: invalid choice: 'reinterp' (choose from 'rewrite', 'plain')

Travis CI testing broken

See
https://travis-ci.org/mdp-toolkit/mdp-toolkit/builds/481630084?utm_source=github_status&utm_medium=notification

This was triggered by merging #42, but the tests of #42 went fine. So this issue is obviously not caused by changes in #42, but by changes in Travis. It looks like something is wrong with pytest...? Maybe a new release of pytest had backward-breaking changes?

I googled for the prominent feature of the erroneous Travis log: "INTERNALERROR".
This found i.a. ansible/pytest-mp#8, which is about a somewhat similar phenomenon and there it was due to pytest.

Ideas?

fix random seed for releases

I am following up on private email communication where we discussed the issue of small but notable probability of failing random tests due to insane random data.
We concluded it is best to fix a seed for releases, so the tests of a release run reliably.

Alternatives would have been to lower testing precision, but that would spoof test reliability and would not eliminate the risk of failure, just make it less likely, which is not what we want. The algorithms cannot be improved to handle every ill kind of data because in the end this relies on numerical operations and for every numerical operation there is some nasty data that provokes instability. Checking or sanetizing data would in general be unnecessarily expensive, given the low probability of failure. For frequent and annoying cases this may be an option though (c.f. symeig_semidefinite).

A remaining question is how to proceed after the release. I propose we should unfix the seed again, but track results to learn more about the situation. I'd suggest we collect a list of sane seeds (around 1000 or so) and then randomly choose from these. Still fix a sane seed for every release.
For the upcoming release, let's just fix some good seed and unfix it afterwards.

Suggestions, opinions?

pytest 3.5: Metafunc.addcall is deprecated

Splitting off from #37. I did not yet investigate, just copied the issue description provided by @AlbertoEsc:

pytest 3.5 gives hundreds of warnings “Metafunc.addcall is deprecated and scheduled to be removed in pytest 4.0. Please use Metafunc.parametrize instead.”, perhaps we should consider updating this part of the tests.

Sklearn wrapper break with Scipy on Python 3

Line 354 in scikits_nodes seems to be the problem: It tries to use __func__ on a method function (and this is apparently only available on the descriptor on instances of the class).

Apparently this is a problem with the 2to3 translation, the original Python 2 code uses im_fun.

I'm not sure what the best approach for fixing this would be (maybe we have to find a workaround for 2to3).

I had the problem on Python 3.3.5 (installed on OSX Mavericks with brew), with sklearn 0.14.1 and scipy 0.13.3.

Fix handling of libsvm

Newer versions of libsvm, which are compatible only with Python 3, come with a different interface if installed directly from upstream (or via the Debian/Ubuntu package) or from PyPI. For now we only support the API of the PyPI package, but it would be wise to be able to support the other API too. Adding support is just a matter of a couple of if/else statements.
Reference: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=958114

Some Travis runs take very long

In recent Travis run https://travis-ci.org/mdp-toolkit/mdp-toolkit/builds/553703911 I observed that the first two specific configurations took about 14-15 minutes to complete while all other runs took only about 2-5 minutes.
The long-taking runs were Python 3.5 + NumPy 1.8.2 + SciKits 0.18. However a later run with Python 3.5 + NumPy 1.12.1 + SciKits 0.18 was okay so it seems to be caused by NumPy 1.8.2.
It looks like the tests were hanging most time at test_SFANode.
It is difficult to see whether the log contains more useful info because it is totally spammed with the deprecation warning mentioned in #49. So we better fix that one first.
Also, given that the issuous configuration is somewhat odd this doesn't have high priority. I'd like to track it though as it might point to an issue in test_SFANode.

Drop Python 2 compatibility

As Python 2 is no longer developed, we should drop compatibility for it. This can happen first by just removing dependency on the future package and change the code in MDP that was using future to be compatible with Python 3 only. On the long run it would be good to get rid of old syntax and start using Python 3 new features more aggressively.

pytest 5

Upgrading to pytest 5 yields a couple of breakages. This is a non-goal for the upcoming release, but I open this issue to start tracking and hopefully fixing the breakages.

For details see the Travis logs and #65

Redundant processing of _get_train_seq()

Hi Everyone,

I have a doubt regarding _train_seq in the Node class.

_train_seq = property(lambda self: self._get_train_seq(), doc="""... """)

The _train_seq property whenever accessed during training/stop_training will always call self._get_train_seq(). For a flownode or nodes with expensive self._get_train_seq(), isn't it
an overkill to call this method every time _train_seq is accessed?

Please correct me if I am wrong, the only advantage I see by doing this via a property is that one could add docstring. But, maybe this can be done using a private variable that is initialized only once in the Node's constructor, say self.__train_seq = self._get_train_seq() and then defining the property as

_train_seq = property(self.__train_seq, doc="""... """)

Letting Travis CI test on Python 3 as well

Following up on @AlbertoEsc's latest fix, we should let Travis test on Python 3 as well, given that all tests pass now on Python 3. Looking at .travis.yml I recognize that mdp uses a hack to let Travis access numpy and scipy from neurodebian. I'm not familiar with this procedure, usually I would have something like this in travis.yml:

python:
  - "2.7"
  - "3.3"
  - "3.4"
  - "3.5.2"
  - "3.5.3"
- "3.6

Does someone know how to achieve Python 3 tests based on the current hack (i.e. pretending to be erlang)? @otizonaizit ?
I confirmed that with current config, Travis really does only run tests on Python 2.7.

incorrect (obsolete) del statements -- mdp is not friendly to 'reload()'

just got a surprise while testing pymvpa locally:

  File "/home/yoh/proj/pymvpa/pymvpa/debian/tmp/usr/lib/python2.6/dist-packages/mvpa2/base/externals.py", line 99, in __assign_mdp_version
    import mdp
  File "/usr/lib/python2.6/dist-packages/mdp/__init__.py", line 165, in <module>
    import nodes
  File "/usr/lib/python2.6/dist-packages/mdp/nodes/__init__.py", line 55, in <module>
    del convolution_nodes
NameError: name 'convolution_nodes' is not defined

and looking at the code saw:

if numx_description == 'scipy':
    from convolution_nodes import Convolution2DNode
    __all__ += ['Convolution2DNode']
    del convolution_nodes

my wild guess that it it used to be importing convolution_nodes as a whole and del was supposed to clean up the space....

Drop pp and shogun nodes

  • pp, aka parallel python is a library for parallelization of Python code, that can be used in mdp.parallel as an optional backend. From the website and the downloads it seems that some version of that library supports Python 3. Unfortunately the version available on PyPI is Python 2 only. It is available on Debian/Ubuntu in the package python-pp. But this is package is going to be dropped soon, because Debian and Ubuntu are removing all Python 2 pacakges. There is no Python 3 compatible version on Debian/Ubuntu. It seems the only way to have pp on Python 3 is to download manually from the project webpage, but the Python 3 version is not even up to date with the last Python 2 pp version, and it dates back to 2014.
    I think we should drop it and remove the corresponding compatibility code.

  • shogun is a library for machine learning and SVM. It does not seem to be available anymore on PyPI, nor on Debian/Ubuntu. It seems that shogun can be installed via conda-forge or by adding a PPA on Ubuntu. Either we find a way to integrate this in travis, or we should drop the nodes using shogun.

still leaves /tmp/pp4mdp-monkeypatch* junk behind

Using mdp debian package 3.2+git78-g7db3c50-3 . While running PyMVPA unittests I see those generated. The easiest (but bulky) way to reproduce is to invoke "python -c 'from mvpa2.suite import *'" while having MDP available.

Drop the obsolete numx backend

Context: #58

The numx backend can be dropped completely, and we can have scipy as a haard dependency.
After release 3.6 this is a low hanging fruit.

Disable Coveralls

IMO Coveralls does not provide any value to MDP in current maintenance mode, so I vote for disabling it. No test-engineering is planned for MDP (or could be afforded by current maintainers). So the slight variations in coverage due to routine maintenance are causing nothing but noise and pollute test results.
I will wait one week for replies and otherwise go on to disable it (given that re-enabling is a trivial thing).

Also see my comment #96 (comment)

test_ParallelNearestMeanClassifier fails with python 3.6

Found when making Python 3.6 supported in Debian/Ubuntu (https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=867515).

This passes:

$ python3.5 -m pytest --seed=1029384756 mdp/test/test_parallelclassifiers.py  -k test_ParallelNearestMeanClassifier

This fails (consistently):

$ python3.6 -m pytest --seed=1029384756 mdp/test/test_parallelclassifiers.py  -k test_ParallelNearestMeanClassifier 
============================= test session starts ==============================
platform linux -- Python 3.6.2rc2, pytest-3.0.6, py-1.4.34, pluggy-0.4.0
          python: 3.6.2.candidate.2
             mdp: 3.5
 parallel python: NOT AVAILABLE: No module named 'pp'
          shogun: NOT AVAILABLE: No module named 'shogun'
          libsvm: NOT AVAILABLE: No module named 'svm'
          joblib: 0.10.4.dev0
         sklearn: 0.18
            numx: scipy 0.18.1
          symeig: scipy.linalg.eigh
Random Seed: 1029384756

rootdir: /mdp-3.5, inifile: pytest.ini
collected 3 items 

mdp/test/test_parallelclassifiers.py F

===================================== NOTE =====================================
          python: 3.6.2.candidate.2
             mdp: 3.5
 parallel python: NOT AVAILABLE: No module named 'pp'
          shogun: NOT AVAILABLE: No module named 'shogun'
          libsvm: NOT AVAILABLE: No module named 'svm'
          joblib: 0.10.4.dev0
         sklearn: 0.18
            numx: scipy 0.18.1
          symeig: scipy.linalg.eigh
Random Seed: 1029384756

IMPORTANT: some tests use random numbers. This could
occasionally lead to failures due to numerical degeneracies.
To rule this out, please run the tests more than once.
If you get reproducible failures please report a bug!

=================================== FAILURES ===================================
______________________ test_ParallelNearestMeanClassifier ______________________

    def test_ParallelNearestMeanClassifier():
        """Test ParallelGaussianClassifier."""
        precision = 6
        xs = [numx_rand.random([4,5]) for _ in range(8)]
        labels = [1,2,1,1,2,3,2,3]
        node = mdp.nodes.NearestMeanClassifier()
        pnode = parallel.ParallelNearestMeanClassifier()
        for i, x in enumerate(xs):
            node.train(x, labels[i])
        node.stop_training()
        pnode1 = pnode.fork()
        pnode2 = pnode.fork()
        for i, x in enumerate(xs):
            if i % 2:
                pnode1.train(x, labels[i])
            else:
                pnode2.train(x, labels[i])
        pnode.join(pnode1)
        pnode.join(pnode2)
        pnode.stop_training()
        # check that results are the same for all object classes
        assert_array_almost_equal(node.ordered_means, pnode.ordered_means,
>                                 precision)
E       AssertionError: 
E       Arrays are not almost equal to 6 decimals
E       
E       (mismatch 66.66666666666666%)
E        x: array([[ 0.399052,  0.586642,  0.446319,  0.522502,  0.474398],
E              [ 0.43769 ,  0.385436,  0.323763,  0.399375,  0.474671],
E              [ 0.569127,  0.311976,  0.673377,  0.653891,  0.391286]])
E        y: array([[ 0.43769 ,  0.385436,  0.323763,  0.399375,  0.474671],
E              [ 0.399052,  0.586642,  0.446319,  0.522502,  0.474398],
E              [ 0.569127,  0.311976,  0.673377,  0.653891,  0.391286]])

mdp/test/test_parallelclassifiers.py:58: AssertionError
============================== 2 tests deselected ==============================
==================== 1 failed, 2 deselected in 0.06 seconds ====================

IncSFA test fails sometimes

The test seems to be random and this might be a convergence problem. Some dirty methods of fixing this could be reducing the precision or running the test multiple times.

This is the line where the test fails and this an instance of the test failing.

___________________________ test_inverse[IncSFANode] ___________________________
klass = <class 'mdp.nodes.IncSFANode'>, init_args = ()
inp_arg_gen = <function CCIPCANode_inp_arg_gen at 0x7fac27746598>
sup_arg_gen = None, execute_arg_gen = None
    @only_if_node(lambda nodetype: nodetype.is_invertible())
    def test_inverse(klass, init_args, inp_arg_gen,
                     sup_arg_gen, execute_arg_gen):
        args = call_init_args(init_args)
        inp = inp_arg_gen()
        # take the first available dtype for the test
        dtype = klass(*args).get_supported_dtypes()[0]
        args = call_init_args(init_args)
        node = klass(dtype=dtype, *args)
        _train_if_necessary(inp, node, sup_arg_gen)
        extra = [execute_arg_gen(inp)] if execute_arg_gen else []
        out = node.execute(inp, *extra)
        # compute the inverse
        rec = node.inverse(out)
        # cast inp for comparison!
        inp = inp.astype(dtype)
>       assert_array_almost_equal_diff(rec, inp, decimal-3)
mdp/test/test_nodes_generic.py:327: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
x = array([[ 0.5873536 , -0.8092985 ],
       [ 0.5861807 , -0.8076793 ],
       [ 0.5850065 , -0.80609053],
       ...,
       [-0.16121851, -0.11700002],
       [-0.16154128, -0.11724813],
       [-0.16186468, -0.11748097]], dtype=float32)
y = array([[ 0.5873562 , -0.80932856],
       [ 0.5861803 , -0.80770826],
       [ 0.58500445, -0.806088  ],
       ...,
       [-0.1612176 , -0.11700089],
       [-0.16154166, -0.11723606],
       [-0.16186571, -0.11747125]], dtype=float32)
digits = 4, err_msg = ''
    def assert_array_almost_equal_diff(x,y,digits,err_msg=''):
        x,y = numx.asarray(x), numx.asarray(y)
        msg = '\nArrays are not almost equal'
        assert 0 in [len(numx.shape(x)),len(numx.shape(y))] \
               or (len(numx.shape(x))==len(numx.shape(y)) and \
                   numx.alltrue(numx.equal(numx.shape(x),numx.shape(y)))),\
                   msg + ' (shapes %s, %s mismatch):\n\t' \
                   % (numx.shape(x),numx.shape(y)) + err_msg
        maxdiff = old_div(max(numx.ravel(abs(x-y))),\
                  max(max(abs(numx.ravel(x))),max(abs(numx.ravel(y)))))
        if numx.iscomplexobj(x) or numx.iscomplexobj(y): maxdiff = old_div(maxdiff,2)
        cond =  maxdiff< 10**(-digits)
        msg = msg+'\n\t Relative maximum difference: %e'%(maxdiff)+'\n\t'+\
              'Array1: '+str(x)+'\n\t'+\
              'Array2: '+str(y)+'\n\t'+\
              'Absolute Difference: '+str(abs(y-x))
>       assert cond, msg
E       AssertionError: 
E       Arrays are not almost equal
E       	 Relative maximum difference: 1.104705e-04
E       	Array1: [[ 0.5873536  -0.8092985 ]
E        [ 0.5861807  -0.8076793 ]
E        [ 0.5850065  -0.80609053]
E        ...
E        [-0.16121851 -0.11700002]
E        [-0.16154128 -0.11724813]
E        [-0.16186468 -0.11748097]]
E       	Array2: [[ 0.5873562  -0.80932856]
E        [ 0.5861803  -0.80770826]
E        [ 0.58500445 -0.806088  ]
E        ...
E        [-0.1612176  -0.11700089]
E        [-0.16154166 -0.11723606]
E        [-0.16186571 -0.11747125]]
E       	Absolute Difference: [[2.6226044e-06 3.0040741e-05]
E        [3.5762787e-07 2.8967857e-05]
E        [2.0265579e-06 2.5629997e-06]
E        ...
E        [9.0897083e-07 8.6426735e-07]
E        [3.7252903e-07 1.2062490e-05]
E        [1.0281801e-06 9.7230077e-06]]

Scipy 1.4 support

Running with Scipy 1.4 breaks several tests.

The output is hard to read because the following deprecation warnings produce a lot of noise:

DeprecationWarning: scipy.dot is deprecated and will be removed in SciPy 2.0.0, use numpy.dot instead

and in similar fashion for exp, ones, diag, diagonal, eye, sqrt, outer, sign, sin, cos, zeros, linspace, where, array, concatenate, all, maybe more.

I'd suggest to fix these first so we get rid of this noise. I guess this "just" requires some change to the numx setup. We will have to detect Scipy 1.4+.

Failing tests are in
mdp/test/test_ICANode.py
mdp/test/test_ISFANode.py
mdp/test/test_nodes_generic.py

How do I define the number of sources for FastICA decomposition in the MDP.nodes.FastICANode module?

I'm trying to set the number of sources for the FastICA decomposition algorithm in the MDP module.
In the below example, I have three different sources that I mix with a matrix to get 4 observations. When I apply the FastICA algorithm, it found 4 sources as, by default, the number of sources is taken as equal as the number of observation.

import matplotlib.pyplot as plt
import numpy as np
import scipy as sp
from mdp.nodes import FastICANode

# time
Fs=1024
N=10*Fs
t = np.arange(N)/Fs

# sources
s1=np.sin(2*np.pi*0.5*t)
s2=(np.random.rand(N)-0.5)*0.02
s3=sp.signal.sawtooth(2 * np.pi * 2 * t, width=0.5)*0.4
s4=sp.signal.square(2 * np.pi * 5 * t, duty=0.05)

# Mixing matrix
W =np.random.dirichlet(np.ones((4)),size=(4,3))
# compute observations
x = np.dot(W[:,:,1],np.vstack((s1 , s2 , s4)))

#Apply ICA
ica = FastICANode(g='tanh') 
ica.execute(x.T)
Ae = ica.get_recmatrix().T
S = np.dot(np.linalg.pinv(Ae),x)

#display results
plt.figure()
plt.subplot(4,3,1)
plt.title('True Sources ')
plt.plot(t,s1)
plt.subplot(4,3,4)
plt.plot(t,s2)
plt.subplot(4,3,7)
# plt.plot(t,s3)
# plt.subplot(4,3,10)
plt.plot(t,s4)

plt.subplot(4,3,2)
plt.title('Observations ')
plt.plot(t,x[0,:])
plt.subplot(4,3,5)
plt.plot(t,x[1,:])
plt.subplot(4,3,8)
plt.plot(t,x[2,:])
plt.subplot(4,3,11)
plt.plot(t,x[3,:])

plt.subplot(4,3,3)
plt.title('ICA results')
plt.plot(t,S[0,:])
plt.subplot(4,3,6)
plt.plot(t,S[1,:])
plt.subplot(4,3,9)
plt.plot(t,S[2,:])
plt.subplot(4,3,12)
plt.plot(t,S[3,:])
plt.show()

So at the end I get this:
enter image description here

If I tried to set the number of sources with ica.set_output_dim(3):

#Apply ICA
ica = FastICANode(g='tanh')
ica.set_output_dim(3)
ica.execute(x.T) 

I get an error saying:

mdp.signal_node.InconsistentDimException: Output dim are set already (3) (4 given)!

How can I say to the FastICANode class how many sources it should find?

DeprecationWarning: inspect.getargspec() is deprecated

Recent Travis logs are full of the following warning (though passing):

/home/travis/build/mdp-toolkit/mdp-toolkit/mdp/signal_node.py:165:
DeprecationWarning: inspect.getargspec() is deprecated,
use inspect.signature() or inspect.getfullargspec()

This doesn't look too difficult to fix, so we should get it done to make the logs cleaner and thus more useful.

Error when using mdp.nodes.LogisticRegressionScikitsLearnNode

I am trying to use the scikits logistic regression node through MDP, but keep getting an error when attempting to classify new data. I can use scikits directly and things seem to work just fine. I'd rather create my solution in MDP so I can easily swap other classifiers in for Logistic Regression in later phases of this project.

A sample script plus error message is included below. Please advise,

Thanks,

Amar

Code:

import numpy as np
from numpy.random import normal
import mdp

nPts = 10000

create data class A as a normal distribution with mean -1, std = 2

dataA = np.empty((nPts, 3))
dataA[:, 0] = normal(-1, 2, nPts)
dataA[:, 1] = normal(-1, 2, nPts)
dataA[:, 2] = normal(-1, 2, nPts)

create data class B as a normal distribution with mean 1, std = 2

dataB = np.empty((nPts, 3))
dataB[:, 0] = normal(1,2,nPts)
dataB[:, 1] = normal(1,2,nPts)
dataB[:, 2] = normal(1,2,nPts)

combine classA and classB data

dataCombined = np.vstack((dataA, dataB))

create an integer array labeling classA as ones, classB as zeros

groups = np.hstack((np.ones(dataA.shape[0]), np.zeros(dataB.shape[0])))
groups = (groups > 0.5).astype(np.int)

create a LogisticRegression object

LR = mdp.nodes.LogisticRegressionScikitsLearnNode(penalty = 'l2', dual = False, C = 1e-6)

train the LR object

LR.train(dataCombined, groups)

reclassify the training set

LR.execute(dataCombined)


Error message (occurs after the execute command):

Traceback (most recent call last):
File "", line 1, in
File "", line 1, in
File "/usr/local/lib/python2.7/site-packages/mdp/signal_node.py", line 643, in execute
self._pre_execution_checks(x)
File "/usr/local/lib/python2.7/site-packages/mdp/signal_node.py", line 518, in _pre_execution_checks
self._if_training_stop_training()
File "/usr/local/lib/python2.7/site-packages/mdp/signal_node.py", line 497, in _if_training_stop_training
self.stop_training()
File "", line 1, in
File "/usr/local/lib/python2.7/site-packages/mdp/signal_node.py", line 624, in stop_training
self._train_seq[self._train_phase][1](*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/mdp/nodes/scikits_nodes.py", line 289, in _stop_training
super(ScikitsNode, self).init(input_dim=input_dim,
TypeError: fit() takes at least 3 arguments (2 given)

Cannot import latest mdp using python 3

The latest version of mdp is imported properly when using python 2, but when I try to import mdp using python 3 I get the following exception:

$ python3 -c "import mdp"
Traceback (most recent call last):
File "", line 1, in
File "/home/escalafl/MDP_Fork_AlbertoEsc/mdp-toolkit/mdp/init.py", line 131, in
from . import utils
File "/home/escalafl/MDP_Fork_AlbertoEsc/mdp-toolkit/mdp/utils/init.py", line 36, in
from .symeig_semidefinite import (symeig_semidefinite_reg,
File "/home/escalafl/MDP_Fork_AlbertoEsc/mdp-toolkit/mdp/utils/symeig_semidefinite.py", line 58, in
from _symeig import SymeigException
ImportError: No module named '_symeig'

Apparently this is related to (the absence of) relative imports in python3 and the easiest fix is to change "from _symeig import SymeigException" to "from ._symeig import SymeigException" (an extra period) in symeig_semidefinite.py

Several tests fail under Python 3

As some of you have already noticed, several tests fail under python3. When running python3 setup.py test I get 59 failed tests with python 3.4.3: ====== 59 failed, 730 passed, 17 skipped, 490 warnings in 120.83 seconds =======

The failing tests involve mostly the new online nodes/flows, and they originate mostly from using xrange. Also about 3 tests fail due to using float division instead of integer division. There is a pending pull request involving online nodes, but afaik it does not solve this and is itself also not fully python3 compatible.

If you want I can send a small pull request that fixes the current incompatibilities. Or, is there any plan on how to proceed with the failing tests?

bimdp autogen breaks with sklearn 0.14.1

It is likely caused by a docstring in sklearn containing a unicode character.

Traceback (most recent call last):
 File "/Users/vagrant/src/buildsystem-git/recipes/entest_utils.py", line 268, in test1_imports
   exec 'import ' + module
 File "<string>", line 1, in <module>
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/__init__.py", line 91, in <module>
   from inspection import *
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/inspection/__init__.py", line 5, in <module>
   from tracer import (
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/inspection/tracer.py", line 32, in <module>
   from bimdp.hinet import BiFlowNode, CloneBiLayer
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/hinet/__init__.py", line 5, in <module>
   from bihtmlvisitor import BiHiNetHTMLVisitor, show_biflow
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/hinet/bihtmlvisitor.py", line 12, in <module>
   from bimdp.nodes import SenderBiNode
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/nodes/__init__.py", line 3, in <module>
   exec binodes_code()
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/nodes/autogen.py", line 111, in binodes_code
   _binode_module(fid, nodes)
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/nodes/autogen.py", line 102, in _binode_module
   old_classname=old_classname)
 File "/Users/vagrant/src/master-env/lib/python2.7/site-packages/bimdp/nodes/autogen.py", line 70, in _binode_code
   fid.write('\n        """%s"""' % docstring)
UnicodeEncodeError: 'ascii' codec can't encode character u'\u2013' in position 62: ordinal not in range(128)

mdp.test() completes but Apple reports python crashed

I'm on Mac OS X 10.7.5, on ipython 0.13.1, python 2.7.2

I do import mdp;mdp.test() successfully, but the Apple crash reporter claims that Python crashed. My Ipython shell is intact and file. Interpreter output and stack trace are attached

In [108]: mdp.test()
============================================================================= test session starts =============================================================================
platform darwin -- Python 2.7.2 -- pytest-2.1.2
          python: 2.7.2.final.0
             mdp: 3.3
 parallel python: NOT AVAILABLE: No module named pp
          shogun: NOT AVAILABLE: No module named shogun
          libsvm: NOT AVAILABLE: No module named svm
          joblib: NOT AVAILABLE: No module named joblib
         sklearn: NOT AVAILABLE: No module named scikits.learn
            numx: scipy 0.10.0
          symeig: scipy.linalg.eigh
Random Seed: 839992836

collected 581 items 

../../../../../../Library/Python/2.7/lib/python/site-packages/mdp/test/test_AdaptiveCutoffNode.py ..
../../../../../../Library/Python/2.7/lib/python/site-packages/mdp/test/test_Convolution2DNode.py 

Stack trace

Process:         Python [13206]
Path:            /Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python
Identifier:      Python
Version:         ??? (???)
Code Type:       X86-64 (Native)
Parent Process:  Python [12070]

Date/Time:       2013-05-05 08:55:09.190 -0400
OS Version:      Mac OS X 10.7.5 (11G63)
Report Version:  9

Interval Since Last Report:          9358816 sec
Crashes Since Last Report:           84
Per-App Crashes Since Last Report:   3
Anonymous UUID:                      88A24C38-8362-40CA-8FF3-BEEB12570B19

Crashed Thread:  0  Dispatch queue: com.apple.main-thread

Exception Type:  EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x000000000102c919

VM Regions Near 0x102c919:
--> 
    __TEXT                 0000000100000000-0000000100001000 [    4K] r-x/rwx SM=COW  /Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python

Application Specific Information:
objc[13206]: garbage collection is OFF

Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0   org.python.python               0x00000001000ff520 collect + 304
1   org.python.python               0x000000010010018d _PyObject_GC_Malloc + 269
2   org.python.python               0x0000000100100202 _PyObject_GC_NewVar + 50
3   org.python.python               0x0000000100070014 PyTuple_New + 324
4   org.python.python               0x0000000100052aaa dict_items + 74
5   org.python.python               0x00000001000c118a PyEval_EvalFrameEx + 22746
6   org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
7   org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
8   org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
9   org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
10  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
11  org.python.python               0x000000010003da80 function_call + 176
12  org.python.python               0x000000010000c5e2 PyObject_Call + 98
13  org.python.python               0x0000000100010f45 PyObject_CallFunction + 197
14  org.python.python               0x000000010005a67f _PyObject_GenericGetAttrWithDict + 383
15  org.python.python               0x00000001000bd7cf PyEval_EvalFrameEx + 7967
16  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
17  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
18  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
19  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
20  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
21  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
22  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
23  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
24  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
25  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
26  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
27  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
28  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
29  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
30  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
31  org.python.python               0x000000010003da80 function_call + 176
32  org.python.python               0x000000010000c5e2 PyObject_Call + 98
33  org.python.python               0x00000001000bd53c PyEval_EvalFrameEx + 7308
34  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
35  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
36  org.python.python               0x000000010003da80 function_call + 176
37  org.python.python               0x000000010000c5e2 PyObject_Call + 98
38  org.python.python               0x00000001000bd53c PyEval_EvalFrameEx + 7308
39  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
40  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
41  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
42  org.python.python               0x000000010003da80 function_call + 176
43  org.python.python               0x000000010000c5e2 PyObject_Call + 98
44  org.python.python               0x00000001000bd53c PyEval_EvalFrameEx + 7308
45  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
46  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
47  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
48  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
49  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
50  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
51  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
52  org.python.python               0x000000010003da80 function_call + 176
53  org.python.python               0x000000010000c5e2 PyObject_Call + 98
54  org.python.python               0x00000001000bd53c PyEval_EvalFrameEx + 7308
55  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
56  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
57  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
58  org.python.python               0x000000010003da80 function_call + 176
59  org.python.python               0x000000010000c5e2 PyObject_Call + 98
60  org.python.python               0x000000010001ebcb instancemethod_call + 363
61  org.python.python               0x000000010000c5e2 PyObject_Call + 98
62  org.python.python               0x000000010001df45 instance_call + 101
63  org.python.python               0x000000010000c5e2 PyObject_Call + 98
64  org.python.python               0x00000001000be5f3 PyEval_EvalFrameEx + 11587
65  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
66  org.python.python               0x000000010003da80 function_call + 176
67  org.python.python               0x000000010000c5e2 PyObject_Call + 98
68  org.python.python               0x00000001000bd53c PyEval_EvalFrameEx + 7308
69  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
70  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
71  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
72  org.python.python               0x000000010003da80 function_call + 176
73  org.python.python               0x000000010000c5e2 PyObject_Call + 98
74  org.python.python               0x000000010001ebcb instancemethod_call + 363
75  org.python.python               0x000000010000c5e2 PyObject_Call + 98
76  org.python.python               0x000000010001df45 instance_call + 101
77  org.python.python               0x000000010000c5e2 PyObject_Call + 98
78  org.python.python               0x00000001000be5f3 PyEval_EvalFrameEx + 11587
79  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
80  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
81  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
82  org.python.python               0x000000010003da80 function_call + 176
83  org.python.python               0x000000010000c5e2 PyObject_Call + 98
84  org.python.python               0x00000001000bd53c PyEval_EvalFrameEx + 7308
85  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
86  org.python.python               0x00000001000c1ebe PyEval_EvalFrameEx + 26126
87  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
88  org.python.python               0x000000010003da80 function_call + 176
89  org.python.python               0x000000010000c5e2 PyObject_Call + 98
90  org.python.python               0x000000010001ebcb instancemethod_call + 363
91  org.python.python               0x000000010000c5e2 PyObject_Call + 98
92  org.python.python               0x000000010001df45 instance_call + 101
93  org.python.python               0x000000010000c5e2 PyObject_Call + 98
94  org.python.python               0x00000001000be5f3 PyEval_EvalFrameEx + 11587
95  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
96  org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
97  org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
98  org.python.python               0x00000001000c2e46 PyEval_EvalCode + 54
99  org.python.python               0x00000001000e8417 PyRun_StringFlags + 279
100 org.python.python               0x00000001000c13bc PyEval_EvalFrameEx + 23308
101 org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
102 org.python.python               0x00000001000c0b6a PyEval_EvalFrameEx + 21178
103 org.python.python               0x00000001000c2d29 PyEval_EvalCodeEx + 2137
104 org.python.python               0x00000001000c2e46 PyEval_EvalCode + 54
105 org.python.python               0x00000001000e7b6e PyRun_FileExFlags + 174
106 org.python.python               0x00000001000e7e29 PyRun_SimpleFileExFlags + 489
107 org.python.python               0x00000001000fe77c Py_Main + 2940
108 org.python.python               0x0000000100000f14 0x100000000 + 3860

Thread 0 crashed with X86 Thread State (64-bit):
  rax: 0x0000000000000000  rbx: 0x0000000102df0228  rcx: 0x0000000102dd1f30  rdx: 0x000000000102c8f9
  rdi: 0x0000000100189e50  rsi: 0x0000000100189e20  rbp: 0x00007fff5fbf6d10  rsp: 0x00007fff5fbf6c40
   r8: 0x0000000100189e40   r9: 0x0000000000000001  r10: 0x0000000000000030  r11: 0x0000000100189e50
  r12: 0x0000000000000002  r13: 0x0000000100189e50  r14: 0x0000000000000001  r15: 0x0000000100189e20
  rip: 0x00000001000ff520  rfl: 0x0000000000010212  cr2: 0x000000000102c919
Logical CPU: 1

Binary Images:
       0x100000000 -        0x100000fff +org.python.python (2.7.2 - 2.7.2) <639E72E4-F205-C034-8E34-E59DE9C46369> /Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python
       0x100003000 -        0x10016cfef +org.python.python (2.7.2, [c] 2004-2011 Python Software Foundation. - 2.7.2) <49D18B1A-C92D-E32E-A7C1-086D0B14BD76> /Library/Frameworks/Python.framework/Versions/2.7/Python
       0x1002eb000 -        0x1002effff +_struct.so (??? - ???) <8FFF4DE5-5CF1-5DFD-ADE0-5AAD633838BF> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_struct.so
       0x1002f6000 -        0x1002f9fef +binascii.so (??? - ???) <89F159C8-99E7-6BB0-4829-3C441378FF07> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/binascii.so
       0x1002fd000 -        0x1002fdfff +_bisect.so (??? - ???) <0872429A-9467-F3F6-BA8E-CCF6C9AD4FB5> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_bisect.so
       0x100549000 -        0x10054cff7 +zlib.so (??? - ???) <BD96CFDC-B0B5-E21A-75BB-C3E8D035AADD> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/zlib.so
       0x100551000 -        0x100560ff7 +cPickle.so (??? - ???) <190F516F-97F6-7258-29D3-F8B23CF04C50> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/cPickle.so
       0x100568000 -        0x100569fff +cStringIO.so (??? - ???) <364F2486-CB7C-0DA0-C419-F3BCE469B261> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/cStringIO.so
       0x10072f000 -        0x100732ff7 +strop.so (??? - ???) <F7857283-F427-7CF7-9B0D-7619AA0A82F1> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/strop.so
       0x100737000 -        0x10073bff7 +operator.so (??? - ???) <CAD2CA2D-4216-507D-BD3E-3CE5FD1AC228> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/operator.so
       0x100742000 -        0x100746fff +_collections.so (??? - ???) <101CE794-99F9-CF85-B3DF-B2ACA887CEDB> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_collections.so
       0x10074c000 -        0x100753ff7 +itertools.so (??? - ???) <87448276-955A-B859-0CCB-35A9834894A7> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/itertools.so
       0x10075e000 -        0x10075fff7 +_heapq.so (??? - ???) <22FEED03-4F6C-2499-DC3B-952791344ABD> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_heapq.so
       0x1007a3000 -        0x1007a4ff7 +_functools.so (??? - ???) <755EA750-0AF2-4887-4C25-703D7366BB37> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_functools.so
       0x1007a7000 -        0x1007a9fff +_locale.so (??? - ???) <7F398010-BCF7-FF58-065C-85B45DF4FC03> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_locale.so
       0x1007ed000 -        0x1007f2fef +math.so (??? - ???) <BB711560-6A83-84BF-316C-9337CE845D0D> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/math.so
       0x1007f9000 -        0x1007fafff +_hashlib.so (??? - ???) <609B0E50-C9BD-9B88-9177-FFBAFED033ED> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_hashlib.so
       0x101200000 -        0x101201fff +_random.so (??? - ???) <54585B5D-1999-A6CC-9CEB-34F759F0438F> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_random.so
       0x101204000 -        0x101205ff7 +fcntl.so (??? - ???) <A147C2B8-5A06-D0C5-C60F-6BC316D73FAA> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/fcntl.so
       0x101208000 -        0x10121dfff +_io.so (??? - ???) <7A9EA5F2-75D9-F13A-7FDB-AD6841880B07> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_io.so
       0x101234000 -        0x101236fff +time.so (??? - ???) <77136671-9973-6EFB-9A2D-127664836672> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/time.so
       0x10123b000 -        0x101240fff +array.so (??? - ???) <88248C2B-CFF2-71B4-2511-09B994E36CE9> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/array.so
       0x1012c7000 -        0x1012d5fff +datetime.so (??? - ???) <710107D4-44E6-D559-D71B-2D59CCC8966E> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/datetime.so
       0x1012e1000 -        0x101315fe7 +pyexpat.so (??? - ???) <E5FD4237-8D59-8B8E-E229-19601A03F18E> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/pyexpat.so
       0x101327000 -        0x10132afff +select.so (??? - ???) <E30C1A76-F0AA-FA2B-9D48-975AFB586986> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/select.so
       0x101431000 -        0x101542fe7 +multiarray.so (??? - ???) <AEA3AB2F-29AA-3D0A-BF37-674A17A70B59> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/core/multiarray.so
       0x101620000 -        0x101669fff +umath.so (??? - ???) <FE915B9D-C351-3ECF-A587-9AAD7CDB0C80> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/core/umath.so
       0x101694000 -        0x101697ff7 +_dotblas.so (??? - ???) <E88D1400-B296-30E3-96E9-EBB5D01DCB7C> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/core/_dotblas.so
       0x1016db000 -        0x1016dffff +_compiled_base.so (??? - ???) <CDE4C31C-ED32-34C4-BC2F-6D9F58DC317A> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/lib/_compiled_base.so
       0x1016e3000 -        0x1016e6fff +lapack_lite.so (??? - ???) <FE047D10-117B-3C04-A979-E3DE595061F1> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/linalg/lapack_lite.so
       0x1016ea000 -        0x1016eafff +grp.so (??? - ???) <9133BC7F-65D3-5CFB-ACCC-8F1F9478FF2E> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/grp.so
       0x1016ed000 -        0x1016f6ff7 +fftpack_lite.so (??? - ???) <B043D610-6DB4-3AC2-8A5C-BB9CA671634F> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/fft/fftpack_lite.so
       0x1016fa000 -        0x1016fbff7 +_zeros.so (??? - ???) <B17953E8-18F9-361D-A05F-033ED2C6BE32> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/_zeros.so
       0x101800000 -        0x101822fff +scalarmath.so (??? - ???) <F6B171DE-C9E2-33B2-9625-9BCE5D67AF48> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/core/scalarmath.so
       0x101a34000 -        0x101a6ffff +mtrand.so (??? - ???) <D1C482F6-72D2-389D-8A3B-A93D3CAD59C3> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/random/mtrand.so
       0x101abc000 -        0x101ad2fff +_ctypes.so (??? - ???) <C5FE5EC4-F979-6AAA-C324-BD806FF6ECCC> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_ctypes.so
       0x101b1a000 -        0x101b41fff +fblas.so (??? - ???) <D8A26B9F-CC14-3FE9-8B7C-21F73B7A5DA8> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/fblas.so
       0x101b5b000 -        0x101c11faf +libgfortran.2.dylib (3.0.0 - compatibility 3.0.0) <45A8FC56-60A0-4678-B4AD-4F6231AEC7D7> /usr/local/lib/libgfortran.2.dylib
       0x101c58000 -        0x101c64fb1 +libgcc_s.1.dylib (??? - ???) <2F0056F0-9702-4965-B5AA-565A4E3443D3> /usr/local/lib/libgcc_s.1.dylib
       0x101c6e000 -        0x101c78ff7 +_flinalg.so (??? - ???) <BA66D4FC-F413-30A0-9097-BE62BB28216B> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/_flinalg.so
       0x101c7f000 -        0x101cd2fff +flapack.so (??? - ???) <3458FE96-63A1-36DA-85A6-2554AB2BF6F2> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/flapack.so
       0x101d08000 -        0x101d0bff7 +clapack.so (??? - ???) <32BAB3D2-1AA1-30D2-BD31-C3D580BADF92> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/clapack.so
       0x101d0f000 -        0x101d17ff7 +calc_lwork.so (??? - ???) <33A5A7B3-D08E-361F-9E34-E71FE1E94130> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/calc_lwork.so
       0x101d1d000 -        0x101d20fff +cblas.so (??? - ???) <B2E93037-999C-3A01-9DB0-0A0A02422224> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/cblas.so
       0x101d24000 -        0x101d56ff7 +_imaging.so (??? - ???) <6FD2652F-B0B1-3A3A-9247-8809839E4634> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/PIL/_imaging.so
       0x101d74000 -        0x101da8fe7 +libjpeg.8.dylib (12.0.0 - compatibility 12.0.0) <9EB31FBE-7E24-3AFB-BB61-08B6355C1833> /usr/local/lib/libjpeg.8.dylib
       0x101daf000 -        0x101decfff +_fftpack.so (??? - ???) <90ECA25F-308E-3C63-9154-BDDF5A240D61> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/fftpack/_fftpack.so
       0x101df9000 -        0x101dfafff +termios.so (??? - ???) <43FCCF9A-C66E-DE5B-0F7D-1028A41F9263> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/termios.so
       0x101f00000 -        0x101f13fff +convolve.so (??? - ???) <A5FB447F-7EE1-37E0-9273-50B5257483BB> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/fftpack/convolve.so
       0x102019000 -        0x102028fef +sigtools.so (??? - ???) <2D777B29-7932-35B3-8AD0-5686CE594BBA> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/signal/sigtools.so
       0x10202f000 -        0x102036fff +spline.so (??? - ???) <5EAAD3FF-63D4-31D7-8818-A3B0EC2A59DC> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/signal/spline.so
       0x10203b000 -        0x10213efe7 +_cephes.so (??? - ???) <91C0813F-DF40-37EF-A00F-4801EE4E1E4B> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/special/_cephes.so
       0x102159000 -        0x10220cfff +specfun.so (??? - ???) <186CE5D0-04B3-3CB2-B47C-6820487D79CA> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/special/specfun.so
       0x10221d000 -        0x102228fff +orthogonal_eval.so (??? - ???) <82B4A91E-D339-30CE-AEDB-E9852428E3B8> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/special/orthogonal_eval.so
       0x102231000 -        0x102234fff +lambertw.so (??? - ???) <56685985-C28B-3B52-A6C2-791503AC6AE9> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/special/lambertw.so
       0x10223c000 -        0x10223ffff +_logit.so (??? - ???) <828B4BE7-42A0-3ACB-ACAE-4CA19AF85D7A> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/special/_logit.so
       0x102286000 -        0x10228cff7 +minpack2.so (??? - ???) <F0B8F465-7773-3208-9BC7-8E0A8207B111> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/minpack2.so
       0x102291000 -        0x1022a9fff +_minpack.so (??? - ???) <486635CE-ECBF-3E42-9105-03924CFB730E> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/_minpack.so
       0x1022ad000 -        0x1022c6ff7 +_lbfgsb.so (??? - ???) <727CB19F-515E-33F7-A91E-0FA41C51DEDA> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/_lbfgsb.so
       0x1022cb000 -        0x1022d1ff7 +moduleTNC.so (??? - ???) <63386B16-8B23-333D-91B0-482575306BBA> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/moduleTNC.so
       0x1022d4000 -        0x1022eaff7 +_cobyla.so (??? - ???) <7E906E7F-9C1F-33F6-B8BD-E1863E48E96D> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/_cobyla.so
       0x10232f000 -        0x10240efff +_csr.so (??? - ???) <5C670D05-71F2-3299-9F3F-E0517B70FBB8> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/sparsetools/_csr.so
       0x102441000 -        0x1024caff7 +_csc.so (??? - ???) <9FF63781-08D9-3801-B3AF-B29B59F03077> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/sparsetools/_csc.so
       0x1024e8000 -        0x102513ff7 +_coo.so (??? - ???) <ACC4BFD5-33DE-3DA5-BD67-20CBE70C3802> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/sparsetools/_coo.so
       0x10251b000 -        0x102529fff +_dia.so (??? - ???) <1D123379-2829-33D1-9094-824F8F8E4EC3> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/sparsetools/_dia.so
       0x10252f000 -        0x10262bfff +_bsr.so (??? - ???) <1749D9C6-8463-3A0D-8836-38CDE00CD030> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/sparsetools/_bsr.so
       0x102672000 -        0x102676fff +_csgraph.so (??? - ???) <835CED40-45D6-3FDB-92DD-FB795AEB90EA> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/sparsetools/_csgraph.so
       0x1026bc000 -        0x1026e7ff7 +_iterative.so (??? - ???) <43350F57-2960-332A-ABCD-BD2244F924A4> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/isolve/_iterative.so
       0x1026fb000 -        0x102744ff7 +_superlu.so (??? - ???) <B91C6329-CDFA-3DFE-AF46-22D37310AAF8> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/dsolve/_superlu.so
       0x10275d000 -        0x1027dffff +_arpack.so (??? - ???) <1D5DB7A6-1599-3198-B3B4-FBBE3B621749> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so
       0x102836000 -        0x102848fff +_slsqp.so (??? - ???) <76952C71-7384-376A-8F3F-0932BEA8D76B> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/_slsqp.so
       0x10284d000 -        0x102855fff +_nnls.so (??? - ???) <97AEC876-F74B-3AD8-9C27-2167F79880A7> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/optimize/_nnls.so
       0x102859000 -        0x102888ff7 +_fitpack.so (??? - ???) <BA6F40E2-C2A8-32A4-BB06-D5A970960BA8> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/interpolate/_fitpack.so
       0x10288c000 -        0x1028c5ff7 +dfitpack.so (??? - ???) <9DC9F5EF-2C08-3B18-A966-40252AC18564> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/interpolate/dfitpack.so
       0x1028cf000 -        0x1028ebfff +interpnd.so (??? - ???) <D1651460-2F12-3033-B49E-ECF315B319A5> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/interpolate/interpnd.so
       0x1028f8000 -        0x1028fbfff +_multiprocessing.so (??? - ???) <88934565-4DA9-F2F8-A0A4-66E3A8879267> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_multiprocessing.so
       0x102a00000 -        0x102a0efff +ckdtree.so (??? - ???) <16C0FBCC-A235-375A-A206-80766FCB545A> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/spatial/ckdtree.so
       0x102a17000 -        0x102a80ff7 +qhull.so (??? - ???) <6C6C7BAB-CC5C-3B09-9913-F173A47FE72C> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/spatial/qhull.so
       0x102adf000 -        0x102ae6fff +_distance_wrap.so (??? - ???) <39DF528F-C10A-35DE-94FB-23CD675C973E> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/spatial/_distance_wrap.so
       0x102aec000 -        0x102b24fff +_odepack.so (??? - ???) <23C47F2B-AE16-3FD2-B2EC-253D15E4EBBA> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/integrate/_odepack.so
       0x102b2a000 -        0x102b43fe7 +_quadpack.so (??? - ???) <7C8C7BA9-D159-39D1-BE76-6E189E66C771> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/integrate/_quadpack.so
       0x102b47000 -        0x102b74fff +vode.so (??? - ???) <DD14DE7A-EE46-3043-A17F-6FC05CD89146> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/integrate/vode.so
       0x102b7c000 -        0x102b8efff +_dop.so (??? - ???) <F8BD9E36-121B-3BED-BAA4-7B0C4B005A78> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/integrate/_dop.so
       0x102bd4000 -        0x102bdafff +spectral.so (??? - ???) <8985921C-8EAE-380E-B15D-AE52D8B3C077> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/signal/spectral.so
       0x102ce2000 -        0x102ceefff +parser.so (??? - ???) <714DAD2B-6671-C7CB-BB2D-13B75CFDBC91> /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/parser.so
    0x7fff60f59000 -     0x7fff60f8dbaf  dyld (195.6 - ???) <0CD1B35B-A28F-32DA-B72E-452EAD609613> /usr/lib/dyld
    0x7fff88776000 -     0x7fff88788ff7  libz.1.dylib (1.2.5 - compatibility 1.0.0) <30CBEF15-4978-3DED-8629-7109880A19D4> /usr/lib/libz.1.dylib
    0x7fff8926a000 -     0x7fff8926bfff  libDiagnosticMessagesClient.dylib (??? - ???) <3DCF577B-F126-302B-BCE2-4DB9A95B8598> /usr/lib/libDiagnosticMessagesClient.dylib
    0x7fff8a4d1000 -     0x7fff8a4d5fff  libmathCommon.A.dylib (2026.0.0 - compatibility 1.0.0) <FF83AFF7-42B2-306E-90AF-D539C51A4542> /usr/lib/system/libmathCommon.A.dylib
    0x7fff8aa9a000 -     0x7fff8aa9bff7  libsystem_sandbox.dylib (??? - ???) <5422BA4A-4C96-3BC1-AF83-14F0CED0ECB5> /usr/lib/system/libsystem_sandbox.dylib
    0x7fff8aa9c000 -     0x7fff8aa9dff7  libsystem_blocks.dylib (53.0.0 - compatibility 1.0.0) <8BCA214A-8992-34B2-A8B9-B74DEACA1869> /usr/lib/system/libsystem_blocks.dylib
    0x7fff8aadf000 -     0x7fff8aae4fff  libcache.dylib (47.0.0 - compatibility 1.0.0) <B7757E2E-5A7D-362E-AB71-785FE79E1527> /usr/lib/system/libcache.dylib
    0x7fff8ab83000 -     0x7fff8ab84fff  libunc.dylib (24.0.0 - compatibility 1.0.0) <C67B3B14-866C-314F-87FF-8025BEC2CAAC> /usr/lib/system/libunc.dylib
    0x7fff8b75a000 -     0x7fff8b7bafff  libvDSP.dylib (325.4.0 - compatibility 1.0.0) <3A7521E6-5510-3FA7-AB65-79693A7A5839> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvDSP.dylib
    0x7fff8c52a000 -     0x7fff8c52cfff  com.apple.TrustEvaluationAgent (2.0 - 1) <1F31CAFF-C1C6-33D3-94E9-11B721761DDF> /System/Library/PrivateFrameworks/TrustEvaluationAgent.framework/Versions/A/TrustEvaluationAgent
    0x7fff8c587000 -     0x7fff8c664fef  libsystem_c.dylib (763.13.0 - compatibility 1.0.0) <41B43515-2806-3FBC-ACF1-A16F35B7E290> /usr/lib/system/libsystem_c.dylib
    0x7fff8c66a000 -     0x7fff8c6e0ff7  libc++.1.dylib (28.4.0 - compatibility 1.0.0) <A24FC3DA-4FFA-3DD2-9DCC-2B8D1B3BF97C> /usr/lib/libc++.1.dylib
    0x7fff8c717000 -     0x7fff8c71cff7  libsystem_network.dylib (??? - ???) <5DE7024E-1D2D-34A2-80F4-08326331A75B> /usr/lib/system/libsystem_network.dylib
    0x7fff8ce7d000 -     0x7fff8cebfff7  libcommonCrypto.dylib (55010.0.0 - compatibility 1.0.0) <A5B9778E-11C3-3F61-B740-1F2114E967FB> /usr/lib/system/libcommonCrypto.dylib
    0x7fff8cf90000 -     0x7fff8cfadfff  libxpc.dylib (77.19.0 - compatibility 1.0.0) <9F57891B-D7EF-3050-BEDD-21E7C6668248> /usr/lib/system/libxpc.dylib
    0x7fff8d13c000 -     0x7fff8d310ff7  com.apple.CoreFoundation (6.7.2 - 635.21) <62A3402E-A4E7-391F-AD20-1EF20236CE1B> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation
    0x7fff8d583000 -     0x7fff8d58eff7  libc++abi.dylib (14.0.0 - compatibility 1.0.0) <8FF3D766-D678-36F6-84AC-423C878E6D14> /usr/lib/libc++abi.dylib
    0x7fff8d69d000 -     0x7fff8d6bdfff  libsystem_kernel.dylib (1699.32.7 - compatibility 1.0.0) <66C9F9BD-C7B3-30D4-B1A0-03C8A6392351> /usr/lib/system/libsystem_kernel.dylib
    0x7fff8e01e000 -     0x7fff8e020fff  libquarantine.dylib (36.7.0 - compatibility 1.0.0) <8D9832F9-E4A9-38C3-B880-E5210B2353C7> /usr/lib/system/libquarantine.dylib
    0x7fff8e098000 -     0x7fff8e12eff7  libvMisc.dylib (325.4.0 - compatibility 1.0.0) <642D8D54-F9F5-3FBB-A96C-EEFE94C6278B> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib
    0x7fff8e1a6000 -     0x7fff8e1afff7  libsystem_notify.dylib (80.1.0 - compatibility 1.0.0) <A4D651E3-D1C6-3934-AD49-7A104FD14596> /usr/lib/system/libsystem_notify.dylib
    0x7fff8e1f5000 -     0x7fff8e222fe7  libSystem.B.dylib (159.1.0 - compatibility 1.0.0) <7BEBB139-50BB-3112-947A-F4AA168F991C> /usr/lib/libSystem.B.dylib
    0x7fff8e3a6000 -     0x7fff8e5a8fff  libicucore.A.dylib (46.1.0 - compatibility 1.0.0) <0176782F-9526-3905-813A-7A5676EC2C86> /usr/lib/libicucore.A.dylib
    0x7fff8eecb000 -     0x7fff8f2f8fff  libLAPACK.dylib (??? - ???) <4F2E1055-2207-340B-BB45-E4F16171EE0D> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLAPACK.dylib
    0x7fff8f2f9000 -     0x7fff8f300fff  libcopyfile.dylib (85.1.0 - compatibility 1.0.0) <172B1985-F24A-34E9-8D8B-A2403C9A0399> /usr/lib/system/libcopyfile.dylib
    0x7fff8f655000 -     0x7fff8f78bfff  com.apple.vImage (5.1 - 5.1) <A08B7582-67BC-3EED-813A-4833645964A7> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vImage.framework/Versions/A/vImage
    0x7fff8f980000 -     0x7fff8f9b9fe7  libssl.0.9.8.dylib (44.0.0 - compatibility 0.9.8) <79AAEC98-1258-3DA4-B1C0-4120049D390B> /usr/lib/libssl.0.9.8.dylib
    0x7fff8facc000 -     0x7fff8fad1fff  libcompiler_rt.dylib (6.0.0 - compatibility 1.0.0) <98ECD5F6-E85C-32A5-98CD-8911230CB66A> /usr/lib/system/libcompiler_rt.dylib
    0x7fff904ab000 -     0x7fff904e6fff  libsystem_info.dylib (??? - ???) <35F90252-2AE1-32C5-8D34-782C614D9639> /usr/lib/system/libsystem_info.dylib
    0x7fff9053d000 -     0x7fff90621e5f  libobjc.A.dylib (228.0.0 - compatibility 1.0.0) <871E688B-CF57-3BC7-80D6-F6476DFF109B> /usr/lib/libobjc.A.dylib
    0x7fff91898000 -     0x7fff9190bfff  libstdc++.6.dylib (52.0.0 - compatibility 7.0.0) <6BDD43E4-A4B1-379E-9ED5-8C713653DFF2> /usr/lib/libstdc++.6.dylib
    0x7fff92528000 -     0x7fff92530fff  libsystem_dnssd.dylib (??? - ???) <584B321E-5159-37CD-B2E7-82E069C70AFB> /usr/lib/system/libsystem_dnssd.dylib
    0x7fff9289d000 -     0x7fff928a1fff  libdyld.dylib (195.5.0 - compatibility 1.0.0) <F1903B7A-D3FF-3390-909A-B24E09BAD1A5> /usr/lib/system/libdyld.dylib
    0x7fff93448000 -     0x7fff9344eff7  libunwind.dylib (30.0.0 - compatibility 1.0.0) <1E9C6C8C-CBE8-3F4B-A5B5-E03E3AB53231> /usr/lib/system/libunwind.dylib
    0x7fff935c3000 -     0x7fff935c9fff  libmacho.dylib (800.0.0 - compatibility 1.0.0) <D86F63EC-D2BD-32E0-8955-08B5EAFAD2CC> /usr/lib/system/libmacho.dylib
    0x7fff93a5d000 -     0x7fff93b69fff  libcrypto.0.9.8.dylib (44.0.0 - compatibility 0.9.8) <3A8E1F89-5E26-3C8B-B538-81F5D61DBF8A> /usr/lib/libcrypto.0.9.8.dylib
    0x7fff94114000 -     0x7fff946f8fff  libBLAS.dylib (??? - ???) <C34F6D88-187F-33DC-8A68-C0C9D1FA36DF> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
    0x7fff94729000 -     0x7fff9472afff  libdnsinfo.dylib (395.11.0 - compatibility 1.0.0) <853BAAA5-270F-3FDC-B025-D448DB72E1C3> /usr/lib/system/libdnsinfo.dylib
    0x7fff94864000 -     0x7fff948b2fff  libauto.dylib (??? - ???) <D8AC8458-DDD0-3939-8B96-B6CED81613EF> /usr/lib/libauto.dylib
    0x7fff94dff000 -     0x7fff94dfffff  com.apple.Accelerate.vecLib (3.7 - vecLib 3.7) <C06A140F-6114-3B8B-B080-E509303145B8> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/vecLib
    0x7fff94e00000 -     0x7fff94e00fff  com.apple.Accelerate (1.7 - Accelerate 1.7) <82DDF6F5-FBC3-323D-B71D-CF7ABC5CF568> /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate
    0x7fff94e35000 -     0x7fff94e3fff7  liblaunch.dylib (392.39.0 - compatibility 1.0.0) <8C235D13-2928-30E5-9E12-2CC3D6324AE2> /usr/lib/system/liblaunch.dylib
    0x7fff94e42000 -     0x7fff94e43ff7  libremovefile.dylib (21.1.0 - compatibility 1.0.0) <739E6C83-AA52-3C6C-A680-B37FE2888A04> /usr/lib/system/libremovefile.dylib
    0x7fff94f8f000 -     0x7fff94f8ffff  libkeymgr.dylib (23.0.0 - compatibility 1.0.0) <61EFED6A-A407-301E-B454-CD18314F0075> /usr/lib/system/libkeymgr.dylib
    0x7fff955a6000 -     0x7fff955b4fff  libdispatch.dylib (187.10.0 - compatibility 1.0.0) <8E03C652-922A-3399-93DE-9EA0CBFA0039> /usr/lib/system/libdispatch.dylib

External Modification Summary:
  Calls made by other processes targeting this process:
    task_for_pid: 0
    thread_create: 0
    thread_set_state: 0
  Calls made by this process:
    task_for_pid: 0
    thread_create: 0
    thread_set_state: 0
  Calls made by all processes on this machine:
    task_for_pid: 14973
    thread_create: 0
    thread_set_state: 0

VM Region Summary:
ReadOnly portion of Libraries: Total=88.6M resident=27.0M(30%) swapped_out_or_unallocated=61.6M(70%)
Writable regions: Total=37.8M written=20.4M(54%) resident=26.5M(70%) swapped_out=0K(0%) unallocated=11.2M(30%)

REGION TYPE                      VIRTUAL
===========                      =======
MALLOC                             28.9M
MALLOC guard page                    16K
STACK GUARD                        56.0M
Stack                              8192K
VM_ALLOCATE                           8K
__DATA                             2764K
__LINKEDIT                         52.7M
__TEXT                             35.9M
__UNICODE                           544K
shared memory                        12K
===========                      =======
TOTAL                             184.8M

warnings in scikits learn nodes

mdp/test/test_nodes_generic.py::test_dtype_consistency[SGDClassifierScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.stochastic_gradient.SGDClassifier'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_dtype_consistency[PassiveAggressiveClassifierScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.passive_aggressive.PassiveAggressiveClassifier'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_dtype_consistency[PerceptronScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.perceptron.Perceptron'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_outputdim_consistency[SGDClassifierScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.stochastic_gradient.SGDClassifier'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_outputdim_consistency[PassiveAggressiveClassifierScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.passive_aggressive.PassiveAggressiveClassifier'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_outputdim_consistency[PerceptronScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.perceptron.Perceptron'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_dimdtypeset[SGDClassifierScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.stochastic_gradient.SGDClassifier'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_dimdtypeset[PassiveAggressiveClassifierScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.passive_aggressive.PassiveAggressiveClassifier'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

mdp/test/test_nodes_generic.py::test_dimdtypeset[PerceptronScikitsLearnNode]
  /usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:129: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.perceptron.Perceptron'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
    "and default tol will be 1e-3." % type(self), FutureWarning)

The call stack to reach the warning is

mdp/test/test_nodes_generic.py:272: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
<string>:1: in <lambda>
    ???
mdp/signal_node.py:646: in execute
    self._pre_execution_checks(x)
mdp/signal_node.py:521: in _pre_execution_checks
    self._if_training_stop_training()
mdp/signal_node.py:500: in _if_training_stop_training
    self.stop_training()
<string>:1: in <lambda>
    ???
mdp/signal_node.py:627: in stop_training
    self._train_seq[self._train_phase][1](*args, **kwargs)
mdp/nodes/scikits_nodes.py:252: in _stop_training
    return self.scikits_alg.fit(self.data, self.labels, **kwargs)
/usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:587: in fit
    sample_weight=sample_weight)
/usr/lib64/python3.6/site-packages/sklearn/linear_model/stochastic_gradient.py:415: in _fit
    self._validate_params()

/cc @pberkes

SFANode eigenvalue issue

Summary

Training SFA nodes often fails when there are redundant dimensions in the training data. This has been a commonly encountered issue for people starting out with SFA for a long time now. Since the SFA implementation is a feature of mdp that distinguishes it from similar libraries, the access to its functionality should be as smooth and straightforward as we can possibly make it.

Problem details

The SFANode._stop_training() method concludes the training of an SFA node by solving a generalized eigenvalue problem using the system's symeig() routine. This call fails if the covariance matrix of the data yields eigenvalues close to (or even below) zero and provides the user with an error message that contains insufficient information to help newcomers identify and solve the problem.

While the training data might not be sufficient for the symeig() routine, there are workarounds that allow SFA to still be able to extract slow features. In fact, many people reading this might already have their personal workaround / fix, such as slotting a PCA node in front, or using SVD or a different eigenvalue routine to circumvent this issue instead of going back to "fix" their training data.

For an updated SFANode the default behavior does not have to change. However, a workaround such as using SVD should be readily available. Error handling should be fleshed out to an extent that allows (new) users to make an informed decision about how to identify the problem and what to do about it (including possible trade offs of different solutions made available by mdp).

Status

As of right now there are multiple workarounds already in circulation; a working SVD implementation is also available. In order to include this in mdp we need to agree on how to implement a fix (i.e., what should the default behavior be and what solution should be made available) and possibly profile and compare more than one solution to decide which slots into the node most favorably.

drop Python 2 support and new release

Python 2 will not be maintained anymore starting from 2020. Given that the last MDP release is more than 3 years old, I think it's time to make a new release with the following focus:

  • drop Python 2 compatibility
  • finish whatever needs to be finished

I volunteer to take care of the release, but you guyes have to merge the current development branches into master, so that I know what to release.

The release is urgent also because Fedore and Debian/Ubuntu are dropping Python 2 and we need a new MDP package or we will be kicked out of their repos.
What do you think?

Fix ResourceWarnings

If you run the tests by

pytest -x -W always mdp

you'll get a bunch of ResourceWarnings. Some of them are related to #51 (fixing them would require dropping Python 3.5 compatibility, which could be done, IMHO). Other warnings should be explored more thoroughly.

As I am importing bimpd I get following error.

UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 630: ordinal not in range(128)

I installed from first ubuntu repo, then I tried Github version but same problem incurred.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.