Code Monkey home page Code Monkey logo

mapalign's Introduction

Build Status

This repo implements diffusion map based embedding [1] and alignment [2] algorithms.

[1] http://dx.doi.org/10.1016/j.acha.2006.04.006

[2] Langs, Golland and Ghosh (2015) Predicting Activation Across Individuals with Resting-State Functional Connectivity based Multi-Atlas Label Fusion. MICCAI.

This package also provides a scikit-learn compatible DiffusionMapEmbedder.

For documentation see this Jupyter notebook

mapalign's People

Contributors

alexbrc avatar janfreyberg avatar reindervosdewael avatar rudimeier avatar satra avatar steelec avatar ysanchezaraujo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

mapalign's Issues

Add eigenvalues to attributes to DiffusionMapEmbedding

Currently the eigenvalues are not stored so we can’t measure importance of dimensions.
I see in the backend there's the lambda values. Are these the scaled eigenvalues? Is there a way to access these with the DiffusionMapEmbedding?

fail on import of spectral_embedding_

We were testing your package out and noted that it works with some versions of sklearn but not others. This appears to be due to the import statement in embed.py included below. I have not had a lot of time to track it down, but I think direct importing of these functions was removed in sklearn v .24 forward?

>> from sklearn.manifold.spectral_embedding_ import _graph_is_connected

FutureWarning: The sklearn.manifold.spectral_embedding_ module is  deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.manifold. Anything that cannot be imported from sklearn.manifold is now part of the private API.
  warnings.warn(message, FutureWarning)

from sklearn.manifold.spectral_embedding_ import _graph_is_connected

Problem with diffusion times

Embeddings.zip

I would like to ask two mutually related questions about the code:

  1. Where can I found the theory and rationale behind the automated computation of diffusion times (line 144)? I cannot figure out what is the relation between computed output (single embedding) and vector of this diffusion time (more than one value).

  2. Suppose I have run diffusion embedding without predefined diffusion time. In the output, I got the automatically computed diffusion times. Now I take each individual diffusion time and run the algorithm again, but specifying diffusion times explicitly. I wonder why I got different results between the initial run with automated diffusion time estimation and subsequent runs with this parameter specified.
    In the attachment, I provide the data and code to reproduce the results from question 2.

Thank you very much in advance if you could help me.

Missing import for rbf kernel?

I get a NameError ( global name 'rbf_kernel' is not defined) when trying to use the rbf kernel (affinity='rbf').
Maybe this is missing an import from sklearn (sklearn.metrics.pairwise.rbf_kernel) to be used
here?

Normalization by the constant eigenvector

In line 141 of embed.py, in _step_5 the eigenvectors are normalized by the 1st eigenvector psi = vectors/vectors[:, [0]].

What is the purpose of this? I understand that the 1st eigenvector should be constant and so this ideally will amount to scaling and sign flips, but due to numerical issues it isn't going to be a constant vectors.

ValueError

Sorry for disturbing you.

I am very interested in the paper 'Situating the default-mode network along a principal
gradient of macroscale cortical organization' which is based on mapalign, and code is available in https://github.com/NeuroanatomyAndConnectivity/gradient_analysis.

But when I run the code in https://github.com/NeuroanatomyAndConnectivity/gradient_analysis/blob/master/02_embed_connectomes.ipynb, I got ValueError

In [30]: emb, res = embed.compute_diffusion_map(aff, alpha = 0.5)
ValueError Traceback (most recent call last)
in ()
----> 1 emb, res = embed.compute_diffusion_map(aff, alpha = 0.5)

ValueError: too many values to unpack

And I use python2.7 which is the same with 'gradient_analysis'. Hope you to help me with the error, thank you!

embded.compute_diffusion_map 's vectors is stuck into a cycle of positive and negative value

when I run the "emb, res =embded.compute_diffusion_map(aff, alpha=0.5, return_result = True)", which the "aff" is a 800×800 symmetrical matrix, it return the res with the "vectors" and "emb", if some value of the two array is positive for the first time, and then run the function again it will be negative with the almost same absolute value, I am quit confused about this problem and when I run a lot of matrix it will bring inconsistency, and it will be quite tricky to my work which is about the compute of gradient of human brain functional connectivity. I wish some help can be available for me.

Transition matrix - swapped dimensions?

Hi @satra,
in the current implementation, Markov matrix is calculated by taking sums along axis 1:

d_alpha = np.power(np.array(L_alpha.sum(axis=1)).flatten(), -1)

And multiplying each row by the respective inverse sum:
L_alpha = d_alpha[:, np.newaxis] * L_alpha

In this scenario, ROWS sum to 1.

However, as far as I understand the wiki: https://en.wikipedia.org/wiki/Markov_kernel, the normalization should happen along axis 0. Thus, the code should be:

d_alpha = np.power(np.array(L_alpha.sum(axis=0)).flatten(), -1) L_alpha = d_alpha[np.newaxis,:] * L_alpha

Here the COLUMNS sum to 1.

Is that correct?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.