Code Monkey home page Code Monkey logo

spectral_connectivity's Introduction

spectral_connectivity

PR Test DOI Binder status PyPI version Anaconda-Server Badge Documentation Status Coverage Status

Tutorials | Documentation | Usage Example | Installation | Citation | Developer Installation

What is spectral_connectivity?

spectral_connectivity is a Python software package that computes multitaper spectral estimates and frequency-domain brain connectivity measures such as coherence, spectral granger causality, and the phase lag index using the multitaper Fourier transform. Although there are other Python packages that do this (see nitime and MNE-Python), spectral_connectivity has several differences:

  • it is designed to handle multiple time series at once
  • it caches frequently computed quantities such as the cross-spectral matrix and minimum-phase-decomposition, so that connectivity measures that use the same processing steps can be more quickly computed.
  • it decouples the time-frequency transform and the connectivity measures so that if you already have a preferred way of computing Fourier coefficients (i.e. from a wavelet transform), you can use that instead.
  • it implements the non-parametric version of the spectral granger causality in Python.
  • it implements the canonical coherence, which can efficiently summarize brain-area level coherences from multielectrode recordings.
  • easier user interface for the multitaper fourier transform
  • all function are GPU-enabled if cupy is installed and the environmental variable SPECTRAL_CONNECTIVITY_ENABLE_GPU is set to 'true'.

Tutorials

See the following notebooks for more information on how to use the package:

Usage Example

from spectral_connectivity import Multitaper, Connectivity

# Compute multitaper spectral estimate
m = Multitaper(time_series=signals,
               sampling_frequency=sampling_frequency,
               time_halfbandwidth_product=time_halfbandwidth_product,
               time_window_duration=0.060,
               time_window_step=0.060,
               start_time=time[0])

# Sets up computing connectivity measures/power from multitaper spectral estimate
c = Connectivity.from_multitaper(m)

# Here are a couple of examples
power = c.power() # spectral power
coherence = c.coherence_magnitude()
weighted_phase_lag_index = c.weighted_phase_lag_index()
canonical_coherence = c.canonical_coherence(brain_area_labels)

Documentation

See the documentation here.

Citation

For citation, please use the following:

Denovellis, E.L., Myroshnychenko, M., Sarmashghi, M., and Stephen, E.P. (2022). Spectral Connectivity: a python package for computing multitaper spectral estimates and frequency-domain brain connectivity measures on the CPU and GPU. JOSS 7, 4840. 10.21105/joss.04840.

Implemented Measures

Functional

  1. coherency
  2. canonical_coherence
  3. imaginary_coherence
  4. phase_locking_value
  5. phase_lag_index
  6. weighted_phase_lag_index
  7. debiased_squared_phase_lag_index
  8. debiased_squared_weighted_phase_lag_index
  9. pairwise_phase_consistency
  10. global coherence

Directed

  1. directed_transfer_function
  2. directed_coherence
  3. partial_directed_coherence
  4. generalized_partial_directed_coherence
  5. direct_directed_transfer_function
  6. group_delay
  7. phase_lag_index
  8. pairwise_spectral_granger_prediction

Package Dependencies

spectral_connectivity requires:

  • python
  • numpy
  • matplotlib
  • scipy
  • xarray

See environment.yml for the most current list of dependencies.

Installation

pip install spectral_connectivity

or

conda install -c edeno spectral_connectivity

Developer Installation

If you want to make contributions to this library, please use this installation.

  1. Install miniconda (or anaconda) if it isn't already installed. Type into bash (or install from the anaconda website):
wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh;
bash miniconda.sh -b -p $HOME/miniconda
export PATH="$HOME/miniconda/bin:$PATH"
hash -r
  1. Clone the repository to your local machine (.../spectral_connectivity) and install the anaconda environment for the repository. Type into bash:
conda env create -f environment.yml
conda activate spectral_connectivity
pip install -e .

Recent publications and pre-prints that used this software

spectral_connectivity's People

Contributors

ajquinn avatar bloniaszp avatar dizcza avatar edeno avatar emilyps14 avatar mehradsm avatar mmyros avatar sappelhoff avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

spectral_connectivity's Issues

[JOSS] run CI also on PRs

I noticed that your CI tests don't seem to run on PRs (see e.g., #33).

That might be because you defined too few "triggers":

for an example on how to define triggers that also work on pull requests, see: https://github.com/sappelhoff/pyprep/blob/c6b268b90e08af2d4740496414cad649344345fa/.github/workflows/python_tests.yml#L3-L12

I'd strongly recommend running CIs for PRs as well.

xref: openjournals/joss-reviews#4840

[JOSS] Docs API section does not show function signatures

See: https://spectral-connectivity.readthedocs.io/en/latest/api.html

Given that you have nice numpydoc style docstrings for your functions and classes, it'd be nice to also be able to see them in your automatically generated docs.

This repo contains a setup similar to yours (functions and classes documented via numpydoc, built via sphinx and readthedocs), so you may find it useful to debug your own docs: https://pyprep.readthedocs.io/en/latest/api.html

xref: openjournals/joss-reviews#4840

[JOSS] Comparison with MNE

Regarding the comparison with MNE-Python, you wrote :

mne-python is a much larger package designed as a full-featured analysis library for EEG and MEG data, and using the spectral analysis functions requires representing data using its ecosystem.

I understand your point, it's true that MNE strongly relies on its Epoch or Raw container that have not been designed to handle non-human electrophysiological data. But to be fair on the comparison, MNE also have functions that can be applied to non-human LFP using standard NumPy array inputs. For example, in line with this package :

In my opinion, the strength of this package lies in the parameter expectation_type. It's a smart and flexible design because it allows having single-trial and possibly dynamic estimations of functional connectivity using a single implementation, while in MNE they had to use two different functions.

xref : JOSS (openjournals/joss-reviews#4840)

[JOSS] ReadTheDocs pages

Hello - one weakness in this (otherwise excellent) repo is the readthedocs pages here: https://spectral-connectivity.readthedocs.io/en/latest/ - these seem rushed and a bit of an afterthought.

One concern is that the build generates a really large number of warnings and exceptions listed in the final tab here
https://readthedocs.org/projects/spectral-connectivity/builds/18509808/

some examples

/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/spectral_connectivity/transforms.py:docstring of spectral_connectivity.transforms.Multitaper.n_tapers:1: WARNING: duplicate object description of spectral_connectivity.transforms.Multitaper.n_tapers, other instance in _autosummary/spectral_connectivity.transforms.Multitaper, use :noindex: for one of them
/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/spectral_connectivity/transforms.py:docstring of spectral_connectivity.transforms.Multitaper.n_time_samples_per_step:1: WARNING: duplicate object description of spectral_connectivity.transforms.Multitaper.n_time_samples_per_step, other instance in _autosummary/spectral_connectivity.transforms.Multitaper, use :noindex: for one of them
/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/spectral_connectivity/transforms.py:docstring of spectral_connectivity.transforms.Multitaper.tapers:1: WARNING: duplicate object description of spectral_connectivity.transforms.Multitaper.tapers, other instance in _autosummary/spectral_connectivity.transforms.Multitaper, use :noindex: for one of them
WARNING: autodoc: failed to import module 'setup'; the following exception was raised:
Traceback (most recent call last):
  File "/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/envs/latest/lib/python3.9/site-packages/sphinx/ext/autodoc/importer.py", line 58, in import_module
    return importlib.import_module(modname)
  File "/home/docs/.asdf/installs/python/3.9.13/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/setup.py", line 26, in <module>
    long_description=open("README.md").read(),
FileNotFoundError: [Errno 2] No such file or directory: 'README.md'
/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/docs/source/spectral_connectivity.rst:20: WARNING: Citation [Rce5454392c8c-1] is not referenced.
/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/docs/source/spectral_connectivity.rst:22: WARNING: Citation [R50ceabb4b5b7-1] is not referenced.
/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/docs/source/spectral_connectivity.rst:10: WARNING: Citation [R92db3e0bed67-1] is not referenced.
/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/docs/source/spectral_connectivity.rst:32: WARNING: Citation [Reea3610c2217-1] is not referenced.
/home/docs/checkouts/readthedocs.org/user_builds/spectral-connectivity/checkouts/latest/docs/source/spectral_connectivity.rst:16: WARNING: Citation [Rd62c8e2568d7-1] is not referenced.
done

Perhaps not a major issue on its own but suggests that the build is fairly fragile and likely missing some details. Could you take a look and fix up some of these issues?

I've tried to make a local build to say something more productive but keep running into a docutils error.

xref: openjournals/joss-reviews#4840

[JOSS] code and docstrings

Hello - overall I'm impressed with the code quality and writing of this package. Its clear that care has been taken and it appears that the functions and objects handle complicated analyses relatively clearly. The tests are thorough and cover a range of sensible checks. Well done!

My comments are limited to some aspects of code organisation and documentation.

- Code formatting/linting

The code doesn't appear to use a formal style checker or linter, running flake8 spectral_connectivity/*py from the route dir gives quite a lot of suggestions - some more reasonable than others.... You may or may not want to adopt formal code-checking as part of the test suite. I find flake8 useful (ignoring some error codes) but this is up to you.

Flake does highlight the following points that I think are work checking out.

  • some docstrings are missing, or applied inconsistently. flake8 spectral_connectivity/*py | grep Missing
spectral_connectivity/connectivity.py:1:1: D100 Missing docstring in public module
spectral_connectivity/connectivity.py:119:1: D107 Missing docstring in __init__
spectral_connectivity/connectivity.py:159:1: D102 Missing docstring in public method
spectral_connectivity/connectivity.py:165:1: D102 Missing docstring in public method
spectral_connectivity/connectivity.py:263:1: D102 Missing docstring in public method
spectral_connectivity/connectivity.py:272:1: D102 Missing docstring in public method
spectral_connectivity/connectivity.py:520:1: D102 Missing docstring in public method
spectral_connectivity/connectivity.py:735:1: D102 Missing docstring in public method
spectral_connectivity/connectivity.py:738:1: D102 Missing docstring in public method
spectral_connectivity/connectivity.py:1271:1: D413 Missing blank line after last section
spectral_connectivity/connectivity.py:1271:1: D413 Missing blank line after last section
spectral_connectivity/minimum_phase_decomposition.py:1:1: D100 Missing docstring in public module
spectral_connectivity/minimum_phase_decomposition.py:27:1: D413 Missing blank line after last section
spectral_connectivity/minimum_phase_decomposition.py:27:1: D413 Missing blank line after last section
spectral_connectivity/minimum_phase_decomposition.py:100:1: D413 Missing blank line after last section
spectral_connectivity/minimum_phase_decomposition.py:100:1: D413 Missing blank line after last section
spectral_connectivity/statistics.py:1:1: D100 Missing docstring in public module
spectral_connectivity/statistics.py:8:1: D413 Missing blank line after last section
spectral_connectivity/statistics.py:8:1: D413 Missing blank line after last section
spectral_connectivity/statistics.py:35:1: D103 Missing docstring in public function
spectral_connectivity/transforms.py:1:1: D100 Missing docstring in public module
spectral_connectivity/transforms.py:72:1: D107 Missing docstring in __init__
spectral_connectivity/transforms.py:103:1: D105 Missing docstring in magic method
spectral_connectivity/transforms.py:136:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:144:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:164:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:177:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:183:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:202:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:212:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:216:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:220:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:224:1: D102 Missing docstring in public method
spectral_connectivity/transforms.py:384:1: D413 Missing blank line after last section
spectral_connectivity/transforms.py:384:1: D413 Missing blank line after last section
spectral_connectivity/transforms.py:425:1: D413 Missing blank line after last section
spectral_connectivity/transforms.py:425:1: D413 Missing blank line after last section
spectral_connectivity/transforms.py:626:1: D413 Missing blank line after last section
spectral_connectivity/transforms.py:626:1: D413 Missing blank line after last section
spectral_connectivity/transforms.py:695:1: D413 Missing blank line after last section
spectral_connectivity/transforms.py:695:1: D413 Missing blank line after last section
spectral_connectivity/wrapper.py:1:1: D100 Missing docstring in public module

These are mainly on properties and methods of objects but are worth completing, a single line of context would do in most cases. Multitaper.tapers seems to be partially documented.

- Simulations and plotting functions

The simulation functions and plotting utilities in the notebooks are very useful and, I think, deserve to be included in the package as their own submodules. This could really simplify your documentation and be a big help going forward - this one is up to you though, you may not want to maintain this longer term.

I've done something similar in a package based on autoregressive modelling and find it really helpful to be able to quickly generate a known system from within the package.

Maintaining high-quality plotting functions can be a pain, but I think some simple options to help get people started would be great.

- Requirements and versions

The versions of required packages in setup.py and environment.yml seemt to be mismatched. Makes sense that setup has the basics and the conda env would have a more complete list.

numpy, pandas and pytest have version limits in setup but not in the conda env. I guess conda will resolve this during install but it would certainly be simpler if they matched.

- Makefile

Your Makefile seems to be broken. Calling Make gives a Makefile:3: *** missing separator. Stop. error.

It seems like the makefile might not be used much from the limited options in it, but I'd recommend either fixing it up or removing it.

xref: openjournals/joss-reviews#4840

Spectral connectivity from wavelet transform

Hello,

I am trying to implement the spectral connectivity measures (specifically granger causality) using Fourier coefficients obtained from wavelet transforms of my data. My issue is I only get NaNs back from the function. Things go awry on the line

minnimum_phase_factor = (minimum_phase_decomposition(cross_spectral_matrix[
                            ..., pair_indices, pair_indices.T]))

Which hits the error numpy.linalg.LinAlgError: Singular matrix, probably on this line

linear_predictor = _get_linear_predictor( minimum_phase_factor, cross_spectral_matrix, I)
(Although the command seems to run fine for a good number of iterations before it doesn't).

Have you come across this error before? Any idea what could be going wrong?

[JOSS] Usage examples and tutorials

Hello - I like the usage example in the main README but the tutorial jupyter notebooks are most likely only helpful for current experts. The content is good but contains very large code blocks and reads a bit like a list.

I would strongly recommend including one or two tutorials which break down the large code blocks and include a good amount of prose explaning what each chunk of code is doing and why. This doesn't need to be absolutely extensive but is hugely valuable for new users.

Some examples from similar toolboxes

xref: openjournals/joss-reviews#4840

[JOSS] Add `.mailmap` file

While checking the authors of this software for JOSS (openjournals/joss-reviews#4840), I used the following command: git shortlog --email --summary, and it gave me the following list:

     3  Emily Stephen <[email protected]>
     2  Emily Stephen <[email protected]>
   121  Eric Denovellis <[email protected]>
    22  Eric Denovellis <[email protected]>
     5  Eric Denovellis <[email protected]>
     5  Max Myroshnychenko <[email protected]>
    16  Mehrad <[email protected]>
     1  dizcza <[email protected]>
    26  edeno <[email protected]>
     1  myroshnychenkm2 <[email protected]>

As you can see, several authors appear multiple times under different aliases and emails. This is something you can solve with a .mailmap file, see: https://git-scm.com/docs/gitmailmap

For an example, see: https://github.com/mne-tools/mne-bids/blob/main/.mailmap

pairwise granger causality clarification

Since the granger causality prediction is directed and asymmetric — I just wanted to be sure that my interpretation of the output is correct. In the output matrix, does [i,j] (row i, column j) correspond to i ---> j, or is it the other way around? Thanks!

[JOSS] double check CITATION.cff and prepare transparent "author" requirements

See for example:

license: other-open

I think you can be more specific in what license you use: According to https://github.com/citation-file-format/citation-file-format/blob/main/schema-guide.md#license, and your LICENSE file, it should be: GPL-3.0

Apart from that I see that you have had one more person who committed code to this repo (dizcza <[email protected]>) but they are not in the CITATION.cff file (and not in the JOSS publication). If this is a conscious decision, it may be good to clarify in your CONTRIBUTORS.md file, at what point contributors are considered co-authors of the software.

xref: openjournals/joss-reviews#4840

Functional connectivity methods are asymmetric

Hi all,

When using this dataset, we found that the following methods (that are named as functional on the main page) yield directed values:
coherence_phase
phase_locking_value
phase_lag_index
weighted_phase_lag_index

The dataset has brain regions on the first axis and time dimension on the second (so it should be transposed before inputting to the toolkit).

Any ideas?

[JOSS] Consider moving to conda-forge?

I am curious: Why do you ship spectral_connectivity on a personal conda channel, instead of on conda-forge?

Shipping on conda-forge would have the advantage of re-using the huge community structure, and making your release process for conda more transparent, as well as users not having to declare your personal channel as a source in the environments.

xref: openjournals/joss-reviews#4840

Range of connectivity metrics

It will be nice to provide a range of different connectivity metrics, maybe in the description.

In my case, I am calculating the pairwise_spectral_granger_prediction and my values are between 0 and 13. I am not sure how I am supposed to interpret this, I will assume to get values from 0 to 1.

Should I normalize this or remove outliers?

Thanks in advance!

Puzzled about trial averaging - isn't coherence symmetric

Hi there,

Thanks a lot for providing this superb toolbox! We are analyzing ephys data with it and wondered about the symmetry of coherence when probing it via shuffling the input data.

Let's say our signal with trial structure is of shape 2 channels x 500 trials x 800 observations.
Now computing coherence gives this on top row. Now when taking the same data but swapping randomly the channel order, independently for each trial, we find the same shape of the frequency-dependent coherence, however the magnitude decreased substantially.

Given that coherence should be a symmetric measure, we wonder if something in the trial averaging could have caused that amplitude drop.

image

For spectral granger - an asymmetric measure - also the shape of the result changed, as expected:

image

If you have any thoughts as of why that amplitude dropped for the shuffled data we'd be glad to hear them!

Many thanks!

Negative PLI values

PLI is defined as PLI = |E[sign(Im(Sxy))]| and is, therefore, a positive value. And it case there could be any confusion on this, it is explicitly said in the Stam paper "The PLI ranges between 0 and 1". However, trying your toolbox I obtained negative values in the returned array from Multitaper().phase_lag_index().

from spectral_connectivity import Multitaper, Connectivity
import numpy as np

sfreq = 500
np.random.seed(1)

data = np.random.rand(500, 20)
m = Multitaper(time_series=data,
               sampling_frequency=sfreq,
               time_halfbandwidth_product=8,
               time_window_duration=0.20,
               time_window_step=0.10,
               start_time=0.0)

c = Connectivity.from_multitaper(m)

assert(np.all(c.phase_lag_index() >= 0))

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-1-027228a3414c> in <module>
     13 c = Connectivity.from_multitaper(m)
     14 
---> 15 assert(np.all(c.phase_lag_index() >= 0))

AssertionError: 

[JOSS] PLV returns complex values

The phase_locking_value method returns complex value because the np.abs is missing. I guess you made this choice because it's then used in the ppc but from the user side, especially for non-expert, it might be confusing.

A quick fix could be to have a _phase_locking_value without the absolute value and with :

def phase_locking_value()
    return xp.abs(self._phase_locking_value())

And the ppc could use _phase_locking_value method instead. What do you think?

JOSS reference (openjournals/joss-reviews#4840),

[JOSS] Add badges to README

It'd be very convenient to get the following "badges" from your README:

  • travis status, given that this CI is still running as expected. If it doesn't now would be a good time to remove or fix it
  • PyPI status
  • Conda status (I can't find your "edeno" channel and the package right now)
  • readthedocs

Furthermore, it'd be nice to have a badge for test coverage, for example using https://about.codecov.io/
See this repo for an example: https://github.com/bids-standard/pybv/blob/4af6bbab86e6151a3a64d759d4eca4a15e9b253a/.github/workflows/python_tests.yml#L81-L85

xref: openjournals/joss-reviews#4840

[JOSS] GPU computations

I tested GPU computations but I'm not sure I did it correctly as I couldn't observe differences in computing time (see notebook here). Do you have an idea why?

Also, in the doc and in the Multitaper file, you define an environmental variable SPECTRAL_CONNECTIVITY_ENABLE_GPU but in the Connectivity file it's SPEC_CON_ENABLE_GPU.

xref : JOSS (openjournals/joss-reviews#4840)

Equation in tutorial example ipynb not correct

Hi,
The equation in Baccala Example 2 now is x_1=0.5x_1(n-1)+0.3x_2(n-1)+0.4x_3x(n-1)+w_1(n). There is an extra x in the equation. Should it be x_1=0.5x_1(n-1)+0.3x_2(n-1)+0.4x_3(n-1)+w_1(n)?
It is confusing for beginners.

issues with cupy

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[3], line 82
     77 data = np.squeeze(data)
     79 print(f'data size is {data.shape[0]} times and {data.shape[1]}   channels')
---> 82 connectivity_data  = calculate_connectivity(data, sfreq, time_halfbandwidth_product,
     83                                             time_window_duration, time_window_step)
     85 filename = f'{MouseID}_20s_spectral_connectivity_Day_{day}.mat'
     87 print(f'saving file to {filename} to path' )

Cell In[3], line 25, in calculate_connectivity(data, sfreq, time_halfbandwidth_product, time_window_duration, time_window_step)
     13 """time_series : array, shape (n_time_samples, n_trials, n_signals) or
     14                            (n_time_samples, n_signals)
     15 """
     17 m = Multitaper(data, sampling_frequency=sfreq,
     18                time_halfbandwidth_product=time_halfbandwidth_product,
     19                start_time=0,
     20                detrend_type = None,
     21                time_window_duration = time_window_duration,
     22                time_window_step = time_window_step)
---> 25 c = Connectivity.from_multitaper(m)
     27 connectivity_data = dict(
     28     
     29     pairwise_spectral_granger_prediction     = c.pairwise_spectral_granger_prediction(), 
     30    
     31 )
     33 return connectivity_data

File ~\Anaconda3\envs\myenv\lib\site-packages\spectral_connectivity\connectivity.py:166, in Connectivity.from_multitaper(cls, multitaper_instance, expectation_type, blocks, dtype)
    156 @classmethod
    157 def from_multitaper(
    158     cls,
   (...)
    162     dtype=xp.complex128,
    163 ):
    164     """Construct connectivity class using a multitaper instance"""
    165     return cls(
--> 166         fourier_coefficients=multitaper_instance.fft(),
    167         expectation_type=expectation_type,
    168         time=multitaper_instance.time,
    169         frequencies=multitaper_instance.frequencies,
    170         blocks=blocks,
    171         dtype=dtype,
    172     )

File ~\Anaconda3\envs\myenv\lib\site-packages\spectral_connectivity\transforms.py:249, in Multitaper.fft(self)
    239 """Compute the fast Fourier transform using the multitaper method.
    240 
    241 Returns
   (...)
    244 
    245 """
    246 time_series = _add_axes(self.time_series)
    247 time_series = _sliding_window(
    248     time_series,
--> 249     window_size=self.n_time_samples_per_window,
    250     step_size=self.n_time_samples_per_step,
    251     axis=0,
    252 )
    253 if self.detrend_type is not None:
    254     time_series = detrend(time_series, type=self.detrend_type)

File ~\Anaconda3\envs\myenv\lib\site-packages\spectral_connectivity\transforms.py:176, in Multitaper.n_time_samples_per_window(self)
    173     self._n_time_samples_per_window = self.time_series.shape[0]
    174 elif self._time_window_duration is not None:
    175     self._n_time_samples_per_window = int(
--> 176         xp.round(self.time_window_duration * self.sampling_frequency)
    177     )
    178 return self._n_time_samples_per_window

AttributeError: module 'cupy' has no attribute 'round'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.