Code Monkey home page Code Monkey logo

ephypype's Introduction

https://circleci.com/gh/neuropycon/ephypype.svg?style=svg

README

Description

Neuropycon package of functions for electrophysiology analysis, can be used from graphpype and nipype

Documentation

https://neuropycon.github.io/ephypype

Installation

Requirements

ephypype works with python3

The dependencies (mne, nipype) are automatically installed during ephypype installation (see :ref:`ephy_install`).

We also recommend to install MNE python by following the installation instructions

Install ephypype

To install ephypype, use the following command:

$ pip install ephypype

or alternatively, you can download from github the last version and install it:

$ git clone https://github.com/neuropycon/ephypype.git
$ cd ephypype
$ python setup.py develop

Software

Freesurfer
  1. Download Freesurfer software:

https://surfer.nmr.mgh.harvard.edu/fswiki/DownloadAndInstall

  1. Follow the Installation instructions

https://surfer.nmr.mgh.harvard.edu/fswiki/LinuxInstall

ephypype's People

Contributors

annapasca avatar davidmeunier79 avatar dmalt avatar etiennecmb avatar jasmainak avatar wmvanvliet avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

ephypype's Issues

mne.connectivity module, now mne-connectivity library

Dear NeuroPycon team,
I just have seen your library and indeed wanted to test the preprocessing pipeline, spectral connectivity on sensor level, and power spectrum analysis notebooks with some resting EEG datasets.
However, after installing NeuroPycon on Google Colab (and Jupyter notebooks as well) I had some errors when importing ephypype.

After running:
! pip install ephypype
! pip install nipype
! pip install mne
! pip install mne_bids
import os.path as op
import numpy as np
import nipype.pipeline.engine as pe
import mne
import mne_bids
import ephypype

I get the following error in Colab

ModuleNotFoundError Traceback (most recent call last)
in ()
----> 1 import ephypype

4 frames
/usr/local/lib/python3.7/dist-packages/ephypype/spectral.py in ()
10 from scipy.io import savemat
11
---> 12 from mne.connectivity import spectral_connectivity
13 from mne.viz import circular_layout, plot_connectivity_circle
14

ModuleNotFoundError: No module named 'mne.connectivity'

  • I'm not sure if this might be explained by some changes in MNE (around mid 2021) such as the connectivity measures moved to a separate repository (now "mne-connectivity" https://mne.tools/mne-connectivity/stable/index.html , instead of previous "mne.connectivity").

I have also tested the above code in Jupyter, obtaining another error, this time on the mne.datasets.utils fetching.
Thus,
import ephypype

Gives the following output

ImportError Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_15740/2868412181.py in
----> 1 import ephypype

~\anaconda3\lib\site-packages\ephypype_init_.py in
1 #from . import interfaces # noqa
2 from . import gather # noqa
----> 3 from . import datasets
4
5 from .gather import (gather_conmats, gather_results)

~\anaconda3\lib\site-packages\ephypype\datasets.py in
7 import os
8 import zipfile
----> 9 from mne.datasets.utils import _fetch_file
10
11

ImportError: cannot import name '_fetch_file' from 'mne.datasets.utils' (C:\Users\myuser\anaconda3\lib\site-packages\mne\datasets\utils.py).

The same happened when I tried to import specific functions, or manually import the datasets rather than the whole ephypype library, but it raised the same error in mne.datasets.utils fetching

from ephypype.nodes import create_iterator, create_datagrabber
from ephypype.datasets import fetch_omega_dataset

Can you kindly let me know if you are aware of these potential issues due to MNE's recent changes?
I'm relatively new to Python so I'm not able to come up with a solution but I hope this report helps,
Thanks in advance for your help!

failing examples

I get this error when trying to run an example on master:

1.1.2
*** main_path -> /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega ***
180906-14:52:55,261 nipype.workflow INFO:
	 Generated workflow graph: /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/graph.png (graph2use=colored, simple_form=True).
180906-14:52:55,433 nipype.workflow INFO:
	 Workflow preprocessing_pipeline settings: ['check', 'execution', 'logging', 'monitoring']
180906-14:52:55,526 nipype.workflow INFO:
	 Running in parallel.
180906-14:52:55,572 nipype.workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 27.95/27.95, Free processors: 3/3.
180906-14:52:55,645 nipype.workflow INFO:
	 [Node] Setting-up "preprocessing_pipeline.datasource" in "/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/_sess_index_ses-0001_subject_id_sub-0003/datasource".
180906-14:52:55,698 nipype.workflow INFO:
	 [Node] Running "datasource" ("nipype.interfaces.io.DataGrabber")
180906-14:52:55,725 nipype.workflow INFO:
	 [Node] Finished "preprocessing_pipeline.datasource".
180906-14:52:57,570 nipype.workflow INFO:
	 [Job 0] Completed (preprocessing_pipeline.datasource).
180906-14:52:57,573 nipype.workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 27.95/27.95, Free processors: 3/3.
180906-14:52:57,651 nipype.workflow INFO:
	 [Job 1] Cached (preprocessing_pipeline.preproc_meeg.ds2fif).
180906-14:52:59,672 nipype.workflow INFO:
	 [Job 2] Cached (preprocessing_pipeline.preproc_meeg.preproc).
180906-14:53:01,673 nipype.workflow INFO:
	 [Node] Setting-up "preprocessing_pipeline.preproc_meeg.ica" in "/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica".
180906-14:53:01,694 nipype.workflow INFO:
	 [Node] Running "ica" ("ephypype.interfaces.mne.preproc.CompIca")
Opening raw data file /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/preproc/sub-0003_task-rest_run-01_meg_raw_filt_dsamp.fif...
/home/mainak/Desktop/projects/github_repos/ephypype/ephypype/preproc.py:50: RuntimeWarning: This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/preproc/sub-0003_task-rest_run-01_meg_raw_filt_dsamp.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
  raw = read_raw_fif(fif_file, preload=True)
This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/preproc/sub-0003_task-rest_run-01_meg_raw_filt_dsamp.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
180906-14:53:03,573 nipype.workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 27.75/27.95, Free processors: 2/3.                     
                     Currently running:
                       * preprocessing_pipeline.preproc_meeg.ica
    Read 5 compensation matrices
    Range : 0 ... 479999 =      0.000 ...   599.999 secs
Ready.
Current compensation grade : 3
Reading 0 ... 479999  =      0.000 ...   599.999 secs...
Fitting ICA to data using 270 channels (please be patient, this may take a while)
Inferring max_pca_components from picks
    Rejecting  epoch based on MAG : [u'MLO12-4408', u'MLO13-4408', u'MLO23-4408']
Artifact detected in [294400, 296000]
Selection by explained variance: 27 components
Fitting ICA took 52.8s.
/home/mainak/Desktop/projects/github_repos/ephypype/ephypype/preproc.py:70: RuntimeWarning: This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
  ica_src.save(ica_ts_file)
This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
Writing /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif
Closing /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif [done]
Using channel ECG to identify heart beats.
Setting up band-pass filter from 8 - 16 Hz
Filter length of 8192 samples (10.240 sec) selected
Number of ECG events detected : 529 (average pulse 52 / min.)
529 matching events found
No baseline correction applied
Not setting metadata
180906-14:54:09,662 nipype.workflow WARNING:
	 [Node] Error on "preprocessing_pipeline.preproc_meeg.ica" (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica)
180906-14:54:11,663 nipype.workflow ERROR:
	 Node ica.a0 failed to run on host mainak-ThinkPad-W540.
180906-14:54:11,663 nipype.workflow ERROR:
	 Saving crash info to /home/mainak/Desktop/crash-20180906-145411-mainak-ica.a0-77c8dd8b-5461-413e-ad45-fbe44a7adf75.pklz
Traceback (most recent call last):
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/plugins/multiproc.py", line 69, in run_node
    result['result'] = node.run(updatehash=updatehash)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/nodes.py", line 471, in run
    result = self._run_interface(execute=True)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/nodes.py", line 555, in _run_interface
    return self._run_command(execute)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/nodes.py", line 635, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/interfaces/base/core.py", line 521, in run
    runtime = self._run_interface(runtime)
  File "/home/mainak/Desktop/projects/github_repos/ephypype/ephypype/interfaces/mne/preproc.py", line 63, in _run_interface
    eog_ch_name, n_components, reject)
  File "/home/mainak/Desktop/projects/github_repos/ephypype/ephypype/preproc.py", line 89, in compute_ica
    ch_name=ecg_ch_name)
  File "<string>", line 2, in create_ecg_epochs
  File "/home/mainak/Desktop/projects/github_repos/mne-python/mne/utils.py", line 724, in verbose
    return function(*args, **kwargs)
  File "/home/mainak/Desktop/projects/github_repos/mne-python/mne/preprocessing/ecg.py", line 336, in create_ecg_epochs
    verbose=verbose, preload=preload)
  File "<string>", line 2, in __init__
  File "/home/mainak/Desktop/projects/github_repos/mne-python/mne/utils.py", line 724, in verbose
    return function(*args, **kwargs)
  File "/home/mainak/Desktop/projects/github_repos/mne-python/mne/epochs.py", line 2035, in __init__
    verbose=verbose)
  File "/home/mainak/Desktop/projects/github_repos/mne-python/mne/epochs.py", line 348, in __init__
    self.info = pick_info(self.info, picks)
  File "/home/mainak/Desktop/projects/github_repos/mne-python/mne/io/pick.py", line 408, in pick_info
    'channels' % (current_comp, comps_missing))
RuntimeError: Compensation grade 3 has been applied, but compensation channels are missing: [u'BG1-4408', u'BG2-4408', u'BG3-4408', u'G11-4408', u'G12-4408', u'G13-4408', u'G22-4408', u'G23-4408', u'P11-4408', u'P12-4408', u'P13-4408', u'P23-4408', u'Q11-4408', u'Q12-4408', u'Q23-4408', u'R12-4408', u'R13-4408', u'R22-4408']
Either remove compensation or pick compensation channels

180906-14:54:11,669 nipype.workflow INFO:
	 [MultiProc] Running 0 tasks, and 0 jobs ready. Free memory (GB): 27.95/27.95, Free processors: 3/3.
180906-14:54:13,642 nipype.workflow INFO:
	 ***********************************
180906-14:54:13,643 nipype.workflow ERROR:
	 could not run node: preprocessing_pipeline.preproc_meeg.ica.a0
180906-14:54:13,643 nipype.workflow INFO:
	 crashfile: /home/mainak/Desktop/crash-20180906-145411-mainak-ica.a0-77c8dd8b-5461-413e-ad45-fbe44a7adf75.pklz
180906-14:54:13,643 nipype.workflow INFO:
	 ***********************************
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-1-69d92eebc094> in <module>()
     88 
     89     # Run workflow locally on 3 CPUs
---> 90     main_workflow.run(plugin='MultiProc', plugin_args={'n_procs': 3})

/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/workflows.pyc in run(self, plugin, plugin_args, updatehash)
    593         if str2bool(self.config['execution']['create_report']):
    594             self._write_report_info(self.base_dir, self.name, execgraph)
--> 595         runner.run(execgraph, updatehash=updatehash, config=self.config)
    596         datestr = datetime.utcnow().strftime('%Y%m%dT%H%M%S')
    597         if str2bool(self.config['execution']['write_provenance']):

/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/plugins/base.pyc in run(self, graph, config, updatehash)
    190 
    191         self._remove_node_dirs()
--> 192         report_nodes_not_run(notrun)
    193 
    194         # close any open resources

/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/plugins/tools.pyc in report_nodes_not_run(notrun)
     80                 logger.debug(subnode._id)
     81         logger.info("***********************************")
---> 82         raise RuntimeError(('Workflow did not execute cleanly. '
     83                             'Check log for details'))
     84 

verbose

Add verbose parameter to the functions in order to avoid the print

add folder to circleci

How can I add a folder to circleci?
In particular I have to add the sample_omega_bids/FSF folder. I have already updated the Dropbox files.

Thks!

drop support for windows (and python 2?)

In the last PR of @dmalt we noticed that nipype is not really supported on Windows. So, we don't need to have appveyor in that case.

Also, I see that Travis is broken on python 2. Is it easy enough to fix @annapasca ? We should keep python 2 support if it's easy. If it's too much work, then we should just update the readme and remove support.

example stuck

@annapasca I tried to run the example from the branch in your PR #22 . Now, it doesn't throw an error anymore but it gets stuck at some point. This is how far I could go:

/home/mainak/anaconda2/lib/python2.7/site-packages/matplotlib/__init__.py:1405: UserWarning: 
This call to matplotlib.use() has no effect because the backend has already
been chosen; matplotlib.use() must be called *before* pylab, matplotlib.pyplot,
or matplotlib.backends is imported for the first time.

  warnings.warn(_use_error_msg)
1.1.2
*** main_path -> /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega ***
180911-10:43:31,747 nipype.workflow INFO:
	 Generated workflow graph: /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/graph.png (graph2use=colored, simple_form=True).
180911-10:43:32,18 nipype.workflow INFO:
	 Workflow preprocessing_pipeline settings: ['check', 'execution', 'logging', 'monitoring']
180911-10:43:32,107 nipype.workflow INFO:
	 Running in parallel.
180911-10:43:32,132 nipype.workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 27.95/27.95, Free processors: 3/3.
180911-10:43:32,209 nipype.workflow INFO:
	 [Node] Setting-up "preprocessing_pipeline.datasource" in "/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/_sess_index_ses-0001_subject_id_sub-0003/datasource".
180911-10:43:32,249 nipype.workflow INFO:
	 [Node] Running "datasource" ("nipype.interfaces.io.DataGrabber")
180911-10:43:32,277 nipype.workflow INFO:
	 [Node] Finished "preprocessing_pipeline.datasource".
180911-10:43:34,130 nipype.workflow INFO:
	 [Job 0] Completed (preprocessing_pipeline.datasource).
180911-10:43:34,135 nipype.workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 27.95/27.95, Free processors: 3/3.
180911-10:43:34,255 nipype.workflow INFO:
	 [Job 1] Cached (preprocessing_pipeline.preproc_meeg.ds2fif).
180911-10:43:36,241 nipype.workflow INFO:
	 [Job 2] Cached (preprocessing_pipeline.preproc_meeg.preproc).
180911-10:43:38,244 nipype.workflow INFO:
	 [Node] Setting-up "preprocessing_pipeline.preproc_meeg.ica" in "/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica".
180911-10:43:38,261 nipype.workflow INFO:
	 [Node] Running "ica" ("ephypype.interfaces.mne.preproc.CompIca")
Opening raw data file /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/preproc/sub-0003_task-rest_run-01_meg_raw_filt_dsamp.fif...
/home/mainak/Desktop/projects/github_repos/ephypype/ephypype/preproc.py:50: RuntimeWarning: This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/preproc/sub-0003_task-rest_run-01_meg_raw_filt_dsamp.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
  raw = read_raw_fif(fif_file, preload=True)
This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/preproc/sub-0003_task-rest_run-01_meg_raw_filt_dsamp.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
180911-10:43:40,133 nipype.workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 27.75/27.95, Free processors: 2/3.                     
                     Currently running:
                       * preprocessing_pipeline.preproc_meeg.ica
    Read 5 compensation matrices
    Range : 0 ... 479999 =      0.000 ...   599.999 secs
Ready.
Current compensation grade : 3
Reading 0 ... 479999  =      0.000 ...   599.999 secs...
Fitting ICA to data using 270 channels (please be patient, this may take a while)
Inferring max_pca_components from picks
    Rejecting  epoch based on MAG : [u'MLO12-4408', u'MLO13-4408', u'MLO23-4408']
Artifact detected in [294400, 296000]
Selection by explained variance: 27 components
Fitting ICA took 69.6s.
/home/mainak/Desktop/projects/github_repos/ephypype/ephypype/preproc.py:70: RuntimeWarning: This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
  ica_src.save(ica_ts_file)
This filename (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif) does not conform to MNE naming conventions. All raw files should end with raw.fif, raw_sss.fif, raw_tsss.fif, raw.fif.gz, raw_sss.fif.gz or raw_tsss.fif.gz
Writing /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif
Closing /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/preprocessing_pipeline/preproc_meeg/_sess_index_ses-0001_subject_id_sub-0003/ica/sub-0003_task-rest_run-01_meg_raw_filt_dsamp_ica-tseries.fif [done]
Using channel ECG to identify heart beats.
Setting up band-pass filter from 8 - 16 Hz
Filter length of 8192 samples (10.240 sec) selected
Number of ECG events detected : 529 (average pulse 52 / min.)
529 matching events found
No baseline correction applied
Not setting metadata
Loading data for 529 events and 801 original time points ...
0 bad epochs dropped
Reconstructing ECG signal from Magnetometers
*** EOG CHANNELS FOUND ***
Using channel HEOG and VEOG as EOG channels
... filtering ICA sources
Setting up band-pass filter from 1 - 10 Hz
Filter length of 8192 samples (10.240 sec) selected
... filtering target
Setting up band-pass filter from 1 - 10 Hz
Filter length of 8192 samples (10.240 sec) selected
... filtering ICA sources
Setting up band-pass filter from 1 - 10 Hz
Filter length of 8192 samples (10.240 sec) selected
... filtering target
Setting up band-pass filter from 1 - 10 Hz
Filter length of 8192 samples (10.240 sec) selected
Using channel HEOG and VEOG as EOG channels
EOG channel index for this subject is: [298 299]
Filtering the data to remove DC offset to help distinguish blinks from saccades
Setting up band-pass filter from 2 - 45 Hz
Filter length of 8192 samples (10.240 sec) selected
Setting up band-pass filter from 2 - 45 Hz
Filter length of 8192 samples (10.240 sec) selected
Setting up band-pass filter from 1 - 10 Hz
Filter length of 8192 samples (10.240 sec) selected
Now detecting blinks and generating corresponding events
Number of EOG events detected : 22
22 matching events found
No baseline correction applied
Not setting metadata
Loading data for 22 events and 801 original time points ...
2 bad epochs dropped
Embedding : jquery-1.10.2.min.js
Embedding : jquery-ui.min.js
Embedding : bootstrap.min.js
Embedding : jquery-ui.min.css
Embedding : bootstrap.min.css

ep2ts

Add the possibility to choose the condition and specify the channel to pick

delete branches

@jasmainak I understood is wrong to have branches in neuropycon
Now in ephypype there are dev and fixed_inv

The command to delete them is

git push --delete upstream #name_branch

?

Thks!!!

Freesurfer

Create a new pipeline running reconall + bem

how to run power example?

I get the following error on the power example:

In [1]: %run plot_power.py
*** main_path -> /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega ***
180914-23:59:07,341 nipype.workflow INFO:
	 Generated workflow graph: /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/power_pipeline/graph.png (graph2use=colored, simple_form=True).
180914-23:59:07,351 nipype.workflow INFO:
	 Workflow power_pipeline settings: ['check', 'execution', 'logging', 'monitoring']
180914-23:59:07,355 nipype.workflow INFO:
	 Running in parallel.
180914-23:59:07,357 nipype.workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 27.95/27.95, Free processors: 3/3.
180914-23:59:07,430 nipype.workflow INFO:
	 [Node] Setting-up "power_pipeline.datasource" in "/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/power_pipeline/_sess_index_ses-0001_subject_id_sub-0003/datasource".
180914-23:59:07,436 nipype.workflow INFO:
	 [Node] Running "datasource" ("nipype.interfaces.io.DataGrabber")
180914-23:59:07,441 nipype.workflow WARNING:
	 [Node] Error on "power_pipeline.datasource" (/home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/power_pipeline/_sess_index_ses-0001_subject_id_sub-0003/datasource)
180914-23:59:09,389 nipype.workflow ERROR:
	 Node datasource.a0 failed to run on host mainak-ThinkPad-W540.
180914-23:59:09,389 nipype.workflow ERROR:
	 Saving crash info to /home/mainak/Desktop/projects/github_repos/ephypype/examples/crash-20180914-235909-mainak-datasource.a0-06204c04-801f-4abc-9648-ced9e5624c68.pklz
Traceback (most recent call last):
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/plugins/multiproc.py", line 69, in run_node
    result['result'] = node.run(updatehash=updatehash)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/nodes.py", line 471, in run
    result = self._run_interface(execute=True)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/nodes.py", line 555, in _run_interface
    return self._run_command(execute)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/nodes.py", line 635, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/interfaces/base/core.py", line 523, in run
    outputs = self.aggregate_outputs(runtime)
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/interfaces/base/core.py", line 597, in aggregate_outputs
    predicted_outputs = self._list_outputs()
  File "/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/interfaces/io.py", line 1259, in _list_outputs
    raise IOError(msg)
IOError: Output key: raw_file Template: /home/mainak/Desktop/projects/github_repos/BIDS-examples/sample_BIDS_omega/*sub-0003/ses-0001/meg/sub-0003*rest*ica.fif returned no files

180914-23:59:09,396 nipype.workflow INFO:
	 [MultiProc] Running 0 tasks, and 0 jobs ready. Free memory (GB): 27.95/27.95, Free processors: 3/3.
180914-23:59:11,361 nipype.workflow INFO:
	 ***********************************
180914-23:59:11,361 nipype.workflow ERROR:
	 could not run node: power_pipeline.datasource.a0
180914-23:59:11,361 nipype.workflow INFO:
	 crashfile: /home/mainak/Desktop/projects/github_repos/ephypype/examples/crash-20180914-235909-mainak-datasource.a0-06204c04-801f-4abc-9648-ced9e5624c68.pklz
180914-23:59:11,361 nipype.workflow INFO:
	 ***********************************
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
/home/mainak/Desktop/projects/github_repos/ephypype/examples/plot_power.py in <module>()
     84     main_workflow.config['execution'] = {'remove_unnecessary_outputs': 'false'}
     85 
---> 86     main_workflow.run(plugin='MultiProc', plugin_args={'n_procs': 3})

/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/engine/workflows.pyc in run(self, plugin, plugin_args, updatehash)
    593         if str2bool(self.config['execution']['create_report']):
    594             self._write_report_info(self.base_dir, self.name, execgraph)
--> 595         runner.run(execgraph, updatehash=updatehash, config=self.config)
    596         datestr = datetime.utcnow().strftime('%Y%m%dT%H%M%S')
    597         if str2bool(self.config['execution']['write_provenance']):

/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/plugins/base.pyc in run(self, graph, config, updatehash)
    190 
    191         self._remove_node_dirs()
--> 192         report_nodes_not_run(notrun)
    193 
    194         # close any open resources

/home/mainak/anaconda2/lib/python2.7/site-packages/nipype-1.1.2-py2.7.egg/nipype/pipeline/plugins/tools.pyc in report_nodes_not_run(notrun)
     80                 logger.debug(subnode._id)
     81         logger.info("***********************************")
---> 82         raise RuntimeError(('Workflow did not execute cleanly. '
     83                             'Check log for details'))
     84 

RuntimeError: Workflow did not execute cleanly. Check log for details

@annapasca it's asking for ica files. Where do I get them from?

gh-pages

Hi @jasmainak !

What is the best way to push files in the gh-pages branch of neuropycon/ephypype? Thks! and no rush :)

add circleci

We need to add circleci to check examples run correctly ...

iterate on API + plans

hello guys! @k-jerbi @annapasca @EtienneCmb @dmalt @davidmeunier79

after conversation with @k-jerbi today, I carefully started looking at documention of ephypype. I'm still trying to get the big picture but I made some progress :) I think the main blockers for me are the following:

  1. API is too complicated
  2. examples don't run if I copy-paste
  3. neuropycon is three packages --> does it need to be?

The semi-minor blockers are:

  1. test coverage is too low
  2. code does not follow professional standards (pep8, documentation style etc...)

I think 1-3 really must be addressed ASAP (in the next week or so) but the others can progress in parallel afterwards. That was the tl;dr version. Now, let me expand on my points:

  1. I think this is important to fix so that we can clarify what exactly ephypype delivers. I finally found the API documentation. This is way too verbose (to be fixed by me with help from @annapasca) -- and the main message gets lost.

    From what I understand, ephypype delivers pipelines, nodes, and interfaces -- for MEG/EEG processing -- which can be plugged into the nipype engine. These template pipelines can make life easier for someone who has not seen much code in their life. You just provide the params file and it works (maybe?)

    However, there is a flaw in this narrative. Looking at the examples, it is full of code ...

    Thus, I think we need to simplify this. Specifically, the functions create_infosource and create_datasource should be part of the nodes that we provide the users. And in the example, we should just import these nodes from ephypype. Then, we should just use nipype, create a workflow using the ephypype nodes and run it -- in less than 10 lines of code. If it takes any more, I would say -- why not use MNE directly? :-) I suggest that @annapasca should take care of this with help from me.

  1. See my bug report here for instance. We have data missing and we can't expect the users to be patient with the software if crucial data for examples is missing. I suggest a very crude solution -- use the MNE private function to fetch the data from Dropbox or Github as you see fit. And remove hard-coded paths in the params file. All these paths should be relative so that they work out of the box without requiring any intervention on the user end. I suggest @davidmeunier79 to take care of this.

  2. I think neuropycon_cli should be inside ephypype as it's just one file. In MNE, it lives inside a folder called commands/ and it should not use click as a dependency (if possible). Otherwise, we are just creating a "dependency hell". I think this should be done by @dmalt. I'm hoping this is just one afternoon's work :)

    The documentation for ephypype should also live inside a doc/ folder rather than a separate repository. My WIP pull request to fix this is in #30. Once this pull request is merged, we should set up CircleCI (I'm banking on @EtienneCmb to help me with this ;) so that the examples are build continuously and we can guarantee that they run on the users machine. However, note that the CircleCI part is contingent upon 2) being finished first.

deepcopy error when running workflow in python3

Can't make nipype work under python3.
I've created a simple pipeline to disentangle command-line code from nipype and ephypype.
When I run the workflow I get the following error:

  File "test_cli.py", line 91, in <module>
    workflow.run(plugin='Linear')
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/site-packages/nipype/pipeline/engine/workflows.py", line 570, in run
    flatgraph = self._create_flat_graph()
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/site-packages/nipype/pipeline/engine/workflows.py", line 830, in _create_flat_graph
    workflowcopy = deepcopy(self)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 180, in deepcopy
    y = _reconstruct(x, memo, *rv)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 280, in _reconstruct
    state = deepcopy(state, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 180, in deepcopy
    y = _reconstruct(x, memo, *rv)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 280, in _reconstruct
    state = deepcopy(state, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 180, in deepcopy
    y = _reconstruct(x, memo, *rv)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 280, in _reconstruct
    state = deepcopy(state, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 215, in _deepcopy_list
    append(deepcopy(a, memo))
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 220, in _deepcopy_tuple
    y = [deepcopy(a, memo) for a in x]
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 220, in <listcomp>
    y = [deepcopy(a, memo) for a in x]
  File "/home/dmalt/Code/python/neuropycon/npyc3/lib/python3.6/copy.py", line 169, in deepcopy
    rv = reductor(4)
TypeError: can't pickle dict_keys objects```

It seems to me that something goes wrong with looped links inside the workflow object. The error appears when nipype tries to deepcopy the workflow instance.

In python2.7 the same works fine. I tried to google for similar problems but I got only one
google groups issue like that without responses.

Have you guys seen a problem like this? 


Epoching node is broken.

After style changes in code epoching stopped working.
This happens because in preproc._create_epochs the Epochs constructor call was substituted with read_epochs function, presumably by accident:

from mne import pick_types, read_epochs

epochs = read_epochs(raw, events=events, tmin=0, tmax=ep_length,
preload=True, picks=picks, proj=False,
flat=flat, reject=reject)

It used to be Epochs instead of read_epochs in both cases.

sample dataset

Up to now the example pipelines use one sbj of the OMEGA dataset.
Would it be better to use the sample dataset of MNE?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.