Code Monkey home page Code Monkey logo

eelbrain's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

eelbrain's Issues

The math under TTestRelated

Hey!

We are activate using the eelbrain library for our studies in EEG+language. We faced a problem with the statistical analysis, so I have a question.
Could you please tell me what do we actually do under the TTestRelated?

For example, I am checking the difference between correlations of two models. One model uses only word onset as a predictor and the second one uses two predictors (onset + SD).

tests['r_compare_onset+SD'] = eelbrain.testnd.TTestRelated(model_fits['onset'], model_fits['onset+SD'], 
                                                                       tfce=True, samples=1000, tail=1)

In model_fits['onset+SD'] I have correlation values.
telegram-cloud-photo-size-2-5440386223341429049-x

I would like to know what do we actually permute here.

Thanks!

plot.brain.brain(roi) crashes immediately

from eelbrain import *
import mne
from matplotlib import pyplot
from aphasia_experiment import e

epoch = 'fixation'
pmin = 0.05
bl = False

if epoch == 'fixation':
tstart = 1.200
tstop = 1.600
data, res = e.load_test('speak_by_TrialType', pmin=pmin, data='source', tstart=tstart, tstop=tstop, baseline=bl,
mask='wholebrain', samples=1000, make=True, epoch=epoch, return_data=True)
clusters = res.find_clusters(pmin=0.05)
data

elif epoch == 'speak':
data, res = e.load_test('speak_by_TrialType', pmin=pmin, data='source', baseline=bl,
mask='wholebrain', samples=1000, make=True, epoch=epoch, return_data=True)
clusters = res.find_clusters(pmin=0.05)
data

if epoch == 'fixation':
load_test = 'inflect_minus_naming' # inflect_minus_naming naming_minus_button inflect_minus_button

tt_res = e.load_test(load_test, pmin=pmin, data='source', tstart=tstart, tstop=tstop, baseline=bl, 
                     mask='wholebrain', samples=1000, make=True, epoch=epoch)
tt_clusters = tt_res.find_clusters(pmin=0.05, maps=True)

elif epoch == 'speak':
load_test = 'inflect_minus_button' # inflect_minus_naming naming_minus_button inflect_minus_button

tt_res = e.load_test(load_test, pmin=pmin, data='source', baseline=bl, 
                     mask='wholebrain', samples=1000, make=True, epoch=epoch)
tt_clusters = tt_res.find_clusters(pmin=0.05, maps=True)

id = 0
tt_cluster = tt_clusters[id, 'cluster']
tt_cluster

mask = tt_cluster != 0

roi = mask.any('time')

p = plot.brain.brain(roi)

#####################################

ADDING THE CODE BELOW FIXES THE ISSUE

#####################################

ndvar = roi

source = ndvar.source
x = ndvar.get_data(source.name)
if x.dtype.kind != 'b':
raise ValueError("Require NDVar of type bool, got %r" % (x.dtype,))
name = str(ndvar.name)
lh_vertices = source.lh_vertices[x[:source.lh_n]]
rh_vertices = source.rh_vertices[x[source.lh_n:]]
lh_label, rh_label = source._label((lh_vertices, rh_vertices), name, [1, 0, 0])

p = plot.brain.brain(roi)

Fixing GlassBrain

        else:
            cbar_vmin, cbar_vmax = None, None
        self._vol_kwargs = dict(dest=dest, mri_resolution=mri_resolution, mni305=mni305)

        show_nan_msg = False
        if vmax is not None and np.isnan(vmax):
            vmax = None
            show_nan_msg = True

The 3rd line here (in eelbrain/plot/_glassbrain) should be as follows, since _save_stc_as_volume takes mni_correction instead of mni305 as arg to decide to correct or not.

        self._vol_kwargs = dict(dest=dest, mri_resolution=mri_resolution, mni_correction=mni305)

Trouble adding NDVar to brain plot

Hi,

I'm trying to add NDVar data to a brain plot, but I keep getting the error shown below. It was my understanding that if there is a case dimension of an NDVar, data would be averaged over cases, but the error is saying the NDVar must be one or two dimensional...

b = e.plot.brain.brain('fsaverage', hemi = 'lh', surf = 'white', subjects_dir=subjects_dir,
                  background='white', cortex='low_contrast',views=['lat'])

b.add_label(label = 'BA11', color='#F7B4AE', alpha=1, scalar_thresh=None, borders=False, hemi='lh')

b.add_ndvar(data, cmap=None, vmin=None, vmax=None, smoothing_steps=None, colorbar=False, time_label='ms',
            lighting=False, contours=True, alpha=1, remove_existing=False)

ValueError: <NDVar 'src': 54 case, 58 source, 501 time>: must be one- or two dimensional`

trouble getting same residuals with BoostingResult and Convolve

Hello eelbrain,

I've been trying to understand the boosting objects and operations. A sanity check I want to achieve is calculating the residual by hand by comparing the output of convolve and the "target" signal that boosting was optimized to model. However, I have been unsuccessful. Below is a minimal example to replicate my logic (based on one of the repo's examples).

# Original author: Christian Brodbeck <[email protected]>
# modifications by Iran R. Roman
# sphinx_gallery_thumbnail_number = 4
import os

from scipy.io import loadmat
import mne
from eelbrain import *
import numpy as np

# Load the mTRF speech dataset and convert data to NDVars
root = mne.datasets.mtrf.data_path()
speech_path = os.path.join(root, 'speech_data.mat')
mdata = loadmat(speech_path)

# Time axis
tstep = 1. / mdata['Fs'][0, 0]
n_times = mdata['envelope'].shape[0]
time = UTS(0, tstep, n_times)
# Load the EEG sensor coordinates (drop fiducials coordinates, which are stored
# after sensor 128)
sensor = Sensor.from_montage('biosemi128')[:128]
# Frequency dimension for the spectrogram
band = Scalar('frequency', range(16))
# Create variables
envelope = NDVar(mdata['envelope'][:, 0], (time,), name='envelope')
eeg = NDVar(mdata['EEG'], (time, sensor), name='EEG', info={'unit': 'µV'})
spectrogram = NDVar(mdata['spectrogram'], (time, band), name='spectrogram')
# Exclude a bad channel
eeg = eeg[sensor.index(exclude='A13')]

# model eeg by boosting the speech envelope
res = boosting(eeg, envelope, -0.100, 0.400, basis=0.100, partitions=4)

# convolve the filters with the envelope to obtain the model prediction
prediction = convolve(res.h_source, envelope)

# calculate the residual error "by hand"
res_error_hand = np.sum((prediction.get_data(['time','sensor'])-eeg.get_data(['time','sensor']))**2,axis=0)

# assert that res.residual and the residual obtained by hand are the same
assert np.allclose(res_error_hand, res.residual.get_data(['sensor']))

Maybe there's something fundamental I'm not understanding. Thanks!

Eelbrain not rendering Brain objects

Hi,

I'm trying to plot a cluster from a spatiotemporal cluster-based permutation test. Everything's working fine, until I try to use eelbrian.plot.brain.cluster, in which case I get an error saying that init() takes exactly one argument. I get this error if I just try to plot a brain using eelbrain.plot.brain.brain('fsaverage'), traceback below. I've had this issue with eelbrain 0.38.4 and 0.38.6, and I'm running this on Python 3.10 on Mac OS Monterey.

I've also tried to (re)install pysurfer. My MNE environment is able to render Brain objects fine in an ipython environment, and also in an eelbrain environment. So there's something happening specifically with eelbrain functions. Any ideas what's going on?

2023-05-04 19:27:13.917 python[27564:826284] ApplePersistenceIgnoreState: Existing state will not be touched. New state will be written to /var/folders/mr/79qm2pcs3f76pkqdnd_pwfzml0x6yh/T/com.continuum.python.savedState
Installed eelbrain event loop hook.
<frozen importlib._bootstrap>:241: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 96 from PyObject
/Users/dac49596/anaconda3/envs/eelbrain0386/lib/python3.10/site-packages/eelbrain/plot/_brain_object.py:49: UserWarning: Error importing PySurfer: cannot import name 'Sequence' from 'collections' (/Users/dac49596/anaconda3/envs/eelbrain0386/lib/python3.10/collections/__init__.py)
  warn(f"Error importing PySurfer: {exception}")
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[3], line 1
----> 1 eelbrain.plot.brain.brain('fsaverage')

File ~/anaconda3/envs/eelbrain0386/lib/python3.10/site-packages/eelbrain/plot/_brain.py:562, in brain(src, cmap, vmin, vmax, surf, views, hemi, colorbar, time_label, w, h, axw, axh, foreground, background, parallel, cortex, title, smoothing_steps, mask, subjects_dir, name, pos)
    559 if subjects_dir is None:
    560     subjects_dir = source.subjects_dir
--> 562 brain = Brain(subject, hemi, surf, title, cortex, views=views, w=w, h=h, axw=axw, axh=axh, foreground=foreground, background=background, subjects_dir=subjects_dir, name=name, pos=pos, source_space=source)
    564 if ndvar is not None:
    565     if ndvar.x.dtype.kind in 'ui':

File ~/anaconda3/envs/eelbrain0386/lib/python3.10/site-packages/eelbrain/plot/_brain_object.py:240, in Brain.__init__(self, subject, hemi, surf, title, cortex, alpha, background, foreground, subjects_dir, views, offset, show_toolbar, offscreen, interaction, w, h, axw, axh, name, pos, source_space, show, run)
    238 if subjects_dir is not None:
    239     subjects_dir = os.path.expanduser(subjects_dir)
--> 240 surfer.Brain.__init__(self, subject, hemi, surf, '', cortex, alpha, (w, h), background, foreground, figure, subjects_dir, views, offset, show_toolbar, offscreen, interaction)
    241 TimeSlicer.__init__(self)
    243 if self._frame and CONFIG['show'] and show:

TypeError: object.__init__() takes exactly one argument (the instance to initialize)

`boosting` with `multiprocessing` on Windows

The boosting function has been reported to stall or produce all-zero TRFs when run with multiprocessing on Windows (see Eelbrain/Alice#5).

EDIT: there is a pre-release that works on Windows, see below


Old workarounds

  • Use a Linux virtual machine.
  • If feasible, the model estimations can be run under a different OS (pickled results can then be loaded and analyzed on Windows, too)
  • On Windows, turn off multiprocessing with eelbrain.configure(n_workers=False).

Solutions that don't work

Last figure won't close

With wx but not eelbrain backend:

  • The last open figure does not close when clicking on close
  • then .on_motion tries to access ._frame which has been deleted by mpl_canvas.CanvasFrame

calling plot functions in eelbrain cause python to abort (libc++abi: terminating with uncaught exception of type NSException)

Hi everyone,

I recently installed eelbrain into a new environment and was trying to work through one of the tutorials, but whenever I call any of the plotting functions python crashes and the kernel is restarted. So far, everything thing else has worked.

code for reproduction

tutorial EEG speech envelope TRF

import os
from scipy.io import loadmat
import mne
from eelbrain import *

# Load the mTRF speech dataset and convert data to NDVars
root = mne.datasets.mtrf.data_path()
speech_path = os.path.join(root, 'speech_data.mat')
mdata = loadmat(speech_path)

# Time axis
tstep = 1. / mdata['Fs'][0, 0]
n_times = mdata['envelope'].shape[0]
time = UTS(0, tstep, n_times)
# Load the EEG sensor coordinates (drop fiducials coordinates, which are stored
# after sensor 128)
sensor = Sensor.from_montage('biosemi128')[:128]
# Frequency dimension for the spectrogram
band = Scalar('frequency', range(16))
# Create variables
envelope = NDVar(mdata['envelope'][:, 0], (time,), name='envelope')
eeg = NDVar(mdata['EEG'], (time, sensor), name='EEG', info={'unit': 'µV'})
spectrogram = NDVar(mdata['spectrogram'], (time, band), name='spectrogram')
# Exclude a bad channel
eeg = eeg[sensor.index(exclude='A13')]

# plot spectrogram of the speech stimulus
plot.Array(spectrogram, xlim=5, w=6, h=2) 

# plot the envelope used as stimulus representation for deconvolution
plot.UTS(envelope, xlim=5, w=6, h=2) 

# plot the EEG data
p = plot.TopoButterfly(eeg, xlim = 5, w = 7, h = 2) 
p.set_time(1.200)

outcome

2022-06-13 15:26:07.365 python[7062:4738888] -[QNSApplication transformToForegroundApplication]: unrecognized selector sent to instance 0x7f9b087351a0
2022-06-13 15:26:07.378 python[7062:4738888] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[QNSApplication transformToForegroundApplication]: unrecognized selector sent to instance 0x7f9b087351a0'
*** First throw call stack:
(
	0   CoreFoundation                      0x00007ff8051531e3 __exceptionPreprocess + 242
	1   libobjc.A.dylib                     0x00007ff804eb3c13 objc_exception_throw + 48
	2   CoreFoundation                      0x00007ff8051d7f3b -[NSObject(NSObject) __retain_OA] + 0
	3   CoreFoundation                      0x00007ff8050b9df5 ___forwarding___ + 1356
	4   CoreFoundation                      0x00007ff8050b9818 _CF_forwarding_prep_0 + 120
	5   libwx_osx_cocoau_core-3.1.5.0.0.dyl 0x0000000171d15277 _ZN5wxApp9DoInitGuiEv + 455
	6   libwx_osx_cocoau_core-3.1.5.0.0.dyl 0x0000000171d651be _ZN5wxApp9OnInitGuiEv + 30
	7   _core.cpython-39-darwin.so          0x00000001714e95fe _ZN10sipwxPyApp9OnInitGuiEv + 110
	8   libwx_osx_cocoau_core-3.1.5.0.0.dyl 0x0000000171d64ff8 _ZN5wxApp10InitializeERiPPw + 168
	9   libwx_baseu-3.1.5.0.0.dylib         0x000000016b586d92 _Z12wxEntryStartRiPPw + 178
	10  _core.cpython-39-darwin.so          0x00000001714e8dd3 _ZN7wxPyApp13_BootstrapAppEv + 467
	11  _core.cpython-39-darwin.so          0x00000001714ed722 _ZL26meth_wxPyApp__BootstrapAppP7_objectS0_ + 114
	12  python3.9                           0x000000010946b88a cfunction_call + 90
	13  python3.9                           0x000000010952b147 _PyEval_EvalFrameDefault + 52951
	14  python3.9                           0x000000010940ebe2 _PyFunction_Vectorcall + 1234
	15  python3.9                           0x000000010949750a slot_tp_init + 346
	16  python3.9                           0x000000010949cce0 type_call + 272
	17  python3.9                           0x00000001095287c0 _PyEval_EvalFrameDefault + 42320
	18  python3.9                           0x000000010940fcfd _PyFunction_Vectorcall + 5613
	19  python3.9                           0x000000010952513e _PyEval_EvalFrameDefault + 28366
	20  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	21  python3.9                           0x0000000109527d9e _PyEval_EvalFrameDefault + 39726
	22  python3.9                           0x000000010940ebe2 _PyFunction_Vectorcall + 1234
	23  python3.9                           0x00000001094975d4 slot_tp_init + 548
	24  python3.9                           0x000000010949cce0 type_call + 272
	25  python3.9                           0x000000010940d031 _PyObject_MakeTpCall + 321
	26  python3.9                           0x00000001095285a6 _PyEval_EvalFrameDefault + 41782
	27  python3.9                           0x000000010951c637 _PyEval_EvalCode + 663
	28  python3.9                           0x00000001095178c9 builtin_exec + 329
	29  python3.9                           0x000000010946c567 cfunction_vectorcall_FASTCALL + 103
	30  python3.9                           0x000000010952513e _PyEval_EvalFrameDefault + 28366
	31  python3.9                           0x000000010952cf45 _PyEval_EvalFrameDefault + 60629
	32  python3.9                           0x000000010952cf45 _PyEval_EvalFrameDefault + 60629
	33  python3.9                           0x000000010942780c gen_send_ex + 172
	34  python3.9                           0x000000010941d6f9 method_vectorcall_O + 121
	35  python3.9                           0x00000001095268dd _PyEval_EvalFrameDefault + 34413
	36  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	37  python3.9                           0x000000010952513e _PyEval_EvalFrameDefault + 28366
	38  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	39  python3.9                           0x00000001095268dd _PyEval_EvalFrameDefault + 34413
	40  python3.9                           0x000000010940ebe2 _PyFunction_Vectorcall + 1234
	41  python3.9                           0x00000001094139ae method_vectorcall + 158
	42  python3.9                           0x000000010952c62c _PyEval_EvalFrameDefault + 58300
	43  python3.9                           0x000000010940ebe2 _PyFunction_Vectorcall + 1234
	44  python3.9                           0x00000001094139ae method_vectorcall + 158
	45  python3.9                           0x0000000109525098 _PyEval_EvalFrameDefault + 28200
	46  python3.9                           0x000000010952cf45 _PyEval_EvalFrameDefault + 60629
	47  python3.9                           0x000000010952cf45 _PyEval_EvalFrameDefault + 60629
	48  python3.9                           0x000000010952cf45 _PyEval_EvalFrameDefault + 60629
	49  python3.9                           0x000000010952cf45 _PyEval_EvalFrameDefault + 60629
	50  python3.9                           0x000000010942780c gen_send_ex + 172
	51  _asyncio.cpython-39-darwin.so       0x000000010a9219aa task_step_impl + 442
	52  _asyncio.cpython-39-darwin.so       0x000000010a92175e task_step + 62
	53  _asyncio.cpython-39-darwin.so       0x000000010a921642 task_wakeup + 194
	54  _asyncio.cpython-39-darwin.so       0x000000010a9214dd TaskWakeupMethWrapper_call + 109
	55  python3.9                           0x000000010940d031 _PyObject_MakeTpCall + 321
	56  python3.9                           0x0000000109563471 context_run + 81
	57  python3.9                           0x000000010946c4d1 cfunction_vectorcall_FASTCALL_KEYWORDS + 97
	58  python3.9                           0x0000000109529b21 _PyEval_EvalFrameDefault + 47281
	59  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	60  python3.9                           0x00000001095268dd _PyEval_EvalFrameDefault + 34413
	61  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	62  python3.9                           0x00000001095268dd _PyEval_EvalFrameDefault + 34413
	63  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	64  python3.9                           0x00000001095268dd _PyEval_EvalFrameDefault + 34413
	65  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	66  python3.9                           0x00000001095268dd _PyEval_EvalFrameDefault + 34413
	67  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	68  python3.9                           0x00000001095268dd _PyEval_EvalFrameDefault + 34413
	69  python3.9                           0x000000010940e8b4 _PyFunction_Vectorcall + 420
	70  python3.9                           0x0000000109527d9e _PyEval_EvalFrameDefault + 39726
	71  python3.9                           0x000000010951c637 _PyEval_EvalCode + 663
	72  python3.9                           0x00000001095178c9 builtin_exec + 329
	73  python3.9                           0x000000010946c567 cfunction_vectorcall_FASTCALL + 103
	74  python3.9                           0x000000010952513e _PyEval_EvalFrameDefault + 28366
	75  python3.9                           0x000000010940ebe2 _PyFunction_Vectorcall + 1234
	76  python3.9                           0x000000010952513e _PyEval_EvalFrameDefault + 28366
	77  python3.9                           0x000000010940ebe2 _PyFunction_Vectorcall + 1234
	78  python3.9                           0x00000001095bdefe pymain_run_module + 222
	79  python3.9                           0x00000001095bd9d1 pymain_run_python + 241
	80  python3.9                           0x00000001095bd895 Py_RunMain + 37
	81  python3.9                           0x00000001093af068 main + 56
	82  dyld                                0x000000011923f51e start + 462
)
libc++abi: terminating with uncaught exception of type NSException


Fatal Python error: Aborted


Main thread:
Current thread 0x00000001192ba600 (most recent call first):
  File "/Users/chantal/opt/anaconda3/envs/eelbrain/lib/python3.9/site-packages/wx/core.py", line 2207 in __init__
  File "/Users/chantal/opt/anaconda3/envs/eelbrain/lib/python3.9/site-packages/eelbrain/_wxgui/app.py", line 696 in get_app
  File "/Users/chantal/opt/anaconda3/envs/eelbrain/lib/python3.9/site-packages/eelbrain/plot/_base.py", line 1586 in __init__
  File "/Users/chantal/opt/anaconda3/envs/eelbrain/lib/python3.9/site-packages/eelbrain/plot/_utsnd.py", line 267 in __init__
  File "/var/folders/_y/rthmdmzs4sjfqrrlyfhq97h00000gp/T/ipykernel_7062/1244890001.py", line 1 in <cell line: 1>

additional information

  • operating system: macOS Monterey 12.3.1
  • Python version: 3.9.13
  • Python environment for eelbrain created with conda
  • eelbrain version: 0.37.3
  • matplotlib backend: Qt5Agg

I've seen similar issues that were related to matplotlib, but changing the backend from Qt5Agg to TkAgg doesn't resolve the error. And anyway plotting otherwise works fine. I apologize if I'm missing something basic.

Loading CND format dataset with load.cnd

Hi,

I was trying to load the EEG data from the diliBach CND dataset (the dataset from the CNSP 2021 Workshop):

import eelbrain

eeg_data = eelbrain.load.cnd('/Users/Datasets/diliBach/dataCND/dataSub1.mat')

and I encountered this error message:

raise ValueError(f"Can not assign item to Dataset. The item`s length {n} is different from the number of cases in the Dataset {self.n_cases}.")
ValueError: Can not assign item to Dataset. The item`s length 0 is different from the number of cases in the Dataset 30.

I also tried with the CND format of LalorNatSpeech dataset, and seemed to work fine.

As far as I understand, the diliBach is in the CND data-format, thus eelbrain.load.cnd should work.

Is this a problem with diliBach dataset in CND format? I see that in mat files in diliBach, the origTrialPosition is an empty list. Is that the problem for the error?

Cannot install eelbrain 0.39.5

Hey!

Thanks for the amazing work you've done with eelbrain package!

Recently I attempted to install eelbrain with conda, pip, conda-forge.
I was able to do it only with conda-forge package version 0.38.2.

Now I am trying to update it to 0.39.5, but as with conda and pip I face the same problem with wheel.
Does anyone have such problem?

Thanks!


Building wheel for eelbrain (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [248 lines of output]
...

 note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for eelbrain
  Running setup.py clean for eelbrain
Failed to build eelbrain
Installing collected packages: eelbrain
  Running setup.py install for eelbrain ... error
  error: subprocess-exited-with-error
  
  × Running setup.py install for eelbrain did not run successfully.
  │ exit code: 1
  ╰─> [250 lines of output]
....
 note: This error originates from a subprocess, and is likely not a problem with pip.
error: legacy-install-failure

× Encountered error while trying to install package.
╰─> eelbrain

note: This is an issue with the package mentioned above, not pip.
hint: See above for output from the failure.

How should I import my meg raw file into eelbrain's NDVar format?

My data is meg data from MEGIN/Elekta Neuromag VectorView and TRIUX in .fif format.
I have 8 stories, one raw file for each story, which I have pre-processed with mne and saved as fif format files.

The issues I have encountered at the moment are

1: meg's fif format does not have montage. and ds['epochs'] = load.fif.epochs(ds) gives an error
ValueError: 'MEG0111' is not in list. seems to be because of montage problem with meg.

Traceback (most recent call last).
  File "/Users//opt/anaconda3/envs/mne/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3505, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-27-d2f9910b3bf8>", line 1, in <module>
    ep = eelbrain.load.fiff.add_epochs(ds)
  File "/Users//opt/anaconda3/envs/mne/lib/python3.10/site-packages/eelbrain/_io/fiff.py", line 570, in add_epochs
    ds[name] = epochs_ndvar(epochs_, name, data, mult=mult, info=info.
  File "/Users//opt/anaconda3/envs/mne/lib/python3.10/site-packages/eelbrain/_io/fiff.py", line 1151, in epochs_ndvar
    sensor = sensors or sensor_dim(epochs, picks, sysname, connectivity)
  File "/Users//opt/anaconda3/envs/mne/lib/python3.10/site-packages/eelbrain/_io/fiff.py", line 822, in sensor_dim
    index = np.array([names.index(name) for name in ch_names])
  File "/Users//opt/anaconda3/envs/mne/lib/python3.10/site-packages/eelbrain/_io/fiff.py", line 822, in <listcomp>
    index = np.array([names.index(name) for name in ch_names])

I have tried asking this question in the mne forum and the developer in question suggested that I use layout, but it seems that layout does not import well into eelbrain's data structure.

https://mne.discourse.group/t/how-to-get-fif-montage-with-channel/6711

I tried building senosr by myself, but it seems that eelbrain's sensors are 3D arrays, but meg's channels are 4D arrays.

raw = raw = mne.io.read_raw_fif(file_path)
raw = raw.pick_types(meg=True, eeg=False, eog=False)
from mne.channels import read_layout
layout = read_layout("Vectorview-all")
meg_senor = eelbrain.Sensor(locs=layout.pos、
                      names=raw.info.ch_names、
                      sysname="fif")
report:
Traceback (most recent call last):
  File "/Users/hasibagen/opt/anaconda3/envs/mne/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3505, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-24-502433b3dc2d>", line 1, in <module>
    sen = eelbrain.Sensor(locs=layout.pos,
  File "/Users/hasibagen/opt/anaconda3/envs/mne/lib/python3.10/site-packages/eelbrain/_data_obj.py", line 8942, in __init__
    raise ValueError(f"locs needs to have shape (n_sensors, 3), got {locs.shape=}")
ValueError: locs needs to have shape (n_sensors, 3), got locs.shape=(306, 4)

2: eelbrain does not have the ability to import raw format as NDVar directly. I would like to use mne to process all the raw data and import it into eelbrain, because my data is special, each story is a separate fif, and the length is different.

This is the code I use to pre-process the raw files:

raw = mne.io.read_raw_fif(file_path)  
raw.load_data()  
meg_picks = mne.pick_types(raw.info, meg=True)  
freqs = (60, 180)  
raw_notch = raw.copy().notch_filter(freqs=freqs, picks=meg_picks, n_jobs=6)  
raw_filter = raw_notch.copy().filter(l_freq=1, h_freq=15, picks=meg_picks, n_jobs=6)  
events = mne.find_events(raw, stim_channel='STI101',uint_cast = True)  
events = mne.pick_events(events, include=32)  
marker_time = (events[0, 0] - raw_filter.first_samp) / raw.info['sfreq']  
dely_time = all_start.iloc[sub - 1, story - 1] / raw.info['sfreq']  
time_min = marker_time + dely_time  
time_max = time_min + all_duration.iloc[story-1][0] + 10  
raw_cut = raw_filter.copy().crop(tmin=time_min, tmax=time_max)  
raw_downsampled = raw_cut.copy().resample(sfreq=100)

mne: 1.3.1
Eelbrain :0.38
ubuntu:20.04

Use of deprecated numpy.testing.decorators

Hello,

I'm getting the following error if I am trying to import eelbrain on a newly installed system. It seems that it is trying to use a deprecated function of numpy:

Traceback (most recent call last):
  File "pythonscript.py", line 1, in <module>
    from eelbrain import *
  File "/home/user/miniconda3/envs/eelbrain/lib/python3.7/site-packages/eelbrain/__init__.py", line 16, in <module>
    from . import mne_fixes
  File "/home/user/miniconda3/envs/eelbrain/lib/python3.7/site-packages/eelbrain/mne_fixes/__init__.py", line 38, in <module>
    from ._label import write_labels_to_annot
  File "/home/user/miniconda3/envs/eelbrain/lib/python3.7/site-packages/eelbrain/mne_fixes/_label.py", line 11, in <module>
    import nibabel
  File "/home/user/.local/lib/python3.7/site-packages/nibabel/__init__.py", line 38, in <module>
    from . import analyze as ana
  File "/home/user/.local/lib/python3.7/site-packages/nibabel/analyze.py", line 87, in <module>
    from .volumeutils import (native_code, swapped_code, make_dt_codes,
  File "/home/user/.local/lib/python3.7/site-packages/nibabel/volumeutils.py", line 22, in <module>
    from .casting import (shared_range, type_info, OK_FLOATS)
  File "/home/user/.local/lib/python3.7/site-packages/nibabel/casting.py", line 11, in <module>
    from .testing import setup_test  # flake8: noqa F401
  File "/home/user/.local/lib/python3.7/site-packages/nibabel/testing/__init__.py", line 21, in <module>
    from numpy.testing.decorators import skipif
ModuleNotFoundError: No module named 'numpy.testing.decorators'

I'm using versions
python 3.7.6
numpy 1.18.1
eelbrain 0.29.8

is there a fix to this?

thanks

Having a dead kernel each time I run eelbrain on my data

Hi! Thanks fo amazing research tool! I am trying to use it to do my research in electrophysiology.

I am using eelbrain to work with MEG data. The data is big; it consists of 204 channels with a sampling frequency at 1000Hz. When I started to run eelbrain.boosting() I got a dead kernel, and I was getting it each time I run this function.

I decided to decrease the number of channels and sampling frequency now my NDVarX is <NDVar: 1 case, 7500 time, 5 sensor> and NDVarY is just <NDVar: 1 case, 7500 time>, but the dead kernels was till there when I run eelbrain.boosting()

Finally, I decided to launch the script on a server to be sure that computational capacity is enough, but the kernel was dead again.

Did anybody experience such an issue? I would be thankful if anybody can help me!
I use eelbrain 0.39

Eelbrain doesn't lauch after update

Hey,
I updated eelbrain to the latest version as the last version I had was from last year. After updating I cannot launch eelbrain
Capture d'écran 2023-12-05 142419

I tried to downgrade mne to version 1.5, but it doesn't work.

Differing results for Spatio-temporal cluster permutation in Mac vs. Windows

Hello,

I recently ran a cluster permutation test on Mac and then on Windows a few days later, and I'm getting a greater number of significant clusters when running the exact same code on Windows.
It's not really a problem as such, but I'm just curious as to why this difference exists.

Clusters found in Windows:
02_12_2022
Clusters found in Mac:
25_11_2022

Also, thanks for the Eelbrain library!

Adding parcellation to vol source space

Hi,

I wanted to add a parcellation to a volume source space with 3D vectors (stc_nd is a VolVectorSourceEstimate):

meg_source_combined = eb.load.fiff.stc_ndvar(stc_nd, src='vol-7', subject='fsaverage', subjects_dir=mri_path,
                                                     method='MNE', fixed=False, parc='aparc+aseg')

In eelbrain.load.fiff.stc_ndvar, the NDVar is constructed here (line 1438 in fiff.py):

 # Construct NDVar Dimensions
    time = UTS(stc.tmin, stc.tstep, stc.times.size)
    if isinstance(stc, MNE_VOLUME_STC):
        ss = VolumeSourceSpace(vertices, subject, src, subjects_dir, None, filename=sss_filename)
        is_vector = stc.data.ndim == 3
    elif isinstance(stc, (mne.SourceEstimate, mne.VectorSourceEstimate)):
        ss = SourceSpace(vertices, subject, src, subjects_dir, parc, filename=sss_filename)
        is_vector = isinstance(stc, mne.VectorSourceEstimate)

Due to the None when constructing VolumeSourceSpace, parc='aparc+aseg' is ignored. In the documentation of eelbrain.VolumeSourceSpace you write that parc "Only applies to ico source spaces, default is ‘aparc’.". Is there a specific reason why it is not implemented for e.g. vol-7 source spaces?

Thank you! Best, Patrick

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.