Code Monkey home page Code Monkey logo

pymapvbvd's People

Contributors

aaronhess avatar alexcraven avatar dimitripapadopoulos avatar frankzijlstra avatar jakobasslaender avatar musicinmybrain avatar p-sodmann avatar shahdloo avatar wtclarke avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

pymapvbvd's Issues

Unnecessary dependencies in install_requires

A couple of the dependencies that setup.py reads from requirements.yml to form the install_requires do not really need to be runtime dependencies.

  • setuptools is really only needed to build a wheel
  • pytest is only used for running the tests

I don’t know much about the requirements.yml file, and every Python project seems to handle this differently, but would it be possible to move these out into separate requirements.txt style files, or add a test extra, so that installing the package doesn’t bring in pytest in particular?

Big-endian host support?

I’m working on packaging this for Fedora Linux as a dependency for spec2nii. One of the Fedora Linux primary architectures, s390x, is big-endian. It looks like pymapvbvd assumes that both the binary file format and the host running the code are little-endian. Is there any interest in making the code endian-independent in the future?

If the file format is always little-endian, this should mostly be a matter of making endianness explicit when creating a numpy array that is to be copied bytewise in or out of the file: for example, np.dtype('<u4') instead of np.uint32. However, there are a lot of places this would need to be done, and it would require some time and study for me to offer a careful patch—probably more time than I want to spend on it.

Would you like to support big-endian architectures, or is this something you are—understandably—not prepared to take the time to implement?

Loading a large (> 5 GB) .dat file is reducing the number of lines in the Twix object

Running into an issue that when I load a large .dat file the number of lines twix_obj.image.NLin is different from the number of acquisitions twix_obj.image.NAcq. This is not the case when I load a smaller dataset, and also not what I expect from loading the same .dat file in Matlab and get the twix object there.

Because of this issue I am unable to convert the Matlab code I have to Python since I lose part of my data in my Python code. I tried looking into the different flags, but couldn't find the one that could be responsible for this. If this would be a simple flag, please let me know which I should set.

image

Tests appear to hang forever

python3 -m venv _e
. _e/bin/activate
pip install -e .
python -m pytest -v
============================= test session starts ==============================
platform linux -- Python 3.11.4, pytest-7.4.0, pluggy-1.2.0 -- /home/ben/src/pymapvbvd/_e/bin/python
cachedir: .pytest_cache
rootdir: /home/ben/src/pymapvbvd
collected 9 items                                                              

tests/test_flags.py::test_flagRemoveOS 

Hangs forever; when interrupted, prints:

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! KeyboardInterrupt !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
/home/ben/src/pymapvbvd/mapvbvd/read_twix_hdr.py:165: KeyboardInterrupt
(to show a full traceback on KeyboardInterrupt use --full-trace)
============================ no tests ran in 28.49s ============================

The Python version does not seem to matter; I have tried 3.11 and 3.7.

If I pass -k 'not test_FlagRemoveOS', then the next test (test_flagIgnoreSeg_flagDoAverage) hangs, and so on.

Am I missing anything here?

With --full-trace, I get the following when I interrupt the test:

test-hang.log

Have the same issue but different reasons to #6

Hello,

I have the same error as #6, but for a different reason.

I saw that you modified the code in mapVBVD.py to cPos = cPos + int(ulDMALength).
Since I installed the mapVBVD tool after you push this modification, my mapVBVD module does have the modified code in mapVBVD.py.

However, I have the same error when I run obj[''] and the error shows like

read data: 48%|████▊ | 5954/12342 [00:05<00:07, 900.44it/s]C:\Users\z0048drc\AppData\Local\miniconda3\envs\Work-ismrmrd\lib\site-packages\mapvbvd\twix_map_obj.py:766: RuntimeWarning: invalid value encountered in add
raw = (raw[:, 0] + 1j * raw[:, 1]).reshape(readShape, order='F')
Traceback (most recent call last):
File "C:\Program Files\JetBrains\PyCharm 2022.3.2\plugins\python\helpers\pydev\pydevd.py", line 1496, in _exec
pydev_imports.execfile(file, globals, locals) # execute the script
File "C:\Program Files\JetBrains\PyCharm 2022.3.2\plugins\python\helpers\pydev_pydev_imps_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "C:\Users\z0048drc\Documents\Github\cinevn-helpers-master\cinevn-helpers-master\test.py", line 20, in
kdata = obj['']
File "C:\Users\z0048drc\AppData\Local\miniconda3\envs\Work-ismrmrd\lib\site-packages\mapvbvd\twix_map_obj.py", line 657, in getitem
out = self.readData(mem, ixToTarg, ixToRaw, selRange, selRangeSz, outSize)
File "C:\Users\z0048drc\AppData\Local\miniconda3\envs\Work-ismrmrd\lib\site-packages\mapvbvd\twix_map_obj.py", line 759, in readData
fid.seek(mem[k] + szScanHeader, 0)
OSError: [Errno 22] Invalid argument

I found that the reason for this error was very large negative values in mem variable.
Actually, mem is passed to readData() in txiw_map_obj.py from __getitem__() with positive values, but in float64 data type.
Based on the current installation version (2023.02.20), the code at Line 638, mem = mem.astype(int) causes memory overflow resulting in very large values.

There are two screenshots showing these results.
2023-02-21 22_13_49-Evaluate
2023-02-21 22_13_38-Evaluate

The first one shows memory overflow, and the second is mem passed from __getitem()__.

How can I handle this problem?

Interestingly, the Siemens raw file contains two scans, ref scan and the main scan. When I try to extract the image for the ref scan, which is quite small data, then the error has not occurred.

Thanks a lot in advance!

Possible collaboration

I'm glad to see more software for working with this data, especially in the Python space. You inspired me to release some of my own code (https://github.com/moloney/twix/), which was based on some code in VeSPA but modified to be more efficient particularly when it comes to memory use. I later modified it based on the Matlab "mapvbvd" code to work with the newer format. Anyway I think the API if fairly nice, but it is probably a little bit behind in terms of features as I haven't touched it if a few years. I am starting to ramp up some other work in this space so do plan to do more work on it in the near future, and likely use your code at least as a reference.

So this isn't really any concrete proposal, so much as I just want to open up conversation and spread some awareness. Thanks for releasing your work!

Reading fails on Windows for files >2GB

In twix_map_obj.readData, mem is cast to int, which on Windows defaults to int32. This fails when data indices are larger than 2**31 - 1, i.e. on files larger than 2GB. The cast should be explicitly specify np.int64.

Mutable default values

Python 3.11 reports this bug related to the use of dataclasses:

ValueError: mutable default <class 'numpy.ndarray'> for field sz is not allowed: use default_factory

Need to use default factory functions to avoid mutable default values (np.array):

@dataclass
class FRI:
szScanHeader: int # bytes
szChannelHeader: int # bytes
iceParamSz: int
sz: np.array = np.zeros(2)
shape: np.array = np.zeros(2)
cut: np.array = None

See also wtclarke/spec2nii#55.

Loading shows different behavioural on different datasets

Hello @ALL,
im doing scientific research about coil compression possibilities. To be able to do everything in just one programming language, i decided to use this module instead of importing some matlab-functionalities to python.
While working i found an error i can't really explain to myself:
I work with to different .dat MRI-files, 1) with dimensions (256, 20, 261, 1, 20, 1, 1, 1, 3, 1, 9, 1, 1, 1, 1, 1) and 2) with dimensions (256, 20,208, 1, 44, 1, 1, 1, 3, 1, 1, 1, 1, 1, 1, 1) after removing oversampling. Both times i want to load the whole dataset in the beginning with data = twixObj.image[''] . But this works just for dataset 1. If i use this on the second dataset, i get this error-message:

line 23, in data = twixObj.image['']
line 646, in getitem out = self.readData(mem, ixToTarg, ixToRaw, selRange, selRangeSz, outSize)
line 748, in readData fid.seek(mem[k] + szScanHeader, 0)
OSError: [Errno 22] Invalid argument

If i use for example this command data= twixObj.image[:,:,:,:,20] , the loading works...but i can never load the whole data-set 2!
Any ideas why this happens? If you want, i can try to share the dataset which does not work well with the module on some platform, so you can reproduce the error.
Many thanks

MRSI data not being handled the same as original matlab version.

Attached are modifications I made to the twix_map_obj.py script. My discussions of variables are commented near changes I made. Primary issues were the ixToRaw, tmp, and maar variables. There was also some broadcasting issues coming about from MRSI data, but I believe that is fixed by changing the skipLin and skipPar variables. Maybe you have more robust ideas to fix these issues. Due to the size of MRSI data I cannot upload a test set here for you to try unfortunately.

Cheers,
Andrew Wright

amw_issue_zip.zip

Wrong size when reading noise.unsorted()

Hi, when reading a file and accessing the data with sens_object.noise.unsorted() I get an array with the size of [256,30,256]
The original Matlab code I am porting from, has a shape of [512, 30, 256]

The information below suggests, that there are 512 columns in the data?

twix_map_obj
File: data/[...]AdjCoilSens.dat
Software: vd
Number of acquisitions read 256
Data size is [512, 30,128, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
Squeezed data size is [256,30,128,2] (['Col', 'Cha', 'Lin', 'Ave'])
NCol = 512
NCha = 30
NLin = 128
NAve = 2

int16 vs. uint8 missmatch

Hi!

Thank a bunch for this toolbox, it's a great helper! Yet I have run into a bug: when I call the toolbox from within Python as intended, everything works fine. However, I would like to call it directly from Julia with the PyCall.jl package, and this case seems to be, for whatever reason, more sensitive to type inconsistencies. More precisely, when calling twixObj = mavbvd.mapVBVD(filename), I get the following error message:

UFuncTypeError(<ufunc 'bitwise_and'>, 'same_kind', dtype('int16'), dtype('uint8'), 2)
  File "~/.local/lib/python3.9/site-packages/mapvbvd/mapVBVD.py", line 360, in mapVBVD
    [mdh, mask] = evalMDH(mdh_blob, version)
  File "~/.local/lib/python3.9/site-packages/mapvbvd/mapVBVD.py", line 180, in evalMDH
    ulPCI_rx = set_bit(ulPCI_rx, 8, False)
  File "~/.local/lib/python3.9/site-packages/mapvbvd/mapVBVD.py", line 19, in set_bit
    v &= mask  # Clear the bit indicated by the mask (if x is False)

The issue is resolved by replacing the line 180 in mapVBVD.py:
ulPCI_rx = set_bit(ulPCI_rx, 8, False)
with
ulPCI_rx = set_bit(np.int16(ulPCI_rx), 8, False)

I can create a pull request with this change if you want me to, but I was wondering if you would know of a cleaner solution. Thanks for looking into this!

-ja

Conversion of an array with ndim > 0 to a scalar is deprecated

python3.11 -m venv _e
. _e/bin/activate
pip install -e .[tests]
python -m pytest
================================ warnings summary ================================
tests/test_flags.py: 6 warnings
tests/test_read.py: 4 warnings
tests/test_svs.py: 2 warnings
  /home/ben/src/forks/pymapvbvd/mapvbvd/mapVBVD.py:343: DeprecationWarning: Conversion of an array with ndim > 0 to a scalar is deprecated, and will error in future. Ensure you extract a single element from your array before performing this operation. (Deprecated NumPy 1.25.)
    measOffset[k] = np.fromfile(fid, dtype=np.uint64, count=1, offset=0)

tests/test_flags.py: 6 warnings
tests/test_read.py: 4 warnings
tests/test_svs.py: 2 warnings
  /home/ben/src/forks/pymapvbvd/mapvbvd/mapVBVD.py:344: DeprecationWarning: Conversion of an array with ndim > 0 to a scalar is deprecated, and will error in future. Ensure you extract a single element from your array before performing this operation. (Deprecated NumPy 1.25.)
    measLength[k] = np.fromfile(fid, dtype=np.uint64, count=1, offset=0)

tests/test_flags.py::test_flagRemoveOS
tests/test_flags.py::test_flagIgnoreSeg_flagDoAverage
tests/test_flags.py::test_flagSkipToFirstLine
tests/test_read.py::test_gre
tests/test_read.py::test_epi
tests/test_svs.py::test_ve
  /home/ben/src/forks/pymapvbvd/mapvbvd/mapVBVD.py:125: DeprecationWarning: Conversion of an array with ndim > 0 to a scalar is deprecated, and will error in future. Ensure you extract a single element from your array before performing this operation. (Deprecated NumPy 1.25.)
    ulDMALength = float(tmp)

tests/test_flags.py: 6 warnings
tests/test_read.py: 4 warnings
tests/test_slicing.py: 1 warning
tests/test_svs.py: 3 warnings
  /home/ben/src/forks/pymapvbvd/mapvbvd/mapVBVD.py:111: DeprecationWarning: Conversion of an array with ndim > 0 to a scalar is deprecated, and will error in future. Ensure you extract a single element from your array before performing this operation. (Deprecated NumPy 1.25.)
    ulDMALength = float(tmp)

The easiest way to fix the DeprecationWarning is to use the numpy.ndarray.item() method whenever you need to get the scalar from a single-element array. PR to follow.

IndexError: shape mismatch

When I am trying to use my own data file I am getting the following error. This file works fine in MATLAB and the original VBVD.

The command I am using:

twix.image.squeeze = False
out = twix.image['']

And the error:

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
 in 
      1 # Which can be squeezed
      2 twix.image.squeeze = False
----> 3 out = twix.image['']

~/usr/local/miniconda3/envs/pytorch-latest/lib/python3.8/site-packages/mapVBVD/twix_map_obj.py in __getitem__(self, key)
    377         tmp = np.arange(0,self.fullSize[2:].prod().astype(int)).reshape(self.fullSize[2:].astype(int))
    378         #tmpSelRange = [x-1 for x in selRange] # python indexing from 0
--> 379         tmp = tmp[tuple(selRange[2:])]
    380         ixToRaw = ixToRaw[tmp]
    381         ixToRaw = ixToRaw.ravel()

IndexError: shape mismatch: indexing arrays could not be broadcast together with shapes (360,) (112,) (1,) (1,) (1,) (1,) (1,) (1,) (12,) (1,) (1,) (1,) (1,) (1,) 

Any help is appreciated.

Compatibility with Python 3.6.x

Are there any Python 3.7 specific functionalities that are leveraged? If not, can the PyPI release be updated to support Python 3.6.x? Happy open a PR if that helps!

selecting indexed data

Hi, is it possible to select a set of indices while reading data? i.e.
slice_index = np.array([0, 1, 2, 30, 31, 32])
out = twixObj.image[:,:,:,0,slice_index,0,0,:,:,0,:,0,0,0,0,0]

I am able to select and read a subset of data using array slicing syntax, but when I try to read a specific set of slice indices (as above), I get: ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all().

I am able to retrieve the indices I need after reading the full dataset:
out = twixObj.image['']
out = out[:,:,:,0,slice_index,0,0,:,:,0,:,0,0,0,0,0]
but would prefer to only read the relevant data as I need it.

Thanks for your help

version 5.2

Hi William,

Thank you very much for this nice package. I recently updated from version 4.8 to 5.2 and after I had done that it was no longer possible to load my twix files in python. By downgrading the package worked fine again. I am using Python3.9

It did not work either on non spatial and spatial data.

Kind regards Emil

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.