Code Monkey home page Code Monkey logo

napari-skimage-regionprops's Introduction

napari-skimage-regionprops (nsr)

License PyPI Python Version tests codecov Development Status napari hub

A napari plugin for measuring properties of labeled objects based on scikit-image

Usage: measure region properties

From the menu Tools > Measurement > Regionprops (nsr) you can open a dialog where you can choose an intensity image, a corresponding label image and the features you want to measure:

img.png

If you want to interface with the labels and see which table row corresponds to which labeled object, use the label picker and activate the show selected checkbox.

If you closed a table and want to reopen it, you can use the menu Tools > Measurements > Show table (nsr) to reopen it. You just need to select the labels layer the properties are associated with.

For visualizing measurements with different grey values, as parametric images, you can double-click table headers.

img.png

Usage: measure point intensities

Analogously, also the intensity and coordinates of point layers can be measured using the menu Tools > Measurement > Measure intensity at point coordinates (nsr). Also these measurements can be visualized by double-clicking table headers:

img.png

img_1.png

Working with time-lapse and tracking data

Note that tables for time-lapse data should include a column named "frame", which indicates which slice in time the given row refers to. If you want to import your own csv files for time-lapse data make sure to include this column. If you have tracking data where each column specifies measurements for a track instead of a label at a specific time point, this column must not be added.

In case you have 2D time-lapse data you need to convert it into a suitable shape using the function: Tools > Utilities > Convert 3D stack to 2D time-lapse (time-slicer), which can be found in the napari time slicer.

Last but not least, make sure that in case of time-lapse data the label image has labels that are subsquently labeled per timepoint. E.g. a dataset where label 5 is missing at timepoint 4 may be visualized incorrectly.

Usage: multichannel or multi-label data

If you want to relate objects from one channels to objects from another channel, you can use Tools > Measurement tables > Object Features/Properties (scikit-image, nsr). This plugin module allos you to answer questions like:

  • how many objects I have inside other objects?
  • what is the average intensity of the objects inside other objects? For that, you need at least two labeled images in napari. You can relate objects along with their features. If intensity features are also wanted, then you also need to provide two intensity images. Below, there is a small example on how to use it. Also, take a look at this example notebook.

Usage, programmatically

You can also control the tables programmatically. See this example notebook for details on regionprops and this example notebook for details on measuring intensity at point coordinates. For creating parametric map images, see this notebook.

Features

The user can select categories of features for feature extraction in the user interface. These categories contain measurements from the scikit-image regionprops list of measurements library:

  • size:
    • area (given as number of pixels in 2D, voxels in 3D)
    • bbox_area
    • convex_area
    • equivalent_diameter
  • intensity:
    • max_intensity
    • mean_intensity
    • min_intensity
    • standard_deviation_intensity (extra_properties implementation using numpy)
  • perimeter:
    • perimeter
    • perimeter_crofton
  • shape
    • major_axis_length
    • minor_axis_length
    • orientation
    • solidity
    • eccentricity
    • extent
    • feret_diameter_max
    • local_centroid
    • roundness as defined for 2D labels by ImageJ
    • circularity as defined for 2D labels by ImageJ
    • aspect_ratio as defined for 2D labels by ImageJ
  • position:
    • centroid
    • bbox
    • weighted_centroid
  • moments:
    • moments
    • moments_central
    • moments_hu
    • moments_normalized

This napari plugin was generated with Cookiecutter using with @napari's cookiecutter-napari-plugin template.

See also

There are other napari plugins with similar functionality for extracting features:

Furthermore, there are plugins for postprocessing extracted measurements

Installation

You can install napari-skimage-regionprops via pip:

pip install napari-skimage-regionprops

Or if you plan to develop it:

git clone https://github.com/haesleinhuepf/napari-skimage-regionprops
cd napari-skimage-regionprops
pip install -e .

If there is an error message suggesting that git is not installed, run conda install git.

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

License

Distributed under the terms of the BSD-3 license, "napari-skimage-regionprops" is free and open source software

Issues

If you encounter any problems, please create a thread on image.sc along with a detailed description and tag @haesleinhuepf.

napari-skimage-regionprops's People

Contributors

chili-chiu avatar cryaaa avatar haesleinhuepf avatar jo-mueller avatar lazigu avatar romainguiet avatar tcompa avatar zoccoler avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

napari-skimage-regionprops's Issues

Error when adding table with time frames

First of all thanks for the awesome plugin! It provides exactly the functionality I want for visualizing segmentation features in napari.
My "normal mode of operation" would be to call this directly from python using add_table, since this offers most flexibility to look at classification predictions from a network, clustering results etc. in addition to standard regionprops features.
This works as expected for 2d data, as explained here: https://forum.image.sc/t/exloring-image-segment-features-in-napari/75222/7?u=constantinpape

However, I run into an issue when using this for 2d data with time; here is a minimal reproducable example:

import numpy as np
import napari
import pandas as pd
from skimage.data import binary_blobs
from skimage.measure import regionprops_table, label
from napari_skimage_regionprops._table import add_table

# a random image with 3 timepoints
image = np.random.rand(3, 128, 128)

# and a segmentation for the 3 timepoints
segmentation = np.stack([label(binary_blobs(128, volume_fraction=0.25)) for _ in range(3)])
print(image.shape, segmentation.shape)

# compute the features for each timepoint
features = []
for t, (seg, im) in enumerate(zip(segmentation, image)):
    feats = regionprops_table(seg, im, properties=("label", "mean_intensity"))
    # add the frame column
    feats["frame"] = np.full(len(feats["label"]), t)
    features.append(pd.DataFrame(feats))
features = pd.concat(features, axis=0)

v = napari.Viewer()
v.add_image(image)
label_layer = v.add_labels(segmentation)
label_layer.features = features
add_table(label_layer, v)
napari.run()

This opens napari, but when I try to select a label in the table or image nothing happens and the following error is thrown:

UnboundLocalError                         Traceback (most recent call last)
File ~/Work/my_projects/napari-skimage-regionprops/napari_skimage_regionprops/_table.py:120, in TableWidget._after_labels_clicked(self=<napari_skimage_regionprops._table.TableWidget object>)
    118 if frame_column is not None and self._viewer is not None:
    119     for r, (l, f) in enumerate(zip(self._table["label"], self._table[frame_column])):
--> 120         if l == self._layer.selected_label and f == frame:
        self._layer = <Labels layer 'segmentation' at 0x7f1f47ba9e80>
        self = <napari_skimage_regionprops._table.TableWidget object at 0x7f1f442def70>
        l = 12
        f = 0
    121             self._view.setCurrentCell(r, self._view.currentColumn())
    122             break

UnboundLocalError: local variable 'frame' referenced before assignment

'area' is actually 'volume' for 3D data

scikit-image regionprops returns the same feature called 'area' for 2D and 3D images.

Looking at the documentation, this 'area' is actually the number of pixels/voxels in an object (since version 0.20 those pixels/voxels are scaled by the pixel/voxel size given by an extra optional argument called spacing).

So, in 3D, this would actually be a volume. The term 'area' might be misleading.

One potential simple fix here suggested by @haesleinhuepf to keep backwards compatibility would be to add an extra column named 'volume' for 3D data which has the same content as 'area'.

Taking layer scale into account for regionprops

Currently, the scale of Napari layers doesn't seem to be taken into account during the regionprops calculations. This means that e.g. areas aren't scaled by the given pixel size, and are just given in pixel units.

regionprops_table has a 'spacing' parameter which the layer .scale could be passed into: https://scikit-image.org/docs/stable/api/skimage.measure.html#skimage.measure.regionprops_table , which currently isn't used:

table = sk_regionprops_table(np.asarray(labels).astype(int), intensity_image=np.asarray(image),

Would it be possible to add this? Thanks!

relabel image only works with python lists and not numpy lists

When using the relabel image function it only works with python lists because the function tries to add the first element of the predictionlist the pythonic way with a +:

def relabel_cle(image, measurements):
    import pyclesperanto_prototype as cle
    return cle.pull(cle.replace_intensities(image, numpy.asarray([0] + measurements)))

def relabel_numpy(image, measurements):
    return numpy.take(numpy.array([0] + measurements), image)

Instead we could reform the measurement list like this:

return numpy.take(np.insert(np.array(measurement), 0, 0))

This would make the function compatible with python as well as numpy lists.

Error running regionprops with "shape" checkbox selected

I get an error when I try to run this plugin with the "shape" regionprops descriptor turned on.

I've pip installed napari-skiamge-regionprops into my napari conda environment.

Version numbers:

  • napari-skimage-regionprops 0.5.3
  • scikit-image 0.19.3
  • napari 0.4.16
  • vispy 0.10.0
  • Operating system: Windows 10
Full conda list (click to expand!)
# packages in environment at C:\Users\CryoEM\.conda\envs\napari-empanada:
#
# Name                    Version                   Build  Channel
alabaster                 0.7.12                     py_0    conda-forge
aom                       3.4.0                h0e60522_1    conda-forge
appdirs                   1.4.4              pyh9f0ad1d_0    conda-forge
argon2-cffi               21.3.0             pyhd8ed1ab_0    conda-forge
argon2-cffi-bindings      21.2.0           py39hb82d6ee_2    conda-forge
asciitree                 0.3.3                      py_2    conda-forge
asttokens                 2.0.5              pyhd8ed1ab_0    conda-forge
attrs                     21.4.0             pyhd8ed1ab_0    conda-forge
autopep8                  1.6.0                    pypi_0    pypi
babel                     2.10.3             pyhd8ed1ab_0    conda-forge
backcall                  0.2.0              pyh9f0ad1d_0    conda-forge
backports                 1.0                        py_2    conda-forge
backports.functools_lru_cache 1.6.4              pyhd8ed1ab_0    conda-forge
beautifulsoup4            4.11.1             pyha770c72_0    conda-forge
bleach                    5.0.1              pyhd8ed1ab_0    conda-forge
blosc                     1.21.1               h74325e0_3    conda-forge
bokeh                     2.4.3            py39hcbf5309_0    conda-forge
brotlipy                  0.7.0           py39hb82d6ee_1004    conda-forge
build                     0.7.0              pyhd8ed1ab_0    conda-forge
bzip2                     1.0.8                h8ffe710_4    conda-forge
c-blosc2                  2.2.0                hdf67494_0    conda-forge
ca-certificates           2022.6.15            h5b45459_0    conda-forge
cachey                    0.2.1              pyh9f0ad1d_0    conda-forge
certifi                   2022.6.15        py39hcbf5309_0    conda-forge
cffi                      1.15.1           py39h0878f49_0    conda-forge
cfitsio                   4.1.0                h5a969a9_0    conda-forge
charls                    2.3.4                h39d44d4_0    conda-forge
charset-normalizer        2.1.0              pyhd8ed1ab_0    conda-forge
click                     8.1.3            py39hcbf5309_0    conda-forge
cloudpickle               2.1.0              pyhd8ed1ab_0    conda-forge
colorama                  0.4.5              pyhd8ed1ab_0    conda-forge
commonmark                0.9.1                      py_0    conda-forge
connected-components-3d   3.10.1                   pypi_0    pypi
cryptography              37.0.4           py39h7bc7c5c_0    conda-forge
cytoolz                   0.12.0           py39hb82d6ee_0    conda-forge
dask                      2022.6.1           pyhd8ed1ab_0    conda-forge
dask-core                 2022.6.1           pyhd8ed1ab_0    conda-forge
dataclasses               0.8                pyhc8e2a94_3    conda-forge
dav1d                     1.0.0                h8ffe710_1    conda-forge
debugpy                   1.6.0            py39h415ef7b_0    conda-forge
decorator                 5.1.1              pyhd8ed1ab_0    conda-forge
defusedxml                0.7.1              pyhd8ed1ab_0    conda-forge
distributed               2022.6.1           pyhd8ed1ab_0    conda-forge
docstring_parser          0.13               pyhd8ed1ab_0    conda-forge
docutils                  0.18.1           py39hcbf5309_1    conda-forge
empanada-dl               0.1.4                    pypi_0    pypi
empanada-napari           0.2.0                    pypi_0    pypi
entrypoints               0.4                pyhd8ed1ab_0    conda-forge
executing                 0.9.0              pyhd8ed1ab_0    conda-forge
fasteners                 0.17.3             pyhd8ed1ab_0    conda-forge
flit-core                 3.7.1              pyhd8ed1ab_0    conda-forge
freetype                  2.10.4               h546665d_1    conda-forge
freetype-py               2.3.0              pyhd8ed1ab_0    conda-forge
fsspec                    2022.5.0           pyhd8ed1ab_0    conda-forge
future                    0.18.2           py39hcbf5309_5    conda-forge
gettext                   0.19.8.1          ha2e2712_1008    conda-forge
giflib                    5.2.1                h8d14728_2    conda-forge
glib                      2.72.1               h7755175_0    conda-forge
glib-tools                2.72.1               h7755175_0    conda-forge
gst-plugins-base          1.20.3               he07aa86_0    conda-forge
gstreamer                 1.20.3               hdff456e_0    conda-forge
heapdict                  1.0.1                      py_0    conda-forge
hsluv                     5.0.2              pyh44b312d_0    conda-forge
icu                       70.1                 h0e60522_0    conda-forge
idna                      3.3                pyhd8ed1ab_0    conda-forge
imagecodecs               2022.2.22        py39h43fe4e9_6    conda-forge
imageio                   2.19.3             pyhcf75d05_0    conda-forge
imagesize                 1.4.1              pyhd8ed1ab_0    conda-forge
importlib-metadata        4.11.4           py39hcbf5309_0    conda-forge
importlib_resources       5.9.0              pyhd8ed1ab_0    conda-forge
intel-openmp              2022.1.0          h57928b3_3787    conda-forge
ipykernel                 6.15.1             pyh025b116_0    conda-forge
ipython                   8.4.0            py39hcbf5309_0    conda-forge
ipython_genutils          0.2.0                      py_1    conda-forge
ipywidgets                7.7.1              pyhd8ed1ab_0    conda-forge
jedi                      0.18.1           py39hcbf5309_1    conda-forge
jinja2                    3.1.2              pyhd8ed1ab_1    conda-forge
jpeg                      9e                   h8ffe710_2    conda-forge
jsonschema                4.7.2              pyhd8ed1ab_0    conda-forge
jupyter                   1.0.0            py39hcbf5309_7    conda-forge
jupyter_client            7.3.4              pyhd8ed1ab_0    conda-forge
jupyter_console           6.4.4              pyhd8ed1ab_0    conda-forge
jupyter_core              4.11.1           py39hcbf5309_0    conda-forge
jupyterlab_pygments       0.2.2              pyhd8ed1ab_0    conda-forge
jupyterlab_widgets        1.1.1              pyhd8ed1ab_0    conda-forge
jxrlib                    1.1                  h8ffe710_2    conda-forge
kiwisolver                1.4.4            py39h2e07f2f_0    conda-forge
krb5                      1.19.3               h1176d77_0    conda-forge
lcms2                     2.12                 h2a16943_0    conda-forge
lerc                      3.0                  h0e60522_0    conda-forge
libaec                    1.0.6                h39d44d4_0    conda-forge
libavif                   0.10.1               h8ffe710_1    conda-forge
libblas                   3.9.0              15_win64_mkl    conda-forge
libbrotlicommon           1.0.9                h8ffe710_7    conda-forge
libbrotlidec              1.0.9                h8ffe710_7    conda-forge
libbrotlienc              1.0.9                h8ffe710_7    conda-forge
libcblas                  3.9.0              15_win64_mkl    conda-forge
libclang                  14.0.6          default_h77d9078_0    conda-forge
libclang13                14.0.6          default_h77d9078_0    conda-forge
libcurl                   7.83.1               h789b8ee_0    conda-forge
libdeflate                1.12                 h8ffe710_0    conda-forge
libffi                    3.4.2                h8ffe710_5    conda-forge
libglib                   2.72.1               h3be07f2_0    conda-forge
libiconv                  1.16                 he774522_0    conda-forge
liblapack                 3.9.0              15_win64_mkl    conda-forge
libogg                    1.3.4                h8ffe710_1    conda-forge
libpng                    1.6.37               h1d00b33_3    conda-forge
libsodium                 1.0.18               h8d14728_1    conda-forge
libssh2                   1.10.0               h680486a_2    conda-forge
libtiff                   4.4.0                h2ed3b44_1    conda-forge
libvorbis                 1.3.7                h0e60522_0    conda-forge
libwebp                   1.2.3                h8ffe710_1    conda-forge
libwebp-base              1.2.3                h8ffe710_2    conda-forge
libxcb                    1.13              hcd874cb_1004    conda-forge
libzlib                   1.2.12               h8ffe710_2    conda-forge
libzopfli                 1.0.3                h0e60522_0    conda-forge
llvmlite                  0.38.1                   pypi_0    pypi
locket                    1.0.0              pyhd8ed1ab_0    conda-forge
lz4                       4.0.0            py39h0878066_2    conda-forge
lz4-c                     1.9.3                h8ffe710_1    conda-forge
m2w64-gcc-libgfortran     5.3.0                         6    conda-forge
m2w64-gcc-libs            5.3.0                         7    conda-forge
m2w64-gcc-libs-core       5.3.0                         7    conda-forge
m2w64-gmp                 6.1.0                         2    conda-forge
m2w64-libwinpthread-git   5.0.0.4634.697f757               2    conda-forge
magicgui                  0.5.1              pyhd8ed1ab_0    conda-forge
markupsafe                2.1.1            py39hb82d6ee_1    conda-forge
matplotlib-inline         0.1.3              pyhd8ed1ab_0    conda-forge
mistune                   0.8.4           py39hb82d6ee_1005    conda-forge
mkl                       2022.1.0           h6a75c08_874    conda-forge
msgpack-python            1.0.4            py39h2e07f2f_0    conda-forge
msys2-conda-epoch         20160418                      1    conda-forge
napari                    0.4.16          pyh275ddea_0_pyqt    conda-forge
napari-console            0.0.4              pyhd8ed1ab_1    conda-forge
napari-plugin-engine      0.2.0              pyhd8ed1ab_2    conda-forge
napari-skimage-regionprops 0.5.3                    pypi_0    pypi
napari-svg                0.1.6              pyhd8ed1ab_0    conda-forge
napari-tools-menu         0.1.15                   pypi_0    pypi
napari-workflows          0.2.3                    pypi_0    pypi
nbclient                  0.6.6              pyhd8ed1ab_0    conda-forge
nbconvert                 6.5.0              pyhd8ed1ab_0    conda-forge
nbconvert-core            6.5.0              pyhd8ed1ab_0    conda-forge
nbconvert-pandoc          6.5.0              pyhd8ed1ab_0    conda-forge
nbformat                  5.4.0              pyhd8ed1ab_0    conda-forge
nest-asyncio              1.5.5              pyhd8ed1ab_0    conda-forge
networkx                  2.8.5              pyhd8ed1ab_0    conda-forge
notebook                  6.4.12             pyha770c72_0    conda-forge
npe2                      0.5.1              pyhd8ed1ab_0    conda-forge
numba                     0.55.2                   pypi_0    pypi
numcodecs                 0.10.0           py39h415ef7b_1    conda-forge
numpy                     1.22.4                   pypi_0    pypi
numpydoc                  1.4.0              pyhd8ed1ab_1    conda-forge
opencv-python             4.6.0.66                 pypi_0    pypi
openjpeg                  2.4.0                hb211442_1    conda-forge
openssl                   1.1.1q               h8ffe710_0    conda-forge
packaging                 21.3               pyhd8ed1ab_0    conda-forge
pandas                    1.4.3            py39h2e25243_0    conda-forge
pandoc                    2.18                 h57928b3_0    conda-forge
pandocfilters             1.5.0              pyhd8ed1ab_0    conda-forge
parso                     0.8.3              pyhd8ed1ab_0    conda-forge
partd                     1.2.0              pyhd8ed1ab_0    conda-forge
pcre                      8.45                 h0e60522_0    conda-forge
pep517                    0.12.0           py39hcbf5309_2    conda-forge
pickleshare               0.7.5                   py_1003    conda-forge
pillow                    9.2.0            py39ha53f419_0    conda-forge
pint                      0.19.2             pyhd8ed1ab_0    conda-forge
pip                       22.2               pyhd8ed1ab_0    conda-forge
ply                       3.11                       py_1    conda-forge
pooch                     1.6.0              pyhd8ed1ab_0    conda-forge
prometheus_client         0.14.1             pyhd8ed1ab_0    conda-forge
prompt-toolkit            3.0.30             pyha770c72_0    conda-forge
prompt_toolkit            3.0.30               hd8ed1ab_0    conda-forge
psutil                    5.9.1            py39hb82d6ee_0    conda-forge
psygnal                   0.3.5            py39h2e07f2f_0    conda-forge
pthread-stubs             0.4               hcd874cb_1001    conda-forge
pure_eval                 0.2.2              pyhd8ed1ab_0    conda-forge
pycodestyle               2.8.0                    pypi_0    pypi
pycparser                 2.21               pyhd8ed1ab_0    conda-forge
pydantic                  1.9.1            py39hb82d6ee_0    conda-forge
pygments                  2.12.0             pyhd8ed1ab_0    conda-forge
pyopengl                  3.1.6              pyhd8ed1ab_1    conda-forge
pyopenssl                 22.0.0             pyhd8ed1ab_0    conda-forge
pyparsing                 3.0.9              pyhd8ed1ab_0    conda-forge
pyqt                      5.15.7           py39hb08f45d_0    conda-forge
pyqt5-sip                 12.11.0          py39h415ef7b_0    conda-forge
pyrsistent                0.18.1           py39hb82d6ee_1    conda-forge
pysocks                   1.7.1            py39hcbf5309_5    conda-forge
python                    3.9.13          h9a09f29_0_cpython    conda-forge
python-dateutil           2.8.2              pyhd8ed1ab_0    conda-forge
python-fastjsonschema     2.16.1             pyhd8ed1ab_0    conda-forge
python_abi                3.9                      2_cp39    conda-forge
pytomlpp                  1.0.11           py39h1f6ef14_0    conda-forge
pytz                      2022.1             pyhd8ed1ab_0    conda-forge
pywavelets                1.3.0            py39h5d4886f_1    conda-forge
pywin32                   303              py39hb82d6ee_0    conda-forge
pywinpty                  2.0.6            py39h99910a6_0    conda-forge
pyyaml                    6.0              py39hb82d6ee_4    conda-forge
pyzmq                     23.2.0           py39he46f08e_0    conda-forge
qt-main                   5.15.4               h467ea89_2    conda-forge
qtconsole                 5.3.1              pyhd8ed1ab_0    conda-forge
qtconsole-base            5.3.1              pyha770c72_0    conda-forge
qtpy                      2.1.0              pyhd8ed1ab_0    conda-forge
requests                  2.28.1             pyhd8ed1ab_0    conda-forge
rich                      12.5.1             pyhd8ed1ab_0    conda-forge
scikit-image              0.19.3           py39h2e25243_0    conda-forge
scipy                     1.8.1            py39h5567194_2    conda-forge
send2trash                1.8.0              pyhd8ed1ab_0    conda-forge
setuptools                63.2.0           py39hcbf5309_0    conda-forge
shellingham               1.4.0              pyh44b312d_0    conda-forge
sip                       6.6.2            py39h415ef7b_0    conda-forge
six                       1.16.0             pyh6c4a22f_0    conda-forge
snappy                    1.1.9                h82413e6_1    conda-forge
snowballstemmer           2.2.0              pyhd8ed1ab_0    conda-forge
sortedcontainers          2.4.0              pyhd8ed1ab_0    conda-forge
soupsieve                 2.3.2.post1        pyhd8ed1ab_0    conda-forge
sphinx                    5.0.2              pyh6c4a22f_0    conda-forge
sphinxcontrib-applehelp   1.0.2                      py_0    conda-forge
sphinxcontrib-devhelp     1.0.2                      py_0    conda-forge
sphinxcontrib-htmlhelp    2.0.0              pyhd8ed1ab_0    conda-forge
sphinxcontrib-jsmath      1.0.1                      py_0    conda-forge
sphinxcontrib-qthelp      1.0.3                      py_0    conda-forge
sphinxcontrib-serializinghtml 1.1.5              pyhd8ed1ab_2    conda-forge
sqlite                    3.39.2               h8ffe710_0    conda-forge
stack_data                0.3.0              pyhd8ed1ab_0    conda-forge
superqt                   0.3.3              pyhd8ed1ab_0    conda-forge
tbb                       2021.5.0             h2d74725_1    conda-forge
tblib                     1.7.0              pyhd8ed1ab_0    conda-forge
terminado                 0.15.0           py39hcbf5309_0    conda-forge
tifffile                  2022.5.4           pyhd8ed1ab_0    conda-forge
tinycss2                  1.1.1              pyhd8ed1ab_0    conda-forge
tk                        8.6.12               h8ffe710_0    conda-forge
toml                      0.10.2             pyhd8ed1ab_0    conda-forge
tomli                     2.0.1              pyhd8ed1ab_0    conda-forge
toolz                     0.12.0             pyhd8ed1ab_0    conda-forge
torch                     1.12.0                   pypi_0    pypi
torchvision               0.13.0                   pypi_0    pypi
tornado                   6.2              py39hb82d6ee_0    conda-forge
tqdm                      4.64.0             pyhd8ed1ab_0    conda-forge
traitlets                 5.3.0              pyhd8ed1ab_0    conda-forge
typer                     0.6.1              pyhd8ed1ab_0    conda-forge
typing-extensions         4.3.0                hd8ed1ab_0    conda-forge
typing_extensions         4.3.0              pyha770c72_0    conda-forge
tzdata                    2022a                h191b570_0    conda-forge
ucrt                      10.0.20348.0         h57928b3_0    conda-forge
urllib3                   1.26.11            pyhd8ed1ab_0    conda-forge
vc                        14.2                 hb210afc_6    conda-forge
vispy                     0.10.0           py39h5d4886f_0    conda-forge
vs2015_runtime            14.29.30037          h902a5da_6    conda-forge
wcwidth                   0.2.5              pyh9f0ad1d_2    conda-forge
webencodings              0.5.1                      py_1    conda-forge
wheel                     0.37.1             pyhd8ed1ab_0    conda-forge
widgetsnbextension        3.6.1              pyha770c72_0    conda-forge
win_inet_pton             1.1.0            py39hcbf5309_4    conda-forge
winpty                    0.4.3                         4    conda-forge
wrapt                     1.14.1           py39hb82d6ee_0    conda-forge
xorg-libxau               1.0.9                hcd874cb_0    conda-forge
xorg-libxdmcp             1.1.3                hcd874cb_0    conda-forge
xz                        5.2.5                h62dcd97_1    conda-forge
yaml                      0.2.5                h8ffe710_2    conda-forge
zarr                      2.12.0             pyhd8ed1ab_0    conda-forge
zeromq                    4.3.4                h0e60522_1    conda-forge
zfp                       0.5.5                h0e60522_8    conda-forge
zict                      2.2.0              pyhd8ed1ab_0    conda-forge
zipp                      3.8.0              pyhd8ed1ab_0    conda-forge
zlib                      1.2.12               h8ffe710_2    conda-forge
zlib-ng                   2.0.6                h8ffe710_0    conda-forge
zstd                      1.5.2                h6255e5f_2    conda-forge

To reproduce:

import napari
from skimage.filters import gaussian
from skimage.measure import label

# opens sample data and adds layer to the Viewer
viewer = napari.Viewer()
viewer.open_sample('napari', 'cells3d')

# Create labels for nuclei
nuclei = viewer.layers[1].data
mask = gaussian(nuclei, sigma=3) > 0.1
labels = label(mask)
viewer.add_labels(labels)

From the menu: Tools > Measurement > Regionprops (scikit-image nsr)
Select the "shape" regionprops checkbox, and then click "Run".

I get this error message:

  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_marching_cubes_lewiner.py", line 177, in _marching_cubes_lewiner
    raise ValueError("Surface level must be within volume data range.")
ValueError: Surface level must be within volume data range.

Full error message traceback:

Details (click to expand!)
C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py:395: UserWarning: Failed to get convex hull image. Returning empty image, see error message below:
QH6013 qhull input error: input is less than 3-dimensional since all points have the same x coordinate    0

While executing:  | qhull i Qt
Options selected for Qhull 2019.1.r 2019/06/21:
  run-id 1146664760  incidence  Qtriangulate  _pre-merge  _zero-centrum
  _max-width  1  Error-roundoff 1.4e-15  _one-merge 9.7e-15
  _near-inside 4.9e-14  Visible-distance 2.8e-15  U-max-coplanar 2.8e-15
  Width-outside 5.5e-15  _wide-facet 1.7e-14  _maxoutside 1.1e-14

  return convex_hull_image(self.image)
Traceback (most recent call last):
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\magicgui\widgets\_bases\value_widget.py", line 57, in _on_value_change
    self.changed.emit(value)
  File "psygnal\\_signal.py", line 725, in psygnal._signal.SignalInstance.emit
  File "psygnal\\_signal.py", line 767, in psygnal._signal.SignalInstance._run_emit_loop
  File "psygnal\\_signal.py", line 768, in psygnal._signal.SignalInstance._run_emit_loop
  File "psygnal\\_signal.py", line 788, in psygnal._signal.SignalInstance._run_emit_loop
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\magicgui\widgets\_function_gui.py", line 207, in _disable_button_and_call
    self.__call__()
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\magicgui\widgets\_function_gui.py", line 318, in __call__
    value = self._function(*bound.args, **bound.kwargs)
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\napari_tools_menu\__init__.py", line 91, in worker_func
    data = func(*iargs, **ikwargs)
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\napari_skimage_regionprops\_regionprops.py", line 103, in regionprops_table
    table = sk_regionprops_table(np.asarray(labels).astype(int), intensity_image=np.asarray(image),
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py", line 996, in regionprops_table
    return _props_to_dict(
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py", line 807, in _props_to_dict
    column_buffer[i] = regions[i][prop]
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py", line 675, in __getitem__
    value = getattr(self, key, None)
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py", line 434, in feret_diameter_max
    coordinates, _, _, _ = marching_cubes(identity_convex_hull,
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_marching_cubes_lewiner.py", line 133, in marching_cubes
    return _marching_cubes_lewiner(volume, level, spacing,
  File "C:\Users\CryoEM\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_marching_cubes_lewiner.py", line 177, in _marching_cubes_lewiner
    raise ValueError("Surface level must be within volume data range.")
ValueError: Surface level must be within volume data range.

If you suspect this is an IPython 8.4.0 bug, please report it at:
    https://github.com/ipython/ipython/issues
or send an email to the mailing list at [email protected]

You can print a more detailed traceback right now with "%tb", or use "%debug"
to interactively debug it.

Extra-detailed tracebacks for bug-reporting purposes can be enabled via:
    %config Application.verbose_crash=True
...even more details (click to expand!)
In [3]: %tb
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
File ~\.conda\envs\napari-empanada\lib\site-packages\psygnal\_signal.py:725, in psygnal._signal.SignalInstance.emit()

File ~\.conda\envs\napari-empanada\lib\site-packages\psygnal\_signal.py:767, in psygnal._signal.SignalInstance._run_emit_loop()

File ~\.conda\envs\napari-empanada\lib\site-packages\psygnal\_signal.py:768, in psygnal._signal.SignalInstance._run_emit_loop()

File ~\.conda\envs\napari-empanada\lib\site-packages\psygnal\_signal.py:788, in psygnal._signal.SignalInstance._run_emit_loop()

File ~\.conda\envs\napari-empanada\lib\site-packages\magicgui\widgets\_function_gui.py:207, in FunctionGui.__init__.<locals>._disable_button_and_call()
    205 self._call_button.enabled = False
    206 try:
--> 207     self.__call__()
    208 finally:
    209     self._call_button.enabled = True

File ~\.conda\envs\napari-empanada\lib\site-packages\magicgui\widgets\_function_gui.py:318, in FunctionGui.__call__(self, update_widget, *args, **kwargs)
    316 self._tqdm_depth = 0  # reset the tqdm stack count
    317 with _function_name_pointing_to_widget(self):
--> 318     value = self._function(*bound.args, **bound.kwargs)
    320 self._call_count += 1
    321 if self._result_widget is not None:

File ~\.conda\envs\napari-empanada\lib\site-packages\napari_tools_menu\__init__.py:91, in make_gui.<locals>.worker_func(*iargs, **ikwargs)
     89 @wraps(func)
     90 def worker_func(*iargs, **ikwargs):
---> 91     data = func(*iargs, **ikwargs)
     92     if data is None:
     93         return None

File ~\.conda\envs\napari-empanada\lib\site-packages\napari_skimage_regionprops\_regionprops.py:103, in regionprops_table(image, labels, size, intensity, perimeter, shape, position, moments, napari_viewer)
     94 # todo:
     95 # weighted_local_centroid
     96 # weighted_moments
   (...)
    100
    101 # quantitative analysis using scikit-image's regionprops
    102 from skimage.measure import regionprops_table as sk_regionprops_table
--> 103 table = sk_regionprops_table(np.asarray(labels).astype(int), intensity_image=np.asarray(image),
    104                           properties=properties, extra_properties=extra_properties)
    106 if shape:
    107     if len(labels.shape) == 2:
    108         # See https://imagej.nih.gov/ij/docs/menus/analyze.html

File ~\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py:996, in regionprops_table(label_image, intensity_image, properties, cache, separator, extra_properties)
    992     out_d = _props_to_dict(regions, properties=properties,
    993                            separator=separator)
    994     return {k: v[:0] for k, v in out_d.items()}
--> 996 return _props_to_dict(
    997     regions, properties=properties, separator=separator
    998 )

File ~\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py:807, in _props_to_dict(regions, properties, separator)
    805     column_buffer = np.empty(n, dtype=dtype)
    806     for i in range(n):
--> 807         column_buffer[i] = regions[i][prop]
    808     out[orig_prop] = np.copy(column_buffer)
    809 else:

File ~\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py:675, in RegionProperties.__getitem__(self, key)
    674 def __getitem__(self, key):
--> 675     value = getattr(self, key, None)
    676     if value is not None:
    677         return value

File ~\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_regionprops.py:434, in RegionProperties.feret_diameter_max(self)
    431     coordinates = np.vstack(find_contours(identity_convex_hull, .5,
    432                                           fully_connected='high'))
    433 elif self._ndim == 3:
--> 434     coordinates, _, _, _ = marching_cubes(identity_convex_hull,
    435                                           level=.5)
    436 distances = pdist(coordinates, 'sqeuclidean')
    437 return sqrt(np.max(distances))

File ~\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_marching_cubes_lewiner.py:133, in marching_cubes(volume, level, spacing, gradient_direction, step_size, allow_degenerate, method, mask)
     13 """Marching cubes algorithm to find surfaces in 3d volumetric data.
     14
     15 In contrast with Lorensen et al. approach [2]_, Lewiner et
   (...)
    129
    130 """
    132 if method == 'lewiner':
--> 133     return _marching_cubes_lewiner(volume, level, spacing,
    134                                    gradient_direction, step_size,
    135                                    allow_degenerate, use_classic=False,
    136                                    mask=mask)
    137 elif method == 'lorensen':
    138     return _marching_cubes_lewiner(volume, level, spacing,
    139                                    gradient_direction, step_size,
    140                                    allow_degenerate, use_classic=True,
    141                                    mask=mask)

File ~\.conda\envs\napari-empanada\lib\site-packages\skimage\measure\_marching_cubes_lewiner.py:177, in _marching_cubes_lewiner(volume, level, spacing, gradient_direction, step_size, allow_degenerate, use_classic, mask)
    175     level = float(level)
    176     if level < volume.min() or level > volume.max():
--> 177         raise ValueError("Surface level must be within volume data range.")
    178 # spacing
    179 if len(spacing) != 3:

ValueError: Surface level must be within volume data range.

Lazy imports for {napari,Qt}-related packages?

Hi there,

Together with @jluethi and others we implemented a wrapper of napari-workflows to be run within the Fractal system (the current -experimental- version of the wrapper is this one). In our current workflows, we already use napari-skimage-regionprops and napari-segment_blobs-and-things-with-membranes.

Since we execute the workflows as part of HPC jobs, we do not need all features related to napari or the graphical interface, and we would appreciate if those dependencies were optional. This is not the case at the moment, since running a workflow without qtpy available fails with a ModuleNotFoundError (the __init__.py of this package imports _table.py, which in turn imports from qtpy.QtCore). This is also true for napari-segment-blobs-and-things-with-membranes.

In the same spirit as releases 0.2.7 and 0.2.8 of napari-workflows, would it be possible to make some graphics-related dependencies optional, by moving the related import statements within the functions that actually need them? I'm not sure I can write a PR myself, but I'd be glad to test any proposed update.

Cheers, Tommaso

Measuring intensity from different image layers with the same labels layer?

Hi there, thanks for the great plugin!

Just want to double check -- there's not currently any way to measure the intensity from different image layers using the same labels layer, correct?

I'm a bit of a outlier for a napari user in that I maintain yt-napari for loading in astrophysics (and more) simulation data from yt into napari. So I'm not sure how my situation applies to the majority of napari users but in my workflow I typically:

  • load several image layers for different simulation fields covering the same spatial extent
  • segment a single image layer to get a single labels layer
  • measure the intensity values of all my image layers using the one labels layer
  • explore how those measured intensities relate to each other (e.g., here's a short video of interactive plotting with napari-clusters-plotter after I manually edited the properties dict to contain intensity values across image layers for the one labels layer and then used napari-clusters-plotter to run a kmeans classification)

I've taken to doing the measurements manually since I haven't found a plugin that will measure across multiple image layers.

I'm not sure how much need there is for most napari users (maybe there are applications with multispectral bio-imaging?), but would you be open to a PR that adds an option to measure intensity from different image layers (assuming that I haven't missed something in how to use regionprops)? No worries at all if not, I can add some functionality to yt-napari, but this felt like a change that might be nice to have upstream here.

Revert workaround for convex-hull based measurements

... as soon as this bug has been fixed and scikit-image has been released:

... we should remove these if-blocks:

Showing features of a Surface layer causes problems

When I use the show table widget on a surface layer which has the feature attribute set but not the properties attribute then the table widget sets the properties to an empty dictionary. This causes 2 issues:

  1. The features of the surface layer are not shown
  2. Other plugins trying to access the features see empty properties although the feature attribute is populated

I believe these lines are the culprit, so maybe we could change the checks to include both the properties and the features attribute? I will make a quick fork and pull request soon

if hasattr(layer, "properties"):
self.set_content(layer.properties)
else:
self.set_content({})

Save as csv/copy to clipboard not working

Hi @haesleinhuepf,

When trying to save/copy measurements error is received:
AttributeError: 'QTableWidget' object has no attribute 'to_dataframe'

I noticed because it stopped working in my plugin but did not fix it yet, maybe you will be faster.

Enable re-labeling non-sequential label images

It might be nice to enable re-labeling of non-sequentially labeled images / measurements. Therefore, we need to use skimage map_array. We might want to program a function similar to this one and then use it from the clusters-plotter for example:

We should not change the existing function for backwards-compatibility reasons. But if the new one works nicely, we can mark the old as deprecated.

CC @lazigu @marabuuu

Measure properties of points

It would be nice to be able to measure some properties for a points layer and an intensity image instead of only using labels layers. I figure the possible measurements would be quite straightforward:

  • Location (=point coordinates)
  • Intensity at point location
  • maaaybe features of neighboring points.

Just like for labels layers, points layers have a points.feature attribute which can host all sorts of measurements.

double-click on headers to generate parametric images - doesn't take into account "rotate" nor"scale"

Hi @haesleinhuepf ,

I saw on the documentation that one can double-click on headers to generate parametric images, that's a wonderful feature!

But with my layers (that I uploaded with a rotate value)
image

when double-clicking on a column header , the "measurement image" appears but didn't take into account "rotate" and "scale" from the measured layer nor the labels
image

to reproduce :

from tifffile import imread # https://pypi.org/project/tifffile/#examples
import napari 
import numpy as np

viewer = napari.Viewer()
viewer.show()

#we define the path 
image_path = '../data/rat_brain/BigStitcher_6-17_441.tif' #3D, 2chs image
image = imread(image_path)

labels_path = '../data/rat_brain/BigStitcher_6-17_441-1-stardist3D_16bit.tif' #labels
labels = imread(labels_path)

print ("image shape : " + str(image.shape))
print ("label shape : " + str(labels.shape))

# we set the voxel scale 
scale = [1,0.625,0.625]
blending = 'additive'
rotate = [0,0,90]

# extract the green channel 
viewer.add_image( np.swapaxes(image, 0,1 ) , 
                    channel_axis = 0,
                    name=["DAPI","Marker" ],
                    colormap=[ 'cyan','green'],
                    scale=scale ,
                    blending=blending,
                    contrast_limits=[[0,15000],[0,15000]],
                    rotate=(rotate,rotate)
                 )

viewer.add_labels( labels,
                    scale=scale ,
                    blending=blending ,
                    rotate=rotate
                 )

Cheers,

Romain

Timelapse handling

If the table has a column "timepoint", we should mark the right label at the right timepoint. At the moment, when a user clicks "label 1" in the image, any table row with label==1 will be selected. The one of the current label should be selected.

Thanks to @lazigu for reporting!

Intensity measurement in additional channels

@jo-mueller suggested this for the napari-cluster-plotter but I think it could be more fitting here:
It would be a nice-to-have to be able to measure intensity values in more than just one image (different channels with different markers). The actual coding should not be very difficult but the gui implementation of it might be slightly more challenging.

wrong values for table generated images

Hi @haesleinhuepf ,

I would like to measure object intensities

image

but the generated image (double clicking on the header) does not reflect these values

image

when I "mouse over" objects, the value is almost all the time 1.14 e03, even for the one to be 36216 (background is 0 ).
What's weird is that the contrast limit, max is 36216 and "compressing" limits doesn't improve the visualisation of the results

image

I generated an image using the column index , and it seems a lot of labels have the same display value 6.49e04 (while values are different in the table) :
image

Similar data (I reproduced the issue but didn't redo all the screenshots ) can be found on zenodo , Crop_ds441.tif .

Best

Romain

napari-skimage-regionprops overwrites layers.properties

Hi @haesleinhuepf ,

it seems that napari-skimage-regionprops overwrites whatever content has previously been stored in the layer.properties attribute. Would you consider a PR that would change the behaviour of, for instance, add_table() in such a way that it checks whether content has already been written to the respective layer?

In case values have been written already, add_table() would then have to replaced by table_widget.append_content() which would add the data to layer.properties in a "smart" way.

Measuring regionprops without images

If I understand correctly, the Image layer is a necessary input for nsr.regionprops_table_all_frames. I think it may be useful to add an option to measure the regionprops without the image layer input (by making the argument Optional). In particular, this feature is necessary for cases where we track segmented regions in a label layer by napari-laptrack.

Extraneous print commands

Hi Robert @haesleinhuepf ,

I am running napari-skimage-regionprops from a notebook in a for loop and I have been getting a lot of print statements.

Is this line there for a reason or by accident? If it is the latter, can we remove it?

Thanks!

scale information

Hi, I see that this plugin was recently updated on the 19th of May and I was wondering if this was in response to Juan Nunez Iglesias updating sci-kit-image regionprops to be able to take scaled voxels and allow region props to work with anisotropic data (also updated on the 19th of May), or if this is a co-incidence?

I only ask as it would be really helpful (I am guessing for many) to be able to measure and pull out quantitative measurements from microscopy data in napari as most microscopy data is anisotropic. ๐Ÿ™‚

For more info, see this post (link below) and reply. Many thanks, Jemima (napari newbie)
https://forum.image.sc/t/please-help-with-measurements-in-3d-in-napari-with-anisotropic-data/81956?u=jemima

measure things inside things doesn't do anything with default settings

Hi Marcelo @zoccoler ,

I was just testing your new Measure things inside things menu (note: I renamed it a little bit). Unfortunately, it doesn't show any result with the default settings.

To reproduce:

  • Install devbio-napari + the master branch of this repository
  • Start Napari
  • Open Samples > clesperanto > Blobs
  • Tools > Segmentation / Labeling > Gauss-Otsu-Labeling
  • Tools > Measurement tables > Measure things inside things
  • Click Run
    No table opens :-(

Would you mind taking a look?

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.