Code Monkey home page Code Monkey logo

stardist's Introduction

PyPI version Anaconda-Server Badge Test Test (PyPI) Image.sc forum PyPI - Downloads

StarDist - Object Detection with Star-convex Shapes

This repository contains the Python implementation of star-convex object detection for 2D and 3D images, as described in the papers:

Please cite the paper(s) if you are using this code in your research.

Overview

The following figure illustrates the general approach for 2D images. The training data consists of corresponding pairs of input (i.e. raw) images and fully annotated label images (i.e. every pixel is labeled with a unique object id or 0 for background). A model is trained to densely predict the distances (r) to the object boundary along a fixed set of rays and object probabilities (d), which together produce an overcomplete set of candidate polygons for a given input image. The final result is obtained via non-maximum suppression (NMS) of these candidates.

The approach for 3D volumes is similar to the one described for 2D, using pairs of input and fully annotated label volumes as training data.

Webinar/Tutorial

If you want to know more about the concepts and practical applications of StarDist, please have a look at the following webinar that was given at NEUBIAS Academy @Home 2020:

webinar video

Installation

This package is compatible with Python 3.6 - 3.12.

If you only want to use a StarDist plugin for a GUI-based software, please read this.

  1. Please first install TensorFlow (either TensorFlow 1 or 2) by following the official instructions. For GPU support, it is very important to install the specific versions of CUDA and cuDNN that are compatible with the respective version of TensorFlow. (If you need help and can use conda, take a look at this.)

  2. StarDist can then be installed with pip:

    • If you installed TensorFlow 2 (version 2.x.x):

      pip install stardist
      
    • If you installed TensorFlow 1 (version 1.x.x):

      pip install "stardist[tf1]"
      

Notes

  • Depending on your Python installation, you may need to use pip3 instead of pip.
  • You can find out which version of TensorFlow is installed via pip show tensorflow.
  • We provide pre-compiled binaries ("wheels") that should work for most Linux, Windows, and macOS platforms. If you're having problems, please see the troubleshooting section below.
  • (Optional) You need to install gputools if you want to use OpenCL-based computations on the GPU to speed up training.
  • (Optional) You might experience improved performance during training if you additionally install the Multi-Label Anisotropic 3D Euclidean Distance Transform (MLAEDT-3D).

Usage

We provide example workflows for 2D and 3D via Jupyter notebooks that illustrate how this package can be used.

Pretrained Models for 2D

Currently we provide some pretrained models in 2D that might already be suitable for your images:

key Modality (Staining) Image format Example Image Description
2D_versatile_fluo 2D_paper_dsb2018 Fluorescence (nuclear marker) 2D single channel Versatile (fluorescent nuclei) and DSB 2018 (from StarDist 2D paper) that were both trained on a subset of the DSB 2018 nuclei segmentation challenge dataset.
2D_versatile_he Brightfield (H&E) 2D RGB Versatile (H&E nuclei) that was trained on images from the MoNuSeg 2018 training data and the TNBC dataset from Naylor et al. (2018).

You can access these pretrained models from stardist.models.StarDist2D

from stardist.models import StarDist2D

# prints a list of available models
StarDist2D.from_pretrained()

# creates a pretrained model
model = StarDist2D.from_pretrained('2D_versatile_fluo')

And then try it out with a test image:

from stardist.data import test_image_nuclei_2d
from stardist.plot import render_label
from csbdeep.utils import normalize
import matplotlib.pyplot as plt

img = test_image_nuclei_2d()

labels, _ = model.predict_instances(normalize(img))

plt.subplot(1,2,1)
plt.imshow(img, cmap="gray")
plt.axis("off")
plt.title("input image")

plt.subplot(1,2,2)
plt.imshow(render_label(labels, img=img))
plt.axis("off")
plt.title("prediction + input overlay")

Annotating Images

To train a StarDist model you will need some ground-truth annotations: for every raw training image there has to be a corresponding label image where all pixels of a cell region are labeled with a distinct integer (and background pixels are labeled with 0). To create such annotations in 2D, there are several options, among them being Fiji, Labkit, or QuPath. In 3D, there are fewer options: Labkit and Paintera (the latter being very sophisticated but having a steeper learning curve).

Although each of these provide decent annotation tools, we currently recommend using Labkit (for 2D or 3D images) or QuPath (for 2D):

Annotating with LabKit (2D or 3D)

  1. Install Fiji and the Labkit plugin
  2. Open the (2D or 3D) image and start Labkit via Plugins > Labkit > Open Current Image With Labkit
  3. Successively add a new label and annotate a single cell instance with the brush tool until all cells are labeled.
    (Always disable allow overlapping labels or – in older versions of LabKit – enable the override option.)
  4. Export the label image via Labeling > Save Labeling ... with Files of Type > TIF Image making sure that the file name ends with .tif or .tiff.

Additional tips:

  • The Labkit viewer uses BigDataViewer and its keybindings (e.g. s for contrast options, CTRL+Shift+mouse-wheel for zoom-in/out etc.)
  • For 3D images (XYZ) it is best to first convert it to a (XYT) timeseries (via Re-Order Hyperstack and swapping z and t) and then use [ and ] in Labkit to walk through the slices.

Annotating with QuPath (2D)

  1. Install QuPath
  2. Create a new project (File -> Project...-> Create project) and add your raw images
  3. Annotate nuclei/objects
  4. Run this script to export the annotations (save the script and drag it on QuPath. Then execute it with Run for project). The script will create a ground_truth folder within your QuPath project that includes both the images and masks subfolder that then can directly be used with StarDist.

To see how this could be done, have a look at the following example QuPath project (data courtesy of Romain Guiet, EPFL).

Multi-class Prediction

StarDist also supports multi-class prediction, i.e. each found object instance can additionally be classified into a fixed number of discrete object classes (e.g. cell types):

Please see the multi-class example notebook if you're interested in this.

Instance segmentation metrics

StarDist contains the stardist.matching submodule that provides functions to compute common instance segmentation metrics between ground-truth label masks and predictions (not necessarily from StarDist). Currently available metrics are

  • tp, fp, fn
  • precision, recall, accuracy, f1
  • panoptic_quality
  • mean_true_score, mean_matched_score

which are computed by matching ground-truth/prediction objects if their IoU exceeds a threshold (by default 50%). See the documentation of stardist.matching.matching for a detailed explanation.

Here is an example how to use it:

# create some example ground-truth and dummy prediction data
from stardist.data import test_image_nuclei_2d
from scipy.ndimage import rotate
_, y_true = test_image_nuclei_2d(return_mask=True)
y_pred = rotate(y_true, 2, order=0, reshape=False)

# compute metrics between ground-truth and prediction
from stardist.matching import matching

metrics =  matching(y_true, y_pred)

print(metrics)
Matching(criterion='iou', thresh=0.5, fp=88, tp=37, fn=88, precision=0.296, 
       recall=0.296, accuracy=0.1737, f1=0.296, n_true=125, n_pred=125, 
       mean_true_score=0.19490, mean_matched_score=0.65847, panoptic_quality=0.19490)

If you want to compare a list of images you can use stardist.matching.matching_dataset:

from stardist.matching import matching_dataset

metrics = matching_dataset([y_true, y_true], [y_pred, y_pred])

print(metrics)
DatasetMatching(criterion='iou', thresh=0.5, fp=176, tp=74, fn=176, precision=0.296, 
    recall=0.296, accuracy=0.1737, f1=0.296, n_true=250, n_pred=250, 
    mean_true_score=0.19490, mean_matched_score=0.6584, panoptic_quality=0.1949, by_image=False)

Troubleshooting & Support

  1. Please first take a look at the frequently asked questions (FAQ).
  2. If you need further help, please go to the image.sc forum and try to find out if the issue you're having has already been discussed or solved by other people. If not, feel free to create a new topic there and make sure to use the tag stardist (we are monitoring all questions with this tag). When opening a new topic, please provide a clear and concise description to understand and ideally reproduce the issue you're having (e.g. including a code snippet, Python script, or Jupyter notebook).
  3. If you have a technical question related to the source code or believe to have found a bug, feel free to open an issue, but please check first if someone already created a similar issue.

Installation

If pip install stardist fails, it could be because there are no compatible wheels (.whl) for your platform (see list). In this case, pip tries to compile a C++ extension that our Python package relies on (see below). While this often works on Linux out of the box, it will likely fail on Windows and macOS without installing a suitable compiler. (Note that you can enforce compilation by installing via pip install stardist --no-binary :stardist:.)

Installation without using wheels requires Python 3.6 (or newer) and a working C++ compiler. We have only tested GCC (macOS, Linux), Clang (macOS), and Visual Studio (Windows 10). Please open an issue if you have problems that are not resolved by the information below.

If available, the C++ code will make use of OpenMP to exploit multiple CPU cores for substantially reduced runtime on modern CPUs. This can be important to prevent slow model training.

macOS

The default C/C++ compiler Clang that comes with the macOS command line tools (installed via xcode-select --install) does not support OpenMP out of the box, but it can be added. Alternatively, a suitable compiler can be installed from conda-forge. Please see this detailed guide for more information on both strategies (although written for scikit-image, it also applies here).

A third alternative (and what we did until StarDist 0.8.1) is to install the OpenMP-enabled GCC compiler via Homebrew with brew install gcc (e.g. installing gcc-12/g++-12 or newer). After that, you can build the package like this (adjust compiler names/paths as necessary):

CC=gcc-12 CXX=g++-12 pip install stardist

If you use conda on macOS and after import stardist see errors similar to Symbol not found: _GOMP_loop_nonmonotonic_dynamic_next, please see this issue for a temporary workaround.

If you encounter an ImportError: dlopen(...): symbol not found in flat namespace ... error on import stardist, you may try to install it like so:

brew install libomp

export HOMEBREW_PREFIX=/opt/homebrew #set to your homebrew prefix
export CPPFLAGS="$CPPFLAGS -Xpreprocessor -fopenmp"
export CFLAGS="$CFLAGS -I/usr/local/opt/libomp/include"
export CXXFLAGS="$CXXFLAGS -I/usr/local/opt/libomp/include"
export LDFLAGS="$LDFLAGS -Wl,-rpath,/usr/local/opt/libomp/lib -L/usr/local/opt/libomp/lib -lomp"

pip install stardist --no-binary :all:
Apple Silicon

As of StarDist 0.8.2, we provide arm64 wheels that should work with macOS on Apple Silicon (M1 chip or newer). We recommend setting up an arm64 conda environment with GPU-accelerated TensorFlow following Apple's instructions (ensure you are using macOS 12 Monterey or newer) using conda-forge miniforge3 or mambaforge. Then install stardist using pip.

conda create -y -n stardist-env python=3.9   
conda activate stardist-env
conda install -c apple tensorflow-deps
pip install tensorflow-macos tensorflow-metal
pip install stardist

Windows

Please install the Build Tools for Visual Studio 2019 (or newer) from Microsoft to compile extensions for Python 3.6+ (see this for further information). During installation, make sure to select the C++ build tools. Note that the compiler comes with OpenMP support.

Plugins for other software

ImageJ/Fiji

We currently provide a ImageJ/Fiji plugin that can be used to run pretrained StarDist models on 2D or 2D+time images. Installation and usage instructions can be found at the plugin page.

Napari

We made a plugin for the Python-based multi-dimensional image viewer napari. It directly uses the StarDist Python package and works for 2D and 3D images. Please see the code repository for further details.

QuPath

Inspired by the Fiji plugin, Pete Bankhead made a custom implementation of StarDist 2D for QuPath to use pretrained models. Please see this page for documentation and installation instructions.

Icy

Based on the Fiji plugin, Deborah Schmidt made a StarDist 2D plugin for Icy to use pretrained models. Please see the code repository for further details.

KNIME

Stefan Helfrich has modified the Fiji plugin to be compatible with KNIME. Please see this page for further details.

How to cite

@inproceedings{schmidt2018,
  author    = {Uwe Schmidt and Martin Weigert and Coleman Broaddus and Gene Myers},
  title     = {Cell Detection with Star-Convex Polygons},
  booktitle = {Medical Image Computing and Computer Assisted Intervention - {MICCAI} 
  2018 - 21st International Conference, Granada, Spain, September 16-20, 2018, Proceedings, Part {II}},
  pages     = {265--273},
  year      = {2018},
  doi       = {10.1007/978-3-030-00934-2_30}
}

@inproceedings{weigert2020,
  author    = {Martin Weigert and Uwe Schmidt and Robert Haase and Ko Sugawara and Gene Myers},
  title     = {Star-convex Polyhedra for 3D Object Detection and Segmentation in Microscopy},
  booktitle = {The IEEE Winter Conference on Applications of Computer Vision (WACV)},
  month     = {March},
  year      = {2020},
  doi       = {10.1109/WACV45572.2020.9093435}
}

@inproceedings{weigert2022,
  author    = {Martin Weigert and Uwe Schmidt},
  title     = {Nuclei Instance Segmentation and Classification in Histopathology Images with Stardist},
  booktitle = {The IEEE International Symposium on Biomedical Imaging Challenges (ISBIC)},
  year      = {2022},
  doi       = {10.1109/ISBIC56247.2022.9854534}
}

stardist's People

Contributors

ajinkya-kulkarni avatar bbquercus avatar constantinpape avatar fynnbe avatar hedgehogcode avatar kevinyamauchi avatar ksugar avatar maweigert avatar psobolewskiphd avatar qin-yu avatar romainguiet avatar uschmidt83 avatar ximion avatar xqua avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

stardist's Issues

Labeling Ground Truth

Hi,
As it says;

The training data that needs to be provided for StarDist consists of corresponding pairs of raw images and pixelwise annotated ground truth images (masks), where every pixel has a unique integer value indicating the object id (or 0 for background).

Can you please shed some light on how to do that for your own data-set? I can see a lot of annotation/masking tools but having hard time getting the required output.

Also, you said this is a subset of kaggle DSB2018. How you guys converted that format to this one?

Help is much appreciated.

Integer distances always?

In the file stardist.cpp, when you calculate the distances you always get an integer.
You calculate dist = sqrt(x*x + y*y) where x=n*cos(phi) and y=n*sin(phi) where n denotes the number of times the loop is reapeted. Therefore
dist = sqrt(n2 (cos2(phi) + sin2(phi))=n sqrt(cos2(phi) + sin2(phi))=n

I am not sure if this was intended.

  • In case, it was you can modify the code and just define dist=0 and do dist++ inside the while loop so that you don't get numerical errors to the calculus of sin and cos.

  • In case, it was not intended and you want to calculate the distance to the pixel you should calculate x_hat=round(n*cos(phi)) and y_hat=round(n*sin(phi)) and later define dist=sqrt(x_hat*x_hat + y_hat*y_hat)

For the matrix

[1 0]
[1 1]

for the ray of 45º degrees, currently, you will get the following distances:

[1 0]
[1 1]

while I would expect to get (alternative I propose):

[sqrt(2) 0]
[sqrt(2) sqrt(2)]

It all depends on the case you prefer to have.

"TypeError: 'function' object is not iterable"

Hi,

I'm a little new to Python and I am trying to run the 2D sample training notebook using the sample data that was provided, but I keep getting the following error shown in the images below, and I'm not sure how to fix it. Can you please help?

Sincerely,
Meg

image
image

pip install error - 'ascii' codec can't decode byte

Hi.
This might sound very petty, but I've been struggling with this now for a while.

TLDR: Previously working installation process fails on another machine with the same setup.

After Ive installed stardist to my notebook without any issues just to try it out for some sample data. I liked it quite much so I've decided to give it a bigger shot and run it on a (local) GPU server.
In both cases I've set up a python3 virtual environment, simply installed tensorflow-gpu, then stardist, but in the second case it has failed with an error:

pip3 install stardist
Obtaining file:///storage01/balassa/Stardist/stardist
ERROR: Command errored out with exit status 1:
command: /storage01/balassa/Stardist/Sd3/bin/python3 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/storage01/balassa/Stardist/stardist/setup.py'"'"'; file='"'"'/storage01/balassa/Stardist/stardist/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' egg_info
cwd: /storage01/balassa/Stardist/stardist/
Complete output (7 lines):
Traceback (most recent call last):
File "", line 1, in
File "/storage01/balassa/Stardist/stardist/setup.py", line 46, in
long_description = f.read()
File "/storage01/balassa/Stardist/Sd3/lib/python3.6/encodings/ascii.py", line 26, in decode
return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 769: ordinal not in range(128)
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

Could anyone help me please what might be the issue here?
In both cases I've tried the process under linux, with python3 and CUDA 9.0. And I have tried to set the env. vars to UTF-8.

Cannot find pre-trained model "versatile" from Fiji in `.h5` format

Hi,
the Fiji plugin of Stardist comes with two pre-trained models for 2D.
One is called "Versatile (fluorescent nuclei)", the other is from the DSB.

I cannot find the pre-trained model "Versatile" in this repo here as a .h5 to use it directly in Python. I looked in my Fiji installation and found a Model directory with two GenericNetwork.... subfolders. I assume these correspond to the two bundled Stardist models (maybe they are from a different plugin altogether), but they are saved in .pb (protocol buffer) format and I'm not sure how to convert them back to the .h5 used in the example notebooks.

Apologies if I overlooked something obvious.

Stardist 3D - predictions - The kernel appears to have died. It will restart automatically.

Hello guys! first congrats for the complete and accesible pipeline that you provide us, people with very few expertise in machine/deep learning. I am trying to predict 3D nuclei images using your Jupyter Notebooks, and I find the kernel died at the line "labels, details = model.predict_instances(img)".

Before that I trained my own model without problems of compatibility. I used 16 labelled images (400 epochs, 100 steps, path_size = 32, 96 rays, batch_size=1...). With the IoU graph that I have attached.

IoU graph

Well, I would like to test the predictions with this model and maybe later improving the training adding more images for a better accuracy. Anyway, first, I tested your 3D_demo model and later my own model, but both of them give the same error (kernel out).

How could I fix this problem to get the predictions results?

Thank you in advance.

Is there a way to make model.predict_instances run on the gpu?

I passed use_gpu = true into my model when I created the model object inside of my prediction script. When I run model.predict_instances on my data my CPU jumps to 100% and it seems like my GPU is not being utilized at all.

Just to clarify I was able to train the model using my GPU so I know it's not an issue with my installation or hardware. I was just hoping I could predict on my data with my GPU to reduce computation time.


conf = Config3D (
    use_gpu          = use_gpu,
    n_channel_in     = n_channel,
)
# print(conf)
# vars(conf)

model = StarDist3D(conf, name='stardistAshley', basedir='models')

Thanks for any help!

CUDA_ERROR_OUT_OF_MEMORY: out of memory

Hi StarDist team,

After installing StarDist on our cluster and adjusting it to the requirements, there is this error appearing in the terminal while training the model:

2019-07-05 15:26:03.689271: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at cwise_ops_common.cc:70 : Resource exhausted: OOM when allocating tensor with shape[64] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc
2019-07-05 15:26:03.691201: E tensorflow/stream_executor/cuda/cuda_driver.cc:806] failed to allocate 7.67M (8041216 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory"

So far I have tried to modify the notebook to set the GPU memory grows up to 90%:

config = tf.ConfigProto()
config.gpu_options.per_process_gpu_memory_fraction = 0.9
session = tf.Session(config=config, ...)

which results to other error.
It is highly appreciated if you help me to find a solution.

running notebook 1

i am trying to run the first few cells from notebook 1 in the 2d examples folder. it works as expected on the data provided on the github page but there is no output when i try to run it on my data. im probably making some stupid mistake but i havent been able to figure it out. i make sure that my mask array is integer valued (i just have 0 and 1) dtype = "np.uint16" but the algorithm doestn give any output. i am attaching my data and notebook.

https://we.tl/t-utCAjfFVNi

Different Keras versions

Hey,

I haven't found anything about the required version for Keras, so I wanted to note that I've been thrown an error with Keras v2.3.0 at the end of the first epoch ('float' object has no attribute 'item' at the line 305 in csbdeep/utils/tf.py).
Forcing Keras version back to 2.2.5 solved the issue.

Compilation with GCC 9 fails

When I compile (or rather pip compiles) with GCC 9, stardist cannot be imported:

ImportError: /home/pape/Work/software/conda/miniconda3/envs/antibodies-gpu/lib/python3.7/site-packages/stardist/lib/stardist2d.cpython-37m-x86_64-linux-gnu.so: symbol GOMP_loop_nonmonotonic_dynamic_start, version GOMP_4.5 not defined in file libgomp.so.1
 with link time reference

Compiling with GCC 7 works. (Haven't tried any others)

I guess it's not really necessary to fix this, but maybe add something about this in the readme.

Extension to TXY images for batch processing

Hi,

Thank you very much for this wonderful software and easy to follow Jupyter notebooks.

I took some code from you and wrote a simple batch processing mode to read image data and write labelled data to a directory for XY shape images. It would be helpful if the functionality can also be extended to TXY shape images, where it would read images as stacks and write them as stacks in the results directory.

My notebook is here: https://github.com/kapoorlab/StardistforCurie/blob/master/BatchProcessing2D.ipynb

Do you have an extension for TXY shape images already? It would help a lot in cell segmentation for time-lapse images of cells/nuclei. If you have this code somewhere, kindly share it with me.

Thanks again.

How do you make the median object size within the field of view of the NN?

Hi,

When I run the 2D training notebook using my own data, I get the following warning:
"WARNING: median object size larger than field of view of the neural network."

What does this mean exactly, and how can I make it so that median object size is within the field of view of the neural network? Is it an issue with the data I am trying to train the model with? When I run the training notebook all the way, the IoU is 0, and I'm thinking that the reason is because of the median object size being larger than the field of view of the neural network.

Sincerely,
Meg

malloc error: pointer being free was not allocated

Hello everybody,

I am a biologist - who enjoys learning about programming, so I know Python a little - and I would love to use StarDist-3D to segment some images of densely packed tissues that I am studying. I tried to import stardist to play around with it, but I immediately failed at the very first line of code and I do not know how to solve this issue.
I imported tensorflow and stardist as indicated on your GitHub, together with GCC compiler.
Though, when I run a .py file in the terminal, I cannot even import stardist that I get this error:

Python(70696,0x109979dc0) malloc: *** error for object 0x144ef8d20: pointer being freed was not allocated Python(70696,0x109979dc0) malloc: *** set a breakpoint in malloc_error_break to debug

So, my naive questions are:
Do you know what I am doing wrong and how I can solve it?
Did anybody have the same issue?

I am using MacOS 10.15 and Python 3.7.3

I hope I will be able to use stardist soon!
Best,

Lucrezia

Error during prediction on channel=3 RGB tiff images

After successful addressing #5 a potential follow-up error occurred. When calling prob, dist = model_no_sc.predict(img) following error message is displayed:

UnboundLocalError                         Traceback (most recent call last)
<timed exec> in <module>

/usr/local/lib/python3.5/dist-packages/stardist/model.py in predict(self, img, resizer, **predict_kwargs)
    454         if img.ndim == 2:
    455             x = np.expand_dims(img,channel)
--> 456         self.config.n_channel_in == x.shape[channel] or _raise(ValueError())
    457 
    458         # resize: make divisible by power of 2 to allow downsampling steps in unet

UnboundLocalError: local variable 'x' referenced before assignment

Dataset TOY

Hi,

I am working on some changes of your network to adapt it to our own circumstances. I would like to test my changes on your dataset TOY, or at least similar pictures. Is it possible to share your code to generate this dataset or share the dataset?

Best,
Niklas

anisotropy of training dataset

I am closing this issue as I found the answer to my initial question (as below).

I am a bit confused with the anisotropy definition in the training configuration.

How to get StarDist ROIs/Labels from DeepImageJ outputs?

Hi @uschmidt83 , Hi @maweigert,

When using DeepImageJ, the outputs of the StarDist model are the probability maps and the 32 radial distances ( see solution made @esgomezm here )

# Input
model_paper.keras_model.input[0]

# Output 1: the object probability (normalized euclidean distance to the nearest background point)
model_paper.keras_model.output[0]

# Output 2: 32 radial distances to the boundary of the object
model_paper.keras_model.output[1]

From the StarDist prediction : https://github.com/mpicbg-csbd/stardist/blob/6e7d053bafcf83a6fca47f81469c201b79e993b1/stardist/models/model2d.py#L375

it seems that it would need to get the coord from the distance
https://github.com/mpicbg-csbd/stardist/blob/6e7d053bafcf83a6fca47f81469c201b79e993b1/stardist/models/model2d.py#L380

then get the points , using proba + coord
https://github.com/mpicbg-csbd/stardist/blob/6e7d053bafcf83a6fca47f81469c201b79e993b1/stardist/models/model2d.py#L381

to finally get the labels :
https://github.com/mpicbg-csbd/stardist/blob/6e7d053bafcf83a6fca47f81469c201b79e993b1/stardist/models/model2d.py#L383

The goal is to use StarDist in ImageJ and I am wondering if I am on the right path or I missed something obvious that will give me the labels without having to rewrite these functions in groovy?

Furthermore, in a related issue we were discussing about a StarDist plugin for FIJI. Is it still under development?
(feel free to close/merge this issue if it's too close to the one mentioned above)

Thank you for your inputs,

Romain

Kernel dies during run of 3D example

When running the 3D example, the python kernel of the jupyter notebook dies. No further error is raised.
This happens when executing the following line of the file 3_prediction:
labels, details = model.predict_instances(img)
and the following line of file 2_training:
model.optimize_thresholds(X_val, Y_val)
In both cases, the supplied 3D_demo model was used.
The 2D example runs without error.

Could it be that the model file is corrupt?

Problem with installing via pip

Hi,

i just wanted to give this cool tool try, but run into isseue, where I have no real idea, what is required to solve it. I am using Windows 7 64bit and Python (Anaconda) 3.6.5.

Here is the output:

(base) C:\Users\M1SRH>pip install stardist
Collecting stardist
  Downloading https://files.pythonhosted.org/packages/b7/42/c605d74bd3ce36972f6e
74b67f7b261bb626f805f7da4309602105e9250d/stardist-0.1.0.tar.gz (46kB)
    100% |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 51kB 812kB/s
Collecting csbdeep (from stardist)
  Downloading https://files.pythonhosted.org/packages/55/e7/ff870322ee3645733acc
5c252b238321caef4c71671a5ca3e37b030735fa/csbdeep-0.1.1-py2.py3-none-any.whl (50k
B)
    100% |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 51kB 867kB/s
Requirement already satisfied: scikit-image in c:\anaconda3\lib\site-packages (f
rom stardist) (0.13.1)
Collecting keras>=2.0.7 (from csbdeep->stardist)
  Downloading https://files.pythonhosted.org/packages/68/12/4cabc5c01451eb3b413d
19ea151f36e33026fc0efb932bf51bcaf54acbf5/Keras-2.2.0-py2.py3-none-any.whl (300kB
)
    100% |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 307kB 3.3MB/s
Requirement already satisfied: six in c:\anaconda3\lib\site-packages (from csbde
ep->stardist) (1.11.0)
Requirement already satisfied: tifffile in c:\anaconda3\lib\site-packages (from
csbdeep->stardist) (2018.2.18)
Requirement already satisfied: scipy in c:\anaconda3\lib\site-packages (from csb
deep->stardist) (1.1.0)
Requirement already satisfied: matplotlib in c:\anaconda3\lib\site-packages (fro
m csbdeep->stardist) (2.2.2)
Requirement already satisfied: numpy in c:\anaconda3\lib\site-packages (from csb
deep->stardist) (1.14.3)
Collecting tqdm (from csbdeep->stardist)
  Downloading https://files.pythonhosted.org/packages/93/24/6ab1df969db228aed36a
648a8959d1027099ce45fad67532b9673d533318/tqdm-4.23.4-py2.py3-none-any.whl (42kB)

    100% |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 51kB 1.2MB/s
Requirement already satisfied: networkx>=1.8 in c:\anaconda3\lib\site-packages (
from scikit-image->stardist) (2.1)
Requirement already satisfied: pillow>=2.1.0 in c:\anaconda3\lib\site-packages (
from scikit-image->stardist) (5.1.0)
Requirement already satisfied: PyWavelets>=0.4.0 in c:\anaconda3\lib\site-packag
es (from scikit-image->stardist) (0.5.2)
Collecting keras-preprocessing==1.0.1 (from keras>=2.0.7->csbdeep->stardist)
  Downloading https://files.pythonhosted.org/packages/f8/33/275506afe1d96b221f66
f95adba94d1b73f6b6087cfb6132a5655b6fe338/Keras_Preprocessing-1.0.1-py2.py3-none-
any.whl
Collecting keras-applications==1.0.2 (from keras>=2.0.7->csbdeep->stardist)
  Downloading https://files.pythonhosted.org/packages/e2/60/c557075e586e968d7a9c
314aa38c236b37cb3ee6b37e8d57152b1a5e0b47/Keras_Applications-1.0.2-py2.py3-none-a
ny.whl (43kB)
    100% |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 51kB 2.1MB/s
Requirement already satisfied: pyyaml in c:\anaconda3\lib\site-packages (from ke
ras>=2.0.7->csbdeep->stardist) (3.12)
Requirement already satisfied: h5py in c:\anaconda3\lib\site-packages (from kera
s>=2.0.7->csbdeep->stardist) (2.7.1)
Requirement already satisfied: cycler>=0.10 in c:\anaconda3\lib\site-packages (f
rom matplotlib->csbdeep->stardist) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in c:\an
aconda3\lib\site-packages (from matplotlib->csbdeep->stardist) (2.2.0)
Requirement already satisfied: python-dateutil>=2.1 in c:\anaconda3\lib\site-pac
kages (from matplotlib->csbdeep->stardist) (2.7.3)
Requirement already satisfied: pytz in c:\anaconda3\lib\site-packages (from matp
lotlib->csbdeep->stardist) (2018.4)
Requirement already satisfied: kiwisolver>=1.0.1 in c:\anaconda3\lib\site-packag
es (from matplotlib->csbdeep->stardist) (1.0.1)
Requirement already satisfied: decorator>=4.1.0 in c:\anaconda3\lib\site-package
s (from networkx>=1.8->scikit-image->stardist) (4.3.0)
Requirement already satisfied: setuptools in c:\anaconda3\lib\site-packages (fro
m kiwisolver>=1.0.1->matplotlib->csbdeep->stardist) (39.1.0)
Building wheels for collected packages: stardist
  Running setup.py bdist_wheel for stardist ... error
  Complete output from command c:\Anaconda3\python.exe -u -c "import setuptools,
 tokenize;__file__='C:\\Temp\\pip-install-l_bum2k4\\stardist\\setup.py';f=getatt
r(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close(
);exec(compile(code, __file__, 'exec'))" bdist_wheel -d C:\Temp\pip-wheel-u93vsw
zo --python-tag cp36:
  c:\Anaconda3\lib\distutils\dist.py:261: UserWarning: Unknown distribution opti
on: 'long_description_content_type'
    warnings.warn(msg)
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib.win-amd64-3.6
  creating build\lib.win-amd64-3.6\stardist
  copying stardist\model.py -> build\lib.win-amd64-3.6\stardist
  copying stardist\nms.py -> build\lib.win-amd64-3.6\stardist
  copying stardist\plot.py -> build\lib.win-amd64-3.6\stardist
  copying stardist\utils.py -> build\lib.win-amd64-3.6\stardist
  copying stardist\version.py -> build\lib.win-amd64-3.6\stardist
  copying stardist\__init__.py -> build\lib.win-amd64-3.6\stardist
  running build_ext
  No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying fro
m distutils
  building 'stardist.lib.stardist' extension
  creating build\temp.win-amd64-3.6
  creating build\temp.win-amd64-3.6\Release
  creating build\temp.win-amd64-3.6\Release\stardist
  creating build\temp.win-amd64-3.6\Release\stardist\lib
  cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MT -Ic:\Anaconda3\lib\site-packages\nu
mpy\core\include -Ic:\Anaconda3\include -Ic:\Anaconda3\include -I"C:\Program Fil
es (x86)\Microsoft Visual Studio 14.0\VC\Include" -I"C:\Program Files (x86)\Wind
ows Kits\NETFXSDK\4.6.1\include\um" /EHsc /Tpstardist/lib/stardist.cpp /Fobuild\
temp.win-amd64-3.6\Release\stardist/lib/stardist.obj

  error: Command "cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MT -Ic:\Anaconda3\lib\
site-packages\numpy\core\include -Ic:\Anaconda3\include -Ic:\Anaconda3\include -
I"C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\Include" -I"C:\Program
Files (x86)\Windows Kits\NETFXSDK\4.6.1\include\um" /EHsc /Tpstardist/lib/stardi
st.cpp /Fobuild\temp.win-amd64-3.6\Release\stardist/lib/stardist.obj" failed wit
h exit status 127

  ----------------------------------------
  Failed building wheel for stardist
  Running setup.py clean for stardist
Failed to build stardist
distributed 1.21.8 requires msgpack, which is not installed.
Installing collected packages: keras-preprocessing, keras-applications, keras, t
qdm, csbdeep, stardist
  Running setup.py install for stardist ... error
    Complete output from command c:\Anaconda3\python.exe -u -c "import setuptool
s, tokenize;__file__='C:\\Temp\\pip-install-l_bum2k4\\stardist\\setup.py';f=geta
ttr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.clos
e();exec(compile(code, __file__, 'exec'))" install --record C:\Temp\pip-record-c
od_chcp\install-record.txt --single-version-externally-managed --compile:
    c:\Anaconda3\lib\distutils\dist.py:261: UserWarning: Unknown distribution op
tion: 'long_description_content_type'
      warnings.warn(msg)
    running install
    running build
    running build_py
    creating build
    creating build\lib.win-amd64-3.6
    creating build\lib.win-amd64-3.6\stardist
    copying stardist\model.py -> build\lib.win-amd64-3.6\stardist
    copying stardist\nms.py -> build\lib.win-amd64-3.6\stardist
    copying stardist\plot.py -> build\lib.win-amd64-3.6\stardist
    copying stardist\utils.py -> build\lib.win-amd64-3.6\stardist
    copying stardist\version.py -> build\lib.win-amd64-3.6\stardist
    copying stardist\__init__.py -> build\lib.win-amd64-3.6\stardist
    running build_ext
    No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying f
rom distutils
    building 'stardist.lib.stardist' extension
    creating build\temp.win-amd64-3.6
    creating build\temp.win-amd64-3.6\Release
    creating build\temp.win-amd64-3.6\Release\stardist
    creating build\temp.win-amd64-3.6\Release\stardist\lib
    cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MT -Ic:\Anaconda3\lib\site-packages\
numpy\core\include -Ic:\Anaconda3\include -Ic:\Anaconda3\include -I"C:\Program F
iles (x86)\Microsoft Visual Studio 14.0\VC\Include" -I"C:\Program Files (x86)\Wi
ndows Kits\NETFXSDK\4.6.1\include\um" /EHsc /Tpstardist/lib/stardist.cpp /Fobuil
d\temp.win-amd64-3.6\Release\stardist/lib/stardist.obj

    error: Command "cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MT -Ic:\Anaconda3\li
b\site-packages\numpy\core\include -Ic:\Anaconda3\include -Ic:\Anaconda3\include
 -I"C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\Include" -I"C:\Progra
m Files (x86)\Windows Kits\NETFXSDK\4.6.1\include\um" /EHsc /Tpstardist/lib/star
dist.cpp /Fobuild\temp.win-amd64-3.6\Release\stardist/lib/stardist.obj" failed w
ith exit status 127

    ----------------------------------------
Command "c:\Anaconda3\python.exe -u -c "import setuptools, tokenize;__file__='C:
\\Temp\\pip-install-l_bum2k4\\stardist\\setup.py';f=getattr(tokenize, 'open', op
en)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, _
_file__, 'exec'))" install --record C:\Temp\pip-record-cod_chcp\install-record.t
xt --single-version-externally-managed --compile" failed with error code 1 in C:
\Temp\pip-install-l_bum2k4\stardist\

(base) C:\Users\M1SRH>

Prediction with tiling fails after successful training

The training for the model seems to have worked but when I load an image stack for prediction (+tiling) the error message printed below is prompted. The demo model works fine with the test image stack, so it may have something to do with the model.
Any help in explaining or circumventing this problem would be highly appreciated.

Traceback (most recent call last):
File "prediction_file_name", line 65, in
labels, details = model.predict_instances(img, n_tiles=(1,4,4))
File "/home/isim/anaconda3/lib/python3.7/site-packages/stardist/models/base.py", line 364, in predict_instances
prob, dist = self.predict(img, axes=axes, normalizer=normalizer, n_tiles=n_tiles, show_tile_progress=show_tile_progress, **predict_kwargs)
File "/home/isim/anaconda3/lib/python3.7/site-packages/stardist/models/base.py", line 279, in predict
axes_net_tile_overlaps = self._axes_tile_overlap(axes_net)
File "/home/isim/anaconda3/lib/python3.7/site-packages/stardist/models/base.py", line 448, in _axes_tile_overlap
self._tile_overlap = self._compute_receptive_field()
File "/home/isim/anaconda3/lib/python3.7/site-packages/stardist/models/base.py", line 440, in _compute_receptive_field
return [(m-np.min(i), np.max(i)-m) for (m,i) in zip(mid,ind)]
File "/home/isim/anaconda3/lib/python3.7/site-packages/stardist/models/base.py", line 440, in
return [(m-np.min(i), np.max(i)-m) for (m,i) in zip(mid,ind)]
File "/home/isim/anaconda3/lib/python3.7/site-packages/numpy/core/fromnumeric.py", line 2618, in amin
initial=initial)
File "/home/isim/anaconda3/lib/python3.7/site-packages/numpy/core/fromnumeric.py", line 86, in _wrapreduction
return ufunc.reduce(obj, axis, dtype, out, **passkwargs)
ValueError: zero-size array to reduction operation minimum which has no identity

Stardist prediction allocates full gpu

I have noticed that stardist prediction allocates all memory on the gpu.
As far as I understand, this is the default behaviour of tensorflow, but can be changed:
https://stackoverflow.com/questions/34199233/how-to-prevent-tensorflow-from-allocating-the-totality-of-a-gpu-memory

What would be the best way to do this when using stardist?

(I think for our use case, where we only predict on image at a time, only a small fraction of the gpu memory is used + the task is cpu bound, so the gpu could be used for other tasks concurrently if we don't allocate all memory)

Issue importing StarDist into Python/Jupyter

Hi!

I get the following error when trying to import modules from stardist.

ImportError: dlopen(/usr/local/Caskroom/miniconda/base/envs/tf/lib/python3.7/site-packages/stardist/lib/stardist2d.cpython-37m-darwin.so, 2): Symbol not found: _GOMP_loop_nonmonotonic_dynamic_next Referenced from: /usr/local/Caskroom/miniconda/base/envs/tf/lib/python3.7/site-packages/stardist/lib/stardist2d.cpython-37m-darwin.so Expected in: flat namespace in /usr/local/Caskroom/miniconda/base/envs/tf/lib/python3.7/site-packages/stardist/lib/stardist2d.cpython-37m-darwin.so

I have followed the installation steps as per the guidelines.

Anyone have any idea how to fix this issue?

UnicodeDecodeError during installation

Hi Uwe/Martin,

I am trying to install stardist on a CentOS system with python-3.5, keras-2.2.4, tensorflow-1.6.0. I am getting a UnicodeDecode error after trying to install stardist via pip. The error message is the following:

(tensorflowNEW) -bash-4.2$ pip install stardist
Collecting stardist
Using cached https://files.pythonhosted.org/packages/88/d5/51972b9c6e210a34c44c263a1ac60d525271c5d90a802df5ecc9d66287fa/stardist-0.3.3.tar.gz
ERROR: Command errored out with exit status 1:
command: /data/u934/service_imagerie/v_kapoor/anaconda2/envs/tensorflowNEW/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-d3zbl816/stardist/setup.py'"'"'; file='"'"'/tmp/pip-install-d3zbl816/stardist/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' egg_info --egg-base pip-egg-info
cwd: /tmp/pip-install-d3zbl816/stardist/
Complete output (7 lines):
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-install-d3zbl816/stardist/setup.py", line 46, in
long_description = f.read()
File "/data/u934/service_imagerie/v_kapoor/anaconda2/envs/tensorflowNEW/lib/python3.5/encodings/ascii.py", line 26, in decode
return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 769: ordinal not in range(128)
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

pre-trained model of 3D STARdist

Hi Schmidt,

Thank you for this great work! I am wondering if you can upload the pre-trained model of 3D STARDist. I have some unlabeled data to process and want to test them with your model.

Best,
Xin

NMS implementation does not scale

I've proposed a solution to this issue in a pull request #40 .

The NMS implementation scales with the square of the number of candidate polygons, i.e. for N candidate polygons the average case is still O(N**2). This is acceptable in 2D because comparing two candidate polygons for overlap, even if they must be rendered completely, is fast. This is also acceptable in 3D for sufficiently small N since the inner loop of the algorithm is parallelized over threads and some smart lower/upper bounds are used to potentially identify a suppression without rendering the full polygons.

However, some potentially common and impactful use cases do not have sufficiently small N. The 6 days post fertilization larval zebrafish brain, a common laboratory organism, has on the order of 100,000 neurons. See below an example network which, with a probability threshold optimized for a validation dataset, produced 6.32e6 polygon candidates. Making matters worse, at typical confocal or light sheet microscope sampling rates, individual nuclei in whole brain images of this organism comprise relatively few voxels. In that case, to ensure sufficiently accurate boundary reconstruction, the step size used for polygon rendering must be reduced, slowing down reconstruction time for each instance. Instance segmentation in that case took 32 cores 8 hours to complete, 7.25 of which are from the NMS. On our cluster system, this works out to about $20 of chargeback cost.

However, for instance segmentation over such a large field of view, the overwhelming majority of polygon candidate comparisons cannot possibly result in a suppression, as the candidates do not overlap at all - ideally such comparisons should be ignored. In my pull request I propose a solution which produces the same segmentation, but prevents these unproductive candidate comparisons and reduces the total execution time to 40 minutes, only 21 of which are from the NMS.

See lines:
NMS: n_polys = 6315439
NMS: using OpenMP with 64 thread(s)
NMS took 25817.3547 s
full_image_old

Object boundary reconstruction quality

Hi,

first of all thanks for such a great tool! No more problems with object boundaries :) Also appreciate the great jupyter notebooks to get started.

I was considering using stardist for 2d cell segmentation with the aim to analyse cell shapes (fluorescent label). In a first try I noticed that the segmentation worked pretty well and the cells got nicely separated from each other, which is incredibly useful. However the exact cell shapes seem not super precisely recovered (despite them being clearly star-convex).

If I understood the method correctly the object boundaries are defined by the rays of the single highest score pixel within the cell (after NMS), which at least in my case might be not too precise. Which do you think would be a good way to optimise the final shapes I get out within the stardist framework? E.g., would you expect the ray distances to the borders become reasonably precise with extensive training (so far I have 20 cells labeled and use 128 rays with a (2,2) grid)? Otherwise of course there would be the option to use the stardist output as watershed seeds..

Thanks for any hints or suggestions :)

Cheers,
Marvin

Import Error - dlopen Expected in: flat namespace

Hello,

I've been trying to install stardist today.
Mac OSX Mojave 10.14.4

Anaconda 3.6.7
gcc-9 and g++-9 compilers

I run "pip install stardist":

| => pip install stardist
Collecting stardist
Requirement already satisfied: csbdeep>=0.4.0 in /Users/guest/anaconda3/lib/python3.6/site-packages (from stardist) (0.4.1)
Requirement already satisfied: scikit-image in /Users/guest/anaconda3/lib/python3.6/site-packages (from stardist) (0.15.0)
Requirement already satisfied: numba in /Users/guest/anaconda3/lib/python3.6/site-packages (from stardist) (0.43.1)
Requirement already satisfied: numpy in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (1.16.4)
Requirement already satisfied: matplotlib in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (3.1.0)
Requirement already satisfied: keras>=2.1.2 in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (2.2.2)
Requirement already satisfied: tifffile in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (0.15.1)
Requirement already satisfied: tqdm in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (4.32.1)
Requirement already satisfied: h5py in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (2.9.0)
Requirement already satisfied: scipy in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (1.3.0)
Requirement already satisfied: six in /Users/guest/anaconda3/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (1.12.0)
Requirement already satisfied: networkx>=2.0 in /Users/guest/anaconda3/lib/python3.6/site-packages (from scikit-image->stardist) (2.3)
Requirement already satisfied: pillow>=4.3.0 in /Users/guest/anaconda3/lib/python3.6/site-packages (from scikit-image->stardist) (6.0.0)
Requirement already satisfied: imageio>=2.0.1 in /Users/guest/anaconda3/lib/python3.6/site-packages (from scikit-image->stardist) (2.5.0)
Requirement already satisfied: PyWavelets>=0.4.0 in /Users/guest/anaconda3/lib/python3.6/site-packages (from scikit-image->stardist) (1.0.3)
Requirement already satisfied: llvmlite>=0.28.0dev0 in /Users/guest/anaconda3/lib/python3.6/site-packages (from numba->stardist) (0.28.0)
Requirement already satisfied: python-dateutil>=2.1 in /Users/guest/anaconda3/lib/python3.6/site-packages (from matplotlib->csbdeep>=0.4.0->stardist) (2.8.0)
Requirement already satisfied: cycler>=0.10 in /Users/guest/anaconda3/lib/python3.6/site-packages (from matplotlib->csbdeep>=0.4.0->stardist) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /Users/guest/anaconda3/lib/python3.6/site-packages (from matplotlib->csbdeep>=0.4.0->stardist) (2.4.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /Users/guest/anaconda3/lib/python3.6/site-packages (from matplotlib->csbdeep>=0.4.0->stardist) (1.1.0)
Requirement already satisfied: keras-preprocessing==1.0.2 in /Users/guest/anaconda3/lib/python3.6/site-packages (from keras>=2.1.2->csbdeep>=0.4.0->stardist) (1.0.2)
Requirement already satisfied: pyyaml in /Users/guest/anaconda3/lib/python3.6/site-packages (from keras>=2.1.2->csbdeep>=0.4.0->stardist) (5.1)
Requirement already satisfied: keras-applications==1.0.4 in /Users/guest/anaconda3/lib/python3.6/site-packages (from keras>=2.1.2->csbdeep>=0.4.0->stardist) (1.0.4)
Requirement already satisfied: decorator>=4.3.0 in /Users/guest/anaconda3/lib/python3.6/site-packages (from networkx>=2.0->scikit-image->stardist) (4.4.0)
Requirement already satisfied: setuptools in /Users/guest/anaconda3/lib/python3.6/site-packages (from kiwisolver>=1.0.1->matplotlib->csbdeep>=0.4.0->stardist) (41.0.1)
Installing collected packages: stardist
Successfully installed stardist-0.3.4

Then on command: python -c "import stardist" I get:

Traceback (most recent call last):
File "", line 1, in
File "/Users/guest/anaconda3/lib/python3.6/site-packages/stardist/init.py", line 8, in
from .geometry import star_dist, polygons_to_label, relabel_image_stardist, ray_angles, dist_to_coord
File "/Users/guest/anaconda3/lib/python3.6/site-packages/stardist/geometry/init.py", line 5, in
from .geom2d import star_dist, polygons_to_label, relabel_image_stardist, ray_angles, dist_to_coord
File "/Users/guest/anaconda3/lib/python3.6/site-packages/stardist/geometry/geom2d.py", line 11, in
from ..lib.stardist2d import c_star_dist
ImportError: dlopen(/Users/guest/anaconda3/lib/python3.6/site-packages/stardist/lib/stardist2d.cpython-36m-darwin.so, 2): Symbol not found: _GOMP_loop_nonmonotonic_dynamic_next
Referenced from: /Users/guest/anaconda3/lib/python3.6/site-packages/stardist/lib/stardist2d.cpython-36m-darwin.so
Expected in: flat namespace
in /Users/guest/anaconda3/lib/python3.6/site-packages/stardist/lib/stardist2d.cpython-36m-darwin.so

Any idea what I can do here. Happy to provide other information.

Thank you for the help!

Problems loading new model in Fiji

Hi all -

Just had a first climpse at StarDist 2D and it's performance is outstanding. Really happy with it :)

While I managed to train and use a 2D model for prediction (using the jupyther notebooks), I am struggling to load the model with the Fiji plugin. The generic models that ship with the plugin work without problems.

I simply zipped the content of the respective model folder I generated with the notebooks and tried to load it. An error as follows was returned:

[INFO] Using default TensorFlow version from JAR: TF 1.12.0 CPU
[INFO] Loading TensorFlow model GenericNetwork_f1776fcd341bf3d6cbb63ac2c6fff03c from source file file:/scratch/models/Markus_ScanR_stardist/stardist_scanR.zip
[INFO] Unpacking config.json
[INFO] Unpacking events.out.tfevents.1585666494.ffbc03565224
[INFO] Unpacking thresholds.json
[INFO] Unpacking weights_best.h5
[INFO] Unpacking weights_last.h5
org.tensorflow.TensorFlowException: Could not find SavedModel .pb or .pbtxt at supplied export directory path: /home/tboothe/Fiji.app/models/GenericNetwork_f1776fcd341bf3d6cbb63ac2c6fff03c
	at org.tensorflow.SavedModelBundle.load(Native Method)
	at org.tensorflow.SavedModelBundle.access$000(SavedModelBundle.java:27)
	at org.tensorflow.SavedModelBundle$Loader.load(SavedModelBundle.java:32)
	at org.tensorflow.SavedModelBundle.load(SavedModelBundle.java:95)
	at net.imagej.tensorflow.CachedModelBundle.<init>(CachedModelBundle.java:44)
	at net.imagej.tensorflow.DefaultTensorFlowService.loadCachedModel(DefaultTensorFlowService.java:135)
	at de.csbdresden.csbdeep.network.model.tensorflow.TensorFlowNetwork.loadModel(TensorFlowNetwork.java:135)
	at de.csbdresden.csbdeep.network.model.DefaultNetwork.loadModel(DefaultNetwork.java:48)
	at de.csbdresden.csbdeep.network.DefaultModelLoader.loadNetwork(DefaultModelLoader.java:41)
	at de.csbdresden.csbdeep.network.DefaultModelLoader.run(DefaultModelLoader.java:20)
	at de.csbdresden.csbdeep.commands.GenericCoreNetwork.tryToPrepareInputAndNetwork(GenericCoreNetwork.java:523)
	at de.csbdresden.csbdeep.commands.GenericCoreNetwork.initiateModelIfNeeded(GenericCoreNetwork.java:303)
	at de.csbdresden.csbdeep.commands.GenericCoreNetwork.mainThread(GenericCoreNetwork.java:445)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
java.lang.NullPointerException
	at de.csbdresden.csbdeep.network.DefaultInputValidator.checkForTooManyDimensions(DefaultInputValidator.java:32)
	at de.csbdresden.csbdeep.network.DefaultInputValidator.run(DefaultInputValidator.java:18)
	at de.csbdresden.csbdeep.commands.GenericCoreNetwork.tryToPrepareInputAndNetwork(GenericCoreNetwork.java:526)
	at de.csbdresden.csbdeep.commands.GenericCoreNetwork.initiateModelIfNeeded(GenericCoreNetwork.java:303)
	at de.csbdresden.csbdeep.commands.GenericCoreNetwork.mainThread(GenericCoreNetwork.java:445)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[INFO] CSBDeep plugin exit (took 3643 milliseconds)
[ERROR] Module threw exception
java.lang.NullPointerException
	at de.csbdresden.stardist.StarDist2D.splitPrediction(StarDist2D.java:320)
	at de.csbdresden.stardist.StarDist2D.run(StarDist2D.java:292)
	at org.scijava.command.CommandModule.run(CommandModule.java:199)
	at org.scijava.module.ModuleRunner.run(ModuleRunner.java:168)
	at org.scijava.module.ModuleRunner.call(ModuleRunner.java:127)
	at org.scijava.module.ModuleRunner.call(ModuleRunner.java:66)
	at org.scijava.thread.DefaultThreadService.lambda$wrap$2(DefaultThreadService.java:228)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Am I reading this correctly, that Fiji still tries to load the generic model despite the fact that I am specifying the zip file of my own model?

Thanks a lot in advance for some feedback on that matter.

Build issue on Mac

I'm running into some gcc build errors on Mac. Debugging is a bit complicated for me as I have very limited access to the Mac.

Background:

I tried to install stardist on a facility users Mac as I built a workflow for the user that incorporates Stardist. Given the current COVID situation I tried to do this remotely and I do not currently have access to a Mac otherwise. As I can only access the Mac for short periods of time it would be nice to get an idea what would be good to try.

Steps taken:

  • Create a new conda environment with python 3.6
  • Activate new environment, install conda install -c conda-forge tensorflow=1.15
  • pip install stardist
  • Error when compiling libqhull, error: invalid argument '-std=c++11' not allowed with 'C/ObjC' . I will include a longer traceback at the end.

I googled the error message and can find a number of projects with similar errors (going back a while). Some seem to be related to the llvm version.
When installing tensorflow from conda-forge:

llvm-openmp        conda-forge/osx-64::llvm-openmp-10.0.0-h28b9765_0

During pip install stardist the following version of llvmlite is also collected.

Collecting llvmlite<=0.33.0.dev0,>=0.31.0.dev0
  Using cached llvmlite-0.32.0-cp36-cp36m-macosx_10_9_x86_64.whl (15.9 MB)

I am not sure whether they might be shadowing each other.

What to try ?

Has anyone encountered this before? As mentioned, I have limited time to try on the machine in question.
What I was going to try next time is:

Other suggestions are welcome.

Full console log (create environment, install tensorflow, install stardist)

(fibre) RICO:~ cfrederi$ conda create -n stardist_test python=3.6
Collecting package metadata (current_repodata.json): done
Solving environment: done


==> WARNING: A newer version of conda exists. <==
  current version: 4.7.10
  latest version: 4.8.3

Please update conda by running

    $ conda update -n base -c defaults conda



## Package Plan ##

  environment location: //anaconda3/envs/stardist_test

  added / updated specs:
    - python=3.6


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    libffi-3.2.1               |       h0a44026_6          43 KB
    ncurses-6.2                |       h0a44026_1         749 KB
    openssl-1.1.1g             |       h1de35cc_0         2.2 MB
    sqlite-3.31.1              |       h5c1f38d_1         2.4 MB
    xz-5.2.5                   |       h1de35cc_0         282 KB
    ------------------------------------------------------------
                                           Total:         5.7 MB

The following NEW packages will be INSTALLED:

  ca-certificates    pkgs/main/osx-64::ca-certificates-2020.1.1-0
  certifi            pkgs/main/osx-64::certifi-2020.4.5.1-py36_0
  libcxx             pkgs/main/osx-64::libcxx-4.0.1-hcfea43d_1
  libcxxabi          pkgs/main/osx-64::libcxxabi-4.0.1-hcfea43d_1
  libedit            pkgs/main/osx-64::libedit-3.1.20181209-hb402a30_0
  libffi             pkgs/main/osx-64::libffi-3.2.1-h0a44026_6
  ncurses            pkgs/main/osx-64::ncurses-6.2-h0a44026_1
  openssl            pkgs/main/osx-64::openssl-1.1.1g-h1de35cc_0
  pip                pkgs/main/osx-64::pip-20.0.2-py36_1
  python             pkgs/main/osx-64::python-3.6.10-hc70fcce_1
  readline           pkgs/main/osx-64::readline-8.0-h1de35cc_0
  setuptools         pkgs/main/osx-64::setuptools-46.1.3-py36_0
  sqlite             pkgs/main/osx-64::sqlite-3.31.1-h5c1f38d_1
  tk                 pkgs/main/osx-64::tk-8.6.8-ha441bb4_0
  wheel              pkgs/main/osx-64::wheel-0.34.2-py36_0
  xz                 pkgs/main/osx-64::xz-5.2.5-h1de35cc_0
  zlib               pkgs/main/osx-64::zlib-1.2.11-h1de35cc_3


Proceed ([y]/n)? 


Downloading and Extracting Packages
libffi-3.2.1         | 43 KB     | ############################################################################################################################################################################################################ | 100% 
sqlite-3.31.1        | 2.4 MB    | ############################################################                                                                                                                                                 |  29% sqlite-3.31.1        | 2.4 MB    | ############################################################################################################################################################################################################ | 100% 
ncurses-6.2          | 749 KB    | ############################################################################################################################################################################################################ | 100% 
xz-5.2.5             | 282 KB    | ############################################################################################################################################################################################################ | 100% 
openssl-1.1.1g       | 2.2 MB    | ############################################################################################################################################################################################################ | 100% 
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
#     $ conda activate stardist_test
#
# To deactivate an active environment, use
#
#     $ conda deactivate

(fibre) RICO:~ cfrederi$ conda activate stardist_test
(stardist_test) RICO:~ cfrederi$ conda install -c conda-forge tensorflow=1.15
Collecting package metadata (current_repodata.json): done
Solving environment: failed with current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: done


==> WARNING: A newer version of conda exists. <==
  current version: 4.7.10
  latest version: 4.8.3

Please update conda by running

    $ conda update -n base -c defaults conda



## Package Plan ##

  environment location: //anaconda3/envs/stardist_test

  added / updated specs:
    - tensorflow=1.15


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    h5py-2.10.0                |nompi_py36h106b333_102        1020 KB  conda-forge
    hdf5-1.10.5                |nompi_h15a436c_1103         3.0 MB  conda-forge
    libblas-3.8.0              |      16_openblas          10 KB  conda-forge
    libcblas-3.8.0             |      16_openblas          10 KB  conda-forge
    liblapack-3.8.0            |      16_openblas          10 KB  conda-forge
    libopenblas-0.3.9          |       h3d69b6c_0         8.4 MB  conda-forge
    llvm-openmp-10.0.0         |       h28b9765_0         268 KB  conda-forge
    numpy-1.18.1               |   py36hdc5ca10_1         5.0 MB  conda-forge
    six-1.14.0                 |             py_1          13 KB  conda-forge
    ------------------------------------------------------------
                                           Total:        17.7 MB

The following NEW packages will be INSTALLED:

  _tflow_select      pkgs/main/osx-64::_tflow_select-2.3.0-mkl
  absl-py            conda-forge/osx-64::absl-py-0.9.0-py36_0
  astor              conda-forge/noarch::astor-0.7.1-py_0
  c-ares             conda-forge/osx-64::c-ares-1.15.0-h01d97ff_1001
  gast               conda-forge/noarch::gast-0.2.2-py_0
  google-pasta       conda-forge/noarch::google-pasta-0.2.0-pyh8c360ce_0
  grpcio             conda-forge/osx-64::grpcio-1.23.0-py36h6ef0057_0
  h5py               conda-forge/osx-64::h5py-2.10.0-nompi_py36h106b333_102
  hdf5               conda-forge/osx-64::hdf5-1.10.5-nompi_h15a436c_1103
  keras-applications conda-forge/noarch::keras-applications-1.0.8-py_1
  keras-preprocessi~ conda-forge/noarch::keras-preprocessing-1.1.0-py_0
  libblas            conda-forge/osx-64::libblas-3.8.0-16_openblas
  libcblas           conda-forge/osx-64::libcblas-3.8.0-16_openblas
  libgfortran        conda-forge/osx-64::libgfortran-4.0.0-2
  liblapack          conda-forge/osx-64::liblapack-3.8.0-16_openblas
  libopenblas        conda-forge/osx-64::libopenblas-0.3.9-h3d69b6c_0
  libprotobuf        conda-forge/osx-64::libprotobuf-3.9.2-hfbae3c0_0
  llvm-openmp        conda-forge/osx-64::llvm-openmp-10.0.0-h28b9765_0
  markdown           conda-forge/noarch::markdown-3.2.1-py_0
  numpy              conda-forge/osx-64::numpy-1.18.1-py36hdc5ca10_1
  opt_einsum         conda-forge/noarch::opt_einsum-3.2.1-py_0
  protobuf           conda-forge/osx-64::protobuf-3.9.2-py36h6de7cb9_1
  python_abi         conda-forge/osx-64::python_abi-3.6-1_cp36m
  scipy              conda-forge/osx-64::scipy-1.3.1-py36h7e0e109_2
  six                conda-forge/noarch::six-1.14.0-py_1
  tensorboard        conda-forge/osx-64::tensorboard-1.15.0-py36_0
  tensorflow         pkgs/main/osx-64::tensorflow-1.15.0-mkl_py36h975b573_0
  tensorflow-base    pkgs/main/osx-64::tensorflow-base-1.15.0-mkl_py36h032239d_0
  tensorflow-estima~ pkgs/main/noarch::tensorflow-estimator-1.15.1-pyh2649769_0
  termcolor          conda-forge/noarch::termcolor-1.1.0-py_2
  werkzeug           conda-forge/noarch::werkzeug-0.16.1-py_0
  wrapt              conda-forge/osx-64::wrapt-1.12.1-py36h37b9a7d_1

The following packages will be UPDATED:

  ca-certificates     pkgs/main::ca-certificates-2020.1.1-0 --> conda-forge::ca-certificates-2020.4.5.1-hecc5488_0

The following packages will be SUPERSEDED by a higher-priority channel:

  certifi              pkgs/main::certifi-2020.4.5.1-py36_0 --> conda-forge::certifi-2020.4.5.1-py36h9f0ad1d_0
  openssl              pkgs/main::openssl-1.1.1g-h1de35cc_0 --> conda-forge::openssl-1.1.1g-h0b31af3_0


Proceed ([y]/n)? 


Downloading and Extracting Packages
hdf5-1.10.5          | 3.0 MB    | ############################################################################################################################################################################################################ | 100% 
h5py-2.10.0          | 1020 KB   | ############################################################################################################################################################################################################ | 100% 
llvm-openmp-10.0.0   | 268 KB    | ############################################################################################################################################################################################################ | 100% 
libopenblas-0.3.9    | 8.4 MB    | ############################################################################################################################################################################################################ | 100% 
liblapack-3.8.0      | 10 KB     | ############################################################################################################################################################################################################ | 100% 
libcblas-3.8.0       | 10 KB     | ############################################################################################################################################################################################################ | 100% 
libblas-3.8.0        | 10 KB     | ############################################################################################################################################################################################################ | 100% 
numpy-1.18.1         | 5.0 MB    | ############################################################################################################################################################################################################ | 100% 
six-1.14.0           | 13 KB     | ############################################################################################################################################################################################################ | 100% 
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(stardist_test) RICO:~ cfrederi$ pip install stardist
Collecting stardist
  Using cached stardist-0.5.0.tar.gz (396 kB)
Collecting csbdeep>=0.4.0
  Using cached csbdeep-0.5.1-py2.py3-none-any.whl (62 kB)
Collecting scikit-image
  Downloading scikit_image-0.16.2-cp36-cp36m-macosx_10_6_intel.whl (30.4 MB)
     |████████████████████████████████| 30.4 MB 610 kB/s 
Processing ./Library/Caches/pip/wheels/ac/5a/d7/57de2b0ed2a980d2d245f56506bb93fe3e3ee2d9c965dfefd2/numba-0.49.0-cp36-cp36m-macosx_10_12_x86_64.whl
Requirement already satisfied: numpy in /anaconda3/envs/stardist_test/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (1.18.1)
Collecting tifffile
  Downloading tifffile-2020.2.16-py3-none-any.whl (130 kB)
     |████████████████████████████████| 130 kB 665 kB/s 
Collecting tqdm
  Using cached tqdm-4.45.0-py2.py3-none-any.whl (60 kB)
Requirement already satisfied: scipy in /anaconda3/envs/stardist_test/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (1.3.1)
Requirement already satisfied: six in /anaconda3/envs/stardist_test/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (1.14.0)
Collecting keras>=2.1.2
  Using cached Keras-2.3.1-py2.py3-none-any.whl (377 kB)
Requirement already satisfied: h5py in /anaconda3/envs/stardist_test/lib/python3.6/site-packages (from csbdeep>=0.4.0->stardist) (2.10.0)
Collecting matplotlib
  Downloading matplotlib-3.2.1-cp36-cp36m-macosx_10_9_x86_64.whl (12.4 MB)
     |████████████████████████████████| 12.4 MB 22.8 MB/s 
Collecting networkx>=2.0
  Downloading networkx-2.4-py3-none-any.whl (1.6 MB)
     |████████████████████████████████| 1.6 MB 1.7 MB/s 
Collecting imageio>=2.3.0
  Downloading imageio-2.8.0-py3-none-any.whl (3.3 MB)
     |████████████████████████████████| 3.3 MB 453 kB/s 
Collecting pillow>=4.3.0
  Downloading Pillow-7.1.2-cp36-cp36m-macosx_10_10_x86_64.whl (2.2 MB)
     |████████████████████████████████| 2.2 MB 2.1 MB/s 
Collecting PyWavelets>=0.4.0
  Downloading PyWavelets-1.1.1-cp36-cp36m-macosx_10_9_x86_64.whl (4.3 MB)
     |████████████████████████████████| 4.3 MB 35.3 MB/s 
Requirement already satisfied: setuptools in /anaconda3/envs/stardist_test/lib/python3.6/site-packages (from numba->stardist) (46.1.3.post20200330)
Collecting llvmlite<=0.33.0.dev0,>=0.31.0.dev0
  Using cached llvmlite-0.32.0-cp36-cp36m-macosx_10_9_x86_64.whl (15.9 MB)
Collecting imagecodecs>=2020.1.31
  Downloading imagecodecs-2020.2.18-cp36-cp36m-macosx_10_9_intel.whl (8.6 MB)
     |████████████████████████████████| 8.6 MB 611 kB/s 
Processing ./Library/Caches/pip/wheels/e5/9d/ad/2ee53cf262cba1ffd8afe1487eef788ea3f260b7e6232a80fc/PyYAML-5.3.1-cp36-cp36m-macosx_10_9_x86_64.whl
Requirement already satisfied: keras-preprocessing>=1.0.5 in /anaconda3/envs/stardist_test/lib/python3.6/site-packages (from keras>=2.1.2->csbdeep>=0.4.0->stardist) (1.1.0)
Requirement already satisfied: keras-applications>=1.0.6 in /anaconda3/envs/stardist_test/lib/python3.6/site-packages (from keras>=2.1.2->csbdeep>=0.4.0->stardist) (1.0.8)
Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1
  Downloading pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
     |████████████████████████████████| 67 kB 2.9 MB/s 
Collecting python-dateutil>=2.1
  Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
     |████████████████████████████████| 227 kB 2.1 MB/s 
Collecting cycler>=0.10
  Downloading cycler-0.10.0-py2.py3-none-any.whl (6.5 kB)
Collecting kiwisolver>=1.0.1
  Downloading kiwisolver-1.2.0-cp36-cp36m-macosx_10_9_x86_64.whl (60 kB)
     |████████████████████████████████| 60 kB 501 kB/s 
Collecting decorator>=4.3.0
  Downloading decorator-4.4.2-py2.py3-none-any.whl (9.2 kB)
Building wheels for collected packages: stardist
  Building wheel for stardist (setup.py) ... error
  ERROR: Command errored out with exit status 1:
   command: //anaconda3/envs/stardist_test/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/setup.py'"'"'; __file__='"'"'/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-wheel-2xvf3qhx
       cwd: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/
  Complete output (193 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib.macosx-10.9-x86_64-3.6
  creating build/lib.macosx-10.9-x86_64-3.6/stardist
  copying stardist/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
  copying stardist/matching.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
  copying stardist/nms.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
  copying stardist/rays3d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
  copying stardist/utils.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
  copying stardist/version.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
  creating build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
  copying stardist/geometry/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
  copying stardist/geometry/geom2d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
  copying stardist/geometry/geom3d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
  creating build/lib.macosx-10.9-x86_64-3.6/stardist/models
  copying stardist/models/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
  copying stardist/models/base.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
  copying stardist/models/model2d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
  copying stardist/models/model3d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
  copying stardist/models/pretrained.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
  copying stardist/models/sample_patches.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
  creating build/lib.macosx-10.9-x86_64-3.6/stardist/plot
  copying stardist/plot/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/plot
  copying stardist/plot/plot.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/plot
  copying stardist/plot/render.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/plot
  creating build/lib.macosx-10.9-x86_64-3.6/stardist/kernels
  copying stardist/kernels/stardist2d.cl -> build/lib.macosx-10.9-x86_64-3.6/stardist/kernels
  copying stardist/kernels/stardist3d.cl -> build/lib.macosx-10.9-x86_64-3.6/stardist/kernels
  running build_ext
  building 'stardist.lib.stardist2d' extension
  Warning: Can't read registry to find the necessary compiler setting
  Make sure that Python modules winreg, win32api or win32con are installed.
  C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
  
  creating build/temp.macosx-10.9-x86_64-3.6
  creating build/temp.macosx-10.9-x86_64-3.6/stardist
  creating build/temp.macosx-10.9-x86_64-3.6/stardist/lib
  compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I//anaconda3/envs/stardist_test/include/python3.6m -c'
  extra options: '-std=c++11 -fopenmp'
  gcc: stardist/lib/stardist2d.cpp
  gcc: stardist/lib/utils.cpp
  gcc: stardist/lib/clipper.cpp
  clang: error: unsupported option '-fopenmp'
  clang: error: unsupported option '-fopenmp'
  clang: error: unsupported option '-fopenmp'
  compiling with OpenMP support failed, re-trying without
  building 'stardist.lib.stardist2d' extension
  C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
  
  compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I//anaconda3/envs/stardist_test/include/python3.6m -c'
  extra options: '-std=c++11'
  gcc: stardist/lib/utils.cpp
  gcc: stardist/lib/clipper.cpp
  gcc: stardist/lib/stardist2d.cpp
  stardist/lib/utils.cpp:4:98: warning: field 'label' will be initialized after field 'eps' [-Wreorder]
  ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                   ^
  stardist/lib/utils.cpp:4:111: warning: field 'eps' will be initialized after field 'curr_percentage' [-Wreorder]
  ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                                ^
  In file included from stardist/lib/stardist2d.cpp:8:
  In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/arrayobject.h:4:
  In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:
  In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarraytypes.h:1832:
  //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:17:2: warning: "Using deprecated NumPy API, disable it with "          "#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-W#warnings]
  #warning "Using deprecated NumPy API, disable it with " \
   ^
  2 warnings generated.
  stardist/lib/stardist2d.cpp:102:16: warning: using integer absolute value function 'abs' when argument is of floating point type [-Wabsolute-value]
    area = 0.5 * abs(area);
                 ^
  stardist/lib/stardist2d.cpp:102:16: note: use function 'std::abs' instead
    area = 0.5 * abs(area);
                 ^~~
                 std::abs
  2 warnings generated.
  creating build/lib.macosx-10.9-x86_64-3.6/stardist/lib
  g++ -bundle -undefined dynamic_lookup -L//anaconda3/envs/stardist_test/lib -arch x86_64 -L//anaconda3/envs/stardist_test/lib -arch x86_64 -arch x86_64 build/temp.macosx-10.9-x86_64-3.6/stardist/lib/stardist2d.o build/temp.macosx-10.9-x86_64-3.6/stardist/lib/clipper.o build/temp.macosx-10.9-x86_64-3.6/stardist/lib/utils.o -o build/lib.macosx-10.9-x86_64-3.6/stardist/lib/stardist2d.cpython-36m-darwin.so
  building 'stardist.lib.stardist3d' extension
  C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
  
  creating build/temp.macosx-10.9-x86_64-3.6/private
  creating build/temp.macosx-10.9-x86_64-3.6/private/var
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp
  creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r
  compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src -I//anaconda3/envs/stardist_test/include/python3.6m -c'
  extra options: '-std=c++11 -fopenmp'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp
  gcc: stardist/lib/stardist3d.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullStat.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RboxPoints.cpp
  clang: error: unsupported option '-fopenmp'
  clang: clang: error: unsupported option '-fopenmp'
  error: unsupported option '-fopenmp'
  clang: error: unsupported option '-fopenmp'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullQh.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacetList.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/PointCoordinates.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPoint.cpp
  clang: error: unsupported option '-fopenmp'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.c
  clang: error: unsupported option '-fopenmp'
  clang: error: unsupported option '-fopenmp'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/qset_r.c
  clang: error: unsupported option '-fopenmp'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/merge_r.c
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/io_r.c
  clang: error: unsupported option '-fopenmp'
  clang: error: unsupported option '-fopenmp'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/geom2_r.c
  clang: error: unsupported option '-fopenmp'
  clang: error: unsupported option '-fopenmp'
  clang: error: unsupported option '-fopenmp'
  compiling with OpenMP support failed, re-trying without
  building 'stardist.lib.stardist3d' extension
  C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
  
  compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src -I//anaconda3/envs/stardist_test/include/python3.6m -c'
  extra options: '-std=c++11'
  gcc: stardist/lib/stardist3d.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullStat.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RboxPoints.cpp
  /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp:42:13: warning: explicitly assigning value of variable of type 'int' to itself [-Wself-assign]
      exitcode= exitcode;
      ~~~~~~~~^ ~~~~~~~~
  1 warning generated.
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RoadLogEvent.cpp
  In file included from stardist/lib/stardist3d.cpp:2:
  In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/arrayobject.h:4:
  In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:
  In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarraytypes.h:1832:
  //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:17:2: warning: "Using deprecated NumPy API, disable it with "          "#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-W#warnings]
  #warning "Using deprecated NumPy API, disable it with " \
   ^
  1 warning generated.
  gcc: stardist/lib/stardist3d_impl.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullSet.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullVertexSet.cpp
  stardist/lib/stardist3d_impl.cpp:917:33: warning: unused variable 'diff' [-Wunused-variable]
    std::chrono::duration<double> diff = stop-start;
                                  ^
  stardist/lib/stardist3d_impl.cpp:1063:9: warning: unused variable 'status_percentage_new' [-Wunused-variable]
      int status_percentage_new = 100*count_total/n_polys;
          ^
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RoadError.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullRidge.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullVertex.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullQh.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPoint.cpp
  2 warnings generated.
  gcc: stardist/lib/utils.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPoints.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacetList.cpp
  stardist/lib/utils.cpp:4:98: warning: field 'label' will be initialized after field 'eps' [-Wreorder]
  ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                   ^
  stardist/lib/utils.cpp:4:111: warning: field 'eps' will be initialized after field 'curr_percentage' [-Wreorder]
  ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                                ^
  2 warnings generated.
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/PointCoordinates.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullHyperplane.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPointSet.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacet.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacetSet.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/Coordinates.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.c
  error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/qset_r.c
  error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/merge_r.c
  error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/io_r.c
  error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/geom2_r.c
  error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/Qhull.cpp
  gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/user_r.c
  error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
  error: Command "gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src -I//anaconda3/envs/stardist_test/include/python3.6m -c /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.c -o build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.o -std=c++11" failed with exit status 1
  ----------------------------------------
  ERROR: Failed building wheel for stardist
  Running setup.py clean for stardist
Failed to build stardist
Installing collected packages: imagecodecs, tifffile, tqdm, pyyaml, keras, pyparsing, python-dateutil, cycler, kiwisolver, matplotlib, csbdeep, decorator, networkx, pillow, imageio, PyWavelets, scikit-image, llvmlite, numba, stardist
    Running setup.py install for stardist ... error
    ERROR: Command errored out with exit status 1:
     command: //anaconda3/envs/stardist_test/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/setup.py'"'"'; __file__='"'"'/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-record-ot94izl6/install-record.txt --single-version-externally-managed --compile --install-headers //anaconda3/envs/stardist_test/include/python3.6m/stardist
         cwd: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/
    Complete output (193 lines):
    running install
    running build
    running build_py
    creating build
    creating build/lib.macosx-10.9-x86_64-3.6
    creating build/lib.macosx-10.9-x86_64-3.6/stardist
    copying stardist/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
    copying stardist/matching.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
    copying stardist/nms.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
    copying stardist/rays3d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
    copying stardist/utils.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
    copying stardist/version.py -> build/lib.macosx-10.9-x86_64-3.6/stardist
    creating build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
    copying stardist/geometry/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
    copying stardist/geometry/geom2d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
    copying stardist/geometry/geom3d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/geometry
    creating build/lib.macosx-10.9-x86_64-3.6/stardist/models
    copying stardist/models/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
    copying stardist/models/base.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
    copying stardist/models/model2d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
    copying stardist/models/model3d.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
    copying stardist/models/pretrained.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
    copying stardist/models/sample_patches.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/models
    creating build/lib.macosx-10.9-x86_64-3.6/stardist/plot
    copying stardist/plot/__init__.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/plot
    copying stardist/plot/plot.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/plot
    copying stardist/plot/render.py -> build/lib.macosx-10.9-x86_64-3.6/stardist/plot
    creating build/lib.macosx-10.9-x86_64-3.6/stardist/kernels
    copying stardist/kernels/stardist2d.cl -> build/lib.macosx-10.9-x86_64-3.6/stardist/kernels
    copying stardist/kernels/stardist3d.cl -> build/lib.macosx-10.9-x86_64-3.6/stardist/kernels
    running build_ext
    building 'stardist.lib.stardist2d' extension
    Warning: Can't read registry to find the necessary compiler setting
    Make sure that Python modules winreg, win32api or win32con are installed.
    C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
    
    creating build/temp.macosx-10.9-x86_64-3.6
    creating build/temp.macosx-10.9-x86_64-3.6/stardist
    creating build/temp.macosx-10.9-x86_64-3.6/stardist/lib
    compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I//anaconda3/envs/stardist_test/include/python3.6m -c'
    extra options: '-std=c++11 -fopenmp'
    gcc: stardist/lib/stardist2d.cpp
    gcc: stardist/lib/clipper.cpp
    gcc: stardist/lib/utils.cpp
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    compiling with OpenMP support failed, re-trying without
    building 'stardist.lib.stardist2d' extension
    C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
    
    compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I//anaconda3/envs/stardist_test/include/python3.6m -c'
    extra options: '-std=c++11'
    gcc: stardist/lib/clipper.cpp
    gcc: stardist/lib/stardist2d.cpp
    gcc: stardist/lib/utils.cpp
    stardist/lib/utils.cpp:4:98: warning: field 'label' will be initialized after field 'eps' [-Wreorder]
    ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                     ^
    stardist/lib/utils.cpp:4:111: warning: field 'eps' will be initialized after field 'curr_percentage' [-Wreorder]
    ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                                  ^
    2 warnings generated.
    In file included from stardist/lib/stardist2d.cpp:8:
    In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/arrayobject.h:4:
    In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:
    In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarraytypes.h:1832:
    //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:17:2: warning: "Using deprecated NumPy API, disable it with "          "#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-W#warnings]
    #warning "Using deprecated NumPy API, disable it with " \
     ^
    stardist/lib/stardist2d.cpp:102:16: warning: using integer absolute value function 'abs' when argument is of floating point type [-Wabsolute-value]
      area = 0.5 * abs(area);
                   ^
    stardist/lib/stardist2d.cpp:102:16: note: use function 'std::abs' instead
      area = 0.5 * abs(area);
                   ^~~
                   std::abs
    2 warnings generated.
    creating build/lib.macosx-10.9-x86_64-3.6/stardist/lib
    g++ -bundle -undefined dynamic_lookup -L//anaconda3/envs/stardist_test/lib -arch x86_64 -L//anaconda3/envs/stardist_test/lib -arch x86_64 -arch x86_64 build/temp.macosx-10.9-x86_64-3.6/stardist/lib/stardist2d.o build/temp.macosx-10.9-x86_64-3.6/stardist/lib/clipper.o build/temp.macosx-10.9-x86_64-3.6/stardist/lib/utils.o -o build/lib.macosx-10.9-x86_64-3.6/stardist/lib/stardist2d.cpython-36m-darwin.so
    building 'stardist.lib.stardist3d' extension
    C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
    
    creating build/temp.macosx-10.9-x86_64-3.6/private
    creating build/temp.macosx-10.9-x86_64-3.6/private/var
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp
    creating build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r
    compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src -I//anaconda3/envs/stardist_test/include/python3.6m -c'
    extra options: '-std=c++11 -fopenmp'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp
    gcc: stardist/lib/stardist3d.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RboxPoints.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullStat.cpp
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullQh.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPoint.cpp
    clang: error: unsupported option '-fopenmp'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacetList.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/PointCoordinates.cpp
    clang: error: unsupported option '-fopenmp'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.c
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/qset_r.c
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/merge_r.c
    clang: error: unsupported option '-fopenmp'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/io_r.c
    clang: error: unsupported option '-fopenmp'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/geom2_r.c
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    clang: error: unsupported option '-fopenmp'
    compiling with OpenMP support failed, re-trying without
    building 'stardist.lib.stardist3d' extension
    C compiler: gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64
    
    compile options: '-I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src -I//anaconda3/envs/stardist_test/include/python3.6m -c'
    extra options: '-std=c++11'
    gcc: stardist/lib/stardist3d.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RboxPoints.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullStat.cpp
    /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp:42:13: warning: explicitly assigning value of variable of type 'int' to itself [-Wself-assign]
        exitcode= exitcode;
        ~~~~~~~~^ ~~~~~~~~
    1 warning generated.
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RoadLogEvent.cpp
    In file included from stardist/lib/stardist3d.cpp:2:
    In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/arrayobject.h:4:
    In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:
    In file included from //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/ndarraytypes.h:1832:
    //anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:17:2: warning: "Using deprecated NumPy API, disable it with "          "#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-W#warnings]
    #warning "Using deprecated NumPy API, disable it with " \
     ^
    1 warning generated.
    gcc: stardist/lib/stardist3d_impl.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullSet.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullVertexSet.cpp
    stardist/lib/stardist3d_impl.cpp:917:33: warning: unused variable 'diff' [-Wunused-variable]
      std::chrono::duration<double> diff = stop-start;
                                    ^
    stardist/lib/stardist3d_impl.cpp:1063:9: warning: unused variable 'status_percentage_new' [-Wunused-variable]
        int status_percentage_new = 100*count_total/n_polys;
            ^
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/RoadError.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullRidge.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullQh.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullVertex.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPoint.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPoints.cpp
    2 warnings generated.
    gcc: stardist/lib/utils.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacetList.cpp
    stardist/lib/utils.cpp:4:98: warning: field 'label' will be initialized after field 'eps' [-Wreorder]
    ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                     ^
    stardist/lib/utils.cpp:4:111: warning: field 'eps' will be initialized after field 'curr_percentage' [-Wreorder]
    ProgressBar::ProgressBar(const std::string label,const int width, const float eps): width(width),label(label),eps(eps), curr_percentage(0){};
                                                                                                                  ^
    2 warnings generated.
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/PointCoordinates.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullHyperplane.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullPointSet.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacet.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/QhullFacetSet.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/Coordinates.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.c
    error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/qset_r.c
    error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/merge_r.c
    error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/io_r.c
    error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/geom2_r.c
    error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhullcpp/Qhull.cpp
    gcc: /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/user_r.c
    error: invalid argument '-std=c++11' not allowed with 'C/ObjC'
    error: Command "gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/include -arch x86_64 -I//anaconda3/envs/stardist_test/lib/python3.6/site-packages/numpy/core/include -I/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src -I//anaconda3/envs/stardist_test/include/python3.6m -c /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.c -o build/temp.macosx-10.9-x86_64-3.6/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/stardist/lib/qhull_src/src/libqhull_r/stat_r.o -std=c++11" failed with exit status 1
    ----------------------------------------
ERROR: Command errored out with exit status 1: //anaconda3/envs/stardist_test/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/setup.py'"'"'; __file__='"'"'/private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-install-tmrhw9ls/stardist/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /private/var/folders/v3/8r0std7j1m1d4h9fvmpr6yc80000gx/T/pip-record-ot94izl6/install-record.txt --single-version-externally-managed --compile --install-headers //anaconda3/envs/stardist_test/include/python3.6m/stardist Check the logs for full command output.

dist_relevant_mse and val_dist_relevant_mse are nan during training

The dist_relevant_mse and val_dist_relevant_mse are nan during training when i try the same with a smaller training set of 10 images and validation set of 2 images this problem is not there and i manage to overfit my data, which i reckon is an indication that there is no problem with my images. so is this error because my training set is too large. I use 350 images for training and 50 for validation with no augmentation

Export StarDist model, re-use in FIJI

Hi StarDist Team,

I’m playing with StarDist for a couple of days now and it’s amazing !

I just trained a model with your datasets and the network settings from the notebook and use this model to make some predictions on my own images. The results is (as expected) sensitive to scaling but it amazingly good!

I’m now looking for a solution to use the model in FIJI.

I tried to model.exportTF() as in CSBDeep, but the export function is not implemented (yet ?).
Will you release “soon” or do you plan to make a plugin for FIJI?

Thanks again for this amazing tool!
Cheers,
Romain

terminate called after throwing an instance of 'Xbyak::Error' what(): can't protect

Hi there! I'm reproducing the training procedure according to your Jupyter with the following configuration

`2019-02-03 20:22:54.463532: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
2019-02-03 20:22:54.478737: I tensorflow/core/common_runtime/process_util.cc:69] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
number of images: 447

  • training: 380

  • validation: 67
    Configuration for a :class:StarDist model.

    Parameters

    n_rays : int
    Number of radial directions for the star-convex polygon.
    Recommended to use a power of 2 (default: 32).
    n_channel_in : int
    Number of channels of given input image (default: 1).
    kwargs : dict
    Overwrite (or add) configuration attributes (see below).

    Attributes

    unet_n_depth : int
    Number of U-Net resolution levels (down/up-sampling layers).
    unet_kernel_size : (int,int)
    Convolution kernel size for all (U-Net) convolution layers.
    unet_n_filter_base : int
    Number of convolution kernels (feature channels) for first U-Net layer.
    Doubled after each down-sampling layer.
    net_conv_after_unet : int
    Number of extra convolution layers after U-Net (0 to disable).
    train_shape_completion : bool
    Train model to predict complete shapes for partially visible objects at image boundary.
    train_completion_crop : int
    If 'train_shape_completion' is set to True, specify number of pixels to crop at boundary of training patches.
    Should be chosen based on (largest) object sizes.
    train_patch_size : (int,int)
    Size of patches to be cropped from provided training images.
    train_dist_loss : str
    Training loss for star-convex polygon distances ('mse' or 'mae').
    train_epochs : int
    Number of training epochs.
    train_steps_per_epoch : int
    Number of parameter update steps per epoch.
    train_learning_rate : float
    Learning rate for training.
    train_batch_size : int
    Batch size for training.
    train_tensorboard : bool
    Enable TensorBoard for monitoring training progress.
    train_checkpoint : str
    Name of checkpoint file for model weights (only best are saved); set to None to disable.
    train_reduce_lr : dict
    Parameter :class:dict of ReduceLROnPlateau_ callback; set to None to disable.

      .. _ReduceLROnPlateau: https://keras.io/callbacks/#reducelronplateau
    

Config(n_channel_in=1, n_rays=32, net_conv_after_unet=128, net_input_shape=(None, None, 1), train_batch_size=4, train_checkpoint='weights_best.h5', train_completion_crop=32, train_dist_loss='mae', train_epochs=100, train_learning_rate=0.0003, train_patch_size=(256, 256), train_reduce_lr={'factor': 0.5, 'patience': 10}, train_shape_completion=False, train_steps_per_epoch=400, train_tensorboard=True, unet_kernel_size=(3, 3), unet_n_depth=3, unet_n_filter_base=32)`

And on the epoch 47/100 the process crashes with an error:

terminate called after throwing an instance of 'Xbyak::Error' what(): can't protect /var/spool/sge/qb3cell/idgpu1/job_scripts/8502: line 34: 55937 Aborted (core dumped) python

or alternatively

terminate called after throwing an instance of 'Xbyak::Error' what(): err munmap /var/spool/sge/qb3cell/idgpu1/job_scripts/7939: line 34: 21694 Aborted (core dumped) python
Could you suggest any solutions?

Thanks ahead!

Obtaining dist and prob maps directly

Hi Martin and Uwe!

I've trained a 3D model using 32x32x32 voxels patches - and so prediction must occur on patches of that size as well(?). But of course my actual 3D images are much larger. I would like to run the model in parallel on all 32x32x32 patches from my full image with a stride of say 24 voxels along each dimension (so 25% overlap), and then stitch the results back together. It seems like the best objects to stitch would be the probability maps and distance predictions, since they are continuous variables, and then do the NMS afterwards on the entire stitched volume. Does this sound like a reasonable approach? Can you guys provide guidance on modifying the source to obtain these objects directly, rather than just the segments and details, from running predict_instances? Or are they available via another function?

limiting cores while training , training on 3 channel images

  1. is there an option to limit core usage during training or do i have to do it by setting keras and
    tensorflow environment variables
  2. Is it possible to train on 3 channel images and 1 channel mask, i notice that an assert statement
    checks for that
  3. how do you generate 3 channel images of the mask, is it okay to do it by concatenating 1
    channel mask images 3 times or does labkit have an option to do that

Which params in StarDist2D Config are implemented?

Hi!

Of course - thanks again for the great tools!
I noticed that, of the member variables printed with vars(config) only a subset of them have documentation in the doc string for the class. If something is missing from the doc string, does that mean it is not implemented? I'm specifically interested in dropout and batch_norm. I don't see self.config.unet_dropout or self.config.unet_batch_norm anywhere in model2d.py, or is the config object passed to some keras superclass where dropout and batch norm are considered or something like that?

loading StarDist3D models after training

Hi! I've been trying to get some StarDist3D models trained. I used some CPU clusters to do a initial training test, which finished OK with the following output:

100/100 [==============================] - 826s 8s/step - loss: 1.3955 - prob_loss: 0.4074 - dist_loss: 4.9404 - prob_kld: 0.1733 - dist_relevant_mae: 4.9394 - dist_relevant_mse: 56.6804 - val_loss: 1.4757 - val_prob_loss: 0.3504 - val_dist_loss: 5.6265 - val_prob_kld: 0.1715 - val_dist_relevant_mae: 5.6254 - val_dist_relevant_mse: 66.9819
NMS threshold = 0.3:  75%|███████▌  | 15/20 [07:27<02:29, 29.85s/it, 0.513 -> 0.014]
NMS threshold = 0.4:  75%|███████▌  | 15/20 [08:30<02:50, 34.02s/it, 0.513 -> 0.009]
NMS threshold = 0.5:  75%|███████▌  | 15/20 [10:14<03:24, 41.00s/it, 0.513 -> 0.008]

Loading network weights from 'weights_best.h5'.
GPU enabled False: 16467.110308 sec
Using optimized values: prob_thresh=0.514525, nms_thresh=0.3.
Saving to 'thresholds.json'.

I try to subsequently reload this trained model onto an interactive node / my personal local machine with:

model = StarDist3D(None, name='fixed_organoids_3d',basedir='/path/to/models')

Two strange things happen. On the interactive node, the entire python shell crashes, which i suspect is because the CUDA-enabled tensorflow is not configured properly on the node:

Loading network weights from 'weights_best.h5'.
2020-02-18 07:25:19.385558: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
terminate called after throwing an instance of 'std::system_error'
 what():  Resource temporarily unavailable
Aborted

But on my personal machine, where tensorflow-gpu is working OK (via gputools), I get the following error:

Traceback (most recent call last):

  File "<ipython-input-11-2244447c3551>", line 1, in <module>
    model = StarDist3D(None, name='fixed_organoids_2d')

  File "/anaconda2/envs/python3/lib/python3.6/site-packages/stardist/models/model3d.py", line 294, in __init__
    super().__init__(config, name=name, basedir=basedir)

  File "/anaconda2/envs/python3/lib/python3.6/site-packages/stardist/models/base.py", line 145, in __init__
    super().__init__(config=config, name=name, basedir=basedir)

  File "/anaconda2/envs/python3/lib/python3.6/site-packages/csbdeep/models/base_model.py", line 92, in __init__
    self._set_logdir()

  File "/anaconda2/envs/python3/lib/python3.6/site-packages/csbdeep/models/base_model.py", line 30, in wrapper
    return f(*args, **kwargs)

  File "/anaconda2/envs/python3/lib/python3.6/site-packages/csbdeep/models/base_model.py", line 126, in _set_logdir
    self.config = self._config_class(**config_dict)

  File "/anaconda2/envs/python3/lib/python3.6/site-packages/stardist/models/model3d.py", line 256, in __init__
    self.update_parameters(False, **kwargs)

  File "/anaconda2/envs/python3/lib/python3.6/site-packages/csbdeep/models/config.py", line 72, in update_parameters
    raise AttributeError("Not allowed to add new parameters (%s)" % ', '.join(attr_new))

AttributeError: Not allowed to add new parameters (train_foreground_only)

I'm really not sure what is going on since I'm not passing any new configuration to the model constructor.

Also, just wanted to add that some of the pre-trained 2D models are working excellently on my hard-to-segment 2d images! :)

Problem in training "No module named stardist.lib.stardist"

Hello,

While training, it occurs that "ImportError: No module named stardist.lib.stardist" in utils.py, line 74, from .lib.stardist import c_star_dist.
The detail is:

Epoch 1/100
Traceback (most recent call last):
File "/home/x000000/anaconda3/lib/python3.5/site-packages/keras/utils/data_utils.py", line 578, in get
inputs = self.queue.get(block=True).get()
File "/home/x000000/anaconda3/lib/python3.5/multiprocessing/pool.py", line 644, in get
raise self._value
File "/home/x000000/anaconda3/lib/python3.5/multiprocessing/pool.py", line 119, in worker
result = (True, func(*args, **kwds))
File "/home/x000000/anaconda3/lib/python3.5/site-packages/keras/utils/data_utils.py", line 401, in get_index
return _SHARED_SEQUENCES[uid][i]
File "/home/x000000/SYF/stardist/stardist/model.py", line 100, in getitem
dist = np.stack([star_dist(lbl,self.n_rays) for lbl in Y])
File "/home/x000000/SYF/stardist/stardist/model.py", line 100, in
dist = np.stack([star_dist(lbl,self.n_rays) for lbl in Y])
File "/home/x000000/SYF/stardist/stardist/utils.py", line 121, in star_dist
return _cpp_star_dist(a,n_rays)
File "/home/x000000/SYF/stardist/stardist/utils.py", line 74, in _cpp_star_dist
from .lib.stardist import c_star_dist
ImportError: No module named 'stardist.lib.stardist'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "train_1.py", line 94, in
model.train(X_trn,Y_trn,validation_data=(X_val,Y_val))
File "/home/x000000/SYF/stardist/stardist/model.py", line 420, in train
callbacks=self.callbacks, verbose=1)
File "/home/x000000/anaconda3/lib/python3.5/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/home/x000000/anaconda3/lib/python3.5/site-packages/keras/engine/training.py", line 2194, in fit_generator
generator_output = next(output_generator)
File "/home/x000000/anaconda3/lib/python3.5/site-packages/keras/utils/data_utils.py", line 584, in get
six.raise_from(StopIteration(e), e)
File "", line 3, in raise_from
StopIteration: No module named 'stardist.lib.stardist'

I also tried to import stardist.lib.stardist in the terminal, but it doesn't work either.

Dealing with empty masks.

Occurred while training a 2d model. From a public dataset, I had the case, that one of the masks was empty (only background). It took a while to figure out what went wrong. I attached the error below. I am not sure where you might want to add the resolving code (as the error occurs in csbdeep) – therefore not a pull request. I came up with two potential solutions.

Preprocessing:
X, Y = zip(*[(x, y) for x, y in zip(X, Y) if y.max() >= 1])
Runtime:
assert / raise y.max() >=1.

ValueError                                Traceback (most recent call last)
<ipython-input-25-50a4e0118c27> in <module>
      1 model.train(X_trn, Y_trn, validation_data=(X_val, Y_val), 
----> 2             augmenter=Augmenter(**data_gen_args).augment)

~/anaconda3/envs/image/lib/python3.7/site-packages/stardist/models/model2d.py in train(self, X, Y, validation_data, augmenter, seed, epochs, steps_per_epoch)
    352         Xv, Mv, Pv, Dv = [None]*n_take, [None]*n_take, [None]*n_take, [None]*n_take
    353         for i,k in enumerate(ids):
--> 354             (Xv[i],Mv[i]),(Pv[i],Dv[i]) = data_val[k]
    355         Xv, Mv, Pv, Dv = np.concatenate(Xv,axis=0), np.concatenate(Mv,axis=0), np.concatenate(Pv,axis=0), np.concatenate(Dv,axis=0)
    356         data_val = [[Xv,Mv],[Pv,Dv]]

~/anaconda3/envs/image/lib/python3.7/site-packages/stardist/models/model2d.py in __getitem__(self, i)
     49         arrays = [sample_patches_from_multiple_stacks((self.Y[k],) + self.channels_as_tuple(self.X[k]),
     50                                                       patch_size=self.patch_size, n_samples=1,
---> 51                                                       patch_filter=self.no_background_patches_cached(k)) for k in idx]
     52         if self.n_channel is None:
     53             X, Y = list(zip(*[(x[0][self.b],y[0]) for y,x in arrays]))

~/anaconda3/envs/image/lib/python3.7/site-packages/stardist/models/model2d.py in <listcomp>(.0)
     49         arrays = [sample_patches_from_multiple_stacks((self.Y[k],) + self.channels_as_tuple(self.X[k]),
     50                                                       patch_size=self.patch_size, n_samples=1,
---> 51                                                       patch_filter=self.no_background_patches_cached(k)) for k in idx]
     52         if self.n_channel is None:
     53             X, Y = list(zip(*[(x[0][self.b],y[0]) for y,x in arrays]))

~/anaconda3/envs/image/lib/python3.7/site-packages/csbdeep/data/generate.py in sample_patches_from_multiple_stacks(datas, patch_size, n_samples, datas_mask, patch_filter, verbose)
     98 
     99     if n_valid == 0:
--> 100         raise ValueError("'patch_filter' didn't return any region to sample from")
    101 
    102     sample_inds = choice(range(n_valid), n_samples, replace=(n_valid < n_samples))

ValueError: 'patch_filter' didn't return any region to sample from```

Training on 3-channel images

n_channel_in=3 was added in the Config. However, the following error occurred:

---------------------------------------------------------------------------
InvalidArgumentError                      Traceback (most recent call last)
/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
   1627   try:
-> 1628     c_op = c_api.TF_FinishOperation(op_desc)
   1629   except errors.InvalidArgumentError as e:

InvalidArgumentError: Dimensions must be equal, but are 3 and 32 for 'loss_2/dist_loss/mul' (op: 'Mul') with input shapes: [?,?,?,3], [?,?,?,32].

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
<ipython-input-63-563f6521b7a2> in <module>
----> 1 model.train(X_trn,Y_trn,validation_data=(X_val,Y_val))

/usr/local/lib/python3.5/dist-packages/stardist/model.py in train(self, X, Y, validation_data, epochs, steps_per_epoch)
    398 
    399         if not self._model_prepared:
--> 400             self.prepare_for_training()
    401 
    402         data_kwargs = {

/usr/local/lib/python3.5/dist-packages/stardist/model.py in prepare_for_training(self, optimizer)
    338         dist_loss = {'mse': masked_loss_mse, 'mae': masked_loss_mae}[self.config.train_dist_loss]
    339         input_mask = self.keras_model.inputs[1] # second input layer is mask for dist loss
--> 340         self.keras_model.compile(optimizer, loss=['binary_crossentropy',dist_loss(input_mask)])
    341 
    342         self.callbacks = []

/usr/local/lib/python3.5/dist-packages/keras/engine/training.py in compile(self, optimizer, loss, metrics, loss_weights, sample_weight_mode, weighted_metrics, target_tensors, **kwargs)
    340                 with K.name_scope(self.output_names[i] + '_loss'):
    341                     output_loss = weighted_loss(y_true, y_pred,
--> 342                                                 sample_weight, mask)
    343                 if len(self.outputs) > 1:
    344                     self.metrics_tensors.append(output_loss)

/usr/local/lib/python3.5/dist-packages/keras/engine/training_utils.py in weighted(y_true, y_pred, weights, mask)
    402         ""
    403         # score_array has ndim >= 2
--> 404         score_array = fn(y_true, y_pred)
    405         if mask is not None:
    406             # Cast the mask to floatX to avoid float64 upcasting in Theano

/usr/local/lib/python3.5/dist-packages/stardist/model.py in _loss(d_true, d_pred)
     35 def masked_loss(mask, penalty):
     36     def _loss(d_true, d_pred):
---> 37         return K.mean(mask * penalty(d_pred - d_true), axis=-1)
     38     return _loss
     39 

/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/math_ops.py in binary_op_wrapper(x, y)
    864     with ops.name_scope(None, op_name, [x, y]) as name:
    865       if isinstance(x, ops.Tensor) and isinstance(y, ops.Tensor):
--> 866         return func(x, y, name=name)
    867       elif not isinstance(y, sparse_tensor.SparseTensor):
    868         try:

/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/math_ops.py in _mul_dispatch(x, y, name)
   1129   is_tensor_y = isinstance(y, ops.Tensor)
   1130   if is_tensor_y:
-> 1131     return gen_math_ops.mul(x, y, name=name)
   1132   else:
   1133     assert isinstance(y, sparse_tensor.SparseTensor)  # Case: Dense * Sparse.

/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/gen_math_ops.py in mul(x, y, name)
   5040   if _ctx is None or not _ctx._eager_context.is_eager:
   5041     _, _, _op = _op_def_lib._apply_op_helper(
-> 5042         "Mul", x=x, y=y, name=name)
   5043     _result = _op.outputs[:]
   5044     _inputs_flat = _op.inputs

/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/op_def_library.py in _apply_op_helper(self, op_type_name, name, **keywords)
    785         op = g.create_op(op_type_name, inputs, output_types, name=scope,
    786                          input_types=input_types, attrs=attr_protos,
--> 787                          op_def=op_def)
    788       return output_structure, op_def.is_stateful, op
    789 

/usr/local/lib/python3.5/dist-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
    486                 'in a future version' if date is None else ('after %s' % date),
    487                 instructions)
--> 488       return func(*args, **kwargs)
    489     return tf_decorator.make_decorator(func, new_func, 'deprecated',
    490                                        _add_deprecated_arg_notice_to_docstring(

/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/ops.py in create_op(***failed resolving arguments***)
   3272           input_types=input_types,
   3273           original_op=self._default_original_op,
-> 3274           op_def=op_def)
   3275       self._create_op_helper(ret, compute_device=compute_device)
   3276     return ret

/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/ops.py in __init__(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def)
   1790           op_def, inputs, node_def.attr)
   1791       self._c_op = _create_c_op(self._graph, node_def, grouped_inputs,
-> 1792                                 control_input_ops)
   1793 
   1794     # Initialize self._outputs.

/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
   1629   except errors.InvalidArgumentError as e:
   1630     # Convert to ValueError for backwards compatibility.
-> 1631     raise ValueError(str(e))
   1632 
   1633   return c_op

ValueError: Dimensions must be equal, but are 3 and 32 for 'loss_2/dist_loss/mul' (op: 'Mul') with input shapes: [?,?,?,3], [?,?,?,32].

SSLCertVerificationError when trying to use versatile pretrained model

When trying to run:

model_versatile = StarDist2D.from_pretrained('2D_versatile_fluo')

in a Jupyter notebook, I get the following stack trace:

Found model '2D_versatile_fluo' for 'StarDist2D'.
Downloading data from https://cloud.mpi-cbg.de/index.php/s/1k5Zcy7PpFWRb0Q/download?path=/versatile&files=2D_versatile_fluo.zip

---------------------------------------------------------------------------
SSLCertVerificationError                  Traceback (most recent call last)
Details
C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in do_open(self, http_class, req, **http_conn_args)
   1318                 h.request(req.get_method(), req.selector, req.data, headers,
-> 1319                           encode_chunked=req.has_header('Transfer-encoding'))
   1320             except OSError as err: # timeout error

C:\anaconda52\x64\envs\gluthi\lib\http\client.py in request(self, method, url, body, headers, encode_chunked)
   1251         """Send a complete request to the server."""
-> 1252         self._send_request(method, url, body, headers, encode_chunked)
   1253 

C:\anaconda52\x64\envs\gluthi\lib\http\client.py in _send_request(self, method, url, body, headers, encode_chunked)
   1297             body = _encode(body, 'body')
-> 1298         self.endheaders(body, encode_chunked=encode_chunked)
   1299 

C:\anaconda52\x64\envs\gluthi\lib\http\client.py in endheaders(self, message_body, encode_chunked)
   1246             raise CannotSendHeader()
-> 1247         self._send_output(message_body, encode_chunked=encode_chunked)
   1248 

C:\anaconda52\x64\envs\gluthi\lib\http\client.py in _send_output(self, message_body, encode_chunked)
   1025         del self._buffer[:]
-> 1026         self.send(msg)
   1027 

C:\anaconda52\x64\envs\gluthi\lib\http\client.py in send(self, data)
    965             if self.auto_open:
--> 966                 self.connect()
    967             else:

C:\anaconda52\x64\envs\gluthi\lib\http\client.py in connect(self)
   1421             self.sock = self._context.wrap_socket(self.sock,
-> 1422                                                   server_hostname=server_hostname)
   1423 

C:\anaconda52\x64\envs\gluthi\lib\ssl.py in wrap_socket(self, sock, server_side, do_handshake_on_connect, suppress_ragged_eofs, server_hostname, session)
    422             context=self,
--> 423             session=session
    424         )

C:\anaconda52\x64\envs\gluthi\lib\ssl.py in _create(cls, sock, server_side, do_handshake_on_connect, suppress_ragged_eofs, server_hostname, context, session)
    869                         raise ValueError("do_handshake_on_connect should not be specified for non-blocking sockets")
--> 870                     self.do_handshake()
    871             except (OSError, ValueError):

C:\anaconda52\x64\envs\gluthi\lib\ssl.py in do_handshake(self, block)
   1138                 self.settimeout(None)
-> 1139             self._sslobj.do_handshake()
   1140         finally:

SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076)

During handling of the above exception, another exception occurred:

URLError                                  Traceback (most recent call last)
C:\anaconda52\x64\envs\gluthi\lib\site-packages\keras\utils\data_utils.py in get_file(fname, origin, untar, md5_hash, file_hash, cache_subdir, hash_algorithm, extract, archive_format, cache_dir)
    224             try:
--> 225                 urlretrieve(origin, fpath, dl_progress)
    226             except HTTPError as e:

C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in urlretrieve(url, filename, reporthook, data)
    246 
--> 247     with contextlib.closing(urlopen(url, data)) as fp:
    248         headers = fp.info()

C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in urlopen(url, data, timeout, cafile, capath, cadefault, context)
    221         opener = _opener
--> 222     return opener.open(url, data, timeout)
    223 

C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in open(self, fullurl, data, timeout)
    524 
--> 525         response = self._open(req, data)
    526 

C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in _open(self, req, data)
    542         result = self._call_chain(self.handle_open, protocol, protocol +
--> 543                                   '_open', req)
    544         if result:

C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in _call_chain(self, chain, kind, meth_name, *args)
    502             func = getattr(handler, meth_name)
--> 503             result = func(*args)
    504             if result is not None:

C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in https_open(self, req)
   1361             return self.do_open(http.client.HTTPSConnection, req,
-> 1362                 context=self._context, check_hostname=self._check_hostname)
   1363 

C:\anaconda52\x64\envs\gluthi\lib\urllib\request.py in do_open(self, http_class, req, **http_conn_args)
   1320             except OSError as err: # timeout error
-> 1321                 raise URLError(err)
   1322             r = h.getresponse()

URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076)>

During handling of the above exception, another exception occurred:

Exception                                 Traceback (most recent call last)
<ipython-input-66-e4c7a46e04a2> in <module>
----> 1 model_versatile = StarDist2D.from_pretrained('2D_versatile_fluo')
      2 

C:\anaconda52\x64\envs\gluthi\lib\site-packages\stardist\models\base.py in from_pretrained(cls, name_or_alias)
    157         try:
    158             get_model_details(cls, name_or_alias, verbose=True)
--> 159             return get_model_instance(cls, name_or_alias)
    160         except ValueError:
    161             if name_or_alias is not None:

C:\anaconda52\x64\envs\gluthi\lib\site-packages\stardist\models\pretrained.py in get_model_instance(cls, key_or_alias)
     89 
     90 def get_model_instance(cls, key_or_alias):
---> 91     path = get_model_folder(cls, key_or_alias)
     92     model = cls(config=None, name=path.stem, basedir=path.parent)
     93     model.basedir = None # make read-only

C:\anaconda52\x64\envs\gluthi\lib\site-packages\stardist\models\pretrained.py in get_model_folder(cls, key_or_alias)
     83     target = str(Path('models') / cls.__name__ / key)
     84     path = Path(get_file(fname=key+'.zip', origin=m['url'], file_hash=m['hash'],
---> 85                          cache_subdir=target, extract=True))
     86     assert path.exists() and path.parent.exists()
     87     return path.parent

C:\anaconda52\x64\envs\gluthi\lib\site-packages\keras\utils\data_utils.py in get_file(fname, origin, untar, md5_hash, file_hash, cache_subdir, hash_algorithm, extract, archive_format, cache_dir)
    227                 raise Exception(error_msg.format(origin, e.code, e.msg))
    228             except URLError as e:
--> 229                 raise Exception(error_msg.format(origin, e.errno, e.reason))
    230         except (Exception, KeyboardInterrupt):
    231             if os.path.exists(fpath):

Exception: URL fetch failure on https://cloud.mpi-cbg.de/index.php/s/1k5Zcy7PpFWRb0Q/download?path=/versatile&files=2D_versatile_fluo.zip : None -- [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076)

Installation failing on macOS

Continuation of my pull request #20.

I wasn't able to reproduce the error with pip (probably delete all dependencies in python / locally). I was able to generate an error running python setup.py install from the master. This was all it output. If I manage to get the same through pip directly, I'll make sure to add it here.

$ python setup.py install

running install
running bdist_egg
running egg_info
writing stardist.egg-info/PKG-INFO
writing dependency_links to stardist.egg-info/dependency_links.txt
writing requirements to stardist.egg-info/requires.txt
writing top-level names to stardist.egg-info/top_level.txt
reading manifest file 'stardist.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'stardist.egg-info/SOURCES.txt'
installing library code to build/bdist.macosx-10.14-x86_64/egg
running install_lib
running build_py
creating build/lib.macosx-10.14-x86_64-3.7
creating build/lib.macosx-10.14-x86_64-3.7/stardist
copying stardist/plot.py -> build/lib.macosx-10.14-x86_64-3.7/stardist
copying stardist/version.py -> build/lib.macosx-10.14-x86_64-3.7/stardist
copying stardist/rays3d.py -> build/lib.macosx-10.14-x86_64-3.7/stardist
copying stardist/nms.py -> build/lib.macosx-10.14-x86_64-3.7/stardist
copying stardist/__init__.py -> build/lib.macosx-10.14-x86_64-3.7/stardist
copying stardist/matching.py -> build/lib.macosx-10.14-x86_64-3.7/stardist
copying stardist/utils.py -> build/lib.macosx-10.14-x86_64-3.7/stardist
creating build/lib.macosx-10.14-x86_64-3.7/stardist/models
copying stardist/models/model3d.py -> build/lib.macosx-10.14-x86_64-3.7/stardist/models
copying stardist/models/model2d.py -> build/lib.macosx-10.14-x86_64-3.7/stardist/models
copying stardist/models/__init__.py -> build/lib.macosx-10.14-x86_64-3.7/stardist/models
copying stardist/models/base.py -> build/lib.macosx-10.14-x86_64-3.7/stardist/models
creating build/lib.macosx-10.14-x86_64-3.7/stardist/geometry
copying stardist/geometry/__init__.py -> build/lib.macosx-10.14-x86_64-3.7/stardist/geometry
copying stardist/geometry/geom3d.py -> build/lib.macosx-10.14-x86_64-3.7/stardist/geometry
copying stardist/geometry/geom2d.py -> build/lib.macosx-10.14-x86_64-3.7/stardist/geometry
creating build/lib.macosx-10.14-x86_64-3.7/stardist/kernels
copying stardist/kernels/stardist2d.cl -> build/lib.macosx-10.14-x86_64-3.7/stardist/kernels
copying stardist/kernels/stardist3d.cl -> build/lib.macosx-10.14-x86_64-3.7/stardist/kernels
running build_ext
building 'stardist.lib.stardist2d' extension
Warning: Can't read registry to find the necessary compiler setting
Make sure that Python modules winreg, win32api or win32con are installed.
C compiler: clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/usr/include -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/System/Library/Frameworks/Tk.framework/Versions/8.5/Headers

creating build/temp.macosx-10.14-x86_64-3.7
creating build/temp.macosx-10.14-x86_64-3.7/stardist
creating build/temp.macosx-10.14-x86_64-3.7/stardist/lib
compile options: '-I/usr/local/lib/python3.7/site-packages/numpy/core/include -I/usr/local/include -I/usr/local/opt/openssl/include -I/usr/local/opt/sqlite/include -I/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/include/python3.7m -c'
extra options: '-std=c++11 -fopenmp'
clang: stardist/lib/clipper.cpp
clang: stardist/lib/stardist2d.cpp
clang: clang: error: unsupported option '-fopenmp'error: unsupported option '-fopenmp'

compiling with OpenMP support failed, re-trying without
building 'stardist.lib.stardist2d' extension
C compiler: clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/usr/include -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/System/Library/Frameworks/Tk.framework/Versions/8.5/Headers

compile options: '-I/usr/local/lib/python3.7/site-packages/numpy/core/include -I/usr/local/include -I/usr/local/opt/openssl/include -I/usr/local/opt/sqlite/include -I/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/include/python3.7m -c'
extra options: '-std=c++11'
clang: stardist/lib/stardist2d.cpp
clang: stardist/lib/clipper.cpp
In file included from stardist/lib/stardist2d.cpp:5:
In file included from /usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/arrayobject.h:4:
In file included from /usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:
In file included from /usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1822:
/usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:17:2: warning: 
      "Using deprecated NumPy API, disable it with "          "#define NPY_NO_DEPRECATED_API
      NPY_1_7_API_VERSION" [-W#warnings]
#warning "Using deprecated NumPy API, disable it with " \
 ^
1 warning generated.
creating build/lib.macosx-10.14-x86_64-3.7/stardist/lib
clang++ -bundle -undefined dynamic_lookup build/temp.macosx-10.14-x86_64-3.7/stardist/lib/stardist2d.o build/temp.macosx-10.14-x86_64-3.7/stardist/lib/clipper.o -L/usr/local/lib -L/usr/local/opt/openssl/lib -L/usr/local/opt/sqlite/lib -o build/lib.macosx-10.14-x86_64-3.7/stardist/lib/stardist2d.cpython-37m-darwin.so
building 'stardist.lib.stardist3d' extension
C compiler: clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/usr/include -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/System/Library/Frameworks/Tk.framework/Versions/8.5/Headers

creating build/temp.macosx-10.14-x86_64-3.7/stardist/lib/qhull_src
creating build/temp.macosx-10.14-x86_64-3.7/stardist/lib/qhull_src/src
creating build/temp.macosx-10.14-x86_64-3.7/stardist/lib/qhull_src/src/libqhullcpp
creating build/temp.macosx-10.14-x86_64-3.7/stardist/lib/qhull_src/src/libqhull_r
compile options: '-I/usr/local/lib/python3.7/site-packages/numpy/core/include -Istardist/lib/qhull_src/src -I/usr/local/include -I/usr/local/opt/openssl/include -I/usr/local/opt/sqlite/include -I/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/include/python3.7m -c'
extra options: '-std=c++11 -fopenmp'
clang: stardist/lib/stardist3d.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/RoadError.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullVertex.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullRidge.cpp
clang: error: unsupported option '-fopenmp'
clang: error: unsupported option '-fopenmp'
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullPointSet.cpp
clang: error: unsupported option '-fopenmp'
clang: error: unsupported option '-fopenmp'
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullFacetSet.cpp
clang: stardist/lib/qhull_src/src/libqhull_r/user_r.c
clang: stardist/lib/qhull_src/src/libqhullcpp/Qhull.cpp
clang: error: unsupported option '-fopenmp'
clang: error: unsupported option '-fopenmp'
clang: stardist/lib/qhull_src/src/libqhull_r/random_r.c
clang: stardist/lib/qhull_src/src/libqhull_r/poly2_r.c
clang: error: unsupported option '-fopenmp'
clang: error: unsupported option '-fopenmp'
clang: stardist/lib/qhull_src/src/libqhull_r/libqhull_r.c
clang: stardist/lib/qhull_src/src/libqhull_r/geom_r.c
clang: error: unsupported option '-fopenmp'
clang: error: unsupported option '-fopenmp'
clang: error: unsupported option '-fopenmp'
clang: error: unsupported option '-fopenmp'
compiling with OpenMP support failed, re-trying without
building 'stardist.lib.stardist3d' extension
C compiler: clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/usr/include -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/System/Library/Frameworks/Tk.framework/Versions/8.5/Headers

compile options: '-I/usr/local/lib/python3.7/site-packages/numpy/core/include -Istardist/lib/qhull_src/src -I/usr/local/include -I/usr/local/opt/openssl/include -I/usr/local/opt/sqlite/include -I/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/include/python3.7m -c'
extra options: '-std=c++11'
clang: stardist/lib/stardist3d.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/RoadError.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullVertex.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullRidge.cpp
In file included from stardist/lib/stardist3d.cpp:6:
In file included from /usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/arrayobject.h:4:
In file included from /usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:
In file included from /usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1822:
/usr/local/lib/python3.7/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:17:2: warning: 
      "Using deprecated NumPy API, disable it with "          "#define NPY_NO_DEPRECATED_API
      NPY_1_7_API_VERSION" [-W#warnings]
#warning "Using deprecated NumPy API, disable it with " \
 ^
stardist/lib/stardist3d.cpp:949:24: warning: unused variable 'scores' [-Wunused-variable]
   const float * const scores = (float*) PyArray_DATA(arr_scores);
                       ^
clang: stardist/lib/qhull_src/src/libqhullcpp/RboxPoints.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullStat.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullQh.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullSet.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullPoints.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullVertexSet.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullPointSet.cpp
2 warnings generated.
clang: stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp
stardist/lib/qhull_src/src/libqhullcpp/usermem_r-cpp.cpp:42:13: warning: explicitly assigning value
      of variable of type 'int' to itself [-Wself-assign]
    exitcode= exitcode;
    ~~~~~~~~^ ~~~~~~~~
1 warning generated.
clang: stardist/lib/qhull_src/src/libqhullcpp/RoadLogEvent.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullFacetSet.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/Qhull.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullPoint.cpp
clang: stardist/lib/qhull_src/src/libqhull_r/user_r.c
error: invalid argument '-std=c++11' not allowed with 'C'
clang: stardist/lib/qhull_src/src/libqhull_r/random_r.c
error: invalid argument '-std=c++11' not allowed with 'C'
clang: stardist/lib/qhull_src/src/libqhull_r/poly2_r.c
error: invalid argument '-std=c++11' not allowed with 'C'
clang: stardist/lib/qhull_src/src/libqhull_r/libqhull_r.c
error: invalid argument '-std=c++11' not allowed with 'C'
clang: stardist/lib/qhull_src/src/libqhull_r/geom_r.c
error: invalid argument '-std=c++11' not allowed with 'C'
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullFacetList.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/PointCoordinates.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullHyperplane.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/QhullFacet.cpp
clang: stardist/lib/qhull_src/src/libqhullcpp/Coordinates.cpp
error: Command "clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/usr/include -I/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/System/Library/Frameworks/Tk.framework/Versions/8.5/Headers -I/usr/local/lib/python3.7/site-packages/numpy/core/include -Istardist/lib/qhull_src/src -I/usr/local/include -I/usr/local/opt/openssl/include -I/usr/local/opt/sqlite/include -I/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/include/python3.7m -c stardist/lib/qhull_src/src/libqhull_r/user_r.c -o build/temp.macosx-10.14-x86_64-3.7/stardist/lib/qhull_src/src/libqhull_r/user_r.o -std=c++11" failed with exit status 1

Qhull warning and errors when doing predictions

I got the following warning and errors during the prediction step. Any idea what this is about and how should I fix it? Many thanks!

Qhull output at end
Qhull precision warning: repartition point p254 from f103 as a outside point above a hidden facet f147 dist 0.00041 nearest vertices 0.2
convex hull of kernel intersection
QH6297 Qhull precision warning: in post-processing (qh_check_maxout) p117 for f306 is 2.3e-10 above hidden facet f833 nearest vertices 5.1e-05
Qhull precision warning: in post-processing (qh_check_maxout) p117 for f306 is 2.3e-10 above hidden facet f833 nearest vertices 5.1e-05
Qhull precision error (qh_check_maxout): large increase in qh.max_outside during post-processing dist 2.3e-10 (107.2x). See warning QH0032/QH0033. Disable with 'Q15' (allow_widemax) and 'Pp'
ERRONEOUS FACET:

  • f833
    • flags: bottom simplicial tested
    • normal: -0.8329 0.3488 0.4297
    • offset: 24.8334
    • center: 147.2531166051903 181.6815757347091 80.17435662391317
    • maxoutside: 2.343867e-10
    • vertices: p127(v191) p126(v155) p34(v69)
    • neighboring facets: f304 f831 f923
    • ridges:
    • r409 tested simplicialbot
      vertices: p126(v155) p34(v69)
      between f304 and f833
    • all ridges: r409 r645
    • r645 tested simplicialtop simplicialbot
      vertices: p127(v191) p34(v69)
      between f833 and f831

While executing: convex hull | qhull
Options selected for Qhull 2018.0.1.r 2018/12/28:
run-id 1740914251 _pre-merge _zero-centrum _max-width 46
Error-roundoff 2.7e-13 _one-merge 1.9e-12 _near-inside 9.6e-12
Visible-distance 5.5e-13 U-max-coplanar 5.5e-13 Width-outside 1.1e-12
_wide-facet 3.3e-12
Last point added to hull was p38. Last merge was #361.

At error exit:

Convex hull of 242 points in 3-d:

Number of vertices: 242
Number of facets: 124
Number of non-simplicial facets: 105

Statistics for: convex hull | qhull

Number of points processed: 242
Number of hyperplanes created: 676
Number of distance tests for qhull: 4560
Number of distance tests for merging: 7880
Number of distance tests for checking: 7360
Number of merged facets: 361
Maximum distance of merged point above facet: 2.3e-10 (107.4x)
Maximum distance of merged vertex below facet: -8.3e-13 (0.4x)

Convert StarDist model to use DeepImageJ

Hi @uschmidt83 , Hi @maweigert,

@lacan and I started using DeepImageJ and we encountered a first issue while trying to export StarDist model.
Fortunately, @esgomezm helped us and made a notebook to convert the StarDist model, and it works like a charm (see below)!

image

One issue now is that the grid parameter has to be set to (1,1), otherwise the model exported to be DeepImageJ compatible have a downsample output, which creates some weird results (see below).

image

In the training notebook, the grid parameter is set to (2,2), so many people might encounter this problem.

If the grid parameter is set to (2,2), would it be possible to have the output set back to original size? so the model will be fully compatible with DeepImageJ ?

Thank for your inputs about it,

Best,

Romain

Export imageJ ROIs for 3D polyhedra

Hi!

I'm a complete beginner at Python but find the software really exciting, so have jumped in at the deep end! I'm still familiarising myself with the pipeline in 3D using the demo 3D model and have run through the Jupyter notebooks and I think it is all working.

My aim is to be able to export the label maps and visualise them in ImageJ or something similar, and to use the 3 dimensional ROIs to calculate values for nuclear volume and shape and other things. I can see that you've made a notebook to export the ROIs for the 2D model prediction into FIJI which works great for me. I tried to adapt this for 3D by just having axes set to zyx and having 'dist' instead of 'coord' (in export_imagej_rois('img_rois.zip', details['dist'])). I have found that this doesn't really work, but not sure why.

I also tried to export the demo 3d model into deepImageJ, but it runs into trouble when you try to apply it to an image. I can give more details, but as I am quite lost I'm not sure what is the relevant information.

I would appreciate any help,
Best wishes from a coding newbie, always in awe

Michael Schwimmer

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.