Code Monkey home page Code Monkey logo

itk-feedstock's Introduction

About itk-feedstock

Feedstock license: BSD-3-Clause

Home: https://www.itk.org/

Package license: Apache-2.0

Summary: ITK is an open-source toolkit for multidimensional image analysis

Development: https://github.com/InsightSoftwareConsortium/ITK

Documentation: https://itk.org/ITK/help/documentation.html

Current build status

Azure
VariantStatus
linux_64_numpy1.21python3.10.____cpython variant
linux_64_numpy1.21python3.8.____cpython variant
linux_64_numpy1.21python3.9.____cpython variant
linux_64_numpy1.23python3.11.____cpython variant
osx_64_numpy1.21python3.10.____cpython variant
osx_64_numpy1.21python3.8.____cpython variant
osx_64_numpy1.21python3.9.____cpython variant
osx_64_numpy1.23python3.11.____cpython variant
win_64_numpy1.21python3.10.____cpython variant
win_64_numpy1.21python3.8.____cpython variant
win_64_numpy1.21python3.9.____cpython variant
win_64_numpy1.23python3.11.____cpython variant

Current release info

Name Downloads Version Platforms
Conda Recipe Conda Downloads Conda Version Conda Platforms

Installing itk

Installing itk from the conda-forge channel can be achieved by adding conda-forge to your channels with:

conda config --add channels conda-forge
conda config --set channel_priority strict

Once the conda-forge channel has been enabled, itk can be installed with conda:

conda install itk

or with mamba:

mamba install itk

It is possible to list all of the versions of itk available on your platform with conda:

conda search itk --channel conda-forge

or with mamba:

mamba search itk --channel conda-forge

Alternatively, mamba repoquery may provide more information:

# Search all versions available on your platform:
mamba repoquery search itk --channel conda-forge

# List packages depending on `itk`:
mamba repoquery whoneeds itk --channel conda-forge

# List dependencies of `itk`:
mamba repoquery depends itk --channel conda-forge

About conda-forge

Powered by NumFOCUS

conda-forge is a community-led conda channel of installable packages. In order to provide high-quality builds, the process has been automated into the conda-forge GitHub organization. The conda-forge organization contains one repository for each of the installable packages. Such a repository is known as a feedstock.

A feedstock is made up of a conda recipe (the instructions on what and how to build the package) and the necessary configurations for automatic building using freely available continuous integration services. Thanks to the awesome service provided by Azure, GitHub, CircleCI, AppVeyor, Drone, and TravisCI it is possible to build and upload installable packages to the conda-forge Anaconda-Cloud channel for Linux, Windows and OSX respectively.

To manage the continuous integration and simplify feedstock maintenance conda-smithy has been developed. Using the conda-forge.yml within this repository, it is possible to re-render all of this feedstock's supporting files (e.g. the CI configuration files) with conda smithy rerender.

For more information please check the conda-forge documentation.

Terminology

feedstock - the conda recipe (raw material), supporting scripts and CI configuration.

conda-smithy - the tool which helps orchestrate the feedstock. Its primary use is in the construction of the CI .yml files and simplify the management of many feedstocks.

conda-forge - the place where the feedstock and smithy live and work to produce the finished article (built conda distributions)

Updating itk-feedstock

If you would like to improve the itk recipe or build a new package version, please fork this repository and submit a PR. Upon submission, your changes will be run on the appropriate platforms to give the reviewer an opportunity to confirm that the changes result in a successful build. Once merged, the recipe will be re-built and uploaded automatically to the conda-forge channel, whereupon the built conda packages will be available for everybody to install and use from the conda-forge channel. Note that all branches in the conda-forge/itk-feedstock are immediately built and any created packages are uploaded, so PRs should be based on branches in forks and branches in the main repository should only be used to build distinct package versions.

In order to produce a uniquely identifiable distribution:

  • If the version of a package is not being increased, please add or increase the build/number.
  • If the version of a package is being increased, please remember to return the build/number back to 0.

Feedstock Maintainers

itk-feedstock's People

Contributors

beckermr avatar brey avatar conda-forge-admin avatar conda-forge-curator[bot] avatar ericdill avatar mariusvniekerk avatar ocefpaf avatar thewtex avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

itk-feedstock's Issues

HDF5 conflicts

Issue:

This conda recipe uses the "internal" HDF5 support in HDF5, as opposed to relying on the conda hdf5 package. This create trouble for me, and it is in any case a duplication etc. Is there a good reason for this? If so, any suggestions for how I avoid conflicts?

More detail:
I'm maintaining a conda feedstock for STIR. STIR can use HDF5. ITK can as well. ITK can build its own version of HDF5 but if we build ITK ourselves, we set ITK_USE_SYSTEM_HDF5:BOOL=ON to avoid duplication/conflicts, but I think it normally works fine without it as well. However, I am updating our feedstock such that it depends on both hdf5 and libitk-devel in conda-forge/stir-feedstock#33 (I also tried to add OSX support, hence the name of the PR, but ignore that here).

This works fine for Linux but the Windows builds fail with errors like this:

exe /C "cd . && %BUILD_PREFIX%\Library\bin\cmake.exe -E vs_link_exe --intdir=src\utilities\CMakeFiles\conv_gipl_to_interfile.dir --rc=C:\PROGRA~2\WI3CF2~1\10\bin\100177~1.0\x64\rc.exe --mt=C:\PROGRA~2\WI3CF2~1\10\bin\100177~1.0\x64\mt.exe --manifests  -- C:\PROGRA~2\MICROS~1\2017\ENTERP~1\VC\Tools\MSVC\1416~1.270\bin\Hostx64\x64\link.exe /nologo @CMakeFiles\conv_gipl_to_interfile.rsp  /out:src\utilities\conv_gipl_to_interfile.exe /implib:src\utilities\conv_gipl_to_interfile.lib /pdb:src\utilities\conv_gipl_to_interfile.pdb /version:0.0 /machine:x64   /INCREMENTAL:NO /subsystem:console  && cd ."
LINK: command "C:\PROGRA~2\MICROS~1\2017\ENTERP~1\VC\Tools\MSVC\1416~1.270\bin\Hostx64\x64\link.exe /nologo @CMakeFiles\conv_gipl_to_interfile.rsp /out:src\utilities\conv_gipl_to_interfile.exe /implib:src\utilities\conv_gipl_to_interfile.lib /pdb:src\utilities\conv_gipl_to_interfile.pdb /version:0.0 /machine:x64 /INCREMENTAL:NO /subsystem:console /MANIFEST /MANIFESTFILE:src\utilities\conv_gipl_to_interfile.exe.manifest" failed (exit code 1169) with the following output:
libitkhdf5_cpp.lib(H5DataSpace.cpp.obj) : error LNK2005: "public: __cdecl H5::DataSpace::DataSpace(int,unsigned __int64 const *,unsigned __int64 const *)" (??0DataSpace@H5@@QEAA@HPEB_K0@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSpace.cpp.obj) : error LNK2005: "public: __cdecl H5::DataSpace::DataSpace(enum H5S_class_t)" (??0DataSpace@H5@@QEAA@W4H5S_class_t@@@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSpace.cpp.obj) : error LNK2005: "public: virtual __cdecl H5::DataSpace::~DataSpace(void)" (??1DataSpace@H5@@UEAA@XZ) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSpace.cpp.obj) : error LNK2005: "public: class H5::DataSpace & __cdecl H5::DataSpace::operator=(class H5::DataSpace const &)" (??4DataSpace@H5@@QEAAAEAV01@AEBV01@@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSpace.cpp.obj) : error LNK2005: "public: int __cdecl H5::DataSpace::getSimpleExtentDims(unsigned __int64 *,unsigned __int64 *)const " (?getSimpleExtentDims@DataSpace@H5@@QEBAHPEA_K0@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSpace.cpp.obj) : error LNK2005: "public: int __cdecl H5::DataSpace::getSimpleExtentNdims(void)const " (?getSimpleExtentNdims@DataSpace@H5@@QEBAHXZ) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSpace.cpp.obj) : error LNK2005: "public: void __cdecl H5::DataSpace::selectHyperslab(enum H5S_seloper_t,unsigned __int64 const *,unsigned __int64 const *,unsigned __int64 const *,unsigned __int64 const *)const " (?selectHyperslab@DataSpace@H5@@QEBAXW4H5S_seloper_t@@PEB_K111@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5IdComponent.cpp.obj) : error LNK2005: "public: class H5::IdComponent & __cdecl H5::IdComponent::operator=(class H5::IdComponent const &)" (??4IdComponent@H5@@QEAAAEAV01@AEBV01@@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5Location.cpp.obj) : error LNK2005: "public: class H5::DataSet __cdecl H5::H5Location::openDataSet(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > const &,class H5::DSetAccPropList const &)const " (?openDataSet@H5Location@H5@@QEBA?AVDataSet@2@AEBV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@AEBVDSetAccPropList@2@@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5Location.cpp.obj) : error LNK2005: "public: class H5::DataSet __cdecl H5::H5Location::openDataSet(char const *,class H5::DSetAccPropList const &)const " (?openDataSet@H5Location@H5@@QEBA?AVDataSet@2@PEBDAEBVDSetAccPropList@2@@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSet.cpp.obj) : error LNK2005: "public: __cdecl H5::DataSet::DataSet(void)" (??0DataSet@H5@@QEAA@XZ) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSet.cpp.obj) : error LNK2005: "public: virtual __cdecl H5::DataSet::~DataSet(void)" (??1DataSet@H5@@UEAA@XZ) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSet.cpp.obj) : error LNK2005: "public: void __cdecl H5::DataSet::read(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > &,class H5::DataType const &,class H5::DataSpace const &,class H5::DataSpace const &,class H5::DSetMemXferPropList const &)const " (?read@DataSet@H5@@QEBAXAEAV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@AEBVDataType@2@AEBVDataSpace@2@2AEBVDSetMemXferPropList@2@@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5DataSet.cpp.obj) : error LNK2005: "public: void __cdecl H5::DataSet::read(void *,class H5::DataType const &,class H5::DataSpace const &,class H5::DataSpace const &,class H5::DSetMemXferPropList const &)const " (?read@DataSet@H5@@QEBAXPEAXAEBVDataType@2@AEBVDataSpace@2@2AEBVDSetMemXferPropList@2@@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5StrType.cpp.obj) : error LNK2005: "public: __cdecl H5::StrType::StrType(int,unsigned __int64 const &)" (??0StrType@H5@@QEAA@HAEB_K@Z) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
libitkhdf5_cpp.lib(H5StrType.cpp.obj) : error LNK2005: "public: virtual __cdecl H5::StrType::~StrType(void)" (??1StrType@H5@@UEAA@XZ) already defined in hdf5_cpp.lib(hdf5_cpp.dll)
src\utilities\conv_gipl_to_interfile.exe : fatal error LNK1169: one or more multiply defined symbols found

See https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=314472&view=logs&j=171a126d-c574-5c8c-1269-ff3b989e923d&t=1183ba29-a0b5-5324-8463-2a49ace9e213&l=9566

Python 3.11

Comment:

Hi. Thanks for your efforts on this one. Could you please update it for python 3.11? the Pypi already has required wheels. THank you!

HDF5 License violation

Note that HDF5 license requires the copyright text be distributed.
Other dependencies might need their licenses to be distributed as well.

Shared Libraries Missing?

Hello all! I wasn't sure whether to post this as a PR or as an issue.

ITK is a dependency of a project I maintain, and we are trying to make the switch to distributing it on conda-forge. Unfortunately, there seem to bee some incompatibilities with the package in it's current state as we rely on the C++ shared libraries rather than the python bindings.

The two big issues that I am having right now are:

  • Most of the standard libITKfoo.so / dyld binaries are missing from the lib folder.
  • The ITKReview module is not included within the package taken from PyPi.

I'm wondering if there is any interest in making the ITK feedstock compatible with general code projects? We have the conda-build scripts already written to generate a standard build: https://github.com/vmtk/conda-recipes/tree/master/itk. Only the option to include python bindings is missing right now.

I wanted to reach out and get your input before going ahead and submitting a PR. I'd really appreciate any feedback you might have: is this something of interest, or if there are any techincal reasons that this has not been implemented yet?

Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.