Comments (6)
Hi, @tomvars. Thanks for the proposal. I think this is an excellent idea.
One potential solution would be leveraging the PyTorch Hub tools. I got this code working. What do you think?
import torchio as tio
fpg = tio.datasets.FPG()
fpg.plot(reorient=False)
import torch
repo = 'fepegar/resseg:add-preprocessing-hubconf'
function_name = 'get_preprocessing_transform'
input_path = fpg.t1.path
preprocess = torch.hub.load(repo, function_name, input_path, image_name='t1', force_reload=True)
preprocessed = preprocess(fpg)
preprocessed.plot(reorient=False)
from torchio.
I really like this API! You could maybe create a new repo like fepegar/torchiohub:main
and have a single hubconf.py
file as the access point to different preprocessing functions. In the repo users could append their transform functions to a large transforms.py
file and the hubconf.py
would have lines such as from transforms import ronneberger_unet_2015_transform
from torchio.
You mean something like this?
@classmethod
def from_hub(cls, *args, **kwargs):
return torch.hub.load(*args, **kwargs)
from torchio.
I think it's more convenient to allow users to use their own hubconf in their repos because
- This is what PyTorch does, so people are familiar with the syntax etc.
- Sometimes, getting a transform needs some special code. The snippet I shared is an example in which additional libraries or files might be needed just to compute the transform, and we wouldn't want to put everyone's code in the same repo.
So the contribution to this library (which I'm happy to write) would be documentation on how to set up transforms for reproducibility on top of PyTorch Hub. Does that sound good?
from torchio.
That makes sense 👍 thoughts on introducing a class method for the Transform called from_hub
which would wrap the torch.hub.load
call and pass in the relevant arguments?
from torchio.
Hey, I forgot to share some experiments I conducted. The code below needs unet
to be PIP-installed:
import torch
import torchio as tio
colin = tio.datasets.Colin27()
path = colin.t1.path
torch.hub.load('fepegar/resseg:add-preprocessing-hubconf', 'get_preprocessing_transform', path)
transform = torch.hub.load('fepegar/resseg:add-preprocessing-hubconf', 'get_preprocessing_transform', path, image_name='t1')
transform(colin).plot()
Here, HistogramStandardization
makes it a bit awkward, but things work. We should write a tutorial about this. If you think the class method would be helpful, feel free to contribute with a PR!
from torchio.
Related Issues (20)
- Suggestions the modifying default value of prefetch_factor and the argument to set it for minimize the blocking-bottleneck between fetch subject and generate patch in Queue HOT 1
- Different transforms applied to CT and label HOT 11
- The Affine matrix does not change after applying the augmentations HOT 3
- Custom loader not used when loading data lazily HOT 2
- Seed is not working HOT 2
- Silenced exception makes it harder to debug custom Transforms HOT 5
- Resample
- tio.Resample does not work with custom image class HOT 2
- Setting NUM_SAMPLES when using sampler with Queue HOT 3
- RescaleIntensity - multiple calls HOT 9
- Return sampled parameters upon request HOT 3
- Halve queue length when using DDP HOT 2
- bug in rotation part of tio.transforms.RandomAffine HOT 4
- get_subjects_from_batch has a hick-up with int metadata HOT 5
- masking_method in Mask class is not saved as argument (preventing applying the inverse transform)
- RandomAffine raises an error when isotropic=True and 3 elements are given for scales HOT 10
- Queue is not respecting the batch size HOT 1
- Resample an image by providing only the target affine HOT 1
- Supporting PyTorch 2.3 HOT 11
- Cannot copy subclass of Subject with keyword arguments HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from torchio.