junma11 / seglossodyssey Goto Github PK
View Code? Open in Web Editor NEWA collection of loss functions for medical image segmentation
License: Apache License 2.0
A collection of loss functions for medical image segmentation
License: Apache License 2.0
如题, pytorch版本为1.11, 输入都为tensor,输出提示上述错误
问题解决了,打扰您了
Hey! First of all, thanks for this really nice summary of all the different losses!!! That's amazing... Since you are kind of an expert with all these losses just one question:
Which loss do you suggest for Instance Segmentation where the given dataset contains negative samples (i.e. no instance in the scene). In several works, these samples would be filtered and not used in the loss, however, in my current case, this is not possible. Dice loss? However, it is not capable of negative samples (since the GT-mask is empty there is no intersection no matter what is predicted).
Would be nice to read your opinion on that!
I wrote a doctest for the erosion based hausdorff loss and I found it is not differentiable:
Examples:
>>> image_preds = torch.tensor([[
... [0, 0.1, 0.1, 0],
... [0, 0.3, 0.6, 0],
... [0, 0.4, 0.8, 0],
... [0, 0, 0, 0],
... ]], requires_grad=True)[None]
>>> image_gt = torch.tensor([[
... [0, 1, 0, 0],
... [1, 1, 1, 0],
... [1, 1, 1, 0],
... [0, 0, 0, 0],
... ]])[None]
>>> hdloss = HausdorffERLoss(from_logits=False)
>>> hdloss(image_preds, image_gt)
tensor(0.0625)
>>> hdloss(image_preds, image_gt).backward()
I would suspect something is missing from your implementation.
Hi @JunMa11 ,
I greatly appreciate the effort you've put into creating a library of loss functions for the community. I'm reaching out to see if you might consider adding a loss function that we've developed, the FCM loss, which is specifically designed for both unsupervised and semi-supervised registration tasks.
Paper: Medical Physics
Code: FCM Loss
Thank you!
Junyu
I am interested in this challenge, but i got nothing from the organizer. It would be greatly helpful if you can provide the dataset. My email is [email protected].
Thank you for your work. can you please provide polyloss PyTorch for image segmentation? Thank you
Hello,
First of all, thanks for your repository! I would like to use some of the loss functions that are in it in my project, and this is why I'm opening this issue to ask you to add a license to your repository.
As explained in GitHub's help site on license choosing, not providing a license for your repository means that "the work is under exclusive copyright by default. Unless you include a license that specifies otherwise, nobody else can copy, distribute, or modify your work without being at risk of take-downs, shake-downs, or litigation."
As per the way the README of this repository is written, I'm assuming that the lack of license is more of an oversight than a wish to restrict the use of this repository ; though if I'm wrong, and you actually do not want to add a license to your repository, could you state so in the README ?
Thank you for your time, and the consideration of this request !
Hi jun,
I want to ask that in IoULoss def forward(self, x, y, loss_mask=None):
How your output can be minus if it is not minus why did you use return -iou
thanks
Hi, Jun, thank you very much for your valuable work.
When you perform multi-organ segmentation on nnUNet, you observed that the combine of Dice loss and Focal loss achieved the best DSC. Can you share your parameters used in Focal loss? Such as the alpha and gamma and learning rate.
Many thanks, waiting for your reply.
Hi,
thanks for this great collection of segmentation losses.
But I wonder, what are the differences between losses_pytorch
(https://github.com/JunMa11/SegLoss/tree/master/test/loss_functions) and test/loss_functions
(https://github.com/JunMa11/SegLoss/tree/master/test/loss_functions)?
And which one would recommend for custom test?
I saw that test/loss_functions
have dependencies to nnU-Net, while the others don't.
Thank you
The boundary loss and the sample for computing the distance map (input to boundary loss) both mentioned, that there are for binary masks.
How would I use the Boundary loss in a multi class scenario?
How would I compute the sdf map for multi class?
Thank you!
I'm sorry to bother you.In SegLoss/losses_pytorch/hausdorff.py
,I find function distance_field
use torch.no_grad
,this wil make this fuction have no gradient, Will this have an impact on training?I don't know much about the impact.Can you explain it for me?
Thanks.
Hi. Could I use LossOverview.PNG for the class material with reference?
The class is not public, but non commercial.
Thank you.
Hey Jun. @JunMa11 I hope you are fine. I thought I shall ask you as you have already tried the dataset conversion from Fabian's repo. So, I download the cardiac data from ACDC, they have training repo, but when I ran the code (for ACDC data-preparation) fromhis repo, it's throwing the error. I can't use the Hackathon data, as it has got only 30 set. Can you please clarify whether it's the right process? I just need to convert the data, ans strip the 4D to 3D.
Hello, I have a doubt about Sensitivity Specificity loss function's implementation.
So I think that the correct way to translate this Sensitivity into **prediction ** and **ground_truth ** terms for a loss function would be similar to:
true_positives = prediction*ground_truth
false_negatives = (1-prediction)*ground_truth
But in the implementation:
Sensitivity = (square(ground_truth-prediction) * (1-one_hot) ) / (1-one_hot)
How should I interpret its expressions(The same occurs with the Specificity )?
Hi @JunMa11,
I would like to ask if your prediction tensor in IoUloss and Diceloss consists of only 1 class. For multi-class segmentation can i use softmax function before the calculation of loss ? I am using different architecture than you so i don't know your input shape.
thanks
Hi, @JunMa11,
I tried reimplementing the liver segmentation using dice loss. Everything goes well though, the performance is a little lower than the results in your paper. I use the SoftDiceloss and the LiTS dataset. I conducted the experiments, using the same splits as provided. When I evaluate the liver segmentation, I combine the liver and liver tumor into one channel. The average Dice result is 94.51.
Do you have any idea about that or suggestions?
Best,
Xi
Hi, I'd like to ask Focal Loss with multiple classifications. So for example, I have background, C1,C2,C3, four categories. If I set alpha to [0.25, 1,1,0.5], does that mean that my background weight is 0.25, then the foreground three classes are weighted 1,1,0.5, and C1 and C2 have the same weight and are twice as heavy as C3
Hi, Jun, I want to use DC+HD as the loss to train a multi-organ segmentation model, but I meet two problems:
def forward(self, net_output, target): dc_loss = self.dc(net_output, target) hd_loss = self.hd(net_output, target) if self.aggregate == "sum": with torch.no_grad(): alpha = hd_loss / (dc_loss + 1e-5 ) result = alpha * dc_loss + hd_loss else: raise NotImplementedError("nah son") return result
Many losses in the end return -loss
, e.g. SoftDiceLoss or Tversky:
https://github.com/JunMa11/SegLoss/blob/131663994df22efb907e7ae897a39efe628518fc/losses_pytorch/dice_loss.py#L291
Why are the losses returned negative?
When train using nnunet with loss of DiceHD or DiceBD, it seems that in most of time the usage of GPU is around 0%, few time is around 40%. Looks like around 30 mins will cost for 1 epoch. Is this the normal situation?
I built my own dataset which is similar with Multi-organ Abdominal CT and preprocess it according to the instruction of nnunet. Then I copied https://github.com/JunMa11/SegLoss/blob/master/test/nnUNetV1/network_training/nnUNetTrainer_DiceBD.py and lossfunctions to nnunet folder. I replaced all nnUNetTrainer to nnUNetTrainerV2 . Is this the right procedure?
Thank you very much!
Hi Jun @JunMa11
Thanks for getting time for collecting all loss functions at one place. You are.awesome.~😉
Can you please help me in understanding why you did them? (One by one). Will be great if you help.
Started the tuple from 2 and the next lines, I mean you checking masks are returning proper identity without squares, then can you please explain x_i * with all rows + columns of the mask?
axes = tuple(range(2, len(net_output.size()))) ...... tp = torch.stack(tuple(x_i * mask[:, 0] for x_i in torch.unbind(tp, dim=1)), dim=1)
if self.batch_dice: axes = [0] + list(range(2, len(shp_x))) else: axes = list(range(2, len(shp_x))) if self.apply_nonlin is not None: x = self.apply_nonlin(x)
Hi, I'd like to ask Focal Loss with multiple classifications. So for example, I have background, C1,C2,C3, four categories. If I set alpha to [0.25, 1,1,0.5], does that mean that my background weight is 0.25, then the foreground three classes are weighted 1,1,0.5, and C1 and C2 have the same weight and are twice as heavy as C3
Hi, @JunMa11,
Thanks for your great work. I have a question about data preprocessing. As you've provided the detailed splits for the datasets. Could you further provide the code for liver CT dataset processing, so we can compare our results with yours directly?
Best,
Xi
Hello,
Thanks for sharing the code. it helps me a lot. The paper (loss odyssey ) is also great for me to know more about loss in medical segmentation.
I have found that in your code, the resolution is not considered in distance_map computing. However, it is used herehttps://github.com/LIVIAETS/boundary-loss.
Why is the resolution not be used here? Looking forward to your reply. Thanks a lot.
Hi, thanks for your works!
In your codes, they all need from nnunet.xxx
, should I install nnunet
from https://github.com/MIC-DKFZ/nnUNet?
Why not provide individual loss function and test example codes?
Hello,
What loss is worth trying for multi classification unbalanced data sets?
Hello
How are you?
Thanks for contributing to this project.
Did u look at this paper?
https://www.mdpi.com/2072-4292/13/3/454
I think that u can easily implement this loss (Potential Energy Loss).
Thanks.
Hi JunMa,
Thanks alot for your code.
I am just wondering as to boudary loss, should the distance map be a signed distance map (negative inside and positive outside the boundary) or an un-signed distance map (positive both inside and outside)?
Thanks in advance.
Cheers.
I wonder whether loss functions with same color in README figure1 have similar mathematical characteristics? I want to combine two loss functions with same color, does it make sense?
Thanks for collecting and sharing the loss function work. There are a lot of torch.no_grad() used in the code. Will this cause the gradient to be truncated, making it impossible to update the weights?
Hi,
Thank you for these loss functions, they are very helpful. I am trying to run with nnUNetv2 and I can't seem to find the nnUNetTrainerV2.py in this repository and I can't seem to find it. Is there one available or would I have to change the nnUNetTrainer.py to suit nnUNetv2 instead of nnUNetv1?
Thank you in advance.
Could you please provide the code for the violin plot? Thanks!
Hi, thank you so much for collecting these papers.
I've noticed that the paper link of this paper is wrong, and it is link to another paper called "Correlation Maximized Structural Similarity Loss for Semantic Segmentation."
201910 | Shuai Zhao | Region Mutual Information Loss for Semantic Segmentation |
---|
Hi, @JunMa11,
Your paper mentions that grid searching is used to determine the learning rates for different losses for a fair comparison. Could you provide the learning rates of different dice compound losses, (i.e. Dice, DiceCE, DiceBD, DiceHD, DiceTopk, DiceFocal), so that we can have an easier reimplementation?
Best,
Xi
when apply_nonlin=None
calculation of GDiceLoss fails:
Traceback (most recent call last):
File "path/loser.py", line 181, in <module>
loss = criterion(prediction, label)
File "path/python3.8/site-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "path/SegLoss/losses_pytorch/dice_loss.py", line 123, in forward
intersection: torch.Tensor = w * einsum("bcxyz, bcxyz->bc", softmax_output, y_onehot)
UnboundLocalError: local variable 'softmax_output' referenced before assignment
Hi,
Thanks for this nice repo! I'm looking for the loss from this paper "Topology-Preserving Deep Image Segmentation" it seems to appear your table so I guess you've implemented it but I did not find it. Would you mind pointing me the correct loss function please?
Thank you very much,
Bests
Hi Jun,
Thank you so much for sharing this with everyone. I'm coming from the nnU-Net repository after researching some of the issues posted.
I'm getting ready to implement your repository but noticed the --ndet
flag at the end of the suggested run command python run/run_training.py 3d_fullres nnUNetTrainer_Dice TaskXX_MY_DATASET FOLD --ndet
. I'm not familiar with what this flag does, and I can't seem to find any information about it anywhere. My best guess is that it is for non-deterministic training but I think nnU-Net does this by default.
Any info is much appreciated. Take care!
Andres
I tried to use some modules in "https://github.com/JunMa11/SegLoss/blob/master/losses_pytorch/dice_loss.py" but it seems like np isn't imported before being called at line #21.
I want to use the boundary loss with my dataset but the conversion between masks to contours can be a pain, is there a simpler way to do it ?
Hello, it's amazing repository thank you for share your code,
I have two questions:
I try to use in DiceTopK loss function and i not sure that I understood that parameters to use correctly in this loss function,
what is meaning in this paramaters:
I would be very happy if you could explain them to me?
Also i search in your code for implementation for DiceFocal and i I can not find, Can you please direct me to link?
Thank you very much,
Aviad
Hello, @JunMa11
I want to know whether focal loss is suitable for 3D image segmentation or not,and for segmentation with only two classifications of background and foreground,whether the network can be single-channel output and use focal loss.
Hope your help, thank you!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.