Code Monkey home page Code Monkey logo

Comments (4)

xingjunm avatar xingjunm commented on June 15, 2024

Hi firelgh,

  1. If the labels/mask are clean, try 1*CE + beta*RCE, beta can be selected to make sure the two loss terms have similar magnitudes. In terms of segmentation, it is related to the sparsity of the mask. Sometimes, the value of beta*RCE can be 10x large than the CE term, which may produce better performance.

  2. If the labels/mask are noisy, try around 0.1*CE + beta*RCE. The 0.1*CE is to reduce the overfitting of CE to noisy mask. The beta can be large such as 6/10 to boost training.

  3. If the dataset is very sparse, and has a convergence problem. The try large alpha [1, 10]. For example: 6*CE+ beta*RCE. The beta can be determined similarly as above.

Note the A parameter can also be tuned. The beta works together with A, and they determine the magnitude of the RCE term. For segmentation, there are usually many dimensions in a mask, try A=-1/-2 instead of the -4 as used for classification.

Hope this can help your experiments.

from symmetric_cross_entropy_for_noisy_labels.

firelgh avatar firelgh commented on June 15, 2024

Thank you very much! It was very helpful to me.

However,there's a question about parameter A. In the code, I canβ€˜t find where the parameter A is defined from.

In the code, I only find the alpha and the bete of SCEloss.

def symmetric_cross_entropy(alpha, beta):
def loss(y_true, y_pred):
y_true_1 = y_true
y_pred_1 = y_pred

    y_true_2 = y_true
    y_pred_2 = y_pred

    y_pred_1 = tf.clip_by_value(y_pred_1, 1e-7, 1.0)
    y_true_2 = tf.clip_by_value(y_true_2, 1e-4, 1.0)

    return alpha*tf.reduce_mean(-tf.reduce_sum(y_true_1 * tf.log(y_pred_1), axis = -1)) + beta*tf.reduce_mean(-tf.reduce_sum(y_pred_2 * tf.log(y_true_2), axis = -1))
return loss

from symmetric_cross_entropy_for_noisy_labels.

xingjunm avatar xingjunm commented on June 15, 2024

Hi firelgh,
y_true_2 = tf.clip_by_value(y_true_2, 1e-4, 1.0), this is related to A.

from symmetric_cross_entropy_for_noisy_labels.

firelgh avatar firelgh commented on June 15, 2024

Okay,I've got it.Thank you.

from symmetric_cross_entropy_for_noisy_labels.

Related Issues (11)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.