Code Monkey home page Code Monkey logo

Comments (1)

tomasfryda avatar tomasfryda commented on June 3, 2024

@MoonCapture There is no parameter to do that so you would have to implement it by yourself.

I would approach this by calculating linear approximation similarly as is done in Generalized DeepSHAP.

You will need to get SHAP predictions by best_model_01.predict_contributions(df_test_normalized[0,:], background_frame=df_train_normalized, output_space=True, output_per_reference=True).

First you have to ensure that the output of the SHAP values are in the same space as the predictions (i.e. if the model uses link function you might have to apply inverse link function on the SHAP values) this is what parameter output_space=True does.

Then you will need contribution to the change of prediction against every single point from background_frame, that's what output_per_reference is for.

Relevant part of the doc string:

:param output_space: If True, linearly scale the contributions so that they sum up to the prediction.
                     NOTE: This will result only in approximate SHAP values even if the model supports exact SHAP calculation.
                     NOTE: This will not have any effect if the estimator doesn't use a link function.
:param output_per_reference: If True, return baseline SHAP, i.e., contribution for each data point for each reference from the background_frame.
                             If False, return TreeSHAP if no background_frame is provided, or marginal SHAP if background frame is provided.
                             Can be used only with background_frame.

Next you denormalize the SHAP values. This depends on the way you normalize the data, if you can inverse the normalization just by multiplication then it's simple just multiply all values. If you need to use addition as well then this I would do only for the Bias after the multiplication. If the normalization procedure you use is more complicated, use eq. 3 from Explaining a series of models by propagating Shapley values. (or you can check my implementation of simplified G-DeepSHAP in our StackedEnsembles (simplified because it is applied only on two layers (basemodels -> metalearner))).

Next you should check that the Bias is denormalized prediction on the background frame point.
Pseudocode:

abs(denormalize(best_model_01.predict(background_frame[i, :])) - denorm_shap_pred[denorm_shap_pred["BackgroundRowIdx"]==i, "Bias"]) < 1e-6

Then you can also check that row sums of the denorm_shap_pred (excluding RowIdx and BackgroundRowIdx) are roughly the same as denormalized prediction (i.e. denormalized contributions + denormalized bias == denormalized prediction).

Next if you're confident that those values are close enough (depending on the model the epsilon can be 1e-6 up to 1e-3 (XGBoost uses floats in our implementation for prediction and double for contributions so there will be the epsilon closer to 1e-3)), you take the average contribution across the background frame. Something like:

denorm_shap_pred.drop("BackgroundRowIdx").groupby("RowIdx").mean()

And that should be the result you are looking for. It's not exact SHAP value since G-DeepSHAP gives only approximation if there is some non-linearity but at least you can compute it in reasonable time.

from h2o-3.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.