Code Monkey home page Code Monkey logo

shapml.jl's Introduction

Build Status Codecov

ShapML ShapML logo

The purpose of ShapML is to compute stochastic feature-level Shapley values which can be used to (a) interpret and/or (b) assess the fairness of any machine learning model. Shapley values are an intuitive and theoretically sound model-agnostic diagnostic tool to understand both global feature importance across all instances in a data set and instance/row-level local feature importance in black-box machine learning models.

This package implements the algorithm described in Štrumbelj and Kononenko's (2014) sampling-based Shapley approximation algorithm to compute the stochastic Shapley values for a given instance and model feature.

  • Flexibility:

    • Shapley values can be estimated for any machine learning model using a simple user-defined predict() wrapper function.
  • Speed:

    • The speed advantage of ShapML comes in the form of giving the user the ability to select 1 or more target features of interest and avoid having to compute Shapley values for all model features (i.e., a subset of target features from a trained model will return the same feature-level Shapley values as the full model with all features). This is especially useful in high-dimensional models as the computation of a Shapley value is exponential in the number of features.

Install

using Pkg
Pkg.add("ShapML")
  • Development
using Pkg
Pkg.add(PackageSpec(url = "https://github.com/nredell/ShapML.jl"))

Documentation and Vignettes

shapml.jl's People

Contributors

4sanalyticsnmodelling avatar ablaom avatar ethan-russell avatar nredell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

shapml.jl's Issues

Add additivity constraint

At present, for each instance, the sum of the Shapley values will not exactly equal the model prediction. Assuming that the reference population is the right population, in the limit of Monte Carlo samples, the sum of the feature-level Shapley values will equal that instance's prediction. However, due to the stochastic nature of the sampling, this may not be the case in practice.

I'll add an optional argument to shap() that will adjust the Shapley values to support the additivity constraint at the instance level.

ShapML not working with @load RandomForestClassifier pkg = DecisionTree

Hi,

does ShapML work with random forest classifiers? thanks for your time.

using MLJ, ShapML

rfc = @load RandomForestClassifier pkg = DecisionTree

r = Int(round(nrow(X)/3))
explain = copy(X[1:r, :])   # Compute Shapley feature-level predictions 
reference = copy(X)         # An optional reference population to compute the baseline prediction.
sample_size = 60            # Number of Monte Carlo samples for Shapley

    pipeRandomForestClassifier = @pipeline RandomForestClassifierPipe(
        selector = FeatureSelector(),
        hot = OneHotEncoder(),
        tree = RandomForestClassifier()) prediction_type = :probabilistic

    cases = [[Symbol(names(X)[j]) for j in 1:i] for i in 1:ncol(X)]   
    r1 = range(pipeRandomForestClassifier, :(selector.features), values = cases)

    tmRandomForestClassifier = TunedModel(
        model = pipeRandomForestClassifier,
        range = r1,
        measures = [cross_entropy, BrierScore()],
        resampling = CV(nfolds = 5)
    )
    mtm = machine(tmRandomForestClassifier, setScientificTypes!(X), categorical(y[:, 1]))
    fit!(mtm)

    dataShap = ShapML.shap(explain = explain,
                    reference = reference,
                    model = mtm,
                    predict_function = predict_function,
                    sample_size = sample_size,
                    seed = 1
                    )

ERROR: LoadError: MethodError: no method matching +(::MLJBase.UnivariateFinite{Int64,UInt32,Float64}, ::MLJBase.UnivariateFinite{Int64,UInt32,Float64})
Closest candidates are:
+(::Any, ::Any, ::Any, ::Any...) at operators.jl:529
+(::MLJBase.AbstractNode, ::Any) at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\composition\networks.jl:473
+(::Any, ::MLJBase.AbstractNode) at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\composition\networks.jl:472
Stacktrace:
[1] add_sum(::MLJBase.UnivariateFinite{Int64,UInt32,Float64}, ::MLJBase.UnivariateFinite{Int64,UInt32,Float64}) at .\reduce.jl:21
[2] mapreduce_impl(::typeof(identity), ::typeof(Base.add_sum), ::Array{MLJBase.UnivariateFinite{Int64,UInt32,Float64},1}, ::Int64, ::Int64, ::Int64) at .\reduce.jl:238
[3] mapreduce_impl(::typeof(identity), ::typeof(Base.add_sum), ::Array{MLJBase.UnivariateFinite{Int64,UInt32,Float64},1}, ::Int64, ::Int64, ::Int64) at .\reduce.jl:247 (repeats 12 times)
[4] mapreduce_impl at .\reduce.jl:253 [inlined]
[5] _mapreduce at .\reduce.jl:407 [inlined]
[6] _mapreduce_dim at .\reducedim.jl:312 [inlined]
[7] #mapreduce#580 at .\reducedim.jl:307 [inlined]
[8] mapreduce at .\reducedim.jl:307 [inlined]
[9] _sum at .\reducedim.jl:657 [inlined]
[10] _sum at .\reducedim.jl:656 [inlined]
[11] #sum#583 at .\reducedim.jl:652 [inlined]
[12] sum at .\reducedim.jl:652 [inlined]
[13] _mean(::Array{MLJBase.UnivariateFinite{Int64,UInt32,Float64},1}, ::Colon) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\Statistics\src\Statistics.jl:160
[14] mean(::Array{MLJBase.UnivariateFinite{Int64,UInt32,Float64},1}; dims::Function) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\Statistics\src\Statistics.jl:157
[15] mean(::Array{MLJBase.UnivariateFinite{Int64,UInt32,Float64},1}) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\Statistics\src\Statistics.jl:157
[16] _predict(; reference::DataFrames.DataFrame, data_predict::DataFrames.DataFrame, model::MLJBase.Machine{MLJTuning.ProbabilisticTunedModel{MLJTuning.Grid,Main.VSCodeDebugger.RandomForestClassifierPipe,ComputationalResources.CPU1{Nothing},ComputationalResources.CPU1{Nothing}}}, predict_function::typeof(Main.VSCodeDebugger.predict_function), n_features::Int64, n_target_features::Int64, n_instances_explain::Int64, sample_size::Int64, precision::Nothing, chunk::Bool) at C:\Users\BCP.julia\packages\ShapML\sceNA\src\predict.jl:29
[17] shap(; explain::DataFrames.DataFrame, reference::DataFrames.DataFrame, model::MLJBase.Machine{MLJTuning.ProbabilisticTunedModel{MLJTuning.Grid,Main.VSCodeDebugger.RandomForestClassifierPipe,ComputationalResources.CPU1{Nothing},ComputationalResources.CPU1{Nothing}}}, predict_function::Function, target_features::Nothing, sample_size::Int64, parallel::Nothing, seed::Int64, precision::Nothing, chunk::Bool) at C:\Users\BCP.julia\packages\ShapML\sceNA\src\ShapML.jl:165

Warning: left joining data frames using join is deprecated, use `leftjoin(df1, df2, on=[:index, :feature_name], makeunique=false, indicator=nothing, validate=(false, false))` instead

Hello,

I hope are you having a safe and great vacation. just letting you about the following warning. thanks for your time and help.

Computing Shapley Effect of Random Forest using 500 random rows. It might take a few minutes
┌ Warning: left joining data frames using join is deprecated, use leftjoin(df1, df2, on=[:index, :feature_name], makeunique=false, indicator=nothing, validate=(false, false)) instead
│ caller = ip:0x0
└ @ Core :-1
┌ Warning: by(d::AbstractDataFrame, cols::Any; sort::Bool = false, skipmissing::Bool = false, f...) is deprecated, use combine(groupby(d, cols, sort = sort, skipmissing = skipmissing), [if in_col isa ColumnIndex │ in_col │ else │ AsTable(in_col) │ end => (fun => out_col) for (out_col, (in_col, fun)) = f]...) instead.
│ caller = ip:0x0
└ @ Core :-1

ShapML for Classification.

Hi - is there a method to get shapley values for classification problems? The code I tried is below:

RFC = @load RandomForestClassifier pkg="DecisionTree"
rfc_model = RFC()
rf_machine = machine(rfc_model, X, y)
MLJ.fit!(rf_machine)
function predict_function(model, data)
data_pred = DataFrame(y_pred = MLJ.predict(model, data))
return data_pred
end
explain = copy(X[1:50, :])
reference = copy(X)
sample_size = 60 # Number of monte carlo samples
data_shap = ShapML.shap(explain=explain, reference=reference, model=rf_machine,
predict_function=predict_function, sample_size=sample_size, seed=1)

and I am getting the following error:

ERROR: LoadError: TypeError: in LocationScale, in T, expected T<:Real, got Type{Any}
Stacktrace:
[1] Distributions.LocationScale(μ::Float64, σ::Float64, ρ::UnivariateFinite{OrderedFactor{2}, Int64, UInt32, Float64}; check_args::Bool)
@ Distributions ~/.julia/packages/Distributions/jEqbk/src/univariate/locationscale.jl:50
[2] Distributions.LocationScale(μ::Float64, σ::Float64, ρ::UnivariateFinite{OrderedFactor{2}, Int64, UInt32, Float64})
@ Distributions ~/.julia/packages/Distributions/jEqbk/src/univariate/locationscale.jl:47
[3] *(x::Float64, d::UnivariateFinite{OrderedFactor{2}, Int64, UInt32, Float64})
@ Distributions ~/.julia/packages/Distributions/jEqbk/src/univariate/locationscale.jl:126
[4] /(d::UnivariateFinite{OrderedFactor{2}, Int64, UInt32, Float64}, x::Int64)
@ Distributions ~/.julia/packages/Distributions/jEqbk/src/univariate/locationscale.jl:129
[5] _mean(f::typeof(identity), A::Vector{UnivariateFinite{OrderedFactor{2}, Int64, UInt32, Float64}}, dims::Colon)
@ Statistics /Users/julia/buildbot/worker/package_macos64/build/usr/share/julia/stdlib/v1.6/Statistics/src/Statistics.jl:176
[6] mean(A::Vector{UnivariateFinite{OrderedFactor{2}, Int64, UInt32, Float64}}; dims::Function)
@ Statistics /Users/julia/buildbot/worker/package_macos64/build/usr/share/julia/stdlib/v1.6/Statistics/src/Statistics.jl:164
[7] mean(A::Vector{UnivariateFinite{OrderedFactor{2}, Int64, UInt32, Float64}})
@ Statistics /Users/julia/buildbot/worker/package_macos64/build/usr/share/julia/stdlib/v1.6/Statistics/src/Statistics.jl:164
[8] _predict(; reference::DataFrame, data_predict::DataFrame, model::Machine{MLJDecisionTreeInterface.RandomForestClassifier, true}, predict_function::typeof(predict_function), n_features::Int64, n_target_features::Int64, n_instances_explain::Int64, sample_size::Int64, precision::Nothing, chunk::Bool, reconcile_instance::Bool, explain::DataFrame)
@ ShapML ~/.julia/packages/ShapML/QMund/src/predict.jl:30
[9] shap(; explain::DataFrame, reference::DataFrame, model::Machine{MLJDecisionTreeInterface.RandomForestClassifier, true}, predict_function::Function, target_features::Nothing, sample_size::Int64, parallel::Nothing, seed::Int64, precision::Nothing, chunk::Bool, reconcile_instance::Bool)
@ ShapML ~/.julia/packages/ShapML/QMund/src/ShapML.jl:168
[10] top-level scope
@ Untitled-1:21
in expression starting at Untitled-1:21

Support for DataFrames 22.0?

Seems like this is not compatible with the latest DataFrames 22.0 version if you do add ShapML. I was able to download it from the github repository but it resulted in downgrading to DataFrames 21.8. There are quite a few changes from 21.8 to 22.0.

Error installing ShapML in windows

I installed MLJ package initially and then tried installing ShapML. It repeatedly gives error saying -
Error when installing package MLJ:
IOError: unlink(<path to Appdata\local\temp> ): resource busy or locked (EBUSY)

This error persists even after restart of the computer(Windows 10 professional).

Update MLJ examples

Some changes have been made to MLJ that break the current examples in ShapML.jl. (MLJ 0.17 has just been released.) Happy to make a PR to update the examples in either the Documenter documentation or the README.md but not excited about doing both 😓 . It seems the README.md examples are rendered redundant by the Documenter docs, no?

Another possibility is to make the examples a tutorial in DataScienceTutorials.jl and just point users there? This also might help with discoverability of this excellent package.

@nredell What do you think?

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

ERROR: OutOfMemoryError()Worker 2 terminated.

hi ,

I am doing a random forest tree with data file: https://www.openml.org/d/179

my code worked in many smaller data files. but this the largest data file I tried so far. do you know if ShapML might have a limit in the number of file rows?

@Everywhere function predict_function_mode(model, data)
ŷ = MLJ.predict(model, data)
ŷMode = [convert(Int64, mode(ŷ[i])) for i in 1:length(ŷ)]
data_pred = DataFrame(y_pred = ŷMode)
return data_pred
end # predict_function_mode

    @everywhere pipeRandomForestClassifier = @pipeline RandomForestClassifierPipe(
        selector = FeatureSelector(),
        hot = OneHotEncoder(),
        tree = RandomForestClassifier()) prediction_type = :probabilistic

    cases = [[Symbol(names(X)[j]) for j in 1:i] for i in 1:ncol(X)]   
    r1 = range(pipeRandomForestClassifier, :(selector.features), values = cases)

    tmRandomForestClassifier = TunedModel(
        model = pipeRandomForestClassifier,
        range = r1,
        measures = [cross_entropy, BrierScore()],
        resampling = CV(nfolds = 5)
    )
    mtm = machine(tmRandomForestClassifier, setScientificTypes!(X), categorical(y[:, 1]))
    Base.invokelatest(MLJ.fit!, mtm)

    predictor = predict_function_mode

r = Int(round(nrow(X) / 2))
explain = copy(X[1:r, :])   # Compute Shapley feature-level predictions 
reference = copy(X)         # An optional reference population to compute the baseline prediction.
sample_size = 10            # Number of Monte Carlo samples for Shapley

println("Computing Shapley Effect of Random Forest")
dataShap = ShapML.shap( explain = explain,
                        reference = reference,
                        model = mtm,
                        predict_function = predictor,
                        sample_size = sample_size,
                        parallel = :samples,  # Parallel computation over "sample_size"
                        seed = 20200628
    )

Worker 5 terminated.
ERROR: OutOfMemoryError()Worker 2 terminated.
Stacktrace:

[1] Worker 4 terminated.Array{Float64,2}
(::Worker 3 terminated.UndefInitializer
, ::Int64, ::Int64) at .\boot.jl:407
[2] matrix(::DataFrame; transpose::Bool) at C:\Users\BCP.julia\packages\Tables\okt7x\src\matrix.jl:73
[3] matrix at C:\Users\BCP.julia\packages\Tables\okt7x\src\matrix.jl:68 [inlined]
[4] #matrix#11 at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\interface\data_utils.jl:9 [inlined]
[5] matrix at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\interface\data_utils.jl:9 [inlined]
[6] matrix(::DataFrame; kw::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{,Tuple{}}}) at C:\Users\BCP.julia\packages\MLJModelInterface\aA1k2\src\data_utils.jl:27
[7] matrix at C:\Users\BCP.julia\packages\MLJModelInterface\aA1k2\src\data_utils.jl:27 [inlined]
[8] predict(::RandomForestClassifier, ::Tuple{DecisionTree.Ensemble{Float64,UInt32},CategoricalArray{Int64,1,UInt32,Int64,CategoricalValue{Int64,UInt32},Union{}},Array{UInt32,1}}, ::DataFrame) at C:\Users\BCP.julia\packages\MLJModels\i4XcU\src\DecisionTree.jl:188
[9] predict(::NodalMachine{RandomForestClassifier}, ::DataFrame) at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\operations.jl:29
[10] (::Node{NodalMachine{RandomForestClassifier}})(::DataFrame) at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\composition\networks.jl:302
[11] predict(::RandomForestClassifierPipe, ::Node{NodalMachine{RandomForestClassifier}}, ::DataFrame) at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\composition\composites.jl:45
[12] predict(::Machine{RandomForestClassifierPipe}, ::DataFrame) at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\operations.jl:29
[13] predict(::MLJTuning.ProbabilisticTunedModel{Grid,RandomForestClassifierPipe,CPU1{Nothing},CPU1{Nothing}}, ::Machine{RandomForestClassifierPipe}, ::DataFrame) at C:\Users\BCP.julia\packages\MLJTuning\JZ7ZX\src\tuned_models.jl:597
[14] predict(::Machine{MLJTuning.ProbabilisticTunedModel{Grid,RandomForestClassifierPipe,CPU1{Nothing},CPU1{Nothing}}}, ::DataFrame) at C:\Users\BCP.julia\packages\MLJBase\O5b6j\src\operations.jl:29
[15] predict_function_mode(::Machine{MLJTuning.ProbabilisticTunedModel{Grid,RandomForestClassifierPipe,CPU1{Nothing},CPU1{Nothing}}}, ::DataFrame) at C:\Users\BCP\github\ICP\GetBlankets.jl:180
[16] _shap_sample(::DataFrame, ::DataFrame, ::Int64, ::Int64, ::Int64, ::Int64, ::Array{String,1}, ::Array{String,1}, ::Array{Symbol,1}, ::Int64, ::Symbol, ::Array{Int64,1}, ::Bool, ::Machine{MLJTuning.ProbabilisticTunedModel{Grid,RandomForestClassifierPipe,CPU1{Nothing},CPU1{Nothing}}}, ::typeof(predict_function_mode), ::Nothing) at C:\Users\BCP.julia\packages\ShapML\sceNA\src\shap_sample.jl:97
[17] shap(; explain::DataFrame, reference::DataFrame, model::Machine{MLJTuning.ProbabilisticTunedModel{Grid,RandomForestClassifierPipe,CPU1{Nothing},CPU1{Nothing}}}, predict_function::Function, target_features::Nothing, sample_size::Int64, parallel::Nothing, seed::Int64, precision::Nothing, chunk::Bool) at C:\Users\BCP.julia\packages\ShapML\sceNA\src\ShapML.jl:117
[18] trainRandomForest(::DataFrame, ::DataFrame) at C:\Users\BCP\github\ICP\GetBlankets.jl:248
[19] selectRandomForest(::DataFrame, ::DataFrame, ::Int64) at C:\Users\BCP\github\ICP\GetBlankets.jl:155
[20] getBlanketRandomForest(::DataFrame, ::DataFrame, ::String, ::Int64) at C:\Users\BCP\github\ICP\GetBlankets.jl:135
[21] getBlanketRandomForest at C:\Users\BCP\github\ICP\GetBlankets.jl:129 [inlined]
[22] ForestInvariantCausalPrediction(::DataFrame, ::DataFrame, ::DataFrame; α::Float64, selection::String, verbose::Bool) at C:\Users\BCP\github\ICP\InvariantCausalPrediction.jl:305
[23] top-level scope at REPL[10]:1
caused by [exception 1]
ProcessExitedException(2)
Stacktrace:
[1] (::Base.var"#726#728")(::Task) at .\asyncmap.jl:178
[2] foreach(::Base.var"#726#728", ::Array{Any,1}) at .\abstractarray.jl:1919
[3] maptwice(::Function, ::Channel{Any}, ::Array{Any,1}, ::UnitRange{Int64}) at .\asyncmap.jl:178
[4] wrap_n_exec_twice(::Channel{Any}, ::Array{Any,1}, ::Distributed.var"#204#207"{WorkerPool}, ::Function, ::UnitRange{Int64}) at .\asyncmap.jl:154
[5] async_usemap(::Distributed.var"#188#190"{Distributed.var"#188#189#191"{WorkerPool,ShapML.var"#13#15"{DataFrame,Machine{MLJTuning.ProbabilisticTunedModel{Grid,RandomForestClassifierPipe,CPU1{Nothing},CPU1{Nothing}}},typeof(predict_function_mode),Int64,Nothing,Bool,Array{String,1},Array{Symbol,1},Int64,Int64,Int64,Int64,Array{Int64,1}}}}, ::UnitRange{Int64}; ntasks::Function, batch_size::Nothing) at .\asyncmap.jl:103
[6] #asyncmap#710 at .\asyncmap.jl:81 [inlined]
[7] pmap(::Function, ::WorkerPool, ::UnitRange{Int64}; distributed::Bool, batch_size::Int64, on_error::Nothing, retry_delays::Array{Any,1}, retry_check::Nothing) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\Distributed\src\pmap.jl:126
[8] pmap(::Function, ::WorkerPool, ::UnitRange{Int64}) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\Distributed\src\pmap.jl:101
[9] pmap(::Function, ::UnitRange{Int64}; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{,Tuple{}}}) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\Distributed\src\pmap.jl:156
[10] pmap at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\Distributed\src\pmap.jl:156 [inlined]
[11] shap(; explain::DataFrame, reference::DataFrame, model::Machine{MLJTuning.ProbabilisticTunedModel{Grid,RandomForestClassifierPipe,CPU1{Nothing},CPU1{Nothing}}}, predict_function::Function, target_features::Nothing, sample_size::Int64, parallel::Symbol, seed::Int64, precision::Nothing, chunk::Bool) at C:\Users\BCP.julia\packages\ShapML\sceNA\src\ShapML.jl:137
[12] trainRandomForest(::DataFrame, ::DataFrame) at C:\Users\BCP\github\ICP\GetBlankets.jl:238
[13] selectRandomForest(::DataFrame, ::DataFrame, ::Int64) at C:\Users\BCP\github\ICP\GetBlankets.jl:155
[14] getBlanketRandomForest(::DataFrame, ::DataFrame, ::String, ::Int64) at C:\Users\BCP\github\ICP\GetBlankets.jl:135
[15] getBlanketRandomForest at C:\Users\BCP\github\ICP\GetBlankets.jl:129 [inlined]
[16] ForestInvariantCausalPrediction(::DataFrame, ::DataFrame, ::DataFrame; α::Float64, selection::String, verbose::Bool) at C:\Users\BCP\github\ICP\InvariantCausalPrediction.jl:305
[17] top-level scope at REPL[10]:1

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.