Code Monkey home page Code Monkey logo

glmnet.jl's Introduction

GLMNet

Build Status Coverage Status

glmnet is an R package by Jerome Friedman, Trevor Hastie, Rob Tibshirani that fits entire Lasso or ElasticNet regularization paths for linear, logistic, multinomial, and Cox models using cyclic coordinate descent. This Julia package wraps the Fortran code from glmnet.

Quick start

To fit a basic regression model:

julia> using GLMNet

julia> y = collect(1:100) + randn(100)*10;

julia> X = [1:100 (1:100)+randn(100)*5 (1:100)+randn(100)*10 (1:100)+randn(100)*20];

julia> path = glmnet(X, y)
Least Squares GLMNet Solution Path (86 solutions for 4 predictors in 930 passes):
──────────────────────────────
      df   pct_dev           λ
──────────────────────────────
 [1]   0  0.0       30.0573
 [2]   1  0.152922  27.3871
 [3]   1  0.279881  24.9541
 : 
[84]   4  0.90719    0.0133172
[85]   4  0.9072     0.0121342
[86]   4  0.907209   0.0110562
──────────────────────────────

path represents the Lasso or ElasticNet fits for varying values of λ. The value of the intercept for each λ value are in path.a0. The coefficients for each fit are stored in compressed form in path.betas.

julia> path.betas
4×86 CompressedPredictorMatrix:
 0.0  0.0925032  0.176789  0.253587  0.323562  0.387321  0.445416  0.498349  0.546581  0.590527  0.63057  0.667055  0.700299     1.33905      1.34855     1.35822     1.36768     1.37563     1.3829      1.39005     1.39641     1.40204     1.40702     1.41195
 0.0  0.0        0.0       0.0       0.0       0.0       0.0       0.0       0.0       0.0       0.0      0.0       0.0          -0.165771    -0.17235    -0.178991   -0.185479   -0.190945   -0.195942   -0.200851   -0.20521    -0.209079   -0.212501   -0.215883
 0.0  0.0        0.0       0.0       0.0       0.0       0.0       0.0       0.0       0.0       0.0      0.0       0.0          -0.00968611  -0.0117121  -0.0135919  -0.0154413  -0.0169859  -0.0183965  -0.0197951  -0.0210362  -0.0221345  -0.0231023  -0.0240649
 0.0  0.0        0.0       0.0       0.0       0.0       0.0       0.0       0.0       0.0       0.0      0.0       0.0          -0.110093    -0.110505   -0.111078   -0.11163    -0.112102   -0.112533   -0.112951   -0.113324   -0.113656   -0.113953   -0.11424

This CompressedPredictorMatrix can be indexed as any other AbstractMatrix, or converted to a Matrix using convert(Matrix, path.betas).

One can visualize the path by

julia> using Plots, LinearAlgebra, LaTeXStrings

julia> betaNorm = [norm(x, 1) for x in eachslice(path.betas,dims=2)];

julia> extraOptions = (xlabel=L"\| \beta \|_1",ylabel=L"\beta_i", legend=:topleft,legendtitle="Variable", labels=[1 2 3 4]);

julia> plot(betaNorm, path.betas'; extraOptions...)

regression-lasso-path

To predict the output for each model along the path for a given set of predictors, use predict:

julia> predict(path, [22 22+randn()*5 22+randn()*10 22+randn()*20])

1×86 Array{Float64,2}:
 50.3295  47.6932  45.291  43.1023  41.108  39.2909  37.6352  36.1265  34.7519  33.4995  32.3583  31.3184  30.371  29.5077  28.7211  28.0044    21.3966  21.3129  21.2472  21.1746  21.1191  21.0655  21.0127  20.9687  20.9284  20.8885  20.8531  20.8218  20.7942  20.7667

To find the best value of λ by cross-validation, use glmnetcv:

julia> cv = glmnetcv(X, y) 
Least Squares GLMNet Cross Validation
86 models for 4 predictors in 10 folds
Best λ 0.136 (mean loss 101.530, std 10.940)

julia> argmin(cv.meanloss)
59

julia> coef(cv) # equivalent to cv.path.betas[:, 59]
4-element Array{Float64,1}:
  1.1277676556880305
  0.0
  0.0
 -0.08747434292954445

A classification Example

julia> using RDatasets

julia> iris = dataset("datasets", "iris");

julia> X = convert(Matrix, iris[:, 1:4]);

julia> y = convert(Vector, iris[:Species]);

julia> iTrain = sample(1:size(X,1), 100, replace = false);

julia> iTest = setdiff(1:size(X,1), iTrain);

julia> iris_cv = glmnetcv(X[iTrain, :], y[iTrain])
Multinomial GLMNet Cross Validation
100 models for 4 predictors in 10 folds
Best λ 0.001 (mean loss 0.130, std 0.054)

julia> yht = round.(predict(iris_cv, X[iTest, :], outtype = :prob), digits=3);

julia> DataFrame(target=y[iTest], set=yht[:,1], ver=yht[:,2], vir=yht[:,3])[5:5:50,:]
10×4 DataFrame
│ Row │ target     │ set     │ ver     │ vir     │
│     │ Cat       │ Float64 │ Float64 │ Float64 │
├─────┼────────────┼─────────┼─────────┼─────────┤
│ 1   │ setosa     │ 0.9970.0030.0     │
│ 2   │ setosa     │ 0.9950.0050.0     │
│ 3   │ setosa     │ 0.9990.0010.0     │
│ 4   │ versicolor │ 0.00.9970.003   │
│ 5   │ versicolor │ 0.00.360.64    │
│ 6   │ versicolor │ 0.00.050.95    │
│ 7   │ virginica  │ 0.00.0020.998   │
│ 8   │ virginica  │ 0.00.0010.999   │
│ 9   │ virginica  │ 0.00.01.0     │
│ 10  │ virginica  │ 0.00.0010.999   │


julia> irisLabels = reshape(names(iris)[1:4],(1,4));
julia> βs =iris_cv.path.betas;
julia> λs= iris_cv.lambda;
julia> sharedOpts =(legend=false,  xlabel=L"\lambda", xscale=:log10) 
julia> p1 = plot(λs,βs[:,1,:]',ylabel=L"\beta_i";sharedOpts...);
julia> p2 = plot(λs,βs[:,2,:]',title="Across Cross Validation runs";sharedOpts...);
julia> p3 = plot(λs,βs[:,3,:]', legend=:topright,legendtitle="Variable", labels=irisLabels,xlabel=L"\lambda",xscale=:log10);
julia> plot(p1,p2,p3,layout=(1,3))

iris-lasso-path

julia> plot(iris_cv.lambda, iris_cv.meanloss, xscale=:log10, legend=false, yerror=iris_cv.stdloss,xlabel=L"\lambda",ylabel="loss")
julia> vline!([lambdamin(iris_cv)])

iris-cv

Fitting models

glmnet has two required parameters: the n x m predictor matrix X and the dependent variable y. It additionally accepts an optional third argument, family, which can be used to specify a generalized linear model. Currently, Normal() (least squares, default), Binomial() (logistic), Poisson() , Multinomial(), CoxPH() (Cox model) are supported.

  • For linear and Poisson models, y is a numerical vector of length n.
  • For logistic models, y is either a string vector of length n or a n x 2 matrix, where the first column is the count of negative responses for each row in X and the second column is the count of positive responses.
  • For multinomial models, y is either a string vector (with at least 3 unique values) or a n x k matrix, where k is the number of unique values (classes).
  • For Cox models, y is a n x 2 matrix, where the first column is survival time and the second column is (right) censoring status. Note that for survival data, glmnet has another method glmnet(X::Matrix, time::Vector, status::Vector) (glmnetcv has a corresponding method as well).

glmnet also accepts many optional keyword parameters as described below:

  • weights: A vector of length n of weights.
  • alpha: The tradeoff between lasso and ridge regression. This defaults to 1.0, which specifies a lasso model.
  • penalty_factor: A vector of length m of penalties for each predictor/column in X. This defaults to all ones, which weighs each predictor equally. To unpenalize a predictor, set the corresponding entry to zero.
  • constraints: A 2 x m matrix specifying lower bounds (first row) and upper bounds (second row) of the predictors. By default, this is [-Inf; Inf] for each predictor in X.
  • dfmax: The maximum number of predictors in the largest model.
  • pmax: The maximum number of predictors in any model.
  • nlambda: The number of values of λ along the path to consider.
  • lambda_min_ratio: The smallest λ value to consider, as a ratio of the λ value that gives the null model (i.e., the model with only an intercept). If the number of observations (n) exceeds the number of variables (m), this defaults to 0.0001, otherwise 0.01.
  • lambda: The λ values to consider. By default, this is determined from nlambda and lambda_min_ratio.
  • tol: Convergence criterion, with the default value of 1e-7.
  • standardize: Whether to standardize predictors so that they are in the same units, with the default value of true. Beta values are always presented on the original scale.
  • intercept: Whether to fit an intercept term. The intercept is always unpenalized. The default value is true.
  • maxit: The maximum number of iterations of the cyclic coordinate descent algorithm. If convergence is not achieved, a warning is returned. The default value is 1e6.

See also

  • Lasso.jl, a pure Julia implementation of the glmnet coordinate descent algorithm that often achieves better performance.
  • LARS.jl, an implementation of least angle regression for fitting entire linear (but not generalized linear) Lasso and Elastic Net coordinate paths.

glmnet.jl's People

Contributors

ararslan avatar devmotion avatar dsweber2 avatar femtocleaner[bot] avatar github-actions[bot] avatar jackdunnnz avatar juliatagbot avatar simonbyrne avatar simonster avatar tkelman avatar tyhlee avatar yixuan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

glmnet.jl's Issues

Capitalization

What do you of capitalizing this as GLMnet? I know that the official Matlab pag uses Glmnet as capitalization in one place, but it seems strange to me that GLM isn't all in caps.

Allow for sparse predictor matrix

The fortran code has methods for CSR predictor matrixes. Currently julia only supports sparse matrixes in CSC format in Base, right? How difficult would it be to use glmnet's sparse capabilities?

(I might take a crack at this at some point but I figured I'd open an issue to see if someone else wants to work on it too.)

`cv.meanloss` differs from `cv$cvm` in R

In trying to get the cross-validation output from glmnet in R and GLMNet.jl to conform, I find the losses differ even when everything else (lambda sequence, fold id) is the same across the two. This yields an argmin(cv.meanloss) different from which.min(cv$cvm) (R), and it sometimes matters. What is the source of the difference?

Example:

R

require(glmnet)

data <- iris
foldid <- rep(1:10, nrow(data) / 10)
x <- model.matrix(data = data, ~ Sepal.Length + Sepal.Width + Petal.Length + Petal.Width)

cvl <- cv.glmnet(y = data$Species, x = x, 
                 family = "multinomial", alignment = "fraction", foldid = foldid)

round(cvl$lambda, 8)

cvl$cvm
2.1972246 2.0531159 1.9324868 ...

julia

using Pkg

Pkg.add("RDatasets")
Pkg.add("GLMNet")
Pkg.add("GLM")

using RDatasets, GLMNet, GLM

iris = dataset("datasets", "iris")

fml = @formula(Species ~ SepalLength + SepalWidth + PetalLength + PetalWidth + SepalLength)
x = ModelMatrix(ModelFrame(fml, iris)).m
foldid = repeat(1:10, Int(size(iris, 1) / 10))
cvl = glmnetcv( x, iris.Species; folds = foldid )

cvl.lambda'

cvl.meanloss
2.1955639962247964
2.0530748153377423
1.9324668652650965
...

Incorrect `null_dev`s

The null deviations seem to be incorrect when I run GLMNet.glmnet -- I'm getting absurdly small numbers.

Example: this code

# Simulated data set
X_sim = randn((100,10));
beta_sim = randn(10);
y_sim = randn(100) .+ (X_sim * beta_sim  .+ 3.14)

sim_path = GLMNet.glmnet(X_sim, y_sim)

Produces the output:

Least Squares GLMNet Solution Path (64 solutions for 10 predictors in 266 passes):
───────────────────────────────
      df    pct_dev           λ
───────────────────────────────
 [1]   0  0.0        3.15936   
 [2]   1  0.0716368  2.87869   
 [3]   1  0.131111   2.62295   
 [4]   1  0.180487   2.38994   
 [5]   1  0.221481   2.17762   
 [6]   2  0.270351   1.98417   
.
.
.

Which seems fine -- but then when I run

sim_path.null_dev

I get an absurdly small number:

6.240013019814641e-34

In contrast, when I compute the null deviance (sum of squares) myself:

size(X_sim, 1) * var(y_sim)

I get

2389.5611952108716

Have I misunderstood something? It's easy enough to compute the null deviance on my own, but it seems like GLMNet.jl isn't computing it as advertised.

And I don't see it covered in your unit tests. So maybe this was a small blind spot.

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Building Error in Windows

When I try to build the GLMNet.jl it tells me that the file is not found. I dug around and it looks like the problem is inside the build.jl file:

pic = @windows ? "" : "-fPIC"
run(`gfortran -m$WORD_SIZE -fdefault-real-8 -ffixed-form $pic -shared -O3 glmnet5.f90 -o libglmnet.so`)

and when the run( ) line is read, under windows it will change $pic to '' instead of an actual blank, which messes up the build. When I removed the $pic from that line, it built fine

Multivariate normal L1 regression

Maybe i'm blind, but how can I fit a linear lasso model on multiple dependent variables?

There doesn't seem to be a method for

X = rand(100, 10)
Y = rand(100, 4)
path = glmnet(X, Y,  Normal())
# or 
path = glmnet(X, Y,  MvNormal())

MethodError: glmnet(::Array{Float64,2}, ::Array{Int64,1}) is ambiguous

From basic regression model from quick start https://github.com/JuliaStats/GLMNet.jl, if y is of type Array{Int64,1}.

glmnet will give ambiguous error

ERROR: MethodError: glmnet(::Array{Float64,2}, ::Array{Int64,1}) is ambiguous. Candidates:
glmnet(X::Array{Float64,2}, y; kw...) in GLMNet at C:\Users\ly.julia\packages\GLMNet\1uQom\src\Multinomial.jl:191
glmnet(X::AbstractArray{T,2} where T, y::AbstractArray{var"#s77",1} where var"#s77"<:Number) in GLMNet at C:\Users\ly.julia\packages\GLMNet\1uQom\src\GLMNet.jl:492
Possible fix, define
glmnet(::Array{Float64,2}, ::AbstractArray{var"#s77",1} where var"#s77"<:Number)

[PackageEvaluator.jl] Your package GLMNet may have a testing issue.

This issue is being filed by a script, but if you reply, I will see it.

PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their test (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3).

The results of this script are used to generate a package listing enhanced with testing results.

The status of this package, GLMNet, on...

  • Julia 0.2 is 'Package doesn't load.' PackageEvaluator.jl
  • Julia 0.3 is 'Tests pass.' PackageEvaluator.jl

'No tests, but package loads.' can be due to their being no tests (you should write some if you can!) but can also be due to PackageEvaluator not being able to find your tests. Consider adding a test/runtests.jl file.

'Package doesn't load.' is the worst-case scenario. Sometimes this arises because your package doesn't have BinDeps support, or needs something that can't be installed with BinDeps. If this is the case for your package, please file an issue and an exception can be made so your package will not be tested.

This automatically filed issue is a one-off message. Starting soon, issues will only be filed when the testing status of your package changes in a negative direction (gets worse). If you'd like to opt-out of these status-change messages, reply to this message.

Logistic regression fails if y is a string of vectors

From README:

For logistic models, y is either a string vector or a m x 2 matrix

But the following doesn't work

using GLMNet
y = ["M", "B", "M", "B"]
X = rand(4, 10)
glmnet(X, y, Binomial())

MethodError: no method matching glmnet(::Matrix{Float64}, ::Vector{String}, ::Binomial{Float64})
Closest candidates are:
  glmnet(::AbstractMatrix{T} where T, ::AbstractVector{T} where T, ::AbstractVector{T} where T) at /home/users/bbchu/.julia/packages/GLMNet/C8WKF/src/CoxNet.jl:151
  glmnet(::AbstractMatrix{T} where T, ::AbstractVector{T} where T, ::AbstractVector{T} where T, ::CoxPH; kw...) at /home/users/bbchu/.julia/packages/GLMNet/C8WKF/src/CoxNet.jl:151
  glmnet(::Matrix{Float64}, ::Vector{Float64}, ::Distribution; kw...) at /home/users/bbchu/.julia/packages/GLMNet/C8WKF/src/GLMNet.jl:485
  ...

Fortunately if y is a matrix with 2 columns, it does work

y = [1 0; 0 1; 0 1; 1 0]
X = rand(4, 10)
glmnet(X, y, Binomial())

Logistic GLMNet Solution Path (100 solutions for 10 predictors in 833 passes):
────────────────────────────────
       df    pct_dev           λ
────────────────────────────────
  [1]   0  0.0        0.476672
  [2]   1  0.0582906  0.455006
  [3]   1  0.11166    0.434325
  [4]   1  0.160737   0.414585
  [5]   1  0.206039   0.395741
  [6]   1  0.248      0.377754
  [7]   1  0.286986   0.360585
  ...

Constraints dimensions on README

The README states constraints should be an n x 2 matrix. However, it is actually a 2 x n matrix.
I'd prefer if it were n x 2, but if it's supposed to be 2 x n, it would be good to fix the README in order to avoid confusion.

[PkgEval] GLMNet may have a testing issue on Julia 0.3 (2014-07-14)

PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3). The results of this script are used to generate a package listing enhanced with testing results.

On Julia 0.3

  • On 2014-07-12 the testing status was Tests pass.
  • On 2014-07-14 the testing status changed to Package doesn't load.

Tests pass. means that PackageEvaluator found the tests for your package, executed them, and they all passed.

Package doesn't load. means that PackageEvaluator did not find tests for your package. Additionally, trying to load your package with using failed.

This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.

Test log:

INFO: Installing ArrayViews v0.4.6
INFO: Installing DataArrays v0.1.12
INFO: Installing DataFrames v0.5.6
INFO: Installing Distributions v0.5.2
INFO: Installing GLMNet v0.0.2
INFO: Installing GZip v0.2.13
INFO: Installing PDMats v0.2.0
INFO: Installing Reexport v0.0.1
INFO: Installing SortingAlgorithms v0.0.1
INFO: Installing StatsBase v0.5.3
INFO: Building GLMNet
INFO: Package database updated
Warning: could not import Sort.sortby into DataFrames
Warning: could not import Sort.sortby! into DataFrames
ERROR: repl_show not defined
 in include at ./boot.jl:245
 in include_from_node1 at ./loading.jl:128
 in include at ./boot.jl:245
 in include_from_node1 at ./loading.jl:128
 in reload_path at loading.jl:152
 in _require at loading.jl:67
 in require at loading.jl:54
 in include at ./boot.jl:245
 in include_from_node1 at ./loading.jl:128
 in reload_path at loading.jl:152
 in _require at loading.jl:67
 in require at loading.jl:51
 in include at ./boot.jl:245
 in include_from_node1 at loading.jl:128
 in process_options at ./client.jl:285
 in _start at ./client.jl:354
while loading /home/idunning/pkgtest/.julia/v0.3/DataFrames/src/dataframe/reshape.jl, in expression starting on line 163
while loading /home/idunning/pkgtest/.julia/v0.3/DataFrames/src/DataFrames.jl, in expression starting on line 110
while loading /home/idunning/pkgtest/.julia/v0.3/GLMNet/src/GLMNet.jl, in expression starting on line 2
while loading /home/idunning/pkgtest/.julia/v0.3/GLMNet/testusing.jl, in expression starting on line 1
INFO: Package database updated

Note this is possibly due to removal of deprecated functions in Julia 0.3-rc1: JuliaLang/julia#7609

V0.1.0 available?

Hello. I just saw that GLMNet v0.1.0 has been released according to the GLMNet.jl site.
However, when I ran Pkg.update(), my current GLMNet v0.0.5 was not updated to v0.1.0. I also tried to remove GLMNet v0.0.5 and reinstall GLMNet via Pkg.add("GLMNet"), but still v0.0.5 was installed. Why does this happen and how can I get the v0.1.0 without using Pkg.checkout("GLMNet")? Since v0.1.0 is an official release, I should be able to get that version simply by Pkg.update() in principle, correct?
Best,
BVPs

Problem with Pkg.add("GLMNet") in Julia 0.3.3 on Mac, OS 10.9.5

It appears that the build of this package, using gfortran on OS X 10.9.5, tries to use the -shared option which is not present.

julia> Pkg.build("GLMNet")
INFO: Building GLMNet
i686-apple-darwin8-gfortran-4.2: unrecognized option '-shared'
Undefined symbols for architecture x86_64:
"MAIN_", referenced from:
_main in libgfortranbegin.a(fmain.o)
ld: symbol(s) not found for architecture x86_64
collect2: ld returned 1 exit status
========================================[ ERROR: GLMNet ]========================================

failed process: Process(gfortran -m64 -fdefault-real-8 -ffixed-form -fPIC -shared -O3 glmnet3.f90 -o libglmnet.so, ProcessExited(1)) [1]
while loading /Users/psz/.julia/v0.3/GLMNet/deps/build.jl, in expression starting on line 3

`constraints` matrix changes after running glmnetcv

Is this behavior intended?

MWE:

X = rand(1000, 2)
y = X[:,1] + randn(1000)
constraints = [-1 -1; Inf Inf]
res = glmnetcv(X, y, constraints = constraints)
julia> constraints
2×2 Matrix{Float64}:
 -4.69035e-7  -4.57555e-7
 Inf          Inf

I cannot set the family as Binomial

When I try to set the variable family as Binomial, glmnet just give me an error:

ERROR: MethodError: no method matching glmnet!(::Matrix{Float64}, ::Vector{Float64}, ::Binomial{Float64}; alpha=1, standardize=false, intercept=true, lambda=[0.0005])
Closest candidates are:
  glmnet!(::Matrix{Float64}, ::Matrix{Float64}, ::Binomial; offsets, weights, alpha, penalty_factor, constraints, dfmax, pmax, nlambda, lambda_min_ratio, lambda, tol, standardize, intercept, maxit, algorithm) at ~/.julia/packages/GLMNet/Bzpup/src/GLMNet.jl:337

I use the version 1.8.

rep and convert

Hello, I stumbled over a few things trying to follow the example problem.

It looks like the DataArrays package is needed for the definition of rep in the glmnetcv function.

Also, I had a little trouble with the convert method called from within the show function:

ERROR: no method convert(Type{Array{T,2}}, CompressedPredictorMatrix)
in convert at base.jl:11

Changing the function name "convert" to something else worked, but being brand new to Julia, I don't yet know the proper way to fix this.

How to use Multinomial family

Hello, how do you use Multinomial() or other methods in the family option? I tried

glmnet(X,Y, family::Multinomial()) and glmnet(X,Y, family=:Multinomial()) where X and Y are both 2-D arrays.

Probably an easy question, but I just couldn't figure out how to use it. Thanks for your help.

Warm starts

Is there a way to use warm starts for glmnet and glmnetcv?
The R wrapper seems to have the warm start capability.

Thanks!

Error attempting Logistic Regression

I am encountering an error when trying to run a binomial logistic ridge regression. The error does not occur when using the default least squares regression.

import GLMNet
cv_guass = GLMNet.glmnetcv(X, y, GLMNet.Normal(), alpha=0.0)
cv_binom = GLMNet.glmnetcv(X, y, GLMNet.Binomial(), alpha=0.0)

The final line of code produces the following error:

no method glmnet!(Array{Any,1},Array{Float64,2},Array{Float64,1},Binomial)
at In[100]:3
 in glmnet at /home/dhimmels/.julia/v0.2/GLMNet/src/GLMNet.jl:346
 in glmnetcv at /home/dhimmels/.julia/v0.2/GLMNet/src/GLMNet.jl:382

I am new to julia, so perhaps I'm missing something. Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.