xiaodaigh / jlboost.jl Goto Github PK
View Code? Open in Web Editor NEWA 100%-Julia implementation of Gradient-Boosting Regression Tree algorithms
License: MIT License
A 100%-Julia implementation of Gradient-Boosting Regression Tree algorithms
License: MIT License
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml
to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
If you'd like for me to do this for you, comment TagBot fix
on this issue.
I'll open a PR within a few hours, please be patient!
Same error in both master and release v0.1.5:
julia> using JLBoost
[ Info: Precompiling JLBoost [13d6d4a1-5e7f-472c-9ebc-8123a4fbb95f]
WARNING: both PrettyTables and MLJModelInterface export "matrix"; uses of it in module MLJBase must be qualified
ERROR: LoadError: LoadError: ArgumentError: Package JLBoost does not have Tables in its dependencies:
- If you have JLBoost checked out for development and have
added Tables as a dependency but haven't updated your primary
environment's manifest file, try `Pkg.resolve()`.
- Otherwise you may need to report an issue with JLBoost
(v1.3) pkg> add https://github.com/xiaodaigh/JLBoost.jl
Updating git-repo `https://github.com/xiaodaigh/JLBoost.jl`
Updating git-repo `https://github.com/xiaodaigh/JLBoost.jl`
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package DataFrames [a93c6f00]:
DataFrames [a93c6f00] log:
├─possible versions are: [0.11.7, 0.12.0, 0.13.0-0.13.1, 0.14.0-0.14.1, 0.15.0-0.15.2, 0.16.0, 0.17.0-0.17.1, 0.18.0-0.18.4, 0.19.0-0.19.4, 0.20.0] or uninstalled
├─restricted to versions 0.19 by JLBoost [13d6d4a1], leaving only versions 0.19.0-0.19.4
│ └─JLBoost [13d6d4a1] log:
│ ├─possible versions are: 0.1.3 or uninstalled
│ └─JLBoost [13d6d4a1] is fixed to version 0.1.3
└─restricted to versions 0.20.0 by an explicit requirement — no versions left
Does it support Acceleration and Parallelism · MLJ ?
Hey and thanks for this package!
Do you have any idea on how to incorporate different weights for each datapoints, in the sense that datapoints with low weights should be less important for the model to fit than those with high weights?
For instance, by specifying the loss as WeightedLogitLogLoss(weights)
etc.'
LossFunction.jl has some support for this, e.g.
value(LogitLogLoss(), Y, Y, AggMode.WeightedSum(ones(length(Y))))
Hey,
FYI there seem to be issues installing JLBoost now.
(v1.3) pkg> add JLBoost
Updating registry at `C:\Users\azevelev\.julia\registries\General`
Updating git-repo `https://github.com/JuliaRegistries/General.git`
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package JLBoost [13d6d4a1]:
JLBoost [13d6d4a1] log:
├─possible versions are: 0.1.0-0.1.6 or uninstalled
├─restricted to versions * by an explicit requirement, leaving only versions 0.1.0-0.1.6
├─restricted by compatibility requirements with MLJBase [a7f614a8] to versions: [0.1.2-0.1.3, 0.1.6] or uninstalled, leaving only versions: [0.1.2-0.1.3, 0.1.6]
│ └─MLJBase [a7f614a8] log:
│ ├─possible versions are: [0.1.0-0.1.1, 0.2.0-0.2.6, 0.3.0, 0.4.0, 0.5.0, 0.6.0, 0.7.0-0.7.5, 0.8.0-0.8.4, 0.9.0-0.9.2, 0.10.0-0.10.1, 0.11.0-0.11.9] or uninstalled
│ └─restricted to versions 0.11.9 by an explicit requirement, leaving only versions 0.11.9
└─restricted by compatibility requirements with Tables [bd369af6] to versions: [0.1.0, 0.1.5] or uninstalled — no versions left
└─Tables [bd369af6] log:
├─possible versions are: [0.1.0-0.1.15, 0.1.17-0.1.19, 0.2.0-0.2.11, 1.0.0-1.0.2] or uninstalled
├─restricted by compatibility requirements with MLJModels [d491faf4] to versions: [0.2.0-0.2.11, 1.0.0-1.0.2]
│ └─MLJModels [d491faf4] log:
│ ├─possible versions are: [0.1.0-0.1.1, 0.2.0-0.2.5, 0.3.0, 0.4.0, 0.5.0-0.5.9, 0.6.0-0.6.3, 0.7.0-0.7.2, 0.8.0-0.8.3] or uninstalled
│ └─restricted to versions 0.8.3 by an explicit requirement, leaving only versions 0.8.3
└─restricted by compatibility requirements with MLJScientificTypes [2e2323e0] to versions: 1.0.0-1.0.2
└─MLJScientificTypes [2e2323e0] log:
├─possible versions are: [0.1.0-0.1.1, 0.2.0-0.2.1] or uninstalled
└─restricted to versions 0.2.1 by an explicit requirement, leaving only versions 0.2.1
Also
(v1.3) pkg> add JLBoostMLJ
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package MLJ [add582a8]:
MLJ [add582a8] log:
├─possible versions are: [0.1.0-0.1.1, 0.2.0-0.2.5, 0.3.0, 0.4.0, 0.5.0-0.5.9, 0.6.0-0.6.1, 0.7.0, 0.8.0, 0.9.0-0.9.3] or uninstalled
├─restricted to versions 0.9.3 by an explicit requirement, leaving only versions 0.9.3
└─restricted by compatibility requirements with JLBoostMLJ [8b86df2c] to versions: 0.6.0-0.6.1 — no versions left
└─JLBoostMLJ [8b86df2c] log:
├─possible versions are: 0.1.0 or uninstalled
└─restricted to versions * by an explicit requirement, leaving only versions 0.1.0
I love your package & it would be awesome to use inside MLJ soon.
Check out EvoTrees.
They have a short program MLJ.jl
that interfaces their package w/ MLJ.
A new kid on the gradient boosting block is the NGBoost variant described in this project from the Standford ML Group. It would be really nice to have this implementation in this project as well.
It appears this package is not compatible with the latest stable version of DataFrames v1.0.0
(@v1.6) pkg> add DataFrames@v1.0.0
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package DataFrames [a93c6f00]:
DataFrames [a93c6f00] log:
├─possible versions are: 0.11.7-1.0.0 or uninstalled
├─restricted to versions 1.0.0 by an explicit requirement, leaving only versions 1.0.0
└─restricted by compatibility requirements with JLBoost [13d6d4a1] to versions: 0.19.0-0.22.7 — no versions left
└─JLBoost [13d6d4a1] log:
├─possible versions are: 0.1.0-0.1.16 or uninstalled
└─restricted to versions * by an explicit requirement, leaving only versions 0.1.0-0.1.16
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.