Comments (11)
Wait, why are those packages not installed? They are listed in the Travis configuration and according to the log are getting installed as binary versions.
from mlr-tutorial.
the packages are installed, at least i guess so.
but they cannot be loaded because of the DLL problem. then mlr generates this message that the package is not installed.
from mlr-tutorial.
the only chance I see is
a) require the package only during training and not while we construct the object. we tried that once, it is not that easy and takes a longer time to test
b) generate the tables for the different types separately.
do you want to try b?
from mlr-tutorial.
It's like Bernd said. The packages are properly installed (at least I checked the travis log which does not show any errors in installing them), but cannot be loaded because of the DLL problem. These loading problems are caught by listLearners
and lead to missing learners in the returned object.
b) generate the tables for the different types separately.
do you want to try b?
What exactly do you mean by "separately"?
from mlr-tutorial.
I meant with b): If the tables for different learner types are in different pages in the tutorial, it might be possible to generate them im different R processes. So not all packages are loaded at once.
Or do we have another chance?
from mlr-tutorial.
I'm not sure how many R processes we have there.
My impression from the logs was always that we have one process and the number of packages accumulates over different tutorial pages. For example the log file above continues with package loading errors in learner.Rmd
, which comes directly after integrated_learners.Rmd
:
Knitting file 'learner.Rmd' ...
Quitting from lines 143-154 ()
Error in head(listLearners("cluster", create = TRUE), 2) :
error in evaluating the argument 'x' in selecting a method for function 'head': Error in listLearners.character("cluster", create = TRUE) :
(converted from warning) The following learners could not be constructed, probably because their packages are not installed:
cluster.cmeans,cluster.kmeans
Check ?learners to see which packages you need or install mlr with all suggestions.
Calls: listLearners -> listLearners.character -> warningf
Calls: lapply ... withCallingHandlers -> withVisible -> eval -> eval -> head
Execution halted
You are certainly right that it would be easier to have different R processes if the tables are on different pages in the tutorial.
I'm taking a closer look at the build file now.
from mlr-tutorial.
My impression from the logs was always that we have one process and
the number of packages accumulates over different tutorial pages
I know. But if you have separate pages, we could do this at least in principle. Or do you want to somehow manually create the page for now? Is that easier?
from mlr-tutorial.
At the moment it works (although it's ugly), i.e., in the current tutorial version no learners are missing in the table.
The only thing I came up with yesterday is: As long term solution maybe we could
- as you proposed have separate pages for different learner types,
- have an additional argument for
build
which says which tutorial pages should be knit (can default to all pages, but when building the tutorial on travis we can use it to knit the learner tables separately from the other pages), - insert an additional condition into the build file to make sure that mkdocs is only called if all the md-files are there.
from mlr-tutorial.
Ok, I agree. Like I said, my most preferred solution would be to be able to construct the learner objects without having to load their packages. This would fix everything at once. But I really dont know how hard this would be. it SHOULD be doable.
from mlr-tutorial.
see #580.
I gues this might be doable.
from mlr-tutorial.
We now can create the learner tables without loading packages. See mlr-org/mlr#580 ✌️
Will adapt the integrated learners page and then close.
from mlr-tutorial.
Related Issues (20)
- Show plotResiduals HOT 1
- Write a page about dummy learners HOT 3
- Mention makeLearners HOT 1
- Better integrate new(ish) learner properties into the learner tables HOT 1
- Is the concept of wrappers explained sufficiently? HOT 1
- Updating Travis to Trusty HOT 1
- Wrong formula for FDR HOT 3
- Best score for G2 metric should equal 1 HOT 2
- Tuning process should not be displayed... HOT 1
- Example of trafo for sigma tuning in kernlab HOT 1
- default branch: gh-pages: HOT 3
- Typo on Basics/Performance site
- Basics/Benchmark: Color coding HOT 2
- Can't build tutorial locally. Possible typo in code? HOT 8
- Usage of undefined term "eol" in hyperpar_tuning_effects HOT 2
- The local page build process should be done in a folder that is in .gitignore HOT 1
- partial_dependence broken HOT 2
- table of contents HOT 4
- oneclass_classification.Rmd
- Tables in appendix have hidden horizontal scroll HOT 8
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mlr-tutorial.