Comments (5)
Good question. We've brainstormed with @raphaelsty and @AdilZouitine and we don't think that would be a good idea (at least for the moment). We think that it's better to run one instance of chantilly
per task. Supporting multiple tasks makes thinks much more on our end complicated, and we kind of like the idea of running one instance of chantilly
per task.
Do you agree? As I said in another issue, nothing is set in stone yet.
from chantilly.
Is this compatible with the multiple models? Could this be a setting per model?
from chantilly.
To me, the philosophy of Chantilly is above all to be able to put a cream model into production easily and with the minimum of configuration. I think we have to make sure that we keep the Chantilly API very simple and intuitive to use for all. The concept of flavour is interesting because it allows clustering the different uses of Chantilly, ie the recommendation has nothing to do with regression.
I still don't see the point of having a single instance of Chantilly for multiple models that have different flavors. I see more benefits in using a modular architecture cut from the flavors. It's easier to manage in terms of maintenance, less code and we identify the problem more quickly in case of a flavor bug. This allows to deploy each model on servers adapted to the volume of client requests. Nothing is set in marble and I think we should continue to discuss of the advantages and side effect to generalize a single instance to n flavours and k models.
Raphaël :-)
from chantilly.
Just so that I understand correctly, here is my example use case.
My model:
model = preprocessing.StandardScaler()
model |= linear_model.LogisticRegression()
What I get out of creme using predict_proba_one is this: (which is exactly what I want)
{False: 0.9993760805960461, True: 0.0006239194039538452}
Is that going to be possible? As it stands today when I configure the "flavor" to regression for this example the /predict returns True/False.
from chantilly.
In this case you're doing binary classification, therefore you need to set the flavor to binary
in order to obtain predictions.
What's happening under the hood is that the flavor determines which prediction function to use. When you set it to regression
, then the predict_one
method is used. When you set it to binary
or multiclass
, then the predict_proba_one
method is used. Therefore if you set the flavor to regression
and you upload a classifier, the predict_one
method will be used, even though the classifier has a predict_proba_one
method. This probably isn't the clearest approach, but at least everything works as expected as long as you set the flavor correctly.
from chantilly.
Related Issues (20)
- Consider cloudpickle
- Consider Blinker
- Consider making /api/predict a GET operation instead of POST HOT 3
- 500 error posting ClassifierChain model HOT 3
- Display model memory usage
- Integrate with AWS
- Unable to upload creme model HOT 1
- invalid javascript syntax in example
- trouble using loaded model
- Question on fit_one function HOT 2
- Unable to run in Windows HOT 5
- 'Unsupported operand type' error prediction in README example HOT 1
- Update metrics - possibly missing a case? HOT 20
- Can this be used for un-labeled models as well? And how many?
- add a /predict_proba method HOT 3
- Websocket implementation
- Write performance benchmark HOT 7
- move default error log location HOT 2
- Error when POST a new model. HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from chantilly.