openscoring / openscoring Goto Github PK
View Code? Open in Web Editor NEWREST web service for the true real-time scoring (<1 ms) of Scikit-Learn, R and Apache Spark models
License: GNU Affero General Public License v3.0
REST web service for the true real-time scoring (<1 ms) of Scikit-Learn, R and Apache Spark models
License: GNU Affero General Public License v3.0
I am getting this error:
Standard error:
May 16, 2019 10:09:35 AM org.jpmml.sklearn.Main run
INFO: Parsing PKL..
May 16, 2019 10:09:35 AM org.jpmml.sklearn.Main run
INFO: Parsed PKL in 60 ms.
May 16, 2019 10:09:35 AM org.jpmml.sklearn.Main run
INFO: Converting..
May 16, 2019 10:09:35 AM org.jpmml.sklearn.Main run
SEVERE: Failed to convert
java.lang.IllegalArgumentException: Attribute 'sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.loss_' has an unsupported value (Python class sklearn.ensemble._gb_losses.BinomialDeviance)
at org.jpmml.sklearn.CastFunction.apply(CastFunction.java:43)
at org.jpmml.sklearn.PyClassDict.get(PyClassDict.java:57)
at sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.getLoss(GradientBoostingClassifier.java:121)
at sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.encodeModel(GradientBoostingClassifier.java:67)
at sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.encodeModel(GradientBoostingClassifier.java:42)
at sklearn2pmml.pipeline.PMMLPipeline.encodePMML(PMMLPipeline.java:213)
at org.jpmml.sklearn.Main.run(Main.java:145)
at org.jpmml.sklearn.Main.main(Main.java:94)
Caused by: java.lang.ClassCastException: Cannot cast net.razorvine.pickle.objects.ClassDict to sklearn.ensemble.gradient_boosting.LossFunction
at java.base/java.lang.Class.cast(Class.java:3611)
at org.jpmml.sklearn.CastFunction.apply(CastFunction.java:41)
... 7 more
Exception in thread "main" java.lang.IllegalArgumentException: Attribute 'sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.loss_' has an unsupported value (Python class sklearn.ensemble._gb_losses.BinomialDeviance)
at org.jpmml.sklearn.CastFunction.apply(CastFunction.java:43)
at org.jpmml.sklearn.PyClassDict.get(PyClassDict.java:57)
at sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.getLoss(GradientBoostingClassifier.java:121)
at sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.encodeModel(GradientBoostingClassifier.java:67)
at sklearn.ensemble.gradient_boosting.GradientBoostingClassifier.encodeModel(GradientBoostingClassifier.java:42)
at sklearn2pmml.pipeline.PMMLPipeline.encodePMML(PMMLPipeline.java:213)
at org.jpmml.sklearn.Main.run(Main.java:145)
at org.jpmml.sklearn.Main.main(Main.java:94)
Caused by: java.lang.ClassCastException: Cannot cast net.razorvine.pickle.objects.ClassDict to sklearn.ensemble.gradient_boosting.LossFunction
at java.base/java.lang.Class.cast(Class.java:3611)
at org.jpmml.sklearn.CastFunction.apply(CastFunction.java:41)
... 7 more
from fbd663c
current uber jar and war deployment may it difficult to deploy in a case of strict firm's security controls is in place (limited to regular maven coordinates /pip installation/docker image)
When multiple clients try to load the same model in parallel, the server may fail with an internal server error.
Unfortunately, Openscoring doesn't log any error. Is it possible to turn on the error logging in Openscoring?
The body of the failure response is:
{
"message" : "Concurrent modification"
}
Testing code:
import logging
import requests
from multiprocessing import Pool
workers = 2 # changing this to 1 fixes the issue
logging.basicConfig(format='%(asctime)-15s %(message)s')
logger = logging.getLogger()
state = 'connection error'
with open('DecisionTreeIris.pmml') as f:
model = f.read()
def upload_model(n, *args, **kwargs):
r = requests.put(
f'http://localhost:8080/openscoring/model/DecisionTreeIris',
data=model,
timeout=10,)
r.raise_for_status()
return 1
with Pool(workers) as p:
while True:
prev_state = state
try:
res = p.map(upload_model, range(workers))
print(res)
except requests.ConnectionError:
state = 'connection error'
else:
state = 'normal'
if prev_state != state:
logger.error(state)
Testcode log:
$ python3 openscoring_test.py
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/usr/lib/python3.6/multiprocessing/pool.py", line 119, in worker
result = (True, func(*args, **kwds))
File "/usr/lib/python3.6/multiprocessing/pool.py", line 44, in mapstar
return list(map(*args))
File "openscoring_test.py", line 20, in upload_model
r.raise_for_status()
File "/home/krab/.local/lib/python3.6/site-packages/requests/models.py", line 940, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://localhost:8080/openscoring/model/DecisionTreeIris
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "openscoring_test.py", line 27, in <module>
res = p.map(upload_model, range(workers))
File "/usr/lib/python3.6/multiprocessing/pool.py", line 288, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/usr/lib/python3.6/multiprocessing/pool.py", line 670, in get
raise self._value
requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://localhost:8080/openscoring/model/DecisionTreeIris
Openscoring log:
$ java -jar openscoring-server-executable-1.4.5.jar
Jun 13, 2019 11:27:49 AM org.eclipse.jetty.util.log.Log initialized
INFO: Logging initialized @164ms to org.eclipse.jetty.util.log.Slf4jLog
Jun 13, 2019 11:27:49 AM org.openscoring.service.filters.NetworkSecurityContextFilter discoverLocalAddresses
INFO: Local network addresses: [127.0.1.1, 127.0.0.1]
Jun 13, 2019 11:27:49 AM org.openscoring.service.filters.ServiceIdentificationFilter discoverNameAndVersion
INFO: Service name and version: Openscoring/1.4.5
Jun 13, 2019 11:27:49 AM org.eclipse.jetty.server.Server doStart
INFO: jetty-9.4.z-SNAPSHOT; built: 2018-11-14T21:20:31.478Z; git: c4550056e785fb5665914545889f21dc136ad9e6; jvm 11.0.3+7-Ubuntu-1ubuntu218.04.1
Jun 13, 2019 11:27:50 AM org.eclipse.jetty.server.handler.ContextHandler doStart
INFO: Started o.e.j.s.ServletContextHandler@105fece7{/openscoring,null,AVAILABLE}
Jun 13, 2019 11:27:50 AM org.eclipse.jetty.server.AbstractConnector doStart
INFO: Started ServerConnector@40bffbca{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
Jun 13, 2019 11:27:50 AM org.eclipse.jetty.server.Server doStart
INFO: Started @925ms
Jun 13, 2019 11:27:57 AM org.openscoring.service.filters.NetworkSecurityContext isUserInRole
INFO: Admin role granted to network address 127.0.0.1
Jun 13, 2019 11:27:57 AM org.openscoring.service.filters.NetworkSecurityContext isUserInRole
INFO: Admin role granted to network address 127.0.0.1
Some part is missing in client. There are no way to get the list of models.
I loaded neural network with these parameters:
{
"id" : "WF1",
"summary" : "Neural network",
"properties" : {
"created.timestamp" : "2015-07-08T03:19:37.669+0000",
"accessed.timestamp" : null,
"file.size" : 12690,
"file.md5sum" : "899a823e007f417d53944666f75bf59c"
},
"schema" : {
"activeFields" : [ {
"id" : "BMI_SDS",
"opType" : "continuous"
}, {
"id" : "HBA1C",
"opType" : "continuous"
}, {
"id" : "INS_KG",
"opType" : "continuous"
}, {
"id" : "CHOL",
"opType" : "continuous"
}, {
"id" : "HDL",
"opType" : "continuous"
}, {
"id" : "HA",
"opType" : "categorical",
"values" : [ "0", "1", "2" ]
}, {
"id" : "TANNER",
"opType" : "categorical",
"values" : [ "1", "2", "3", "4", "5" ]
}, {
"id" : "Male sex",
"opType" : "categorical",
"values" : [ "0", "1" ]
} ],
"groupFields" : [ ],
"targetFields" : [ {
"id" : "lnGDR",
"opType" : "continuous"
} ],
"outputFields" : [ ]
}
}
However any validation fails because of:
Jul 08, 2015 5:27:30 AM org.openscoring.service.ModelResource evaluate
INFO: Received EvaluationRequest{id=konrad-test, arguments={BMI_SDS=50, HBA1C=7, INS_KG=1.4, CHOL=150, HDL=40, HA=1, TANNER=1, Male sex=1}}
Jul 08, 2015 5:27:30 AM org.openscoring.service.ModelResource doEvaluate
SEVERE: Failed to evaluate
java.lang.NullPointerException
at org.jpmml.evaluator.TypeUtil.cast(TypeUtil.java:323)
at org.jpmml.evaluator.TypeUtil.parseOrCast(TypeUtil.java:60)
at org.jpmml.evaluator.ArgumentUtil.prepare(ArgumentUtil.java:60)
at org.jpmml.evaluator.ModelEvaluator.prepare(ModelEvaluator.java:113)
at org.jpmml.evaluator.EvaluatorUtil.prepare(EvaluatorUtil.java:120)
at org.openscoring.service.ModelResource.evaluate(ModelResource.java:538)
at org.openscoring.service.ModelResource.doEvaluate(ModelResource.java:398)
at org.openscoring.service.ModelResource.evaluate(ModelResource.java:259)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:205)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102)
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:308)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:291)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1140)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:403)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:386)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:334)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:221)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:808)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
PMML file was generated by STATISTICA and adheres to the format. As I can read from above input is correct.
I want to gain persistence when deploying models on Openscoring and to this end I used the --model-dir
option to activate auto-deployment. However I would like to have the models persist when deployed through the API directly.
Is this already supported ? If not I would like to add this feature. In the API 1.2.X the POST method to deploy a model is not part of the API anymore. Maybe it could be used as an alternate method to deploy a method in a persistent manner (by storing the pmml file locally in the pmml directory). Another solution could be to have a different endpoint such as http://host/openscoring/persist/model
.
Thanks!
The method ModelUtil#encodeOutputFields()
currently passes through all output fields definitions. It should exclude output fields definitions that have been marked as "hidden" by setting OutputField@isFinalResult="false"
.
Hi,
I try to get the evaluation results made by POST method on the browser with GET method, but i got the 405 error : "Method Not Allowed" do you have any idea ?
The identifier could be the MD5 hash of the uploaded PMML document (as already available in model properties), or some (extension-) PMML element. For example, a special-purpose Constant
element under some /PMML/Header/Annotation
element (configurable via Openscoring configuration file).
Hi
I am using Openscoring webapp [war file] with tomcat and it is running inside container of Kubernetes and in between openscoring and host machine [from where i am accessing it through REST client] we have nginx proxy server is running.So i am trying to access the openscoring through nginx and we are using pod ip of nginx in trusted address to talk to openscoring application. But i want to access that using host machine ip [that is visitor machine host ip]. i tried the solutions by removing that component class in the application.conf at least to access without security but no luck and i tried by placing container ip in the application.conf still not able to access getting "user not authorized". so can anyone suggest me how to access this openscoring webapp through nginx proxy using host machine[Where REST client is running].
Is there any plan for support of PMML 4.3?
I am using sklearn2pmml
is already creating pmml 4.3 files. I also see that jpmml-evaluator works with 4.3 as well.
At the minute if a request fails there's very little information which explains why it happened. Genereally, I have to recreate the issue in a dev environment with a debugger attached to work out why something isn't working.
I think if we could have OpenScoring log what's going on it would help tremendously diagnose these issues. It would also allow issues to be diagnosed retrospectively.
I imagine we'll need to extend jpmml-evaluator to provide more information to OpenScoring as a first step.
Dear Villu,
Sorry to bother you again, but I'm struggling with another issue.
Is there a way to receive in the result or wherever details about evaluation of neural network? Like probabilities (end or on specific neuron)?
It would be very helpful to implement this details in my project and currently in the response I get only the class of prediction.
Thank you very much in advance. I really appreciate your work.
Konrad
It might be useful to package and run the module org.openscoring:openscoring-server
as a Windows service.
when I use the jar version 2.0 for the xgboost.pmml predict.There is an error "xgbValue" is not defined occur, but the jar 1.4.5 not
I would like to perform a health check on my openscoring server.
This is required for Amazon services (such as load balancing), as well as other features I need.
My health check includes:
Checking that I can send commands to the server (such as query the server for models). More importantly, I need a good indication that at least one model is deployed.
Is there any built in way to do it?
Thanks!
hello,
I have a problem when running:
java -cp client-executable-1.4.1.jar org.openscoring.client.Deployer --model http://localhost:8080/openscoring/model/DecisionTreeIris --file DecisionTreeIris.pmml
The exception is:
Exception in thread "main" java.lang.IllegalStateException: InjectionManagerFactory not found.
at org.glassfish.jersey.internal.inject.Injections.lambda$lookupInjectionManagerFactory$0(Injections.java:98)
at java.util.Optional.orElseThrow(Unknown Source)
at org.glassfish.jersey.internal.inject.Injections.lookupInjectionManagerFactory(Injections.java:98)
at org.glassfish.jersey.internal.inject.Injections.createInjectionManager(Injections.java:68)
at org.glassfish.jersey.client.ClientConfig$State.initRuntime(ClientConfig.java:432)
at org.glassfish.jersey.internal.util.collection.Values$LazyValueImpl.get(Values.java:341)
at org.glassfish.jersey.client.ClientConfig.getRuntime(ClientConfig.java:826)
at org.glassfish.jersey.client.ClientRequest.getConfiguration(ClientRequest.java:285)
at org.glassfish.jersey.client.JerseyInvocation.validateHttpMethodAndEntity(JerseyInvocation.java:143)
at org.glassfish.jersey.client.JerseyInvocation.<init>(JerseyInvocation.java:112)
at org.glassfish.jersey.client.JerseyInvocation.<init>(JerseyInvocation.java:99)
at org.glassfish.jersey.client.JerseyInvocation$Builder.buildPut(JerseyInvocation.java:238)
at org.glassfish.jersey.client.JerseyInvocation$Builder.buildPut(JerseyInvocation.java:171)
at org.openscoring.client.Deployer$1.perform(Deployer.java:81)
at org.openscoring.client.Deployer$1.perform(Deployer.java:71)
at org.openscoring.client.ModelApplication.execute(ModelApplication.java:51)
at org.openscoring.client.Deployer.deploy(Deployer.java:90)
at org.openscoring.client.Deployer.run(Deployer.java:58)
at org.openscoring.client.Application.run(Application.java:64)
at org.openscoring.client.Deployer.main(Deployer.java:53)
How can I resove the problem? Any help will be appreciated!
$ mvn -v
Apache Maven 3.5.0 (ff8f5e7444045639af65f6095c62210b5713f426; 2017-04-03T19:39:06Z)
Maven home: /var/www/apache-maven-3.5.0
Java version: 1.8.0_131, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.131-2.b11.30.amzn1.x86_64/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "4.9.20-11.31.amzn1.x86_64", arch: "amd64", family: "unix"
$ mvn clean install
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Openscoring ........................................ SUCCESS [ 2.128 s]
[INFO] Openscoring Common ................................. SUCCESS [ 2.979 s]
[INFO] Openscoring Client ................................. SUCCESS [ 2.359 s]
[INFO] Openscoring Common GWT ............................. SUCCESS [ 3.184 s]
[INFO] Openscoring Service ................................ FAILURE [ 5.293 s]
[INFO] Openscoring Server ................................. SKIPPED
[INFO] Openscoring WebApp ................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16.337 s
[INFO] Finished at: 2017-06-05T20:18:09Z
[INFO] Final Memory: 44M/134M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.19.1:test (default-test) on project openscoring-service: There are test failures.
[ERROR]
[ERROR] Please refer to /var/www/openscoring/openscoring-service/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :openscoring-service
I am trying to update a model from outside the local network. Since I am not in localhost I am not an admin.
My question is, how can I override this, is there any option/configuration I can modify?
Thanks one more time.
When I am trying to do Model evaluation using POST /model/${id} in single prediction mode, I am getting the below error.
{
"message" : "Attribute with value RegressionTable@targetCategory=1 is not valid"
}
The above error occurs only when my model is in logistic regression. The PMML files of decision tree and random forest are working fine, but I am getting this error only when my model is in logistic regression.
Whats the reason for this error?
The Openscoring web service should send a custom HTTP response header, so that Openscoring client libraries/tools have a way of verifying that they are talking to the correct server, and they can understand each other sufficiently well.
Use cases:
"The server at http://localhost:8080 did not identify itself as Openscoring"
. See openscoring/openscoring-python#2"The server at http://localhost:8080 requires older/newer client library"
I hava a pmml file which is generated by Knime, I deployed it to sever and I can get its info from server,such as
curl http://localhost:8080/openscoring/model/dtclf
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 3473 100 3473 0 0 226k 0 --:--:-- --:--:-- --:--:-- 226k{
"id" : "dtclf",
"miningFunction" : "classification",
"summary" : "Tree model",
"properties" : {
"created.timestamp" : "2019-02-01T08:21:20.313+0000",
"accessed.timestamp" : "2019-02-01T08:21:20.329+0000",
"file.size" : 31382,
"file.md5sum" : "3232e7cc5735f4ede12ca024496ed878"
},
"schema" : {
"inputFields" : [ {
"id" : "VMail Message",
"dataType" : "integer",
"opType" : "continuous",
"values" : [ "[0.0, 51.0]" ]
}, {
.......
but when I post a json to evaluate, it returned a error:
$ curl -X POST --data-binary @query.json -H "Content-type: application/json" http://localhost:8080/openscoring/model/dtclf
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 603 100 47 100 556 3133 37066 --:--:-- --:--:-- --:--:-- 40200{
"message" : "For input string: \"3.5\""
}
I tried to use python client, and got the same error.
The error traceback info on the server is
Failed to evaluate
java.lang.NumberFormatException: For input string: "3.5"
at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.base/java.lang.Long.parseLong(Long.java:692)
at java.base/java.lang.Long.parseLong(Long.java:817)
at org.jpmml.evaluator.TypeUtil.parseInteger(TypeUtil.java:103)
at org.jpmml.evaluator.TypeUtil.parse(TypeUtil.java:63)
at org.jpmml.evaluator.TypeUtil.parseOrCast(TypeUtil.java:47)
at org.jpmml.evaluator.FieldValue.create(FieldValue.java:489)
at org.jpmml.evaluator.FieldValueUtil.create(FieldValueUtil.java:100)
at org.jpmml.evaluator.ValueParser.parse(ValueParser.java:25)
at org.jpmml.evaluator.RichSimplePredicate.getValue(RichSimplePredicate.java:52)
at org.jpmml.evaluator.FieldValue.compareTo(FieldValue.java:181)
at org.jpmml.evaluator.FieldValue.compareTo(FieldValue.java:174)
at org.jpmml.evaluator.PredicateUtil.evaluateSimplePredicate(PredicateUtil.java:162)
at org.jpmml.evaluator.PredicateUtil.evaluatePredicate(PredicateUtil.java:81)
at org.jpmml.evaluator.PredicateUtil.evaluate(PredicateUtil.java:71)
at org.jpmml.evaluator.tree.TreeModelEvaluator.evaluateNode(TreeModelEvaluator.java:195)
at org.jpmml.evaluator.tree.TreeModelEvaluator.handleTrue(TreeModelEvaluator.java:212)
at org.jpmml.evaluator.tree.TreeModelEvaluator.handleTrue(TreeModelEvaluator.java:223)
at org.jpmml.evaluator.tree.TreeModelEvaluator.evaluateTree(TreeModelEvaluator.java:159)
at org.jpmml.evaluator.tree.TreeModelEvaluator.evaluateClassification(TreeModelEvaluator.java:128)
at org.jpmml.evaluator.ModelEvaluator.evaluate(ModelEvaluator.java:535)
at org.jpmml.evaluator.ModelEvaluator.evaluate(ModelEvaluator.java:503)
at org.openscoring.service.ModelResource.evaluate(ModelResource.java:569)
at org.openscoring.service.ModelResource.doEvaluate(ModelResource.java:417)
at org.openscoring.service.ModelResource.evaluate(ModelResource.java:265)
at jdk.internal.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:76)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:148)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:191)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:243)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:103)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:493)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:415)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:104)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:277)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:272)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:268)
at org.glassfish.jersey.internal.Errors.process(Errors.java:316)
at org.glassfish.jersey.internal.Errors.process(Errors.java:298)
at org.glassfish.jersey.internal.Errors.process(Errors.java:268)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:289)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:256)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:703)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:416)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:370)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:389)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:342)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:229)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:867)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:542)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:502)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
at java.base/java.lang.Thread.run(Thread.java:834)
The method CsvUtil#getFormat(BufferedReader)
allocates a 10 kB internal buffer for Reader#mark(int)
activity. If the method CsvUtil#checkFormat(BufferedReader)
reads more character than that, then the subsequent Reader#reset()
activity will fail with an IOException stating "Mark invalid".
For example, see the following issue:
jpmml/jpmml-evaluator#24
Hi again,
I'm strugling with new issue on neural network. He is the problem:
konrad@gen:~$ curl -H 'Content-Type: application/json' -X POST -d '{"id":"Uw4Fz","arguments" : { "hsa-miR-1246":"0","hsa-miR-1307-5p":"0","hsa-miR-150-5p":"0","hsa-miR-1307-5p":"0","hsa-miR-16-2-3p":"0","hsa-miR-200a-3p":"0","hsa-miR-200c-3p":"0","hsa-miR-203a":"0","hsa-miR-23b-3p":"0","hsa-miR-29a-3p":"0","hsa-miR-30d-5p":"0","hsa-miR-320b":"0","hsa-miR-320c":"0","hsa-miR-320d":"0","hsa-miR-32-5p":"0","hsa-miR-335-5p":"0","hsa-miR-450b-5p":"0","hsa-miR-486-3p":"0","hsa-miR-92a-3p":"0"}}' http://localhost:8080/openscoring/model/NNCFS
{
"message" : "DataField"
}
The model looks like this:
konrad@gen:~$ curl -X GET http://localhost:8080/openscoring/model/NNCFS
{
"id" : "NNCFS",
"miningFunction" : "classification",
"summary" : "Neural network",
"properties" : {
"created.timestamp" : "2016-07-15T12:37:22.249+0000",
"accessed.timestamp" : "2016-07-15T16:28:28.503+0000",
"file.size" : 9830,
"file.md5sum" : "ed5534fb3c8e3d120b99bfabb9f3f0d1"
},
"schema" : {
"activeFields" : [ {
"id" : "hsa-miR-16-2-3p",
"opType" : "continuous"
}, {
"id" : "hsa-miR-200a-3p",
"opType" : "continuous"
}, {
"id" : "hsa-miR-200c-3p",
"opType" : "continuous"
}, {
"id" : "hsa-miR-320b",
"opType" : "continuous"
}, {
"id" : "hsa-miR-320d",
"opType" : "continuous"
} ],
"groupFields" : [ ],
"targetFields" : [ {
"id" : "2 group",
"opType" : "categorical",
"values" : [ "Cancer", "Controls+Borderline" ]
} ],
"outputFields" : [ ]
}
}
I don't understand the error message and what is the problem here.
Thank you very much in advance for help!
Konrad
Hello,
I got a problem , when I put a .pmml in the good folder, I have to stop the server and restart it to see the new file. How can I see it without restarting the server ?
Regards,
Benjamin
Hi i've used https://github.com/jpmml/jpmml-lightgbm to generate a PMML file from a lightgbm model
lightgbm.txt
lightgbm.pmml.txt
When i load this into the server i get the following response
curl -X PUT --data-binary @lightgbm.pmml -H "Content-type: text/xml" http://localhost:8080/openscoring/model/lightgbm
{
"id" : "lightgbm",
"miningFunction" : "classification",
"summary" : "Ensemble model",
"properties" : {
"created.timestamp" : "2019-04-16T07:24:15.570+0000",
"accessed.timestamp" : null,
"file.size" : 613912,
"file.md5sum" : "b00ce9c05625965800699ab94da460ab"
},
"schema" : {
"inputFields" : [ {
"id" : "region",
"dataType" : "double",
"opType" : "continuous",
"values" : [ "[0.0, 61.0]" ]
}, {
"id" : "site_group",
"dataType" : "double",
"opType" : "continuous",
"values" : [ "[0.0, 178.0]" ]
}, {
"id" : "clean_title_cos_sim_keywords_string",
"dataType" : "double",
"opType" : "continuous",
"values" : [ "[-1.0, 1.0]" ]
}, {
"id" : "clean_title_cos_sim_client_id",
"dataType" : "double",
"opType" : "continuous",
"values" : [ "[-1.0, 1.0]" ]
}, {
"id" : "clean_description_cos_sim_keywords_string",
"dataType" : "double",
"opType" : "continuous",
"values" : [ "[-1.0, 1.0]" ]
}, {
"id" : "clean_description_cos_sim_client_id",
"dataType" : "double",
"opType" : "continuous",
"values" : [ "[-1.0, 0.726566731929779]" ]
}, {
"id" : "client_relevancy",
"dataType" : "double",
"opType" : "continuous",
"values" : [ "[-1.0, 1.0]" ]
} ],
"targetFields" : [ {
"id" : "_target",
"dataType" : "integer",
"opType" : "categorical",
"values" : [ "0", "1" ]
} ],
"outputFields" : [ {
"id" : "probability(0)",
"dataType" : "double",
"opType" : "continuous"
}, {
"id" : "probability(1)",
"dataType" : "double",
"opType" : "continuous"
} ]
}
}%
However when i try to test this i get the following error
curl -X POST --data-binary @lightgbm_request.json -H "Content-type: application/json" http://localhost:8080/openscoring/model/lightgbm
{
"message" : "Field \"transformedLgbmValue\" is not defined"
}%
with the following request json
{
"id": "1",
"arguments": {
"region": 58.0,
"site_group": 10.0,
"clean_title_cos_sim_keywords_string": 0.5951485633850098,
"clean_title_cos_sim_client_id": 0.04875922203063965,
"clean_description_cos_sim_keywords_string": 0.46828553080558777,
"clean_description_cos_sim_client_id": 0.1009560078382492,
"client_relevancy": 0.64421546459198
}
}
Based off the returned model schema i don't understand why i can't score this? Looking through the PMML file there is a transformedLgbmValue but it isn't in the expected inputFields?
Good morning everybody !
I wanted to know if somebody has ever been concerned by this problem ...
I wanted to follow this tutorial. For this reason I have modified my pom.xml, in order to generate the convert executable, which is below :
when I open a terminal and I write in the good file :
-mvn clean install
It seems there is no problem (No mistakes message).
But when I try to execute this commande, I have this message :
command : java -jar converter-executable-1.3-SNAPSHOT.jar --pkl-input pipeline.pkl.z --pmml-output pipeline.pmml
answer : aucun attribut manifest principal dans converter-executable-1.3-SNAPSHOT.jar.
Does anybody have the solution ? Thanks you very much ...
Hi,
I have total 2116 pmml.gz files of 1.9 GB of size . When I deployed all files in one go on azure server , it takes 256 GB of memory to load all . But when i checked memory usage after one day it decreased to 120GB and it is decreasing exponentially day by day . Could you suggest me some fix so that i can save memory ?
Thank You for your help
Hi I deployed it on centos6.7 [64bit os] with Apache Tomcat/8.0.27. I have taken war from 1.2.8. but when I ran
curl -X PUT --data-binary @test.xml -H "Content-type: text/xml" http://localhost:8080/openscoring/model/justtest
received.
l{
"message" : "Forbidden"
}
But same setup on windows server is working. Is there any separate setting for linux?
It would be great if an OpenAPI definition file was provided with this project to enable the service to be more easily exposed in environments like Bluemix that have API tooling.
At the moment, if the evaluation request does not contain mappings for some input fields, then Openscoring logs a warning and provides a "default mapping" in the form of a missing value. However, it should be possible to treat incomplete evaluation requests as an user error.
The "strict schema mode" can be extremely helpful when upgrading/updating models - making sure that all client applications are aware about the correct set of input fields.
Similarly, it might be desirable to detect and refuse to deal with evaluation requests that contain "unused" mappings.
Dear Villu,
My issue is the continuation of #14.
I have few more neural networks I would like to score and gain probability measures.
Here I put my PMML files:
Ovaries-NNqPCR2.txt
Ovaries-NNKeller.txt
For example one can receive the following output:
{ "id" : "Qz2XDQoJ5HiF7jmc4GDn", "result" : { "ID_REF" : "Controls+Borderline", "probability_disease" : -0.4489425455522529, "probability_no_disease" : 1.3847227148467958 } }
The output probabilities turned out by these networks doesn't sum up to 1 and I'm not able to find a reason for this phenomena.
Is there a next issue with PMML files or I missed something in the construction of the network?
Thank you very much for your help in advance.
Best regards,
Konrad
I have created a PMML that runs on one openscoring instance but not on another. On one openscoring instance, the model returns the expected result while on the other it returns:
org.jpmml.evaluator.FunctionException (at or around line 21): Missing arguments
at org.jpmml.evaluator.functions.AbstractFunction.checkArguments(AbstractFunction.java:60)
at org.jpmml.evaluator.functions.AbstractFunction.checkArguments(AbstractFunction.java:40)
at org.jpmml.evaluator.functions.EqualityFunction.evaluate(EqualityFunction.java:40)
at org.jpmml.evaluator.FunctionUtil.evaluate(FunctionUtil.java:47)
at org.jpmml.evaluator.ExpressionUtil.evaluateApply(ExpressionUtil.java:439)
at org.jpmml.evaluator.ExpressionUtil.evaluateExpression(ExpressionUtil.java:106)
at org.jpmml.evaluator.ExpressionUtil.evaluate(ExpressionUtil.java:66)
at org.jpmml.evaluator.ExpressionUtil.evaluateApply(ExpressionUtil.java:356)
at org.jpmml.evaluator.ExpressionUtil.evaluateExpression(ExpressionUtil.java:106)
at org.jpmml.evaluator.ExpressionUtil.evaluate(ExpressionUtil.java:66)
at org.jpmml.evaluator.OutputUtil.evaluate(OutputUtil.java:203)
at org.jpmml.evaluator.rule_set.RuleSetModelEvaluator.evaluate(RuleSetModelEvaluator.java:112)
at org.jpmml.evaluator.ModelEvaluator.evaluate(ModelEvaluator.java:373)
at org.openscoring.service.ModelResource.evaluate(ModelResource.java:570)
at org.openscoring.service.ModelResource.doEvaluate(ModelResource.java:418)
at org.openscoring.service.ModelResource.evaluate(ModelResource.java:266)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:205)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102)
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:473)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:258)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
This issue seems related to the output field and whether or not the predicted field is included.
I have attached a test PMML that exhibits the behaviour described above.
Any ideas why openscoring would have two different behaviours for this one PMML ?
PMML:
test.txt
It has been reported via e-mail that the project cannot be built using Java 1.9 EA due to the following compilation error:
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:compile (default-compile) on project openscoring-service: Compilation failure
/home/german/openscoring/openscoring-service/src/main/java/org/openscoring/service/CsvUtil.java:[172,31] error: incompatible types: LinkedHashMap<String,CAP#1> cannot be converted to Map<String,Object>
The project builds correctly with Java 1.7 and 1.8.
Hello,
We start the server and we register models. we restart the server and we are not able to see our previous models ? How Can we be sure to store models ?
Thanks
Hello,
I have deployed OpenScoring on an Ubuntu 12.04 LTS using the server executable uber-jar, however, I am having some trouble PUT
ting to the application:
GET
a list of models (empty) using the Chrome Postman REST client.PUT
a PMML file locally using curl. I get the correct response back, and a subsequent GET
lists the model.PUT
remotely, using the Postman REST client, I get a "403 Forbidden" back.--model-dir
option directing deployed models to a folder in my home folder. This still gives me a "403 Forbidden" when I try to deploy a model.Note that I am doing all of this as the default "ubuntu" user, and deploying the application from a folder in my home folder.
Any thoughts on what I might be missing?
Thanks.
Hi,
I'm wondering if there is a configuration to disable specific HTTP methods (ex: PUT & DELETE)? We would only like to provide access for end-users to use the models but not to make any kind of modifications to the models. It seems like we can load the models from a directory and only update them when the application is redeployed. But this doesn't solve the access to PUT and DELETE methods issue.
If there is some documentation or a specific class you could point me to that would be appreciated.
Thanks,
We trained an xgboost model using sklearn2pmml
We used xgboost pacakage and XGBClassifier inside it and initiated with parameters.
Here's the code for that:
from sklearn2pmml import PMMLPipeline
#xgmodel is our model object that is initiated.
xg_pipeline = PMMLPipeline([("estimator", xg_model)])
xg_pipeline.fit(X_train,y_train)
from sklearn.externals import joblib
joblib.dump(xg_pipeline, "xg_pipeline.pkl.z", compress = 9)
We converted this pkl.z file to a PMML model and are using it with the Openscoring web service.
There is a slight difference in the Openscoring scores and the predictions we get in case we use a pkl model directly using sklearn joblib modules.
Appreciate any help.
I have just deployed OpenScoring as a Bluemix cloud foundry application. Bluemix provides an API gateway that can sit in front of cloud foundry applications and perform authentication. To get this working, I have cloned your repo and hardcoded my NetworkSecurityContext.java like to:
// boolean trusted = isTrusted(address);
boolean trusted = true; // move responsibility for auth to API
It would be desirable if I could do this via a configuration rather than patching, e.g.
networkSecurityContextFilter {
// If you wish to push security to another tier (e.g. api layer) you can set disabled = true
// to prevent openscoring from authenticating requests requiring the admin role
disabled = false
// List of trusted IP addresses. An empty list defaults to all local network IP addresses.
// A client that originates from a trusted IP address (as indicated by the value of the CGI variable REMOTE_ADDR) is granted the "admin" role.
trustedAddresses = []
}
I am using jenkins to deploy openscoring war file on Tomact in my server.I cannot deploy from my machine for security reasons and I want to use --model-dir parameter in order to define a pmml repository folder on the server, so I can deploy my models by copying them to the folder. Can you please advise how I can pass a parameter when deploying a war file on Tomcat?
I was wondering, is there any way that we can set up models when the service starts?
What would be the appropriate NetworkSecurityContextFilter parameters to be able to bind openscoring to the host port from within a docker container?.
Since the ips will vary by machine, hardcoding a list of trusted_ips wont work unfortunately. Disabling NetworkSecurityContextFilter doesnt seem to work neither (openscoring keeps listening to local conexions only).
I am trying to run CSVEvaluator curl command through postman but it's not working as it always says NullPointerException when I didn't provide the output file path and if provided it doesn't write any output to that
Provide me the Http Call for the same from postman
I'm deploying the logistic model and in the details of model I don't see the beta values of the logit model. (They are in the ParamMatrix Entity in PMML 4.2 )
Is there any way of exposing these in the returned JSON. ?
Hello,
I try to make a prediction with this values (
kmeans_topredict.txt
I launch the commands:
curl -X PUT --data-binary @kmeans.pmml -H "Content-type: text/xml" http://localho
st:8080/openscoring/model/kmeans
and to predict
curl -X POST --data-binary @kmeans2.json -H "Content-type: application/json" http://sandbox:8080/openscoring/model/kmeans
and I have this error : do you have any idea ?
2015 8:50:20 AM org.openscoring.service.ModelResource evaluate
FINE: Evaluation request record-001 has prepared arguments: {field_0=ContinuousValue{opType=CONTINUOUS, dataType=DOUBLE, value=20.0}, field_1=ContinuousValue{opType=CONTINUOUS, dataType=DOUBLE, value=756.72}, field_2=ContinuousValue{opType=CONTINUOUS, dataType=DOUBLE, value=1500.0}, field_3=ContinuousValue{opType=CONTINUOUS, dataType=DOUBLE, value=5.0}}
Dec 21, 2015 8:50:20 AM org.openscoring.service.ModelResource doEvaluate
SEVERE: Failed to evaluate
java.lang.UnsupportedOperationException
at java.util.Collections$1.remove(Collections.java:4684)
How to enable Cross-Origin Policy Requests
I am getting the below error:
Access to XMLHttpRequest at 'http://localhost:8080/openscoring/model/Stroke' from origin 'http://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
If I create a model using jpmml-tensorflow can I score it using openscoring ? If there is no tensorflow serving integration yet, would love to contribute it while doing a benchmark with tensorflow-serving...
Hi,
I am deploying the Openscoring WAR on tomcat to access remotely. But when I started it and tried to PUT request with my PMML model it tells me a message of 403 user not authorized. I saw the documentation that you can configure the trusted IP through application.conf when running the Openscoring JAR.
So my question is how can I do the same thing with Openscoring WAR on tomcat? By changing the web.xml in WEB-INF?
Thanks,
Cong
Hi,
I am using openscoring server for deploying my 400 models in one go but while i gave all these model as PMML file to server it was not able to run all of them and threw an error [mentioned below] . Please tell me how i can fix the problem.
Thank You
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at com.sun.xml.bind.v2.runtime.ClassBeanInfoImpl.createInstance(ClassBeanInfoImpl.java:298)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext.createInstance(UnmarshallingContext.java:701)
at com.sun.xml.bind.v2.runtime.unmarshaller.StructureLoader.startElement(StructureLoader.java:186)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext._startElement(UnmarshallingContext.java:576)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext.startElement(UnmarshallingContext.java:555)
at com.sun.xml.bind.v2.runtime.unmarshaller.SAXConnector.startElement(SAXConnector.java:168)
at org.xml.sax.helpers.XMLFilterImpl.startElement
(XMLParser.java:141)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
at org.xml.sax.helpers.XMLFilterImpl.parse(XMLFilterImpl.java:357)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal0(UnmarshallerImpl.java:258)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal(UnmarshallerImpl.java:229)
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.unmarshal(AbstractUnmarshallerImpl.java:140)
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.unmarshal(AbstractUnmarshallerImpl.java:123)
at org.openscoring.service.ModelRegistry.unmarshal(ModelRegistry.java:190)
at org.openscoring.service.ModelRegistry.load(ModelRegistry.java:114)
at org.openscoring.service.ModelResource.doDeploy(ModelResource.java:179)
at org.openscoring.service.ModelResource.deploy(ModelResource.java:157)
at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
Jul 25, 2017 9:17:12 AM org.eclipse.jetty.server.HttpChannel handleException WARNING: Could not send response error 500: javax.servlet.ServletException: org.glassfish.jersey.server.ContainerException: java.lang.OutOfMemoryError: GC overhead limit exceeded
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.