Code Monkey home page Code Monkey logo

play-json-schema-validator's Introduction

THIS PROJECT IS UNMAINTAINED

Issues will not be fixed and new features will not be implemented unless PRs are submitted. The former maintainer no longer has the time and resources to work on this project further.

If you're interested in taking over, please reply to #146.

Play JSON Schema Validator

Build Status Coverage Status

This is a JSON schema (draft v4/draft v7) validation library for Scala based on Play's JSON library.

If you experience any issues or have feature requests etc., please don't hesitate to file an issue. Thanks!

Add an additional resolver to your build.sbt file:

resolvers += "emueller-bintray" at "http://dl.bintray.com/emueller/maven"

Then add the dependency (supporting Scala 2.12/2.13):

libraryDependencies += "com.eclipsesource"  %% "play-json-schema-validator" % "0.9.5"

Please also see the respective release notes.

Usage

Schemas can be parsed by passing the schema string to Json.fromJson. Starting with 0.9.5 (which adds draft 7 support), Reads and Writes have become version specific, hence you need to import them via respective Version object:

  import Version7._ // since 0.9.5 necessary
  val schema = Json.fromJson[SchemaType](Json.parse(
    """{
      |"properties": {
      |  "$id":    { "type": "integer" },
      |  "title": { "type": "string" },
      |  "body":  { "type": "string" }
      |}
    }""".stripMargin)).get

With a schema at hand, we can now validate JsValues via the SchemaValidator:

val validator = SchemaValidator(Some(Version7))
validator.validate(schema, json)

validate returns a JsResult[A]. JsResult can either be a JsSuccess or a JsError. validate is also provided with overloaded alternatives where Play's Reads or Writes instances can be passed additionally. This is useful for mapping JsValues onto case classes and vice versa:

validate[A](schemaUrl: URL, input: => JsValue, reads: Reads[A]) : JsResult[A]
validate[A](schemaUrl: URL, input: A, writes: Writes[A]): JsResult[JsValue] 
validate[A: Format](schemaUrl: URL, input: A): JsResult[A] 

Error Reporting

In case the validate method returns an failure, errors can be converted to JSON by calling the toJson method. Below is given an example taken from the example app:

import com.eclipsesource.schema._ // brings toJson into scope

val result = validator.validate(schema, json, Post.reads)
result.fold(
  invalid = { errors =>  BadRequest(errors.toJson) },
  valid = { post => ... } 
)

Errors feature a schemaPath, an instancePath, a value and a msgs property. While schemaPath and instancePath should be self explanatory, value holds the validated value and msgs holds all errors related to the validated value. The value of the msgs property is always an array. Below is an example, again taken from the example app.

{
  "schemaPath" : "#/properties/title",
  "keyword": "minLength",
  "instancePath" : "/title",
  "value" : "a",
  "msgs" : [ "a violates min length of 3", "a does not match pattern ^[A-Z].*" ],
  "errors": []
}

The value of schemaPath will be updated when following any refs, hence when validating

{
  "properties": {
    "foo": {"type": "integer"},
    "bar": {"$ref": "#/properties/foo"}
  }
}

the generated error report's schemaPath property will point to #/properties/foo.

id

In case the schema to validate against makes use of the id property to alter resolution scope (or if the schema has been loaded via an URL), the error report also contains a resolutionScope property.

anyOf, oneOf, allOf

In case of allOf, anyOf and oneOf, the errors array property holds the actual sub errors. For instance, if we have a schema like the following:

{
  "anyOf": [
    { "type": "integer" },
    { "minimum": 2      }
  ]
}

and we validate the value 1.5, the toJson method returns this error:

[ {
  "schemaPath" : "#",
  "errors" : {
    "/anyOf/0" : [ {
      "schemaPath" : "#/anyOf/0",
      "errors" : { },
      "msgs" : [ "Wrong type. Expected integer, was number" ],
      "value" : 1.5,
      "instancePath" : "/"
    } ],
    "/anyOf/1" : [ {
      "schemaPath" : "#/anyOf/1",
      "errors" : { },
      "msgs" : [ "minimum violated: 1.5 is less than 2" ],
      "value" : 1.5,
      "instancePath" : "/"
    } ]
  },
  "msgs" : [ "Instance does not match any of the schemas" ],
  "value" : 1.5,
  "instancePath" : "/"
} ]

Customizable error reporting

The validator allows you to alter error messages via scala-i18n, e.g. for localizing errors reports. You can alter messages by placing a messages_XX.txt into your resources folder (by default conf). The keys used for replacing messages can be found here. In case you use the validator within a Play application, you'll need to convert Play's Lang and make it implicitly available for the SchemaValidator, e.g. via:

implicit def fromPlayLang(lang: Lang): com.osinka.i18n.Lang = com.osinka.i18n.Lang(lang.locale)

Example

An online demo of the library can be found here.

See the respective github repo for the source code.

play-json-schema-validator's People

Contributors

edgarmueller avatar edrevo avatar etorreborre avatar fhuertas avatar mikegehard avatar ottovw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

play-json-schema-validator's Issues

regex format is unknown

It would be good to have special string format validator for 'regex' formated string property.

Incorrect resolution behavior for refs contained in cache

I'm not quite sure, but it seems it is not possible to preload schemas for the SchemaValidator and still use relative referencing inside the schema. What I mean is, when I do

val validator = SchemaValidator().addSchema(schemaUrl.toString, schema)
validator.validate(schema, json)

the relative reference "$ref": "common.schema.json#contact" inside a subschema of the schema is not resolved. But when I pass the schema as an URL to the validate method, the reference is resolved correctly.

The thing is, it seems that when I pass the reference as an URL, it loads the schema file and parses it again, even though I had previously added the schema to the validator. I base my guess on the fact that passing the schema as an URL takes roughly twice the time when compared to the other version.

It feels like the URL version gets the correct resolution scope, but passing the schema as a preloaded schema it fails to do so. This happens because the schema file is local, but the location is only known at runtime, so it cannot be written directly to the schema as the id. The solution could be to provide a method for validating with a schema with certain id, for example

validator.validate(schema, schemaUrl.toString, json)

But there is another problem here. As a workaround, I tried to modify the schema json after loading it, so that it has the correct id, then created the schema object from that. Unfortunately this causes the validation to run out of stack. I included the stack trace, but it alone may not be of much help.
stack-overflow.txt

Scala 2.12 version

The key dependency that is blocking this project to cross-compile against scala 2.12 is play-json, which doesn't yet provide a scala 2.12 version.

when supplied a valid (using fge) json schema draft v3, it incorrectly returns [ "Invalid JSON schema" ]

Ok, I see now that the repo states to be draft 4, but I got to the demo 1st; the problem is that the demo at https://ancient-atoll-3956.herokuapp.com/ should state that it only supports draft 4, b/c o.w when you compare it to other available validators, it appears to not work.
When the json schema has this line :
"$schema":"http://json-schema.org/draft-03/schema",
at the root, it is valid according to draft v3.
Compare it to http://json-schema-validator.herokuapp.com/ (which uses https://github.com/fge/json-schema-validator)

Double.Min|MaxValue validation against integer property type

When I attempt to validate
{
"intprop" : 3.14
}
against the schema:
{
"type" : "object",
"properties" : {
"intprop" : {
"type" : "integer",
"format" : "int32"
}
}
I receive back:
[ {
"schemaPath" : "#/properties/intprop",
"errors" : { },
"keyword" : "type",
"msgs" : [ "Wrong type. Expected integer, was number." ],
"value" : -3.14,
"instancePath" : "/intprop"
} ]
It is correct.

However, when I validate the value:
{
"intprop" : -1.7976931348623157E+308
}
against the same schema, there is no validation error trapped. I expected it to be be the same as for 3.14 value.

Scala 2.10 support?

Looks like there is crossScalaSupport for 2.10 but the maven repo only has a _2.11 path.

How do i tag required keys ?

For example, if "title" is required key, how do i define it in the schema ?

Here's is my code. Im expecting schema to fail to it passes.

val input3 = Json.toJson(
    Map(
      "id" -> Json.toJson(123),
      "body" -> Json.toJson("body")
    ))

val schema = Json.fromJson[SchemaType](
    Json.parse(
      """{
        |"properties": {
        |  "id":    { "type": "integer" },
        |  "title": { "type": "string", "minLength": 3 },
        |  "body":  { "type": "string" },
        |  "required": ["title"]
        |}
        |}""".stripMargin
    )
  ).get

println(Json.prettyPrint(input3))
val input3_result = SchemaValidator.validate(schema, input3)
println(f"input3 result: $input3_result")

RESULT:

run
[info] Compiling 1 Scala source to C:\Users\sheshank.kodam\workspace_SCALA_IntelliJ\hello_json_validator\target\scala-2.11\classes...
[info] Running hello_json
{
"id" : 123,
"body" : "body"
}
input3 result: Success({"id":123,"body":"body"})
[success] Total time: 2 s, completed Jan 7, 2016 11:46:08 AM

Duplicate reports within `errors` property

When resolveRefAndValidate is run, the AnyValidator is run twice which leads to duplicate reports within the errors reports. A proposed fix would therefore:

  1. Run the AnyValidator only once
  2. Make sure that duplicate errors are removed when merging Results

Validator should be tolerant to custom values of 'format' property

I have discovered that schema validator handles the following known values of format: ["uri", "hostname", "email", "ipv4", "ipv6", "date-time", "uuid" ]. If a schema specifies something else, validator raises an error about unknown format.

According to the meta schema definition for JSON-schema, "format" property is a flexible string, which does not have any constraints. Users may define any possible value in the format field.

Moreover, the meta-schema for JSON-schema uses "regex" value, which is used in the definition of pattern field:
"pattern": {
"type": "string",
"format": "regex"
}

Desirable behavior: unknown formats are ignored or (better) a custom client's callback is invoked to let the client handle custom formats.

Externalize constraints

When using the library in order to build up a JSON schema AST using constraints (e.g. to be emitted as JSON), one needs access to internal packages.

oneOf for properties

Hi!
I figured out that you can use "oneOf", "anyOf" in that way:

{
    "type" : "object",
    "properties" : {
        "name" : { "type": "string"},
        "name2": { "type": "string"}
    },
    "oneOf" : [ 
        {
            "required": ["name"]
        },
        {
            "required": ["name2"]
        }
    ]
}

So this object fails:

{
"name": "bla",
"name": "blabla"
}

But this one succeed:

{
"name": "bla"
} 

or

{
"name2": "blabla"
}

I've come to this from that post:
http://stackoverflow.com/questions/24023536/json-schema-how-do-i-require-one-field-or-another-or-one-of-two-others-but-no

I get error when i validate an empty object ({}) in yours demo web app:

[
    {
        "schemaPath": "#",
        "errors": {
            "matched": [
                "/oneOf/0",
                "/oneOf/1"
            ]
        },
        "msgs": [
            "Instance matches more than one schema"
        ],
        "value": {},
        "instancePath": ""
    }
]

Does your library support this feature?
Thanks!

Validating instances by passing the schema url to the validator is slower than passing it the preloaded schema

Validating instances with validator.validate(schemaUrl, instance) is measurably slower than validator.validate(schema, instance), where schema is the preloaded schema.

This can be measured with the following setup (pseudo code).

/* resources/test.schema.json */
{ "type": "object",
  "properties": {
    "mything": {"$ref": "#thing"}
  },
  "definitions": {
    "thing": {
      "id": "#thing",
      "type": "string"
    }
  }
}
def timed(name: String)(body: => Unit) {
  val start = System.currentTimeMillis();
  body;
  println(name + ": " + (System.currentTimeMillis() - start) + " ms")
}

val validator = SchemaValidator()
val schemaUrl = getClass.getResource("test.schema.json")
val schema = JsonSource.schemaFromUrl(schemaUrl).get

val instance = Json.parse(
      """{
        |"mything": "the thing"
        |}""".stripMargin)

timed("preloaded") {
  for (_ <- 1000) validator.validate(schema, instance)
}
timed("url based") {
  for (_ <- 1000) validator.validate(schemaUrl, instance)
}

An example result would be

preloaded: 260 ms
url based: 855 ms

multiple environments

Hi,

I guess its more question/request for ideas rather then an issue.

when using the "id" in order to specify full path (file or url) for the location of the main schema, in order to be able use relative paths for sub schemas, how is it possible to work in different environments which cause to the base path (like url) to be changed. (for example on my local http://127.0.0.1:9000/schemas/json/file.json, how ever in test env "http://mytest.com:8000/schemas/json/file.json") ?

except manually changing the "id" value, is there any other ideas ?

** will be nice to have option to define ENV variables inside the schema file, this can help a lot with such dynamic changes.

Thanks,

How do i use pattern?

First of all thanks for providing the validator ... very helpful especially beginners in scala like me ..

Question: How do i use patterns? I have the input and schema as shown below which doesnt work.

{
    "id": "398402",
    "start": "2016-01-04T18:31:00Z"
}

 val schema = Json.fromJson[SchemaType](Json.parse(
    """{
      |"properties": {
      |  "id":    { "type": "string" },
      |  "spoolStart":  { "type": "string",
      |                         "pattern": "^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}Z"
      |                       }
      |}
    }""".stripMargin)).get

Provide support for Play 2.5

Hi,
We would really like to use this validator in our project.
Unfortunately the support ends with Play 2.4.
Would you be as kind as to fix this, please?

Best

Don't care about the key, but what type its value should be

Hey there,

not sure where to look for the answer so I'm posting it here.
I need to validate a JSON object that can contain any key name, but its value should be of type "string".
So basically I don't care about the key or how many keys I can have, but I must specify that the correspondent value should be a string.

Is there any way to specify that?

Thanks!

uuid format

Hi,

I am using "format": "uuid" for a string field.
and that is my example value:
6abd01c0-9236-4ba7-8cbd-3e866ad9385a

I have checked here:
http://www.uuid-validator.com/
and my uuid indeed valid one (version 4)

however when I am trying to validate the value, I am getting "Unknown format uuid"]

Thanks,

Question: Ref resolution outside of validation?

By introducing this library into our code, we have been able to leverage additional JSON schema features, such as $ref resolution, for JSON validation.

Beyond validation, we are parsing valid JSON payloads into our own domain types using JSON schemas.

For illustration, we have something like this:

trait Value
case class BooleanValue private(value: Boolean) extends Value
case class StringValue private(value: String) extends Value
case class IntegerValue private(value: BigInt) extends Value
case class NumberValue private(value: BigDecimal) extends Value
case class DateTimeValue private(value: OffsetDateTime) extends Value

object Value {
  def fromBoolean(value: Boolean): Value = BooleanValue(value)
  def fromInteger(value: BigInt): Value = IntegerValue(value)
  def fromNumber(value: BigDecimal): Value = NumberValue(value)
  def fromString(value: String, format: Option[String]): Value = format match {
    case Some(f) if f == "date-time" =>
      DateTimeValue(OffsetDateTime.parse(value, DateTimeFormatter.ISO_OFFSET_DATE_TIME))
    case _ =>
      StringValue(value)
  }
}

case class Element(path: JsPath, value: Value)

def validatedElements(
  schema: SchemaType,
  json: JsValue
): Either[Seq[(JsPath, Seq[ValidationError])], List[Element]] = ???

validatedElements here recursively walks the schema and flattens the JSON payload into a list of JsPath-Value pairs using the corresponding SchemaType for each leaf of the JSON payload.

Given a schema such as

{
  "type": "object",
  "properties": {
    "organization": {
      "allOf": [
        { "$ref": "#/definitions/organization" }
      ]
    }
  },
  "definitions": {
    "organization": {
      "type": "object",
      "properties": {
        "createdAt": { "type": "string", "format": "date-time" }
      },
      "required": ["createdAt"],
      "additionalProperties": false
    }
  }
}

we would like to parse a valid payload into

List(
  Element(JsPath("/organization/createdAt"), aDateTimeValue)
)

We are able to validate the incoming JSON payload with a SchemaValidator(). However, we have not been able to use the same schema to parse the JSON payload, correctly handling ref resolution. As far as I can tell, all ref resolution / ref caching is package-private.

My question is:

(a) is there a way to achieve this already?
(b) if not, would it be within the scope of this library to open up ref resolution for traversing JSON schemas outside of validation? If so, I'd be happy to crack at it.

Return full list of validation errors

Hi!

It's not a bug but i would like to have full stack of errors when validate json over schema. Play's json reads works like that, it tries to read every property in a object and then it result in a success object or array of failed properties. Currently your library returns first failed property, doesn't it?

int32 | int64 validator for integer property type

Hi,

I have tested int32 | int64 format validator. It seems it works for number property type. I thought that Swagger spec allows format spec only for integer property type, which is missing.

Here is what I do:

  • I persist object in a database. JSON schema describes an instance.
  • I inspect the schema and identify object properties and it's types.
  • For number type I use double. It works fine and handles overflows, as double just drops precision when there is no enough bits to save all
  • For integer type I use BigDecimal if no format is specified, use Long if format is int64, and use Int if format is int32.

So, I effectively did not have an issue with number property type, I had an overflow of an integer. If you could let me know how I could fix it myself, I could contribute.

Cannot resolve symbol toJson

Using the same code as shown in your readme but errors.toJson doesnt work

val result: VA[JsValue]= SchemaValidator.validate(schema, reqJson)
            result.fold(
                invalid = { errors => BadRequest(errors.toJson)},
                valid = { post => Ok(reqJson) }
            )

Add addFormat support for non string based properties

addFormat is a great tool for string based properties validation, I would like to have the same for JsNumber based properties.

It may look weird that we may want to validate integers and numbers, but I would like to add an extension to a validator, which could validate that a number is convertible to 32 bits integer or 64 bits integer. I know there is min/max validators for numbers, but I would like to follow the approach swagger takes in the declaration of underlying system type: to use format for number type, where you can specify int32 or int64. and leave min/max constraints for end-user purposes. Currently, I can not trap the overflow of Long type at the stage of json validation. JsNumber from Play Json captures numbers as BigDecimal, and it may overflow at the stage when I access it as Long.

It would be great to have addFormat for JsNumber and, if it is possible, built-in support for format=int32|int64 for integer based properties.

Relative references in a schema without id causes infinite recursion.

Reproducible with a schema like this

{
  "type": "object",
  "properties": {
    "mything": {"$ref": "#thing"} 
  },
  "definitions": {
    "thing": {
      "id": "#thing",
      "type": "string"
    }
  }
}

When validating an instance with code like

val validator = SchemaValidator()
validator.validate(schemaUrl, instance)

the validation is stuck in an infinite recursion and runs out of stack.
If id is given for the schema, the validation finishes correcly.

ref to local file

how to refer to schema in another local file ?

I have tried
"user": {"$ref": "file:///full/path/schema.json#/definitions/user"}
"user": {"$ref": "file:full/path/schema.json#/definitions/user"}
"user": {"$ref": "/full/path/schema.json#/definitions/user"}

in all of the cases I am getting "Could not resolve ref..."

*** is it possible to use relative path ?

Thanks

issue when building using sbt

Hello master ,

I found this code very interesting, but hit this issue trying to run sbt console.

home.bintray.credentials

I created an empty file however it is expecting .credentials with additional information in a particular format. Could you please provide an example or valid .credentials file ?

Thanks

C:\MyProject\play-json-schema-validator-master>sbt console
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Loading project definition from C:\MyProject\play-json-schema-validator-master\project
bintray credentials missing credential properties realm, host, user, password in C:\Users\john.bintray.credentials is malformed
java.lang.RuntimeException: missing credential properties realm, host, user, password in C:\Users\john.bintray.credentials
at scala.sys.package$.error(package.scala:27)
at bintray.Bintray$$anonfun$ensuredCredentials$1.apply(Bintray.scala:60)
at bintray.Bintray$$anonfun$ensuredCredentials$1.apply(Bintray.scala:60)
at scala.util.Either.fold(Either.scala:97)
at bintray.Bintray$.ensuredCredentials(Bintray.scala:60)
at bintray.Bintray$.withRepo(Bintray.scala:49)
at bintray.BintrayPlugin$$anonfun$publishToBintray$1.apply(BintrayPlugin.scala:180)
at bintray.BintrayPlugin$$anonfun$publishToBintray$1.apply(BintrayPlugin.scala:175)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.EvaluateSettings$MixedNode.evaluate0(INode.scala:175)
at sbt.EvaluateSettings$INode.evaluate(INode.scala:135)
at sbt.EvaluateSettings$$anonfun$sbt$EvaluateSettings$$submitEvaluate$1.apply$mcV$sp(INode.scala:69)
at sbt.EvaluateSettings.sbt$EvaluateSettings$$run0(INode.scala:78)
at sbt.EvaluateSettings$$anon$3.run(INode.scala:74)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[error] missing credential properties realm, host, user, password in C:\Users\john.bintray.credentials
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q

Invalid schema type should not throw MatchError exception

If schema type is invalid, e.g.

"someProp": {"type": "sting"}

the schema parsing should not throw MatchError exception. Instead the parsing should return with an error result. The lines of code where the exception originates from, are internal/serialization/JSONSchemaReads.scala:15.

Problem resolving relative references from jar resources

I know that there have been a few answered posts regarding relative references, but my issue is a bit different in its requirements.

Say I've got the following schema files:
resources/schemas/my_schema.schema:

{
  "properties": {
    "foo": {
      "$ref": "foo.schema#"
    }
  }
}

resources/schemas/foo.schema:

{
  "properties": {
    "bar": {
      "type": "string"
    }
  }
}

I load the top-level schema file by using this in my code:
getClass.getResource("/schemas/my_schema.schema") to retrieve a URL.

  • In my tests, that URL contains the protocol file: and the URL becomes file:/absolute/path/to/schemas/my_schema.schema and relative references work fine. Note that this is not jar-relative - it's just a file system path (assuming because it's running via sbt)
  • In my production environment, the jar is deployed and run via java /absolute/path/to/my_jar.jar. Then the protocol for the top-level schema becomes jar:file: and the URL becomesjar:file:/absolute/path/to/my_jar.jar!/schemas/my_schema.schema (note the !) and relative references do not work ("Could not resolve ref foo.schema#")

Using version 0.8.1

I've seen issue #48 regarding relative resources inside jars, however, I don't want to have to use the ClasspathUrlResolver and prefix my references with classpath: (as in "$ref": classpath:foo.schema) (By the way, please correct me if my interpretation of #48 is mistaken).

Is there some way that I can create a resolver that doesn't require a protocol, but does the same as ClasspathUrlResolver regarding relative references? For instance, could I create a resolver with a protocol of ""?

Thanks for any help you can provide.

Anchor-based IDs are not supported correctly

{
     "id": "http://my.site/myschema#",
     "definitions": {
         "schema1": {
             "id": "schema1",
             "type": "integer"
         },
         "schema2": {
             "type": "array",
             "items": { "$ref": "schema1" }
         }
     }
}

In the above schema resolving http://my.site/schema1# with 0.8.3 will fail.

Seemingly valid schema and input results in an error I do not understand

Hi, love this library by the way. This schema is my first attempt with an array of objects.

{
  "$schema": "http://json-schema.org/draft-04/schema#",
  "properties": {
    "name": {
      "type": "string"
    },
    "comics": {
      "type": "array",
      "items": {
        "type": "object",
        "properties": {
          "comic_id": {
            "type": "string",
            "pattern": "^[0-9a-f]{24}$"
          },
          "cron": {
            "type": "string"
          },
          "mark": {
            "type": "integer",
            "minimum": 1
          },
          "step": {
            "type": "integer",
            "minimum": 1,
            "maximum": 50
          }
        },
        "required": [
          "comic_id",
          "cron",
          "mark",
          "step"
        ],
        "additionalProperties": false
      }
    }
  },
  "required": [
    "name",
    "comics"
  ],
  "additionalProperties": false
}

My JSON test case works here: http://www.jsonschemavalidator.net/

{
  "name": "glorious horse",
  "comics": [
    {
      "comic_id": "57341408de7af93a83733280",
      "mark": 1,
      "cron": "0 0 8 1/1 * ? *",
      "step": 10
    }
  ]
}

But when I try to validate it I get this error response:

[ {
  "schemaPath" : "n/a",
  "errors" : { },
  "msgs" : [ "error.path.missing" ],
  "value" : {
    "name" : "glorious horse",
    "comics" : [ {
      "comic_id" : "57341408de7af93a83733280",
      "cron" : "0 0 8 1/1 * ? *",
      "mark" : 1,
      "step" : 10
    } ]
  },
  "instancePath" : "/id"
} ]

It is late at night, so maybe I am tired, but I don't understand what is wrong.

Double.Min|Max values do not trigger an error for integer type property

Hi,

I have applied your fix for int32|int64 formats. Works great now.

Now about doubles :) I have spotted in tests that the following instance (magic number is Double.Max):

{
    "prop" : 1.7976931348623157E+308
}

is validated successfully against the schema:

  {
    "type" : "object",
    "properties" : {
      "prop" : {
        "type" : "integer"
      }
    }
  }

expected result is failure on validation. The similar issue exists for Double.Min.

Instance path is wrong

During a recent refactoring an error has been introduced which leads to wrong instance paths when resolving refs.

migration to play 2.5

Hi,

we migrated to play2.5 from 2.4
and now I am getting:
Error injecting constructor, java.lang.NoClassDefFoundError: Could not initialize class com.eclipsesource.schema.package$

and also that one:
caused by: java.lang.NoSuchMethodError: play.api.libs.functional.syntax.package$.functionalCanBuildApplicative(Lplay/api/libs/functional/Applicative;)Lplay/api/libs/functional/FunctionalCanBuild;
at com.eclipsesource.schema.internal.serialization.JSONSchemaReads$class.tupleReader(JSONSchemaReads.scala:185)
at com.eclipsesource.schema.package$.tupleReader$lzycompute(package.scala:12)
at com.eclipsesource.schema.package$.tupleReader(package.scala:12)
at com.eclipsesource.schema.internal.serialization.JSONSchemaReads$class.$init$(JSONSchemaReads.scala:23)
at com.eclipsesource.schema.package$.(package.scala:13)
at com.eclipsesource.schema.package$.(package.scala)

is there any issues with the play-json-schema-validator and play2.5 ?
(I see that you are using addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.3") while in scala migration to 2.5 stays
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.x"))

Thanks,

pattern for regexp is not included anymore?

Hi,

I have upgraded from 0.8.4 to 0.8.7. Thank you for the improvements included.

I observed change in the messages:
0.8.4:
"asdfasdf* does not match pattern ^[A-Za-z0-9]+$"
0.8.7:
"Invalid pattern 'asdfasdf*'."

0.8.4 - It is clear why user's value does not match the pattern, but only if the user is a geek.
0.8.7 - It is not clear why user's value is a pattern and why it is invalid

Suggestion: allow to add human readable error message along side pattern specification in the json schema. Something like:
"myprop": {
"type": "string",
"pattern": "^[A-Za-z0-9]+$",
"patternMessage": "myprop can contain only letters and numbers"
}

In this case, the validator could place error message, like the following:
"Invalid string value 'asdfasdf*'. myprop can contain only letters and numbers"

If something like this is possible, it would be great. Otherwise, I would prefer to have 0.8.4 message style over 0.8.7

Thanks

Referencing files in resources/jar

Hi,

I'm having trouble referencing ($ref) other schema files that sit together inside a jar/resources.

I've tried using:

  • "$ref": "resource-name.json" -> File not found exception
  • "$ref": "file:resource-name.json" -> File not found exception
  • "$ref": "file:./resource-name.json" -> File not found exception
  • "$ref": "resource:resource-name.json" -> Malformed URL
  • "$ref": "resource:/resource-name.json" -> Malformed URL
  • "$ref": "classpath:resource-name.json" -> Malformed URL
  • "$ref": "classpath:/resource-name.json" -> Malformed URL

I'm running out of ideas. Is there something obvious I'm forgetting or is this something not supported by your library?

Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.