tethys-json / tethys Goto Github PK
View Code? Open in Web Editor NEWAST free JSON library for Scala
Home Page: https://tethys-json.github.io/tethys/
License: Apache License 2.0
AST free JSON library for Scala
Home Page: https://tethys-json.github.io/tethys/
License: Apache License 2.0
selectReader
function became total In #29. But one place was skipped accidentally
def selectReader[Res](fun: PartialFunction[A1, JsonReader[_ <: Res]])(implicit Res: ClassTag[Res], A1: ClassTag[A1]): JsonReader[Res]
We switched from spray-json to tethys, and one of use cases is accepting external json input with extensible content, wrapping it in an envelope object and passing through.
Now we have an object with a string field containing raw json, but writing it as a string results with it being escaped. We solved the issue with a custom data type and custom writer but it would be much better to have support for writing raw json in DSL.
Reading json subtree as a raw json string would seem like a complementary feature, if it's not supported already
Package objects will be deprecated in scala 2.14 and removed in dotty, so i think we need get rid of them before stable release is out
Some weird bug appears after #36
Compilation fails with java.lang.IndexOutOfBoundsException: 0
on first clean compilation
[error] java.lang.IndexOutOfBoundsException: 0
[error] at scala.collection.LinearSeqOptimized$class.apply(LinearSeqOptimized.scala:65)
[error] at scala.collection.immutable.List.apply(List.scala:84)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateOrRelink$1(Importers.scala:170)
[error] at scala.reflect.internal.Importers$StandardImporter.importSymbol(Importers.scala:210)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateType(Importers.scala:224)
[error] at scala.reflect.internal.Importers$StandardImporter.importType(Importers.scala:284)
[error] at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$1.apply(Importers.scala:224)
[error] at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$1.apply(Importers.scala:224)
[error] at scala.collection.immutable.List.map(List.scala:284)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateType(Importers.scala:224)
[error] at scala.reflect.internal.Importers$StandardImporter.importType(Importers.scala:284)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateSymbol(Importers.scala:128)
[error] at scala.reflect.internal.Importers$StandardImporter.scala$reflect$internal$Importers$StandardImporter$$cachedRecreateSymbol$1(Importers.scala:145)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateOrRelink$1(Importers.scala:193)
[error] at scala.reflect.internal.Importers$StandardImporter.importSymbol(Importers.scala:210)
[error] at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$4.apply(Importers.scala:248)
[error] at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$4.apply(Importers.scala:248)
[error] at scala.reflect.internal.Scopes$Scope.foreach(Scopes.scala:373)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateType(Importers.scala:248)
[error] at scala.reflect.internal.Importers$StandardImporter.importType(Importers.scala:284)
[error] at scala.reflect.internal.Importers$StandardImporter$$anon$1.complete(Importers.scala:75)
[error] at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1535)
[error] at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:174)
[error] at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
[error] at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
[error] at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
[error] at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
[error] at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
[error] at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:174)
[error] at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
[error] at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.info(SynchronizedSymbols.scala:174)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateOrRelink$1(Importers.scala:167)
[error] at scala.reflect.internal.Importers$StandardImporter.importSymbol(Importers.scala:210)
[error] at scala.reflect.internal.Importers$StandardImporter.recreatedTreeCompleter(Importers.scala:298)
[error] at scala.reflect.internal.Importers$StandardImporter$$anonfun$importTree$1.apply$mcV$sp(Importers.scala:417)
[error] at scala.reflect.internal.Importers$StandardImporter.tryFixup(Importers.scala:49)
[error] at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:418)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateTree(Importers.scala:383)
[error] at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:415)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateTree(Importers.scala:383)
[error] at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:415)
[error] at scala.reflect.internal.Importers$StandardImporter.recreateTree(Importers.scala:383)
[error] at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:415)
[error] at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:29)
[error] at scala.reflect.macros.contexts.Evals$class.eval(Evals.scala:19)
[error] at scala.reflect.macros.contexts.Context.eval(Context.scala:6)
[error] at tethys.derivation.impl.derivation.ReaderDerivation$class.deriveReader(ReaderDerivation.scala:73)
[error] at tethys.derivation.impl.derivation.SemiautoDerivationMacro.deriveReader(SemiautoDerivationMacro.scala:9)
[error] at tethys.derivation.impl.derivation.SemiautoDerivationMacro.describedJsonReader(SemiautoDerivationMacro.scala:68)
[error] implicit val reader: JsonReader[BaseClass] = jsonReader[BaseClass] {
We need to provide capability to specify conditions for field parsing in jsonReader macro
Make better error message for this case
Input: {}
Class: case class Request(text: String)
Error text:
Tethys: Illegal json at '[ROOT]': Can not extract fields from json'text']
Jackson: text: field is required
I want to accept a patch request model with optional fields, and return an error, if there are unknown fields that are normally ignored by parser.
For example, I have a user and user patch models:
case class User(id: UUID, name: String, phone: String, email: String)
case class UserPatch(name: Option[String], phone: Option[String], email: Option[String])
and I want a patch request with body like that fail:
{
"name": "Ivan",
"age": 42
}
preferably with an error, indicating which field is unwanted.
Current implementation of map function do not use defaultValue result of origin JsonReader and it leads to situation like this:
JsonReader[Option[Int]].defaultValue == Some(None)
JsonReader[Option[Int]].map(_.toString).defaultValue == None
In some cases we got diverging implicit expansion
error. So Lazy approach solves this problem
We use -Xfatal-warnings
+ -Xlint:unsound-match
, which is good for preventing runtime MatchErrors
on incomplete matches on sealed subtypes.
However, selectReader
accepts a partial function.
selectReader
argumentI think that partial argument, even if implemented to actually be useful for something, shouldn't be the default.
case class Wrapper(i: Int)
implicit val wrapperWriter: JsonWriter[Wrapper] = new JsonObjectWriter[Wrapper] {
override def writeValues(value: Wrapper, tokenWriter: TokenWriter): Unit = {
tokenWriter.writeFieldName(classOf[Wrapper].getName)
// tokenWriter.writeNumber(value.i)
}
}
Wrapper(42).asJson
results with {"Wrapper"}
. Error is expected because it's not a json object.
When I use akka routing with typed query-parameters and tethys.derivation.semiauto._
import in the scope I get the conflict with ReaderFieldStringOps.as[A]
path("foo") {
parameter('bar.as[String]) {bar => ...}
}
Sub-quadratic decreasing of throughput when length of the JSON number to parse is increasing.
On contemporary CPUs parsing of such JSON numbers that are bound on doubles or floats and has 1000000 decimal digits (~1Mb) can took more than 14 seconds.
Now we just build JsonReader/JsonWriter and let compiler show compilation errors. And if some of implicits missing it show error only about one value.
Now function writeJson
calls TokenWriter.close() but we need just to flush output
This syntax looks verbose and redundant:
jsonReader[Foo] {
describe {
ReaderBuilder[Foo]
}
}
we could just write
jsonReader[Foo] {
ReaderBuilder[Foo]
}
Add module for json4s ast JsonReader/JsonWriter
Reason:
Many public APIs support json pretty printing for easier debugging:
GET /api/v1/pets?pretty
Adding fields with the same name works incorrect.
It compiles but doesn't work
Example of usage:
jsonReader[User] {
describe {
ReaderBuilder[User]
.extract(_.id).from("id".as[String]).apply(identity)
....
}
}
Char should probably be mapped to a json string with singular code point?
so scala
Now if we want to rename field we should remove it and then add with required name. But it moves new field to the end of json object. So we need field renaming capability
final case class Foo(i: Int)
implicit val fooReader: JsonReader[Foo] = jsonReader
"any".jsonAs[Foo]
"1!".jsonAs[Foo]
"""!{"i":1}""".jsonAs[Foo]
All this cases will cause an exception, but Left(e: ReaderError) is expected.
It happens because TokenIteratorProducer fails on TokenIterator creation.
Tethys does not derive writer for a sealed hierarchy even when writers for subtypes are explicitly defined
import tethys._
import tethys.jackson._
import tethys.derivation.semiauto._
sealed trait Foo
object Foo {
object Bar extends Foo
class Baz extends Foo
}
implicit val BarWriter: JsonObjectWriter[Foo.Bar.type] =
JsonWriter.obj[Foo.Bar.type]
implicit val BazWriter: JsonObjectWriter[Foo.Baz] =
JsonWriter.obj[Foo.Baz]
implicit val FooWriter: JsonWriter[Foo] = jsonWriter
JsonReader abstraction cause primitives boxing and unboxing on read.
So we need something like IntJsonReader, LongJsonReader and so on.
Demo class:
object Demo extends App {
case class Data(key: String, field: String)
def createWriter(firstFieldName: String, secondFieldName: String): JsonWriter[Data] = jsonWriter[Data] {
describe {
WriterBuilder[Data]
.rename(_.field)(firstFieldName)
.add(secondFieldName)(_.field)
}
}
def serialize(data: Data): Unit = {
implicit val writer: JsonWriter[Data] = createWriter("superField", "mehField")
println(data.asJson)
}
serialize(Data("1", "fffff"))
}
Expected output: {"key":"1","superField":"fffff","mehField":"fffff"}
Result:
Error:(12, 14) unknown tree: tethys.derivation.builder.WriterBuilder.apply[Demo.Data].rename[String](((x$1: Demo.Data) => x$1.field))(firstFieldName)
describe {
Error:(12, 14) unknown tree: tethys.derivation.builder.WriterBuilder.apply[Demo.Data].add(secondFieldName).apply[String](((x$1: Demo.Data) => x$1.field))
describe {
We need to replace tethys-macro-derivation with tethys-derivation and tethys-jackson-backend with tethys-jackson
Json path like dsl support
JsonPath.foo.bar.baz.as[Int]
PR: #47
Hi, thank you for a cool project!
I've encountered the issue with resolving implicits, here is somewhat minimal case reproducing error in header:
trait Res[+A] extends Product {
def code: Res.Code
}
object Res {
sealed trait Code
final case object Succ extends Code
final case object Fail extends Code
final case class Ok[A](data: A, code: Code = Succ) extends Res[A]
final case class Error[A](message: String, code: Code = Fail) extends Res[A]
}
final case class Item(name: String)
package object implicits {
import tethys._
import tethys.jackson._
import tethys.derivation.semiauto._
implicit val itemReader: JsonReader[Item] = jsonReader[Item]
implicit val itemWriter: JsonObjectWriter[Item] = jsonWriter[Item]
implicit def resReader[A: JsonReader]: JsonReader[Res[A]] =
JsonReader.builder
.addField[String]("code")
.selectReader[Res[A]]{
case "SUCCESS" => JsonReader.builder.addField[A]("data").buildReader(Res.Ok[A](_))
case "FAILURE" => JsonReader.builder.addField[String]("message").buildReader(Res.Error[A](_))
}
implicit def okWriter[A: JsonObjectWriter]: JsonObjectWriter[Res.Ok[A]] =
JsonWriter.obj[Res.Ok[A]]
.addField("code")(_ => "SUCCESS")
.addField("data")(_.asJson)
implicit val errorWriter: JsonObjectWriter[Res.Error[_]] =
JsonWriter.obj[Res.Error[_]]
.addField("code")(_ => "FAILURE")
.addField("message")(_.message)
}
...
>>> Ok(List(Item("a"), Item("b"))).asJson
...
> diverging implicit expansion for type tethys.JsonWriter[List[io.expload.api.Item]]
Writer instead of Reader.
May be its time to start running tests in ci on submitting pull request?
Consider adding a module for deriving readers and writers for Enumeratum types (EnumEntry
and ValueEnumEntry
subtypes)
I have a case when I would like to check expected and actual json
Get("some-route") ~> routes ~> check {
responseAs[String] shouldBe """{"a": 1, "b": 2}"""
}
In this example in successful case my test will be failed because of json formatting.
Is there any method to collect all tokens from the stream to the Seq[Token]
to compare json strings?
Hello!
I've tried to use camelCase <-> snake_case wrapper FieldStyle.lowerSnakecase
case class QuestionCount(totalCount: Int, activeCount: Int)
implicit val questionCountResponseFormat: JsonReader[Response.QuestionCount] = jsonReader[Response.QuestionCount](
ReaderDerivationConfig.withFieldStyle(FieldStyle.lowerSnakecase).strict
)
I get error
Error:(77, 116) not found: value ReaderError implicit val questionCountResponseFormat: JsonReader[Response.QuestionCount] = jsonReader[Response.QuestionCount](
and not only in this case, also in similar places of code.
The error goes out when I import import tethys.readers.ReaderError
, but it's a little bit strange because ReaderError class is not even some kind of implicits.
We need special matcher for json strings that doesn't care about object fields order. Something like that:
"""{"a": 1, "b": [1, 2, 3], "c": true }""" should equalJson(obj(
"a" -> 1,
"c" -> true,
"b" -> arr(1, 2, 3)
))
Possibly it will be useful feature to convert object keys from/to json.
For example:
Client send me {"client_id": 123}
and I want to read it to case class Request(clientId: Int)
.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.