Code Monkey home page Code Monkey logo

tethys's People

Contributors

alexxand avatar dependabot[bot] avatar dos65 avatar drywet avatar eld0727 avatar ilya-sib avatar jennykosta avatar johnspade avatar kovalyovga avatar little-inferno avatar mrirre avatar nikitamelnikov avatar nikolaev-nikita avatar nikololiahim avatar optician avatar phill101 avatar rednblack avatar roman-statsura avatar scala-steward avatar scf37 avatar susliko avatar tethys-json avatar ulanzetz avatar vilunov avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

tethys's Issues

Total selectReader part 2

selectReader function became total In #29. But one place was skipped accidentally

def selectReader[Res](fun: PartialFunction[A1, JsonReader[_ <: Res]])(implicit Res: ClassTag[Res], A1: ClassTag[A1]): JsonReader[Res]

Reader / Writer for raw json values

We switched from spray-json to tethys, and one of use cases is accepting external json input with extensible content, wrapping it in an envelope object and passing through.

Now we have an object with a string field containing raw json, but writing it as a string results with it being escaped. We solved the issue with a custom data type and custom writer but it would be much better to have support for writing raw json in DSL.

Reading json subtree as a raw json string would seem like a complementary feature, if it's not supported already

Get rid of package objects

Package objects will be deprecated in scala 2.14 and removed in dotty, so i think we need get rid of them before stable release is out

Compilation fails on macro derivation

Some weird bug appears after #36
Compilation fails with java.lang.IndexOutOfBoundsException: 0 on first clean compilation

[error] java.lang.IndexOutOfBoundsException: 0
[error] 	at scala.collection.LinearSeqOptimized$class.apply(LinearSeqOptimized.scala:65)
[error] 	at scala.collection.immutable.List.apply(List.scala:84)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateOrRelink$1(Importers.scala:170)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importSymbol(Importers.scala:210)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateType(Importers.scala:224)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importType(Importers.scala:284)
[error] 	at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$1.apply(Importers.scala:224)
[error] 	at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$1.apply(Importers.scala:224)
[error] 	at scala.collection.immutable.List.map(List.scala:284)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateType(Importers.scala:224)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importType(Importers.scala:284)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateSymbol(Importers.scala:128)
[error] 	at scala.reflect.internal.Importers$StandardImporter.scala$reflect$internal$Importers$StandardImporter$$cachedRecreateSymbol$1(Importers.scala:145)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateOrRelink$1(Importers.scala:193)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importSymbol(Importers.scala:210)
[error] 	at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$4.apply(Importers.scala:248)
[error] 	at scala.reflect.internal.Importers$StandardImporter$$anonfun$recreateType$4.apply(Importers.scala:248)
[error] 	at scala.reflect.internal.Scopes$Scope.foreach(Scopes.scala:373)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateType(Importers.scala:248)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importType(Importers.scala:284)
[error] 	at scala.reflect.internal.Importers$StandardImporter$$anon$1.complete(Importers.scala:75)
[error] 	at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1535)
[error] 	at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:174)
[error] 	at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
[error] 	at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
[error] 	at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
[error] 	at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
[error] 	at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
[error] 	at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:174)
[error] 	at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
[error] 	at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.info(SynchronizedSymbols.scala:174)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateOrRelink$1(Importers.scala:167)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importSymbol(Importers.scala:210)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreatedTreeCompleter(Importers.scala:298)
[error] 	at scala.reflect.internal.Importers$StandardImporter$$anonfun$importTree$1.apply$mcV$sp(Importers.scala:417)
[error] 	at scala.reflect.internal.Importers$StandardImporter.tryFixup(Importers.scala:49)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:418)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateTree(Importers.scala:383)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:415)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateTree(Importers.scala:383)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:415)
[error] 	at scala.reflect.internal.Importers$StandardImporter.recreateTree(Importers.scala:383)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:415)
[error] 	at scala.reflect.internal.Importers$StandardImporter.importTree(Importers.scala:29)
[error] 	at scala.reflect.macros.contexts.Evals$class.eval(Evals.scala:19)
[error] 	at scala.reflect.macros.contexts.Context.eval(Context.scala:6)
[error] 	at tethys.derivation.impl.derivation.ReaderDerivation$class.deriveReader(ReaderDerivation.scala:73)
[error] 	at tethys.derivation.impl.derivation.SemiautoDerivationMacro.deriveReader(SemiautoDerivationMacro.scala:9)
[error] 	at tethys.derivation.impl.derivation.SemiautoDerivationMacro.describedJsonReader(SemiautoDerivationMacro.scala:68)
[error]     implicit val reader: JsonReader[BaseClass] = jsonReader[BaseClass] {

Better json reader errors for missing required field

Make better error message for this case

Input: {}
Class: case class Request(text: String)

Error text:

Tethys: Illegal json at '[ROOT]': Can not extract fields from json'text']
Jackson: text: field is required

Strict parsing mode for objects

I want to accept a patch request model with optional fields, and return an error, if there are unknown fields that are normally ignored by parser.

For example, I have a user and user patch models:

case class User(id: UUID, name: String, phone: String, email: String)

case class UserPatch(name: Option[String], phone: Option[String], email: Option[String])

and I want a patch request with body like that fail:

{
  "name": "Ivan",
  "age": 42
}

preferably with an error, indicating which field is unwanted.

JsonReader.map breaks defaultValue function

Current implementation of map function do not use defaultValue result of origin JsonReader and it leads to situation like this:

JsonReader[Option[Int]].defaultValue == Some(None)
JsonReader[Option[Int]].map(_.toString).defaultValue == None

Change selectReader to use total function

We use -Xfatal-warnings + -Xlint:unsound-match, which is good for preventing runtime MatchErrors on incomplete matches on sealed subtypes.

However, selectReader accepts a partial function.

  1. "partial" property is not used withing the code, PF is just applied to receive a reader
  2. total function heuristics for emitting warnings are not applied to selectReader argument

I think that partial argument, even if implemented to actually be useful for something, shouldn't be the default.

JsonObjectWriter without value.

case class Wrapper(i: Int)

implicit val wrapperWriter: JsonWriter[Wrapper] = new JsonObjectWriter[Wrapper] {
  override def writeValues(value: Wrapper, tokenWriter: TokenWriter): Unit = {
    tokenWriter.writeFieldName(classOf[Wrapper].getName)
//    tokenWriter.writeNumber(value.i)
  }
}

Wrapper(42).asJson

results with {"Wrapper"}. Error is expected because it's not a json object.

Semiauto import conflicts with akka http dsl

When I use akka routing with typed query-parameters and tethys.derivation.semiauto._ import in the scope I get the conflict with ReaderFieldStringOps.as[A]

path("foo") {
  parameter('bar.as[String]) {bar => ...}
}

API rework

  1. Get rid of ClassTag from all signatures in API
  2. Remove defaultValue function from JsonReader and provide default values only for Option type
  3. Allow some macros configurations via additional argument
  4. Add mapWithField function with signature FieldName => A => B to allow raise errors within mapping function

Adding fields with the same name

Adding fields with the same name works incorrect.
It compiles but doesn't work

Example of usage:

jsonReader[User] {
  describe {
    ReaderBuilder[User]
      .extract(_.id).from("id".as[String]).apply(identity)
      ....
  }
}

Add renaming syntax to WriterBuilder

Now if we want to rename field we should remove it and then add with required name. But it moves new field to the end of json object. So we need field renaming capability

`jsonAs` on non json input will cause an exception

final case class Foo(i: Int)
implicit val fooReader: JsonReader[Foo] = jsonReader

"any".jsonAs[Foo]
"1!".jsonAs[Foo]
"""!{"i":1}""".jsonAs[Foo]

All this cases will cause an exception, but Left(e: ReaderError) is expected.
It happens because TokenIteratorProducer fails on TokenIterator creation.

<stuff> is not case class or a sealed trait

Tethys does not derive writer for a sealed hierarchy even when writers for subtypes are explicitly defined

  import tethys._
  import tethys.jackson._
  import tethys.derivation.semiauto._


  sealed trait Foo

  object Foo {
    object Bar extends Foo
    class Baz extends Foo
  }

  implicit val BarWriter: JsonObjectWriter[Foo.Bar.type] =
    JsonWriter.obj[Foo.Bar.type]

  implicit val BazWriter: JsonObjectWriter[Foo.Baz] =
    JsonWriter.obj[Foo.Baz]

  implicit val FooWriter: JsonWriter[Foo] = jsonWriter

Primitive readers

JsonReader abstraction cause primitives boxing and unboxing on read.
So we need something like IntJsonReader, LongJsonReader and so on.

Support for dynamic rename/add in WriterBuilder

Demo class:

object Demo extends App {
  case class Data(key: String, field: String)

  def createWriter(firstFieldName: String, secondFieldName: String): JsonWriter[Data] = jsonWriter[Data] {
    describe {
      WriterBuilder[Data]
        .rename(_.field)(firstFieldName)
        .add(secondFieldName)(_.field)
    }
  }

  def serialize(data: Data): Unit = {
    implicit val writer: JsonWriter[Data] = createWriter("superField", "mehField")

    println(data.asJson)
  }

  serialize(Data("1", "fffff"))
}

Expected output: {"key":"1","superField":"fffff","mehField":"fffff"}
Result:

Error:(12, 14) unknown tree: tethys.derivation.builder.WriterBuilder.apply[Demo.Data].rename[String](((x$1: Demo.Data) => x$1.field))(firstFieldName)
describe {
Error:(12, 14) unknown tree: tethys.derivation.builder.WriterBuilder.apply[Demo.Data].add(secondFieldName).apply[String](((x$1: Demo.Data) => x$1.field))
describe {

Wrong artifacts naming

We need to replace tethys-macro-derivation with tethys-derivation and tethys-jackson-backend with tethys-jackson

Json path

Json path like dsl support

JsonPath.foo.bar.baz.as[Int]

Diverging implicits when building serializer for generic types.

Hi, thank you for a cool project!

I've encountered the issue with resolving implicits, here is somewhat minimal case reproducing error in header:

trait Res[+A] extends Product {
  def code: Res.Code
}

object Res {
  sealed trait Code
  final case object Succ extends Code
  final case object Fail extends Code

  final case class Ok[A](data: A, code: Code = Succ) extends Res[A]
  final case class Error[A](message: String, code: Code = Fail) extends Res[A]
}

final case class Item(name: String)

package object implicits {
  import tethys._
  import tethys.jackson._
  import tethys.derivation.semiauto._

  implicit val itemReader: JsonReader[Item] = jsonReader[Item]
  implicit val itemWriter: JsonObjectWriter[Item] = jsonWriter[Item]

  implicit def resReader[A: JsonReader]: JsonReader[Res[A]] = 
    JsonReader.builder
      .addField[String]("code")
      .selectReader[Res[A]]{
        case "SUCCESS" => JsonReader.builder.addField[A]("data").buildReader(Res.Ok[A](_))
        case "FAILURE" => JsonReader.builder.addField[String]("message").buildReader(Res.Error[A](_))
      }

  implicit def okWriter[A: JsonObjectWriter]: JsonObjectWriter[Res.Ok[A]] =
    JsonWriter.obj[Res.Ok[A]]
      .addField("code")(_ => "SUCCESS")
      .addField("data")(_.asJson)
  implicit val errorWriter: JsonObjectWriter[Res.Error[_]] =
    JsonWriter.obj[Res.Error[_]]
      .addField("code")(_ => "FAILURE")
      .addField("message")(_.message)
}

...

>>>          Ok(List(Item("a"), Item("b"))).asJson
...
> diverging implicit expansion for type tethys.JsonWriter[List[io.expload.api.Item]]

Travis CI support

May be its time to start running tests in ci on submitting pull request?

Enumeratum integration

Consider adding a module for deriving readers and writers for Enumeratum types (EnumEntry and ValueEnumEntry subtypes)

Supporting unit tests

I have a case when I would like to check expected and actual json

Get("some-route") ~> routes ~> check {
  responseAs[String] shouldBe """{"a": 1, "b": 2}"""
}

In this example in successful case my test will be failed because of json formatting.

Is there any method to collect all tokens from the stream to the Seq[Token] to compare json strings?

JsonReaderBuilder signatures rework

Some changes in #36 and #38 has huge impact on all code base, so we need to make few changes:

  • add addField function with explicit jsonReader parameter
  • remove buildReader function with strict parameter and add buildStrictReader function for it

not found: value ReaderError

Hello!

I've tried to use camelCase <-> snake_case wrapper FieldStyle.lowerSnakecase

case class QuestionCount(totalCount: Int, activeCount: Int)

implicit val questionCountResponseFormat: JsonReader[Response.QuestionCount] = jsonReader[Response.QuestionCount](
    ReaderDerivationConfig.withFieldStyle(FieldStyle.lowerSnakecase).strict
)

I get error
Error:(77, 116) not found: value ReaderError implicit val questionCountResponseFormat: JsonReader[Response.QuestionCount] = jsonReader[Response.QuestionCount](
and not only in this case, also in similar places of code.

The error goes out when I import import tethys.readers.ReaderError, but it's a little bit strange because ReaderError class is not even some kind of implicits.

Scalatest integration

We need special matcher for json strings that doesn't care about object fields order. Something like that:

"""{"a": 1, "b": [1, 2, 3], "c": true }""" should equalJson(obj(
  "a" -> 1,
  "c" -> true,
  "b" -> arr(1, 2, 3)
))

Convert underscore names to camelcase names

Possibly it will be useful feature to convert object keys from/to json.

For example:
Client send me {"client_id": 123} and I want to read it to case class Request(clientId: Int).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.