Code Monkey home page Code Monkey logo

aeson's Introduction

Welcome to aeson

Hackage Build Status

aeson is a fast Haskell library for working with JSON data.

Join in!

We are happy to receive bug reports, fixes, documentation enhancements, and other improvements.

Please report bugs via the github issue tracker.

Master git repository:

  • git clone git://github.com/haskell/aeson.git

See what's changed in recent (and upcoming) releases:

(You can create and contribute changes using either git or Mercurial.)

Authors

This library was originally written by Bryan O'Sullivan.

aeson's People

Contributors

adamschoenemann avatar andrewthad avatar basvandijk avatar benweitzman avatar bergmark avatar bodigrim avatar bos avatar daniel-diaz avatar eduardsergeev avatar eskimor avatar fisx avatar fosskers avatar gbaz avatar gdevanla avatar hvr avatar jhance avatar jprider63 avatar jsgf avatar lpsmith avatar lysxia avatar meiersi avatar nbogie avatar nst avatar parsonsmatt avatar phadej avatar roelvandijk avatar ryanglscott avatar teofilc avatar winterland1989 avatar yuras avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aeson's Issues

Modify Parser error value

Motivated by yesodweb/persistent#60.

It would be nice to be able to provide more information for an error message. For example:

parseJSON (Object o) = modifyFailure
    (\s -> "Parsing of the Persistent config file failed: " ++ s)
    (Foo <$> o .: "someField")

Currently, I could do something like <|> fail "Parsing of Persistent config file failed", but I can't get access to the original error message.

Generic _ fieds handling

Example:

data Solid = Solid {
    _id :: String,
    solidStr :: String
} deriving (Show, Eq, Data, Typeable)

solidJson = "{\"_id\":\"a\",\"solidStr\":\"s\"}"

And now big headahe:

decode solidJson :: Maybe Solid

Expect:

Just (Solid {_id = "a", solidStr="s"})

But got:

Nothing

This happens only if _-fields in JSON. But I use aeson for CouchDB. And CouchDB sends "system" fields _id and _rev.

Decoding an encoded unary constructor fails

Both Data.Aeson.Generic, Data.Aeson.TH and the GHC Generic code encode types with a single unary constructor like:

data T = C Int

using the encoding of the argument of the constructor. For example:

> toJSON (C 1)
Number 1

It can also be converted back:

> fromJSON (toJSON (C1 1)) :: Result T
Success (C1 1)

Lets try to encode it:

> encode (toJSON (C1 1))
Chunk "1" Empty

That works. Now lets try to decode this encoded value:

> decode (encode (toJSON (C1 1))) :: Maybe T
Nothing

??? I would have expected to get Just (C 1).

Of course the reason this is happening is because decode only decodes a top-level JSON value which must be either an object or an array, per RFC 4627.

The question is should we change the encoding to go to a one element array instread? Bryan, what do you think?

ToJSON and FromJSON instances of UTCTime use different formats

https://github.com/bos/aeson/blob/master/Data/Aeson/Types/Class.hs#L655

instance ToJSON UTCTime where
toJSON t = String (pack (take 23 str ++ "Z"))
where str = formatTime defaultTimeLocale "%FT%T%Q" t
{-# INLINE toJSON #-}

instance FromJSON UTCTime where
parseJSON = withText "UTCTime" $ \t ->
case parseTime defaultTimeLocale "%FT%T%QZ" (unpack t) of
Just d -> pure d
_ -> fail "could not parse ISO-8601 date"
{-# INLINE parseJSON #-}

When we serialize UTCTime using "%FT%T%Q" template, we can't read it back with "%FT%T%QZ".

cabal install problems on OSX

I tried to install version .4, here is the build log and the error. I'm not sure if I did anything wrong, but I'm happy to help solve the problem. I did do a cabal update before trying the installation. The other tricky thing is that I believe I'm running the 64bit version on OSX.

bverdier-mbp:~ bverdier$ ghc --version
The Glorious Glasgow Haskell Compilation System, version 7.0.3

bverdier-mbp:~ bverdier$ sudo cabal install aeson
Password:
Resolving dependencies...
Configuring text-0.11.1.9...
Preprocessing library text-0.11.1.9...
Preprocessing test suites for text-0.11.1.9...
Building text-0.11.1.9...
: cannot satisfy -package-id deepseq-1.1.0.2-0465f803f7d27d264907e7e03e72a71f
(use -v for more information)
cabal: Error: some packages failed to install:
aeson-0.3.2.12 depends on text-0.11.1.9 which failed to install.
blaze-builder-0.3.0.1 depends on text-0.11.1.9 which failed to install.
blaze-textual-0.2.0.4 depends on text-0.11.1.9 which failed to install.
double-conversion-0.2.0.1 depends on text-0.11.1.9 which failed to install.
hashable-1.1.2.1 depends on text-0.11.1.9 which failed to install.
text-0.11.1.9 failed during the building phase. The exception was:
ExitFailure 1
unordered-containers-0.1.4.3 depends on text-0.11.1.9 which failed to install.
bverdier-mbp:~ bverdier$ sudo cabal update
Downloading the latest package list from hackage.haskell.org
bverdier-mbp:~ bverdier$ sudo cabal install aeson
Resolving dependencies...
Configuring attoparsec-0.9.1.2...
Preprocessing library attoparsec-0.9.1.2...
Building attoparsec-0.9.1.2...
[1 of 9] Compiling Data.Attoparsec.Zepto ( Data/Attoparsec/Zepto.hs, dist/build/Data/Attoparsec/Zepto.o )
[2 of 9] Compiling Data.Attoparsec.Internal.Types ( Data/Attoparsec/Internal/Types.hs, dist/build/Data/Attoparsec/Internal/Types.o )
[3 of 9] Compiling Data.Attoparsec.Number ( Data/Attoparsec/Number.hs, dist/build/Data/Attoparsec/Number.o )
[4 of 9] Compiling Data.Attoparsec.FastSet ( Data/Attoparsec/FastSet.hs, dist/build/Data/Attoparsec/FastSet.o )
[5 of 9] Compiling Data.Attoparsec.Combinator ( Data/Attoparsec/Combinator.hs, dist/build/Data/Attoparsec/Combinator.o )
[6 of 9] Compiling Data.Attoparsec.Internal ( Data/Attoparsec/Internal.hs, dist/build/Data/Attoparsec/Internal.o )
[7 of 9] Compiling Data.Attoparsec ( Data/Attoparsec.hs, dist/build/Data/Attoparsec.o )
[8 of 9] Compiling Data.Attoparsec.Char8 ( Data/Attoparsec/Char8.hs, dist/build/Data/Attoparsec/Char8.o )
[9 of 9] Compiling Data.Attoparsec.Lazy ( Data/Attoparsec/Lazy.hs, dist/build/Data/Attoparsec/Lazy.o )
[1 of 9] Compiling Data.Attoparsec.Zepto ( Data/Attoparsec/Zepto.hs, dist/build/Data/Attoparsec/Zepto.p_o )
[2 of 9] Compiling Data.Attoparsec.Internal.Types ( Data/Attoparsec/Internal/Types.hs, dist/build/Data/Attoparsec/Internal/Types.p_o )
[3 of 9] Compiling Data.Attoparsec.Number ( Data/Attoparsec/Number.hs, dist/build/Data/Attoparsec/Number.p_o )
[4 of 9] Compiling Data.Attoparsec.FastSet ( Data/Attoparsec/FastSet.hs, dist/build/Data/Attoparsec/FastSet.p_o )
[5 of 9] Compiling Data.Attoparsec.Combinator ( Data/Attoparsec/Combinator.hs, dist/build/Data/Attoparsec/Combinator.p_o )
[6 of 9] Compiling Data.Attoparsec.Internal ( Data/Attoparsec/Internal.hs, dist/build/Data/Attoparsec/Internal.p_o )
[7 of 9] Compiling Data.Attoparsec ( Data/Attoparsec.hs, dist/build/Data/Attoparsec.p_o )
[8 of 9] Compiling Data.Attoparsec.Char8 ( Data/Attoparsec/Char8.hs, dist/build/Data/Attoparsec/Char8.p_o )
[9 of 9] Compiling Data.Attoparsec.Lazy ( Data/Attoparsec/Lazy.hs, dist/build/Data/Attoparsec/Lazy.p_o )
Registering attoparsec-0.9.1.2...
Running Haddock for attoparsec-0.9.1.2...
Preprocessing library attoparsec-0.9.1.2...
Warning: The documentation for the following packages are not installed. No
links will be generated to these packages: ffi-1.0, rts-1.0
haddock coverage for dist/build/tmp73532/Data/Attoparsec/Zepto.hs: 7/7 100%
haddock coverage for dist/build/tmp73532/Data/Attoparsec/Internal/Types.hs: 4/11 36%
haddock coverage for dist/build/tmp73532/Data/Attoparsec/Number.hs: 2/2 100%
haddock coverage for dist/build/tmp73532/Data/Attoparsec/FastSet.hs: 9/13 69%
haddock coverage for dist/build/tmp73532/Data/Attoparsec/Combinator.hs: 13/13 100%
haddock coverage for dist/build/tmp73532/Data/Attoparsec/Internal.hs: 38/41 93%
haddock coverage for dist/build/tmp73532/Data/Attoparsec.hs: 48/49 98%
haddock coverage for dist/build/tmp73532/Data/Attoparsec/Char8.hs: 67/68 99%
haddock coverage for dist/build/tmp73532/Data/Attoparsec/Lazy.hs: 9/9 100%
Documentation created: dist/doc/html/attoparsec/index.html
Installing library in
/Users/bverdier/Library/Haskell/ghc-7.0.3/lib/attoparsec-0.9.1.2/lib
Registering attoparsec-0.9.1.2...
Configuring blaze-builder-0.3.0.1...
Preprocessing library blaze-builder-0.3.0.1...
Building blaze-builder-0.3.0.1...
[ 1 of 13] Compiling Blaze.ByteString.Builder.Internal.Types ( Blaze/ByteString/Builder/Internal/Types.hs, dist/build/Blaze/ByteString/Builder/Internal/Types.o )
[ 2 of 13] Compiling Blaze.ByteString.Builder.Internal.Write ( Blaze/ByteString/Builder/Internal/Write.hs, dist/build/Blaze/ByteString/Builder/Internal/Write.o )
[ 3 of 13] Compiling Blaze.ByteString.Builder.Internal.Buffer ( Blaze/ByteString/Builder/Internal/Buffer.hs, dist/build/Blaze/ByteString/Builder/Internal/Buffer.o )
[ 4 of 13] Compiling Blaze.ByteString.Builder.Internal.UncheckedShifts ( Blaze/ByteString/Builder/Internal/UncheckedShifts.hs, dist/build/Blaze/ByteString/Builder/Internal/UncheckedShifts.o )
[ 5 of 13] Compiling Blaze.ByteString.Builder.Internal ( Blaze/ByteString/Builder/Internal.hs, dist/build/Blaze/ByteString/Builder/Internal.o )
[ 6 of 13] Compiling Blaze.ByteString.Builder.Char.Utf8 ( Blaze/ByteString/Builder/Char/Utf8.hs, dist/build/Blaze/ByteString/Builder/Char/Utf8.o )
[ 7 of 13] Compiling Blaze.ByteString.Builder.ByteString ( Blaze/ByteString/Builder/ByteString.hs, dist/build/Blaze/ByteString/Builder/ByteString.o )
[ 8 of 13] Compiling Blaze.ByteString.Builder.Word ( Blaze/ByteString/Builder/Word.hs, dist/build/Blaze/ByteString/Builder/Word.o )

Blaze/ByteString/Builder/Word.hs:100:1:
Warning: The import of GHC.Word' is redundant except perhaps to import instances fromGHC.Word'
To import instances alone, use: import GHC.Word()
[ 9 of 13] Compiling Blaze.ByteString.Builder.Char8 ( Blaze/ByteString/Builder/Char8.hs, dist/build/Blaze/ByteString/Builder/Char8.o )
[10 of 13] Compiling Blaze.ByteString.Builder.HTTP ( Blaze/ByteString/Builder/HTTP.hs, dist/build/Blaze/ByteString/Builder/HTTP.o )
[11 of 13] Compiling Blaze.ByteString.Builder.Int ( Blaze/ByteString/Builder/Int.hs, dist/build/Blaze/ByteString/Builder/Int.o )
[12 of 13] Compiling Blaze.ByteString.Builder ( Blaze/ByteString/Builder.hs, dist/build/Blaze/ByteString/Builder.o )
[13 of 13] Compiling Blaze.ByteString.Builder.Html.Utf8 ( Blaze/ByteString/Builder/Html/Utf8.hs, dist/build/Blaze/ByteString/Builder/Html/Utf8.o )
[ 1 of 13] Compiling Blaze.ByteString.Builder.Internal.Types ( Blaze/ByteString/Builder/Internal/Types.hs, dist/build/Blaze/ByteString/Builder/Internal/Types.p_o )
[ 2 of 13] Compiling Blaze.ByteString.Builder.Internal.Write ( Blaze/ByteString/Builder/Internal/Write.hs, dist/build/Blaze/ByteString/Builder/Internal/Write.p_o )
[ 3 of 13] Compiling Blaze.ByteString.Builder.Internal.Buffer ( Blaze/ByteString/Builder/Internal/Buffer.hs, dist/build/Blaze/ByteString/Builder/Internal/Buffer.p_o )
[ 4 of 13] Compiling Blaze.ByteString.Builder.Internal.UncheckedShifts ( Blaze/ByteString/Builder/Internal/UncheckedShifts.hs, dist/build/Blaze/ByteString/Builder/Internal/UncheckedShifts.p_o )
[ 5 of 13] Compiling Blaze.ByteString.Builder.Internal ( Blaze/ByteString/Builder/Internal.hs, dist/build/Blaze/ByteString/Builder/Internal.p_o )
[ 6 of 13] Compiling Blaze.ByteString.Builder.Char.Utf8 ( Blaze/ByteString/Builder/Char/Utf8.hs, dist/build/Blaze/ByteString/Builder/Char/Utf8.p_o )
[ 7 of 13] Compiling Blaze.ByteString.Builder.ByteString ( Blaze/ByteString/Builder/ByteString.hs, dist/build/Blaze/ByteString/Builder/ByteString.p_o )
[ 8 of 13] Compiling Blaze.ByteString.Builder.Word ( Blaze/ByteString/Builder/Word.hs, dist/build/Blaze/ByteString/Builder/Word.p_o )

Blaze/ByteString/Builder/Word.hs:100:1:
Warning: The import of GHC.Word' is redundant except perhaps to import instances fromGHC.Word'
To import instances alone, use: import GHC.Word()
[ 9 of 13] Compiling Blaze.ByteString.Builder.Char8 ( Blaze/ByteString/Builder/Char8.hs, dist/build/Blaze/ByteString/Builder/Char8.p_o )
[10 of 13] Compiling Blaze.ByteString.Builder.HTTP ( Blaze/ByteString/Builder/HTTP.hs, dist/build/Blaze/ByteString/Builder/HTTP.p_o )
[11 of 13] Compiling Blaze.ByteString.Builder.Int ( Blaze/ByteString/Builder/Int.hs, dist/build/Blaze/ByteString/Builder/Int.p_o )
[12 of 13] Compiling Blaze.ByteString.Builder ( Blaze/ByteString/Builder.hs, dist/build/Blaze/ByteString/Builder.p_o )
[13 of 13] Compiling Blaze.ByteString.Builder.Html.Utf8 ( Blaze/ByteString/Builder/Html/Utf8.hs, dist/build/Blaze/ByteString/Builder/Html/Utf8.p_o )
Registering blaze-builder-0.3.0.1...
Running Haddock for blaze-builder-0.3.0.1...
Preprocessing library blaze-builder-0.3.0.1...
Warning: The documentation for the following packages are not installed. No
links will be generated to these packages: ffi-1.0, rts-1.0
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Internal/Types.hs: 4/15 27%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Internal/Write.hs: 24/26 92%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Internal/Buffer.hs: 22/22 100%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Internal/UncheckedShifts.hs: 1/4 25%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Internal.hs: 21/31 68%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Char/Utf8.hs: 9/9 100%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/ByteString.hs: 12/12 100%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Word.hs: 43/43 100%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Char8.hs: 9/9 100%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/HTTP.hs: 4/4 100%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Int.hs: 43/43 100%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder.hs: 21/23 91%
haddock coverage for dist/build/tmp73532/Blaze/ByteString/Builder/Html/Utf8.hs: 10/10 100%
Documentation created: dist/doc/html/blaze-builder/index.html
Installing library in
/Users/bverdier/Library/Haskell/ghc-7.0.3/lib/blaze-builder-0.3.0.1/lib
Registering blaze-builder-0.3.0.1...
Downloading blaze-textual-0.2.0.5...
Configuring blaze-textual-0.2.0.5...
Preprocessing library blaze-textual-0.2.0.5...
Preprocessing test suites for blaze-textual-0.2.0.5...
Building blaze-textual-0.2.0.5...
[1 of 4] Compiling Blaze.Text.Int ( Blaze/Text/Int.hs, dist/build/Blaze/Text/Int.o )
[2 of 4] Compiling Blaze.Text.Double.Native ( Blaze/Text/Double/Native.hs, dist/build/Blaze/Text/Double/Native.o )
[3 of 4] Compiling Blaze.Text.Double ( Blaze/Text/Double.hs, dist/build/Blaze/Text/Double.o )
[4 of 4] Compiling Blaze.Text ( Blaze/Text.hs, dist/build/Blaze/Text.o )
[1 of 4] Compiling Blaze.Text.Int ( Blaze/Text/Int.hs, dist/build/Blaze/Text/Int.p_o )
[2 of 4] Compiling Blaze.Text.Double.Native ( Blaze/Text/Double/Native.hs, dist/build/Blaze/Text/Double/Native.p_o )
[3 of 4] Compiling Blaze.Text.Double ( Blaze/Text/Double.hs, dist/build/Blaze/Text/Double.p_o )
[4 of 4] Compiling Blaze.Text ( Blaze/Text.hs, dist/build/Blaze/Text.p_o )
Registering blaze-textual-0.2.0.5...
Running Haddock for blaze-textual-0.2.0.5...
Preprocessing library blaze-textual-0.2.0.5...
Preprocessing test suites for blaze-textual-0.2.0.5...
Warning: The documentation for the following packages are not installed. No
links will be generated to these packages: ffi-1.0, rts-1.0
haddock coverage for dist/build/tmp73532/Blaze/Text/Int.hs: 0/4 0%
haddock coverage for dist/build/tmp73532/Blaze/Text/Double/Native.hs: 0/3 0%
haddock coverage for dist/build/tmp73532/Blaze/Text/Double.hs: 0/3 0%
haddock coverage for dist/build/tmp73532/Blaze/Text.hs: 0/4 0%
Documentation created: dist/doc/html/blaze-textual/index.html
Installing library in
/Users/bverdier/Library/Haskell/ghc-7.0.3/lib/blaze-textual-0.2.0.5/lib
Registering blaze-textual-0.2.0.5...
Configuring hashable-1.1.2.1...
Preprocessing library hashable-1.1.2.1...
Preprocessing test suites for hashable-1.1.2.1...
Building hashable-1.1.2.1...
[1 of 1] Compiling Data.Hashable ( Data/Hashable.hs, dist/build/Data/Hashable.o )
[1 of 1] Compiling Data.Hashable ( Data/Hashable.hs, dist/build/Data/Hashable.p_o )
Registering hashable-1.1.2.1...
Running Haddock for hashable-1.1.2.1...
Preprocessing library hashable-1.1.2.1...
Preprocessing test suites for hashable-1.1.2.1...
Warning: The documentation for the following packages are not installed. No
links will be generated to these packages: ffi-1.0, rts-1.0
haddock coverage for dist/build/tmp73532/Data/Hashable.hs: 10/10 100%
Documentation created: dist/doc/html/hashable/index.html
Installing library in
/Users/bverdier/Library/Haskell/ghc-7.0.3/lib/hashable-1.1.2.1/lib
Registering hashable-1.1.2.1...
Downloading unordered-containers-0.1.4.5...
Configuring unordered-containers-0.1.4.5...
Preprocessing library unordered-containers-0.1.4.5...
Preprocessing test suites for unordered-containers-0.1.4.5...
Building unordered-containers-0.1.4.5...
[1 of 8] Compiling Data.FullList.Lazy ( Data/FullList/Lazy.hs, dist/build/Data/FullList/Lazy.o )
[2 of 8] Compiling Data.HashMap.Common ( Data/HashMap/Common.hs, dist/build/Data/HashMap/Common.o )
[3 of 8] Compiling Data.FullList.Strict ( Data/FullList/Strict.hs, dist/build/Data/FullList/Strict.o )
[4 of 8] Compiling Data.HashMap.Lazy ( Data/HashMap/Lazy.hs, dist/build/Data/HashMap/Lazy.o )
[5 of 8] Compiling Data.HashMap.Strict ( Data/HashMap/Strict.hs, dist/build/Data/HashMap/Strict.o )
[6 of 8] Compiling Data.HashSet ( Data/HashSet.hs, dist/build/Data/HashSet.o )
[7 of 8] Compiling Data.HashMap.Lazy.Internal ( Data/HashMap/Lazy/Internal.hs, dist/build/Data/HashMap/Lazy/Internal.o )
[8 of 8] Compiling Data.HashMap.Strict.Internal ( Data/HashMap/Strict/Internal.hs, dist/build/Data/HashMap/Strict/Internal.o )
[1 of 8] Compiling Data.FullList.Lazy ( Data/FullList/Lazy.hs, dist/build/Data/FullList/Lazy.p_o )
[2 of 8] Compiling Data.HashMap.Common ( Data/HashMap/Common.hs, dist/build/Data/HashMap/Common.p_o )
[3 of 8] Compiling Data.FullList.Strict ( Data/FullList/Strict.hs, dist/build/Data/FullList/Strict.p_o )
[4 of 8] Compiling Data.HashMap.Lazy ( Data/HashMap/Lazy.hs, dist/build/Data/HashMap/Lazy.p_o )
[5 of 8] Compiling Data.HashMap.Strict ( Data/HashMap/Strict.hs, dist/build/Data/HashMap/Strict.p_o )
[6 of 8] Compiling Data.HashSet ( Data/HashSet.hs, dist/build/Data/HashSet.p_o )
[7 of 8] Compiling Data.HashMap.Lazy.Internal ( Data/HashMap/Lazy/Internal.hs, dist/build/Data/HashMap/Lazy/Internal.p_o )
[8 of 8] Compiling Data.HashMap.Strict.Internal ( Data/HashMap/Strict/Internal.hs, dist/build/Data/HashMap/Strict/Internal.p_o )
Registering unordered-containers-0.1.4.5...
Running Haddock for unordered-containers-0.1.4.5...
Preprocessing library unordered-containers-0.1.4.5...
Preprocessing test suites for unordered-containers-0.1.4.5...
Warning: The documentation for the following packages are not installed. No
links will be generated to these packages: ffi-1.0, rts-1.0
haddock coverage for dist/build/tmp73532/Data/FullList/Lazy.hs: 10/26 38%
haddock coverage for dist/build/tmp73532/Data/HashMap/Common.hs: 18/21 86%
haddock coverage for dist/build/tmp73532/Data/FullList/Strict.hs: 8/21 38%
haddock coverage for dist/build/tmp73532/Data/HashMap/Lazy.hs: 39/39 100%
haddock coverage for dist/build/tmp73532/Data/HashMap/Strict.hs: 39/39 100%
haddock coverage for dist/build/tmp73532/Data/HashSet.hs: 26/26 100%
haddock coverage for dist/build/tmp73532/Data/HashMap/Lazy/Internal.hs: 3/3 100%
haddock coverage for dist/build/tmp73532/Data/HashMap/Strict/Internal.hs: 3/3 100%
Documentation created: dist/doc/html/unordered-containers/index.html
Installing library in
/Users/bverdier/Library/Haskell/ghc-7.0.3/lib/unordered-containers-0.1.4.5/lib
Registering unordered-containers-0.1.4.5...
Downloading aeson-0.4.0.0...
Configuring aeson-0.4.0.0...
Preprocessing library aeson-0.4.0.0...
Preprocessing test suites for aeson-0.4.0.0...
Building aeson-0.4.0.0...
: cannot satisfy -package-id mtl-2.0.1.0-5b7a9cce5565d8cc8721ba4f95becf1b
(use -v for more information)
Updating documentation index /Users/bverdier/Library/Haskell/doc/index.html
cabal: Error: some packages failed to install:
aeson-0.4.0.0 failed during the building phase. The exception was:
ExitFailure 1

Array convenience function

array function similar to object function. See example below.

{-# LANGUAGE OverloadedStrings #-}

import Data.Text (Text)
import Data.Aeson
import qualified Data.Vector as V

array :: [Value] -> Value
array = Array . V.fromList

obj :: Value
obj = object [
    ("name", "David"),
    ("age", Number 34),
    ("location", Null),
    ("likes", array [Number 7, "Cheese"])]

encoded = encode obj

Encoding always escapes as if JSON may be embedded in HTML

It seem that json strings are always being encoded as if they might need to be embedded in HTML.

See the following ghci output:

Prelude> Data.Aeson.Encode.fromValue (Data.Aeson.String (Data.Text.pack "a"))
""a\u003cb\u003e""
Prelude>

I have JSON data with many '<' and '>" chars, and it's verbose and ugly to have them all encoded like this. This was apparently introduced in issue #81 to prevent issues when embedding in HTML. My data will never be embedded in HTML, and I don't think the JSON standard requires this encoding.

Is it really the job of the fromValue method to avoid XSS attacks? I would have thought that was the job of the code that actually embeds the some JSON in some HTML.

Unicode value decoding bug

This code

{-# LANGUAGE DeriveDataTypeable #-}
import qualified Data.ByteString.Lazy.Char8 as LBS
import qualified Data.Text.Lazy as LText
import qualified Data.Text.Lazy.Encoding as LText
import qualified Data.Aeson.Generic as GenericAeson
import Data.Generics

data A = A { a :: String } deriving (Data, Typeable, Show)

json = "{\"a\": \"Ёжик лижет мёд.\"}"

main = do
  let jsonLBS = LText.encodeUtf8 $ LText.pack json
  LBS.putStrLn jsonLBS
  let Just a = GenericAeson.decode jsonLBS :: Maybe A
  print a

outputs

{"a": "Ёжик лижет мёд."}
A {a = "\1025\1078\1080\1082 \1083\1080\1078\1077\1090 \1084\1105\1076."}

As you can see the A gets constructed with an encoded value.

Generalize the representation of numbers

The JSON spec does not specify the data types to be used for numbers, just their grammar. This allows for fixed precision decimal numbers, which Aeson does not support. Much flexibility could be gained from switching numbers to something like

data Number = Number { significantDigits :: ByteString, Decimal :: ByteString, exponent :: ByteString }

This way, it would be up to the user of Aeson what type they want to use for their numbers.

Jeff

Data.Aeson.Generic.decode unable to parse Data.Aeson.Generic.encode output (aeson-0.6.0.0)

The following is OK:

data Foo = Foo deriving (Typeable,Data,Show)

λ> encode Foo
Chunk "[]" Empty
λ> decode (encode Foo) :: Maybe Foo
Just Foo

But there is a problem for more than one nullary constructor:

data Foo = Foo | Bar deriving (Typeable,Data,Show)

λ> encode Foo
Chunk "\"Foo\"" Empty    
λ> decode (encode Foo) :: Maybe Foo
Nothing

I didn't yet look at the code, but thought I'd make a note of it somewhere, as it's kind of a glaring issue.

Unnecessary re-compilation during cabal install

Hi,
I don't know why, but if I have all the dependent libraries (i.e. text etc.) installed independently, and then I run "cabal install aeson" all the dependent packages are recompiled.
Furthermore - if afterwards I run "cabal install text" the package (text) gets re-installed, and if I then run "cabal install aeson" it happens again...
(I'm using ghc-7.0.3)
Suggestions?

Unexpected behavior of .:?

I want to have distinction between absence of field and its null value.

data Blah = Blah (Maybe (Maybe String))
instance FromJSON Blah where
  parseJSON (Object v) = Blah <$> v .:? "blah"
Blah Nothing -- if field not mentioned
Blah (Just Nothing) -- if field has null value

However, the result of parsing is never Blah (Just Nothing) because .:? uses (Maybe a) instance directly.

(.:?) :: (FromJSON a) => Object -> Text -> Parser (Maybe a)
obj .:? key = case H.lookup key obj of
               Nothing -> pure Nothing
               Just v  -> parseJSON v
            -- Just v  -> Just <$> parseJSON v would have expected behavior

unable to compile with ghc-7.0.4

I am unable to compile aeson-native-0.3.3:

cabal-dev install aeson-native
Resolving dependencies...
Configuring aeson-native-0.3.3...
Preprocessing library aeson-native-0.3.3...
Building aeson-native-0.3.3...
[1 of 6] Compiling Data.Aeson.Functions ( Data/Aeson/Functions.hs, dist/build/Data/Aeson/Functions.o )
[2 of 6] Compiling Data.Aeson.Types ( Data/Aeson/Types.hs, dist/build/Data/Aeson/Types.o )

Data/Aeson/Types.hs:196:22:
No instance for (NFData Object)
arising from a use of rnf' Possible fix: add an instance declaration for (NFData Object) In the expression: rnf o In an equation forrnf': rnf (Object o) = rnf o
In the instance declaration for `NFData Value'
cabal: Error: some packages failed to install:
aeson-native-0.3.3 failed during the building phase. The exception was:
ExitFailure 1

Decoding a Map with a Maybe value fails

I'm using 0.6.0.0 with GHC 7.4.1 on Linux x86_64.

Here's a very basic 'encode' + 'decode' chain that fails:

import Data.Aeson.Generic (decode, encode)
import qualified Data.Map as Map
import Data.Map (Map)

type SomeType = Map String (Maybe String)

main :: IO ()
main = do
  let x = ((Map.fromList [("x", Nothing)]) :: SomeType)
  let y = (decode (encode x) :: Maybe SomeType)
  if (Just x /= y) then
      error "Bug"
    else
      print "OK"

encode to leave out Maybe values that are Nothing

I have come up with a way to not encode Maybe values that are empty, as shown in https://gist.github.com/2465584.

Is there an existing simple way to do this?

If not, should this become a feature in aeson?

  let reply1 = Coord 123.4 20 (Just "foo")
  BL.putStrLn (encode reply1)
  let reply = Coord 123.4 20 Nothing
  BL.putStrLn (encode reply)

results in

{"c":"foo","y":20.0,"x":123.4}
{"y":20.0,"x":123.4}

Make Data.Aeson.Generic encode Maybe the same way as Data.Aeson.Encode

I want to use the generics encoder to encode my custom records as JSON, but I have Maybe fields, and the current implementation encodes those as "Nothing" and {"Just": 12} instead of as null and 12. Numbers and booleans are both special cased in the Generic encoder, so it seems like this should be possible.

Integer overflows silently ignored in fromJSON

I just stumbled over the following behavior while hunting down a bug caused by it:

> fromJSON (Number 260) :: Result Word8
Success 4
it :: Result Word8

Imho, the conversion above should have failed in order to detect when assumptions w.r.t. the JSON serialization are wrong, as 260 can't be properly represented in a Word8; or put differently, I'd expect the following pseudo law/property to hold:

toJSON <$> fromJSON x == pure x  -- iff fromJSON succeeds

encodeDouble test fails linux amd64 ghc 7.2.1 and 7.0.4

ghc 7.0.4 gentoo amd64:
make -j8 -C /var/tmp/portage/dev-haskell/aeson-0.3.2.12/work/aeson-0.3.2.12/tests/
make: Entering directory /var/tmp/portage/dev-haskell/aeson-0.3.2.12/work/aeson-0.3.2.12/tests' ghc -hide-package aeson-native -Wall -o -threaded --make qc Properties.hs [1 of 1] Compiling Main ( Properties.hs, Properties.o ) Linking qc ... make: Leaving directory/var/tmp/portage/dev-haskell/aeson-0.3.2.12/work/aeson-0.3.2.12/tests'
encode:
0.0
0.8987273165487037
encodeDouble: [Failed]
Falsifiable with seed -4860853268025276496, after 2 tests. Reason: Falsifiable
encodeInteger: [OK, passed 100 tests]
roundTrip:
roundTripBool: [OK, passed 100 tests]
roundTripDouble: [OK, passed 100 tests]
roundTripInteger: [OK, passed 100 tests]
toFromJSON:
Integer: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Maybe Integer: [OK, passed 100 tests]
Either Integer Double: [OK, passed 100 tests]
Either Integer Integer: [OK, passed 100 tests]

     Properties   Total       

Passed 9 9
Failed 1 1
Total 10 10

  • ERROR: dev-haskell/aeson-0.3.2.12 failed (test phase):
  • tests failed

encode:
0.0
18.0
encodeDouble: [Failed]
Falsifiable with seed -2437412091130261666, after 30 tests. Reason: Falsifiable
encodeInteger: [OK, passed 100 tests]
roundTrip:
roundTripBool: [OK, passed 100 tests]
roundTripDouble: [OK, passed 100 tests]
roundTripInteger: [OK, passed 100 tests]
toFromJSON:
Integer: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Maybe Integer: [OK, passed 100 tests]
Either Integer Double: [OK, passed 100 tests]
Either Integer Integer: [OK, passed 100 tests]

     Properties   Total       

Passed 9 9
Failed 1 1
Total 10 10

ghc 7.2.1 gentoo amd64:
make -j8 -C /var/tmp/portage/dev-haskell/aeson-0.3.2.12/work/aeson-0.3.2.12/tests/
make: Entering directory /var/tmp/portage/dev-haskell/aeson-0.3.2.12/work/aeson-0.3.2.12/tests' ghc -hide-package aeson-native -Wall -o -threaded --make qc Properties.hs [1 of 1] Compiling Main ( Properties.hs, Properties.o ) Linking qc ... make: Leaving directory/var/tmp/portage/dev-haskell/aeson-0.3.2.12/work/aeson-0.3.2.12/tests'
encode:
1.0
22.0
encodeDouble: [Failed]
Falsifiable with seed 6747498403742802801, after 30 tests. Reason: Falsifiable
encodeInteger: [OK, passed 100 tests]
roundTrip:
roundTripBool: [OK, passed 100 tests]
roundTripDouble: [OK, passed 100 tests]
roundTripInteger: [OK, passed 100 tests]
toFromJSON:
Integer: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Maybe Integer: [OK, passed 100 tests]
Either Integer Double: [OK, passed 100 tests]
Either Integer Integer: [OK, passed 100 tests]

     Properties   Total       

Passed 9 9
Failed 1 1
Total 10 10

  • ERROR: dev-haskell/aeson-0.3.2.12 failed (test phase):
  • tests failed

Why does decodeWith use parse instead of parseOnly?

From my reading of the code it seems that parsing could fail even if there's more input to consume and consuming that input could make the parse succeed. It seems to me that decodeWith should use Attoparsec.parseOnly instead of Attoparsec.parse.

Data.Aeson.Types.object

Just a minor thing, but would it be possible for object to have type
ToJSON a ⇒ [(Text, a)] → Value
instead of
[(Text, Value)] → Value
?

Thank you

aeson-0.3.2.10 compilation broken

When trying to build the latest aeson-0.3.2.10 package w/ ghc 7.0:

Configuring aeson-0.3.2.10...
Preprocessing library aeson-0.3.2.10...
Building aeson-0.3.2.10...
[1 of 6] Compiling Data.Aeson.Functions ( Data/Aeson/Functions.hs, dist/build/Data/Aeson/Functions.o )
[2 of 6] Compiling Data.Aeson.Types ( Data/Aeson/Types.hs, dist/build/Data/Aeson/Types.o )
[3 of 6] Compiling Data.Aeson.Generic ( Data/Aeson/Generic.hs, dist/build/Data/Aeson/Generic.o )
[4 of 6] Compiling Data.Aeson.Parser ( Data/Aeson/Parser.hs, dist/build/Data/Aeson/Parser.o )
[5 of 6] Compiling Data.Aeson.Encode ( Data/Aeson/Encode.hs, dist/build/Data/Aeson/Encode.o )

Data/Aeson/Encode.hs:74:33:
    No instance for (Data.String.IsString Builder)
      arising from the literal `"null"'
    Possible fix:
      add an instance declaration for (Data.String.IsString Builder)
    In the expression: "null"
    In an equation for `fromNumber':
        fromNumber (D d)
          | isNaN d || isInfinite d = "null"
          | otherwise = double d
cabal: Error: some packages failed to install:
aeson-0.3.2.10 failed during the building phase. The exception was:
ExitFailure 1

Space leak in `Data.Aeson.Parser`

This issue is mostly for documenting a known issue and a possible fix/workaround

The current parser implementation in Data/Aeson/Parser creates many thunks for records and arrays during parsing. Those thunks cause measurable overhead in heap usage and performance, which is already visible for some of the JSON documents contained in the Aeson benchmark suite. For even larger JSON documents in the MiB range, the overhead becomes even more substantial. Compared to Python's parser, Aeson then is a few times slower.

The benchmark suite itself doesn't evaluate those thunks, unless the following diff is applied:

diff --git a/benchmarks/AesonParse.hs b/benchmarks/AesonParse.hs
index a05ed5f..9e74e56 100644
--- a/benchmarks/AesonParse.hs
+++ b/benchmarks/AesonParse.hs
@@ -24,7 +24,9 @@ main = do
           let refill = B.hGet h 16384
           result <- parseWith refill json =<< refill
           case result of
-            Done _ r -> loop (good+1) bad
+            Done _ r -> do
+                        evaluate $ rnf r 
+                        loop (good+1) bad
             _        -> loop good (bad+1)
     (good, _) <- loop 0 0
     delta <- flip diffUTCTime start `fmap` getCurrentTime

The following modification removes the space-leak in Data.Aeson.Parser:

diff --git a/Data/Aeson/Parser.hs b/Data/Aeson/Parser.hs
index ba7b135..8bbe07e 100644
--- a/Data/Aeson/Parser.hs
+++ b/Data/Aeson/Parser.hs
@@ -55,19 +55,23 @@ object_ = {-# SCC "object_" #-} do
         b <- char ':' *> skipSpace *> value
         return (a,b)
   vals <- ((pair <* skipSpace) `sepBy` (char ',' *> skipSpace)) <* char '}'
-  return . Object $ Map.fromList vals
+  return $! Object $! Map.fromList vals

 array_ :: Parser Value
 array_ = {-# SCC "array_" #-} do
   skipSpace
   vals <- ((value <* skipSpace) `sepBy` (char ',' *> skipSpace)) <* char ']'
-  return . Array $ Vector.fromList vals
+  return $! Array $! Vector.fromList vals

 -- | Parse any JSON value.  Use 'json' in preference to this function
 -- if you are parsing data from an untrusted source.
 value :: Parser Value
-value = most <|> (Number <$> number)
+value = most <|> fallback
  where
+  fallback = do
+    n <- number
+    return $! Number $! n
+
   most = do
     c <- satisfy (`B8.elem` "{[\"ftn")
     case c of

This leads to the following benchmark results:

-- lazy aeson-0.3.2.12 w/ 'evaluate $ rnf r' benchmark
$ ./AesonParse 700 json-data/jp100.json 
json-data/jp100.json:
  700 good, 3.459478s
  202 per second

-- strictified aeson-0.3.2.12 w/ 'evaluate $ rnf r' benchmark
$ ./AesonParse 700 json-data/jp100.json 
json-data/jp100.json:
  700 good, 3.176462s
  220 per second

-- Python 2.7
$ ./parse.py 700 json-data/jp100.json 
json-data/jp100.json:
  1.49782395363

Aeson does not support Maybe on nested records

This may be related to some of the other issues listed. I am unable to mark fields on nested records as Maybe. It compiles fine buy requires the fields to be in the JSON at runtime.

{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE ScopedTypeVariables #-}
import Data.Aeson
import Data.Text
import qualified Data.ByteString.Lazy as BL
import Data.ByteString.Lazy.Char8 ()

import GHC.Generics
import Data.Attoparsec.ByteString.Lazy as ABl

data RootRecord = RootRecord{
  field1 :: Maybe Text,
  field2 :: SubRecord
} deriving (Generic,Show)

data SubRecord = SubRecord{
  field3 :: Maybe Text
} deriving (Generic, Show)

instance FromJSON RootRecord
instance FromJSON SubRecord

main = do
  let jsonString :: BL.ByteString = "{ \"field1\" :\"foo\", \"field2\": {}}"
  let val :: Maybe (Value) = maybeResult $ ABl.parse json' jsonString
  foo <- case val of
        Just x -> 
          case fromJSON x of
            Success y -> return y
            Error str -> fail str
        Nothing -> error "No Parse"
  print (foo :: RootRecord)

Serializing integers lossy

Serializing and de-serializing the primitive Int type leads to confusing results:

let orig = [minBound, 00, maxBound :: Int]
> orig
[-9223372036854775808,0,9223372036854775807]
> (fromJust $ fromJSON  $ fromJust $ maybeResult $ parse json $ s2l $ encode orig) `asTypeOf` orig
[-9223372036854775808,0,-9223372036854775808]

Parsing JSON which isn't conforming to RFC

There is a bunch of places on the web there you can find JSON which deviate from the standard.
A rather typical example would be { a: 42, b: 'hello world' }

I'd like to see Aeson's parser relaxed to cover such bad examples. Your thoughts?

Serializing UTCTime

The default ToJson instance for serializing UTCTimes doesn't use the ISO-8601 subset described in ECMA-262:

> t <- getCurrentTime
t :: UTCTime
> encode t
Chunk "\"/Date(1296603856)/\"" Empty
it :: Data.ByteString.Lazy.Internal.ByteString

A general serializer for Data.Map

I don't know if the idea is good but look at this dirty proof of concept code:

class ToPropertyName a where
    toPropertyName :: a -> String

instance (ToPropertyName a, ToJSON b) => ToJSON (M.Map a b) where
    toJSON = toJSON . M.mapKeys toPropertyName  

In many languages (think of JS, PHP, Perl) map keys can only be strings. So people serialize other values to strings to get the lookup performance of native associative containers in their languages. To interoperate with those people a Haskeller needs to parse their compound map keys (property names in ECMAScript parlance) into a more type safe form.

Of course one can put compound values in any place, but property name is a special case because of performance, so it may deserve a special treatment in Aeson.

Also, in Haskell people use enumerations and newtypes isomorphic to strings in places where in poorer languages they have to use strings. So it's good to have an ability to use an enumeration (I mean data Foo = Bar | Baz | Quux) for map keys, either by detecting enumerations during TH derivation or by allowing people to use simple instances like toPropertyName = show instead of having to implement a full instance for Data.Map Foo Something

What do you think?

Ignore unexpected values when decoding (with TH derived instances / deriveFromJSON)

Let's start with an example. Here is some Haskell code and JSON input.

data SomeType = SomeType {
  foo :: String
, bar :: String
} deriving (Eq, Show, Data, Typeable)

$(deriveFromJSON id ''SomeType)
{
    "foo": "one",
    "bar": "two",
    "baz": "three"
}

Decoding this (using the the TH derived instance) with Data.Aeson.decode will fail due to the unexpected baz.
It works if I use Data.Aeson.Generic.decode, tough.

The reason why I want to use the derived instance is because I need to customize field names.

Would it be possible to ignore unexpected values in derived instances (like the generic decode does)? If it's not suitable as the default behavior, would it still be possible to provide it as an option?

On the other hand, allowing to customize field names with Data.Aeson.Generic.decode would also solve my current need. I'll take a look at the code right now. Would you accept a patch for that? (edit: I think this approach does not work that well with nested data types.)

Add (.=?) :: Text -> Maybe a -> Pair constructor?

This constructor should allow to use key-value pairs in the object's list that will generate JSON only if the value satisfies certain conditions (e.g. it is Nothing).

Here is an example:

{-# LANGUAGE OverloadedStrings #-}

import Data.Aeson

data Foo = Foo
  { a :: Maybe Int
  , b :: Maybe Int
  }

instance ToJSON Foo where
  toJSON (Foo a b) = object [ "a" .= a, "b" .=? b ]
*Main> encode $ Foo (Just 1) (Just 2)
Chunk "{\"a\":1,\"b\":2}" Empty
*Main> encode $ Foo (Just 1) Nothing
Chunk "{\"a\":1}" Empty
*Main> encode $ Foo Nothing Nothing
Chunk "{\"a\":null}" Empty

And one possible implementation:

diff --git a/Data/Aeson.hs b/Data/Aeson.hs
index d3db4c4..c288b93 100644
--- a/Data/Aeson.hs
+++ b/Data/Aeson.hs
@@ -30,6 +30,7 @@ module Data.Aeson
     , ToJSON(..)
     -- * Constructors and accessors
     , (.=)
+    , (.=?)
     , (.:)
     , (.:?)
     , (.!=)
diff --git a/Data/Aeson/Generic.hs b/Data/Aeson/Generic.hs
index 0c312b0..4349bc3 100644
--- a/Data/Aeson/Generic.hs
+++ b/Data/Aeson/Generic.hs
@@ -170,14 +170,14 @@ toJSON_generic = generic
         -- Use an array if the are no field names, but elide singleton arrays,
         -- and use an object if there are field names.
         encodeConstr c [] = String . constrString $ c
-        encodeConstr c as = object [(constrString c, encodeArgs c as)]
+        encodeConstr c as = object [Just (constrString c, encodeArgs c as)]

         constrString = pack . showConstr

         encodeArgs c = encodeArgs' (constrFields c)
         encodeArgs' [] [j] = j
         encodeArgs' [] js  = Array . V.fromList $ js
-        encodeArgs' ns js  = object $ zip (map pack ns) js
+        encodeArgs' ns js  = object $ map Just $ zip (map pack ns) js


 fromJSON :: (Data a) => Value -> Result a
diff --git a/Data/Aeson/Types.hs b/Data/Aeson/Types.hs
index 178a69f..1a1b502 100644
--- a/Data/Aeson/Types.hs
+++ b/Data/Aeson/Types.hs
@@ -34,6 +34,7 @@ module Data.Aeson.Types
     , ToJSON(..)
     -- * Constructors and accessors
     , (.=)
+    , (.=?)
     , (.:)
     , (.:?)
     , (.!=)
diff --git a/Data/Aeson/Types/Class.hs b/Data/Aeson/Types/Class.hs
index 8ae9704..fb38c22 100644
--- a/Data/Aeson/Types/Class.hs
+++ b/Data/Aeson/Types/Class.hs
@@ -1,6 +1,7 @@
 {-# LANGUAGE CPP, DeriveDataTypeable, FlexibleContexts, FlexibleInstances,
     GeneralizedNewtypeDeriving, IncoherentInstances, OverlappingInstances,
-    OverloadedStrings, UndecidableInstances, ViewPatterns #-}
+    OverloadedStrings, UndecidableInstances, ViewPatterns, MultiParamTypeClasses,
+    FunctionalDependencies #-}

 #ifdef GENERICS
 {-# LANGUAGE DefaultSignatures #-}
@@ -36,6 +37,7 @@ module Data.Aeson.Types.Class
     , (.:?)
     , (.!=)
     , (.=)
+    , (.=?)
     , typeMismatch
     ) where

@@ -711,11 +713,21 @@ instance FromJSON a => FromJSON (Last a) where
     parseJSON = fmap Last . parseJSON
     {-# INLINE parseJSON #-}

+class ToMaybe f a | f -> a where
+  toMaybe :: f -> Maybe a
+
+instance ToMaybe (Maybe a) a where
+  toMaybe = id
+
 -- | Construct a 'Pair' from a key and a value.
 (.=) :: ToJSON a => Text -> a -> Pair
-name .= value = (name, toJSON value)
+name .= value = Just (name, toJSON value)
 {-# INLINE (.=) #-}

+(.=?) :: (ToJSON a, ToMaybe f a) => Text -> f -> Pair
+name .=? value = maybe Nothing (name .=) $ toMaybe value
+{-# INLINE (.=?) #-}
+
 -- | Convert a value from JSON, failing if the types do not match.
 fromJSON :: (FromJSON a) => Value -> Result a
 fromJSON = parse parseJSON
diff --git a/Data/Aeson/Types/Internal.hs b/Data/Aeson/Types/Internal.hs
index 974e1b5..f31aef3 100644
--- a/Data/Aeson/Types/Internal.hs
+++ b/Data/Aeson/Types/Internal.hs
@@ -43,6 +43,7 @@ import Data.Typeable (Typeable)
 import Data.Vector (Vector)
 import qualified Data.HashMap.Strict as H
 import qualified Data.Vector as V
+import Data.Maybe (catMaybes)

 -- | The result of running a 'Parser'.
 data Result a = Error String
@@ -213,11 +214,11 @@ parseEither :: (a -> Parser b) -> a -> Either String b
 parseEither m v = runParser (m v) Left Right

 -- | A key\/value pair for an 'Object'.
-type Pair = (Text, Value)
+type Pair = Maybe (Text, Value)

 {-# INLINE parseEither #-}
 -- | Create a 'Value' from a list of name\/value 'Pair's.  If duplicate
 -- keys arise, earlier keys and their associated values win.
 object :: [Pair] -> Value
-object = Object . H.fromList
+object = Object . H.fromList . catMaybes
 {-# INLINE object #-}

It should work for each instance of ToMaybe.

Pruning Nulls from TH "toJSON" instances

I like it when "{}" can be a valid encoding of data A = A { unA :: Maybe Bool }. The current TH generator will create a JSON Value like Object (fromList [("unA", Null)]).

I feel as though the decode step should succeed even if the encode step still produces the more formal encoding A Nothing <-> "{\"unA\" : null}". I don't feel like this violates the RFC—it's more about how Haskell types are interpreted as schemata—but I also don't know what the expectations of the default (boilerplate-free) encodings are.

For instance, I feel like for any TH instance (really any instance) the rule decode . encode == Just ought to hold, but I feel like fmap encode . decode == Just for all valid encodings is too strong (ignoring the ambiguous polymorphism there). Instead, fmap encode . decode ought to just be idempotent, i.e.

roundtripAs z = fmap encode . (flip asTypeOf $ Just z) . decode

-- This holds on more valid decodes than a similar `prop_isomorphism` would
prop_idem z = \x -> (roundtripAs z >-> roundtripAs z) x == roundtripAs z x

which allows for decode to be a lot more liberal.

`fromJSON . toJSON == id` does not hold for `Either a a`

The ToJSON and FromJSON instances of Either ignore the Left and Right tags:

instance (ToJSON a, ToJSON b) => ToJSON (Either a b) where
    toJSON (Left a)  = toJSON a
    toJSON (Right b) = toJSON b

instance (FromJSON a, FromJSON b) => FromJSON (Either a b) where
    parseJSON a = Left <$> parseJSON a <|> Right <$> parseJSON a

This violates the very nice property: fromJSON . toJSON == id:

Data.Aeson> fromJSON (toJSON (Right 3 :: Either Int Int)) :: Result (Either Int Int)
Success (Left 3)

Note that the generic implementations of toJSON and fromJSON do satisfy the property:

> import qualified Data.Aeson.Generic as G
Data.Aeson.Generic> G.fromJSON (G.toJSON (Right 3 :: Either Int Int)) :: Result (Either Int Int)
Success (Right 3)

Is there a good reason for not following the property?

Thanks,

Bas

genericToFromJSON: _UFoo: [Failed] in current git tree

With ghc 7.0.4 linux amd64, test genericToFromJSON: _UFoo: [Failed] in current git tree:

argus tests # ./qc
encode:
encodeDouble: [OK, passed 100 tests]
encodeInteger: [OK, passed 100 tests]
genericFrom:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
genericTo:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
roundTrip:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Integer: [OK, passed 100 tests]
String: [OK, passed 100 tests]
Text: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
toFromJSON:
Integer: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Maybe Integer: [OK, passed 100 tests]
Either Integer Double: [OK, passed 100 tests]

Either Integer Integer: [OK, passed 100 tests]
genericToFromJSON:
_UFoo: [Failed]
Falsifiable with seed 5308406286099367168, after 1 tests. Reason: Falsifiable

     Properties   Total       

Passed 22 22
Failed 1 1
Total 23 23
argus tests # ./qc
encode:
encodeDouble: [OK, passed 100 tests]
encodeInteger: [OK, passed 100 tests]
genericFrom:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
genericTo:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
roundTrip:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Integer: [OK, passed 100 tests]
String: [OK, passed 100 tests]
Text: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
toFromJSON:
Integer: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Maybe Integer: [OK, passed 100 tests]
Either Integer Double: [OK, passed 100 tests]

Either Integer Integer: [OK, passed 100 tests]
genericToFromJSON:
_UFoo: [Failed]
Falsifiable with seed 8620178254206185358, after 1 tests. Reason: Falsifiable

     Properties   Total       

Passed 22 22
Failed 1 1
Total 23 23
argus tests # ./qc
encode:
encodeDouble: [OK, passed 100 tests]
encodeInteger: [OK, passed 100 tests]
genericFrom:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
genericTo:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
roundTrip:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Integer: [OK, passed 100 tests]
String: [OK, passed 100 tests]
Text: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
toFromJSON:
Integer: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Maybe Integer: [OK, passed 100 tests]
Either Integer Double: [OK, passed 100 tests]
UFoo {_UFooInt = -1, uFooInt = -1}
Either Integer Integer: [OK, passed 100 tests]
genericToFromJSON:
_UFoo: [Failed]
Falsifiable with seed 1349346501665683387, after 1 tests. Reason: Falsifiable

     Properties   Total       

Passed 22 22
Failed 1 1
Total 23 23
argus tests # ./qc
encode:
encodeDouble: [OK, passed 100 tests]
encodeInteger: [OK, passed 100 tests]
genericFrom:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
genericTo:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
roundTrip:
Bool: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Int: [OK, passed 100 tests]
Integer: [OK, passed 100 tests]
String: [OK, passed 100 tests]
Text: [OK, passed 100 tests]
Foo: [OK, passed 100 tests]
toFromJSON:
Integer: [OK, passed 100 tests]
Double: [OK, passed 100 tests]
Maybe Integer: [OK, passed 100 tests]
Either Integer Double: [OK, passed 100 tests]

Either Integer Integer: [OK, passed 100 tests]
genericToFromJSON:
_UFoo: [Failed]
Falsifiable with seed 5534180966496665992, after 1 tests. Reason: Falsifiable

     Properties   Total       

Passed 22 22
Failed 1 1
Total 23 23
argus tests #

Parser helper to extract data to simple types

A decode alternative that can take a parser monad and apply it to a JSON data to extract simple types without defining custom data definitions.

extract :: (Value -> Parser a) -> ByteString -> Either String a
extract parser dat = either Left (parseEither parser) $ parseOnly value dat

Example usage: http://hpaste.org/77970

Currently it takes 3 import lines(8-10 in the paste) added to module boilerplate to do such a simple thing.

decode strict bytestrings?

Several times I've run into the situation where I'd like to use Aeson to decode strict bytestrings, as in the case of decoding many very small json files, or when reading from then writing to the same file, in which case the lazy readFile seems to keep the handle open too long in some instances.

I've been using this function, which is just a slight variation on Aeson's decodeWith:

import qualified Data.ByteString as B
import qualified Data.Aeson as A
import qualified Data.Attoparsec as AP

decodeStrict :: A.FromJSON a => B.ByteString -> Maybe a
decodeStrict bs = 
    case AP.parse A.json' bs of
      AP.Done _ v -> case A.fromJSON v of
                       A.Success a -> Just a
                       _            -> Nothing
      _           -> Nothing

It would be nice to have this in the core Aeson package. I can put together a pull request if there's no objection.

GenericSYB don't parse constructor with no args

When I have
data CommandForBench = BenchCommandConnectGUI
decode (encode BenchCommandConnectGUI) gives Nothing.

When I change to
data CommandForBench = BenchCommandConnectGUI ()
everything works as expected.

Cheers
Jürgen

cabal install aeson fails with hashable < 1.1.0.0

I believe hashable-1.1.0.0 was the first version to introduce Text as an instance of Hashable.

$ cabal install aeson
Resolving dependencies...
Configuring aeson-0.3.2.1...
Preprocessing library aeson-0.3.2.1...
Building aeson-0.3.2.1...
[1 of 9] Compiling Data.Aeson.Encode.Int ( Data/Aeson/Encode/Int.hs, dist/build/Data/Aeson/Encode/Int.o )
[2 of 9] Compiling Data.Aeson.Encode.Double ( Data/Aeson/Encode/Double.hs, dist/build/Data/Aeson/Encode/Double.o )
[3 of 9] Compiling Data.Aeson.Functions ( Data/Aeson/Functions.hs, dist/build/Data/Aeson/Functions.o )
[4 of 9] Compiling Data.Aeson.Encode.Number ( Data/Aeson/Encode/Number.hs, dist/build/Data/Aeson/Encode/Number.o )
[5 of 9] Compiling Data.Aeson.Types ( Data/Aeson/Types.hs, dist/build/Data/Aeson/Types.o )

Data/Aeson/Types.hs:551:27:
No instance for (Data.Hashable.Hashable Text)
arising from a use of H.fromList' at Data/Aeson/Types.hs:551:27-36 Possible fix: add an instance declaration for (Data.Hashable.Hashable Text) In the first argument of(<$>)', namely H.fromList' In the expression: H.fromList <$> mapM go (M.toList o) In the definition ofparseJSON':
parseJSON (Object o)
= H.fromList <$> mapM go (M.toList o)
where
go (k, v) = ((,) k) <$> parseJSON v

Data/Aeson/Types.hs:559:22:
No instance for (Data.Hashable.Hashable LT.Text)
arising from a use of mapHash' at Data/Aeson/Types.hs:559:22-42 Possible fix: add an instance declaration for (Data.Hashable.Hashable LT.Text) In the first argument offmap', namely (mapHash LT.fromStrict)' In the first argument of(.)', namely
`fmap (mapHash LT.fromStrict)'
In the expression: fmap (mapHash LT.fromStrict) . parseJSON
cabal: Error: some packages failed to install:
aeson-0.3.2.1 failed during the building phase. The exception was:
ExitFailure 1

Enhance representation of Numbers

This issue actually covers two items:

  1. optimize memory layout for small integers
  2. add support for arbitrary precision fractional values

  1. Currently in aeson's IR small integers requires 3 heap objects for being represented:
data Data.Aeson.Types.Value
  = ...
  | Data.Aeson.Types.Number Data.Attoparsec.Number.Number

data Data.Attoparsec.Number.Number
  = Data.Attoparsec.Number.I !Integer
  | Data.Attoparsec.Number.D {-# UNPACK #-} !Double

data Integer
  = integer-gmp:GHC.Integer.Type.S# GHC.Prim.Int#
  | integer-gmp:GHC.Integer.Type.J# GHC.Prim.Int# GHC.Prim.ByteArray#

When representing a list of 1000 small integer values, this requires 3000 heap objects, each of which requires 2 words, leading to a space requirement of 6000 words (= ~48 KiB on 64bit). This could be reduced by 1/3 down to 4000 words by adding an additional constructor to Number:

data Data.Attoparsec.Number.Number
  = Data.Attoparsec.Number.I !Int
  | Data.Attoparsec.Number.IB !Integer
  | Data.Attoparsec.Number.D {-# UNPACK #-} !Double

and adapting attoparsec and aeson to use the I constructor if the parsed integer is representable with a plain Int.


  1. arbitrary precision decimals

Currently a decode followed by a encode via aeson is not loss-free, as it has to reduce the parsed decimal to the numbers representable by the Double type. Here's an example that shows the problem already occuring with small values as they occur in monetary applications:

{-# LANGUAGE OverloadedStrings #-}

import Data.Aeson.Parser
import Data.Aeson
import qualified Data.Attoparsec as AP
import qualified Data.ByteString.Lazy as BL
import qualified Data.ByteString as B

toStrictBS = B.concat . BL.toChunks

main = do
  let x = AP.parseOnly json "[1.62,1.64,1.82]"
  print x
  print $ toStrictBS (encode x)

when executed this outputs:

Right (Array fromList [Number 1.62,Number 1.6400000000000001,Number 1.8199999999999998] :: Data.Vector.Vector)
"[1.62,1.6400000000000001,1.8199999999999998]"

A similar approach as in 1. could be used to use the Double type when the parsed value can be properly represented by it, and if not switch to an arbitrary-precision representation, for instance something optimized for base-10 such as

data Decimal = Decimal { decSignificant, decExponent :: !Integer }

toRational (Decimal s e) | e >= 0     = toRational (s * 10^e)
                         | otherwise  = s % (10 ^ (-e))

Incorrect `fromJSON` parsing of `Nothing`-values in Generic module

{-# LANGUAGE DeriveDataTypeable #-}
import Data.Aeson.Generic
import Data.Aeson hiding (fromJSON, toJSON)
import Data.Generics

data A = A { a :: Maybe String } 
  deriving (Typeable, Data, Show)

main = return $ (fromJSON $ toJSON $ A Nothing :: Result A)

results in Error "Data.Aeson.Generic.parseJSON: bad decodeArgs data (Nothing,Null)"

v. 0.6.0.2

incorrect "fromJSON" parsing of Data.Set.Set

sets are rendered as lists, but not parsed back into sets. this may be related to #101, #102. i just pulled the head, but problem persists. to reproduce:

{-# LANGUAGE DeriveDataTypeable          #-}

import Control.Applicative ((<$>))
import Data.Aeson.Generic as A
import Data.Aeson.Parser (value)
import Data.Aeson.Types
import Data.Data
import Data.Set
import Data.Typeable

import Data.Attoparsec.Lazy as Parser

data Document =
    Document
      { fromDocument :: Set String
      }
  deriving (Eq, Ord, Show, Read, Data, Typeable)

x1 = A.encode $ Document empty
y1 = A.decode x1 :: Maybe Document

y2 :: Parser.Result (Data.Aeson.Types.Result Document)
y2 = fmap A.fromJSON $ Parser.parse value x1
*Main> x1
Chunk "{\"fromDocument\":[]}" Empty
*Main> y1
Nothing
*Main> y2
Done Empty Error "Data.Aeson.Generic.parseJSON: NoRep(DataType {tycon = \"Data.Set.Set\", datarep = NoRep})"

Drop the "key not found" exception in favor of monoid empty value

Please consider this record type

data A = A { a :: Maybe String, b :: Int, c :: [Int] }

What's common about its field types is that they all have Monoid instances.

What I propose is instead of failing the parser with a "key not found" exception in case when a JSON value with some missing fields is provided (for instance, {"a": "abc", "b": 3} - field "c" is missing) to provide a Monoid's empty value for the type. So for input:

{"a": "abc", "b": 3}

the decoding would result in

A (Just "abc") 3 []

and for an absolutely empty object

{}

it would result in all the monoids getting empty values, i.e:

A Nothing 0 []

The parser should however still fail for types not having the Monoid instances.

Implementing this will also open an ensuing possibility of greatly reducing the generated JSON, which will provide all kinds of benefits, such as traffic and storage reduction.

Can we get better error messages?

Today I tried to decode a "1" to an Int, but it failed, because apparently only objects and arrays are top-level JSON (this is not said explicitly on http://json.org though).

eitherDecode ("1") :: Either String Int
Left "Failed reading: satisfy"

This took me quite some time. Is it possible to have slightly more informative error messages than this?

Performance problem or benchmarking issue?

Hi,

I'm trying to compare the performance of json (Text.JSON) and aeson, and I get surprising numbers. I apologise in advance if in fact the problem is with my benchmark setup, rather then actual performance issue.

I have the following benchmark program:

import Criterion.Main
import Control.DeepSeq
import qualified Data.ByteString.Lazy as BL
import qualified Text.JSON as J
import qualified Data.Aeson as A

instance (NFData v) => NFData (J.JSObject v) where
  rnf o = rnf (J.fromJSObject o)

instance NFData J.JSValue where
  rnf J.JSNull = ()
  rnf (J.JSBool b) = rnf b
  rnf (J.JSRational a b) = rnf a `seq` rnf b `seq` ()
  rnf (J.JSString s) = rnf (J.fromJSString s)
  rnf (J.JSArray lst) = rnf lst
  rnf (J.JSObject o) = rnf o

encodeJ = length . J.encode

encodeA = BL.length . A.encode

decodeJ :: String -> J.JSObject J.JSValue
decodeJ s =
  case J.decodeStrict s of
    J.Ok v -> v
    J.Error e -> error "fail to parse via JSON"

decodeA :: BL.ByteString -> A.Value
decodeA s = case A.decode' s of
              Nothing -> error "fail to parse via Aeson"
              Just v -> v
main = do
  js <- readFile "config.data"
  as <- BL.readFile "config.data"
  let jdata = decodeJ js
      adata = decodeA as
  defaultMain [
        bgroup "decode" [ bench "json"  $ nf decodeJ js
                        , bench "aeson" $ nf decodeA as
                        ],
        bgroup "encode" [ bench "json"  $ nf encodeJ jdata
                        , bench "aeson" $ nf encodeA adata
                        ]
       ]

Run on an about 1.1MB json input file, with ghc 6.12.1 and aeson 0.4.0 and json 0.4.3 it gives the following:

decode/json  mean: 142.8555 ms, lb 142.5682 ms, ub 143.7004 ms, ci 0.950
decode/aeson mean: 144.2851 ms, lb 142.2344 ms, ub 146.7814 ms, ci 0.950
encode/json  mean: 45.12455 ms, lb 44.05486 ms, ub 46.67985 ms, ci 0.950
encode/aeson mean: 56.76156 ms, lb 56.71679 ms, ub 56.81222 ms, ci 0.950

I would expect aeson to be faster, but the numbers are really really close, so I'm not sure what I'm doing wrong here. Any hints? It almost looks like I'm not testing the actual encoding/decoding.

Unfortunately I can't provide actual JSON file easily, but I can try and give a sanitised one if that's needed to help debug the issue.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.