This package has been deprecated in favor of http-conduit.
snoyberg / http-enumerator Goto Github PK
View Code? Open in Web Editor NEWHTTP client package with enumerator interface and HTTPS support.
HTTP client package with enumerator interface and HTTPS support.
This package has been deprecated in favor of http-conduit.
Yet again (had something similar with this particular site in #11 a while ago), performing a simple simpleHttp "https://t4a.box-world.org/"
fails with an exception:
GHCi, version 7.2.2: http://www.haskell.org/ghc/ :? for help
...
Prelude> import Network.HTTP.Enumerator
(0.08 secs, 32915624 bytes)
Prelude Network.HTTP.Enumerator> simpleHttp "https://t4a.box-world.org/"
Loading package filepath-1.2.0.1 ... linking ... done.
Loading package old-locale-1.0.0.3 ... linking ... done.
Loading package old-time-1.0.0.7 ... linking ... done.
Loading package unix-2.5.0.0 ... linking ... done.
Loading package directory-1.1.0.1 ... linking ... done.
Loading package time-1.2.0.5 ... linking ... done.
Loading package bytestring-0.9.2.0 ... linking ... done.
Loading package array-0.3.0.3 ... linking ... done.
Loading package deepseq-1.2.0.1 ... linking ... done.
Loading package containers-0.4.2.0 ... linking ... done.
Loading package text-0.11.1.9 ... linking ... done.
Loading package attoparsec-0.10.0.3 ... linking ... done.
Loading package transformers-0.2.2.0 ... linking ... done.
Loading package enumerator-0.4.16 ... linking ... done.
Loading package attoparsec-enumerator-0.3 ... linking ... done.
Loading package mtl-2.0.1.0 ... linking ... done.
Loading package asn1-data-0.6.1.1 ... linking ... done.
Loading package base64-bytestring-0.1.0.3 ... linking ... done.
Loading package blaze-builder-0.3.0.1 ... linking ... done.
Loading package blaze-builder-enumerator-0.2.0.3 ... linking ... done.
Loading package hashable-1.1.2.2 ... linking ... done.
Loading package case-insensitive-0.4 ... linking ... done.
Loading package cereal-0.3.4.0 ... linking ... done.
Loading package entropy-0.2.1 ... linking ... done.
Loading package largeword-1.0.1 ... linking ... done.
Loading package dlist-0.5 ... linking ... done.
Loading package data-default-0.3.0 ... linking ... done.
Loading package semigroups-0.8 ... linking ... done.
Loading package tagged-0.2.3.1 ... linking ... done.
Loading package crypto-api-0.8 ... linking ... done.
Loading package crypto-pubkey-types-0.1.0 ... linking ... done.
Loading package certificate-1.0.1 ... linking ... done.
Loading package primitive-0.4.0.1 ... linking ... done.
Loading package vector-0.9 ... linking ... done.
Loading package cryptocipher-0.3.0 ... linking ... done.
Loading package random-1.0.1.0 ... linking ... done.
Loading package cprng-aes-0.2.3 ... linking ... done.
Loading package failure-0.1.0.1 ... linking ... done.
Loading package http-types-0.6.7 ... linking ... done.
Loading package base-unicode-symbols-0.2.2.2 ... linking ... done.
Loading package monad-control-0.2.0.3 ... linking ... done.
Loading package parsec-3.1.2 ... linking ... done.
Loading package network-2.3.0.7 ... linking ... done.
Loading package cryptohash-0.7.4 ... linking ... done.
Loading package tls-0.8.3 ... linking ... done.
Loading package tls-extra-0.4.1 ... linking ... done.
Loading package utf8-string-0.3.7 ... linking ... done.
Loading package zlib-0.5.3.1 ... linking ... done.
Loading package zlib-bindings-0.0.1 ... linking ... done.
Loading package zlib-enum-0.2.1 ... linking ... done.
Loading package http-enumerator-0.7.1.7 ... linking ... done.
*** Exception: sendData: end of file
Prelude Network.HTTP.Enumerator>
I couldn't find out how I can control the SSL certificate verification:
> simpleHttp "https://74.125.232.113/"
Chunk "<!doctype html> …
As opposed to:
$ curl https://74.125.232.113
curl: (51) SSL: certificate subject name 'www.google.com' does not match target host name '74.125.232.113'
Could you please either document this or implement it?
thanks!
iustin
The following program demonstrates that http-enumerator sometimes does not close the sockets that it opened. This is a problem because it means that we run out of file descriptors at some point.
-- put into a file OpenFileTest.hs
import Prelude hiding (catch)
import Network.HTTP.Enumerator
import Control.Concurrent
import Control.Exception
import qualified Data.ByteString.Lazy as BSL
main =
do mapM_ (\_ -> simpleHttp "http://localhost/index.html" `catch` handle) [0..100]
threadDelay 1000000000000000
putStrLn "Done"
where
handle :: SomeException -> IO BSL.ByteString
handle e =
do putStrLn (show e)
return BSL.empty
Make sure that you don't have a web server running at localhost. Then compile the program with
ghc -package http-enumerator-0.6.5 --make -fforce-recomp OpenFileTest.hs
and run it. On Linux, I verified that http-enumerator leaves all its sockets opened using the lsof command:
lsof | grep $(ps -A | gawk '/OpenFileTest/ {print $1}' | head -1)
Here's the output:
OpenFileT 23304 swehr cwd DIR 0,19 4096 2788545 /home/swehr/tmp (10.10.2.4:/var/nfs/homes)
OpenFileT 23304 swehr rtd DIR 0,14 4096 2 / (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr txt REG 0,19 13155174 2780726 /home/swehr/tmp/OpenFileTest (10.10.2.4:/var/nfs/homes)
OpenFileT 23304 swehr mem REG 0,14 10096 2261 /usr/lib64/gconv/ISO8859-15.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 10120 11967 /usr/lib64/gconv/UTF-32.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 51528 190 /lib64/libnss_files-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 3938288 2142 /usr/lib64/locale/locale-archive (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 130966 97 /lib64/libpthread-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 1399984 25 /lib64/libc-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 526456 86 /lib64/libm-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 361968 2289 /usr/lib64/libgmp.so.3.5.2 (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 14512 36 /lib64/libdl-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 10464 1723 /lib64/libutil-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 35656 93 /lib64/librt-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 88368 115 /lib64/libz.so.1.2.3 (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 128424 20 /lib64/ld-2.11.2.so (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr mem REG 0,14 26050 248 /usr/lib64/gconv/gconv-modules.cache (10.10.2.4:/netboot/union/picasso)
OpenFileT 23304 swehr 0u CHR 136,4 0t0 7 /dev/pts/4
OpenFileT 23304 swehr 1u CHR 136,4 0t0 7 /dev/pts/4
OpenFileT 23304 swehr 2u CHR 136,4 0t0 7 /dev/pts/4
OpenFileT 23304 swehr 3u sock 0,5 0t0 861384 can't identify protocol
OpenFileT 23304 swehr 4u sock 0,5 0t0 861386 can't identify protocol
OpenFileT 23304 swehr 5u sock 0,5 0t0 861388 can't identify protocol
OpenFileT 23304 swehr 6u sock 0,5 0t0 861390 can't identify protocol
OpenFileT 23304 swehr 7u sock 0,5 0t0 861392 can't identify protocol
OpenFileT 23304 swehr 8u sock 0,5 0t0 861394 can't identify protocol
OpenFileT 23304 swehr 9u sock 0,5 0t0 861396 can't identify protocol
OpenFileT 23304 swehr 10u sock 0,5 0t0 861398 can't identify protocol
OpenFileT 23304 swehr 11u sock 0,5 0t0 861400 can't identify protocol
OpenFileT 23304 swehr 12u sock 0,5 0t0 861402 can't identify protocol
OpenFileT 23304 swehr 13u sock 0,5 0t0 861404 can't identify protocol
OpenFileT 23304 swehr 14u sock 0,5 0t0 861406 can't identify protocol
OpenFileT 23304 swehr 15u sock 0,5 0t0 861408 can't identify protocol
OpenFileT 23304 swehr 16u sock 0,5 0t0 861410 can't identify protocol
OpenFileT 23304 swehr 17u sock 0,5 0t0 861412 can't identify protocol
OpenFileT 23304 swehr 18u sock 0,5 0t0 861414 can't identify protocol
OpenFileT 23304 swehr 19u sock 0,5 0t0 861416 can't identify protocol
OpenFileT 23304 swehr 20u sock 0,5 0t0 861418 can't identify protocol
OpenFileT 23304 swehr 21u sock 0,5 0t0 861420 can't identify protocol
OpenFileT 23304 swehr 22u sock 0,5 0t0 861422 can't identify protocol
OpenFileT 23304 swehr 23u sock 0,5 0t0 861424 can't identify protocol
OpenFileT 23304 swehr 24u sock 0,5 0t0 861426 can't identify protocol
OpenFileT 23304 swehr 25u sock 0,5 0t0 861428 can't identify protocol
OpenFileT 23304 swehr 26u sock 0,5 0t0 861430 can't identify protocol
OpenFileT 23304 swehr 27u sock 0,5 0t0 861432 can't identify protocol
OpenFileT 23304 swehr 28u sock 0,5 0t0 861434 can't identify protocol
OpenFileT 23304 swehr 29u sock 0,5 0t0 861436 can't identify protocol
OpenFileT 23304 swehr 30u sock 0,5 0t0 861438 can't identify protocol
OpenFileT 23304 swehr 31u sock 0,5 0t0 861440 can't identify protocol
OpenFileT 23304 swehr 32u sock 0,5 0t0 861442 can't identify protocol
OpenFileT 23304 swehr 33u sock 0,5 0t0 861444 can't identify protocol
OpenFileT 23304 swehr 34u sock 0,5 0t0 861446 can't identify protocol
OpenFileT 23304 swehr 35u sock 0,5 0t0 861448 can't identify protocol
OpenFileT 23304 swehr 36u sock 0,5 0t0 861450 can't identify protocol
OpenFileT 23304 swehr 37u sock 0,5 0t0 861452 can't identify protocol
OpenFileT 23304 swehr 38u sock 0,5 0t0 861454 can't identify protocol
OpenFileT 23304 swehr 39u sock 0,5 0t0 861456 can't identify protocol
OpenFileT 23304 swehr 40u sock 0,5 0t0 861458 can't identify protocol
OpenFileT 23304 swehr 41u sock 0,5 0t0 861460 can't identify protocol
OpenFileT 23304 swehr 42u sock 0,5 0t0 861462 can't identify protocol
OpenFileT 23304 swehr 43u sock 0,5 0t0 861464 can't identify protocol
OpenFileT 23304 swehr 44u sock 0,5 0t0 861466 can't identify protocol
OpenFileT 23304 swehr 45u sock 0,5 0t0 861468 can't identify protocol
OpenFileT 23304 swehr 46u sock 0,5 0t0 861470 can't identify protocol
OpenFileT 23304 swehr 47u sock 0,5 0t0 861472 can't identify protocol
OpenFileT 23304 swehr 48u sock 0,5 0t0 861474 can't identify protocol
OpenFileT 23304 swehr 49u sock 0,5 0t0 861476 can't identify protocol
OpenFileT 23304 swehr 50u sock 0,5 0t0 861478 can't identify protocol
OpenFileT 23304 swehr 51u sock 0,5 0t0 861480 can't identify protocol
OpenFileT 23304 swehr 52u sock 0,5 0t0 861482 can't identify protocol
OpenFileT 23304 swehr 53u sock 0,5 0t0 861484 can't identify protocol
OpenFileT 23304 swehr 54u sock 0,5 0t0 861486 can't identify protocol
OpenFileT 23304 swehr 55u sock 0,5 0t0 861488 can't identify protocol
OpenFileT 23304 swehr 56u sock 0,5 0t0 861490 can't identify protocol
OpenFileT 23304 swehr 57u sock 0,5 0t0 861492 can't identify protocol
OpenFileT 23304 swehr 58u sock 0,5 0t0 861494 can't identify protocol
OpenFileT 23304 swehr 59u sock 0,5 0t0 861496 can't identify protocol
OpenFileT 23304 swehr 60u sock 0,5 0t0 861498 can't identify protocol
OpenFileT 23304 swehr 61u sock 0,5 0t0 861500 can't identify protocol
OpenFileT 23304 swehr 62u sock 0,5 0t0 861502 can't identify protocol
OpenFileT 23304 swehr 63u sock 0,5 0t0 861504 can't identify protocol
OpenFileT 23304 swehr 64u sock 0,5 0t0 861506 can't identify protocol
OpenFileT 23304 swehr 65u sock 0,5 0t0 861508 can't identify protocol
OpenFileT 23304 swehr 66u sock 0,5 0t0 861510 can't identify protocol
OpenFileT 23304 swehr 67u sock 0,5 0t0 861512 can't identify protocol
OpenFileT 23304 swehr 68u sock 0,5 0t0 861514 can't identify protocol
OpenFileT 23304 swehr 69u sock 0,5 0t0 861516 can't identify protocol
OpenFileT 23304 swehr 70u sock 0,5 0t0 861518 can't identify protocol
OpenFileT 23304 swehr 71u sock 0,5 0t0 861520 can't identify protocol
OpenFileT 23304 swehr 72u sock 0,5 0t0 861522 can't identify protocol
OpenFileT 23304 swehr 73u sock 0,5 0t0 861524 can't identify protocol
OpenFileT 23304 swehr 74u sock 0,5 0t0 861526 can't identify protocol
OpenFileT 23304 swehr 75u sock 0,5 0t0 861528 can't identify protocol
OpenFileT 23304 swehr 76u sock 0,5 0t0 861530 can't identify protocol
OpenFileT 23304 swehr 77u sock 0,5 0t0 861532 can't identify protocol
OpenFileT 23304 swehr 78u sock 0,5 0t0 861534 can't identify protocol
OpenFileT 23304 swehr 79u sock 0,5 0t0 861536 can't identify protocol
OpenFileT 23304 swehr 80u sock 0,5 0t0 861538 can't identify protocol
OpenFileT 23304 swehr 81u sock 0,5 0t0 861540 can't identify protocol
OpenFileT 23304 swehr 82u sock 0,5 0t0 861542 can't identify protocol
OpenFileT 23304 swehr 83u sock 0,5 0t0 861544 can't identify protocol
OpenFileT 23304 swehr 84u sock 0,5 0t0 861546 can't identify protocol
OpenFileT 23304 swehr 85u sock 0,5 0t0 861548 can't identify protocol
OpenFileT 23304 swehr 86u sock 0,5 0t0 861550 can't identify protocol
OpenFileT 23304 swehr 87u sock 0,5 0t0 861552 can't identify protocol
OpenFileT 23304 swehr 88u sock 0,5 0t0 861554 can't identify protocol
OpenFileT 23304 swehr 89u sock 0,5 0t0 861556 can't identify protocol
OpenFileT 23304 swehr 90u sock 0,5 0t0 861558 can't identify protocol
OpenFileT 23304 swehr 91u sock 0,5 0t0 861560 can't identify protocol
OpenFileT 23304 swehr 92u sock 0,5 0t0 861562 can't identify protocol
OpenFileT 23304 swehr 93u sock 0,5 0t0 861564 can't identify protocol
OpenFileT 23304 swehr 94u sock 0,5 0t0 861566 can't identify protocol
OpenFileT 23304 swehr 95u sock 0,5 0t0 861568 can't identify protocol
OpenFileT 23304 swehr 96u sock 0,5 0t0 861570 can't identify protocol
OpenFileT 23304 swehr 97u sock 0,5 0t0 861572 can't identify protocol
OpenFileT 23304 swehr 98u sock 0,5 0t0 861574 can't identify protocol
OpenFileT 23304 swehr 99u sock 0,5 0t0 861576 can't identify protocol
OpenFileT 23304 swehr 100u sock 0,5 0t0 861578 can't identify protocol
OpenFileT 23304 swehr 101u sock 0,5 0t0 861580 can't identify protocol
OpenFileT 23304 swehr 102u sock 0,5 0t0 861582 can't identify protocol
OpenFileT 23304 swehr 103u sock 0,5 0t0 861585 can't identify protocol
Hi,
I'm a Haskell noobie, and trying to install http-enumerator gives the following errors:
Resolving dependencies... cabal: cannot configure tls-extra-0.4.1. It requires certificate >=1.0.0 && <1.1.0 For the dependency on certificate >=1.0.0 && <1.1.0 there are these packages: certificate-1.0.0. However none of them are available. certificate-1.0.0 was excluded because http-enumerator-0.7.1.3 requires certificate >=0.7 && <0.10
Any idea what I need to do?
Derek.
I'm not sure if this should go here or to TLS. Please let me know if it should go there. One of the users of my libraries reports that they receive the following exception:
***Exception: error unexpected packet: pHandshake [ServerHelloDone]
Internally, my project uses http-enumerator and only overrides a couple of the default fields in request:
https://github.com/MichaelXavier/GooglePlus/blob/master/Web/GooglePlus.hs#L209
I personally cannot reproduce this issue. Anyone know what this may be?
Using the latest http-enumerator from github, this fails for me:
simpleHttp "http://joyful.com/darcsweb/darcsweb.cgi?r=rss2irc;a=summary"
apache log shows:
"GET /darcsweb/darcsweb.cgi?r=hledger%3Ba%3Drss HTTP/1.1" 200 1299 "-" "-"
while for curl the log shows:
"GET /darcsweb/darcsweb.cgi?r=hledger;a=rss HTTP/1.1"
Am I doing it wrong ?
Prelude Network.HTTP.Enumerator> simpleHttp ""http://www.google.com/search?q=foo"
:1:44: parse error on input `='
This happens with version 0.7.1.2, or at least I think that is the version. Versions 0.6.7 and 0.7.1 are also installed, but I assume ghci chooses the latest.
RequestBody and its constructors are not exported in 0.4. I assume this is not deliberate.
Hi,
I just tried to install yesod, where http-enumerator is a dependecy. Unfortunately, the installation breaks because of incompatible dependencies.
The reason for this is that the packages tls and tls-extra require certificate-0.8.1 whereas http-enumerator needs certificate-0.7. I edited the http-enumerator.cabal file to allow certificate, but that does not seem to work.
Currently, Request only supports lazy ByteStrings as request bodies. It would be nice to be able to stream request bodies from (large) files or other sources (with iteratee/enumerator or such).
In the general case, the user would have to additionally supply a content length manually, which would not be checked against the stream. Quote from RFC 2616:
"For compatibility with HTTP/1.0 applications, HTTP/1.1 requests containing a message-body MUST include a valid Content-Length header field unless the server is known to be HTTP/1.1 compliant."
However, in some cases the user might KNOW that this particular HTTP server supports Transfer-Encodings, in which case it might be appropriate to use chunking.
Per RFC-2616, section 14.10, the "Connection: close" header is used by the server to indicate that a connection will be closed upon completion of the response. It appears that the withManagedConn function doesn't handle this header, but instead returns the connection to the Manager's pool. When this header is sent and I follow up with my next request, I get this error:
ParseError {errorContexts = ["HTTP status line","demandInput"], errorMessage = "not enough bytes"}
I believe this error occurs because the connection is reused, the server is immediately responding with an RST, and the parsing code is treating this as an unexpected end of stream.
It seems like this should be handled down at the connection management layer, which has, rightly, been abstracted away from the user.
How to do this?:
(bracket H.newManager H.closeManager) go
Performing a simple simpleHttp "https://t4a.box-world.org/"
fails with
ParseError {errorContexts = ["HTTP status line","demandInput"], errorMessage = "not enough bytes"}
I have no idea why this fails, as curl(1)
or Python's httplib
work just fine
I have a client that needs to hit a single server:port with a lot of POST requests, and I'm running out of sockets very rapidly because http-enumerator lacks keepalive support.
I am using http-enumerator to talk to couch db, and occasionally during testing I would get
ParseError {errorContexts = ["HTTP status line"], errorMessage = "Failed reading: takeWith"}
After investigating using wireshark to look at the tcp dumps, I realized the error occured whenever the previous response was a chunked encoding. (I am reusing the same Manager for multiple queries.) After looking at the code, I think that the chunkedEnumeratee is not consuming the final newlines.
Here is a dump from wireshark of the end of a chunked response and the start of the new one
00003534 34 0d 0a 0d 0a 5d 7d 0d 0a 4....]}. .
0000353D 31 0d 0a 0a 0d 0a 1.....
00003543 30 0d 0a 0d 0a 0....
000013AC 47 45 54 20 2f 74 65 73 74 2f 5f 64 65 73 69 67 GET /tes t/_desig
We see three chunks. A chunk of length four consisting of the closing ]} from the JSON, a chunk of length 1 with a lf, and the closing chunk of length zero.
But look at the http-enumerator code
chunkedEnumeratee k@(Continue _) = do
len <- catchParser "Chunk header" iterChunkHeader
if len == 0
then return k
else do
k' <- takeLBS len k
catchParser "End of chunk newline" iterNewline
chunkedEnumeratee k'
After seeing the final zero length chunk header, the enumeratee is not consuming the 0d 0a from the end of the chunk. Thus when the same manager is used for the next query, the 0d 0a are still sitting out there and we get the error about parsing the HTTP status line.
I am pretty sure the return k needs to change to
catchParser "End of chunks newline" iterNewline >> return k
The initial examples (downloading www.haskell.org with simpleHttp
and saving google.com to a file) don't work on Windows because they're missing withSocketsDo
.
It was very hard for me to figure out what was going on because the error message made it seem like the hostname couldn't be resolved. Could you please correct the documentation?
edit: Found out that withSocketsDo
is the responsibility of the library user rather than the library
version 0.6.5.1
In ghci:
Prelude Network.HTTP.Enumerator> simpleHttp "http://192.168.5.32:8001/a?a"
produces this in my server log:
192.168.5.32 - - [07/Jun/2011:09:18:52 -0400] "GET /a%3Fa HTTP/1.1" 200 - - "-"
Hi,
I have been working on a hard to track down bug in couchdb-enumerator (see https://bitbucket.org/wuzzeb/couchdb-enumerator/issue/2/quickcheck-tests-fail) that we thought for a while was a problem in our test code. But with the patch below to http-enumerator, I found the real error.
All of this has been tested with 0.7.2 and the tip of the master branch here on github.
Essentially, there are two problems and one masked the other :( Here is the sequence of events.
There are two fixes needed:
Patch I used to finally track down the error.
diff -r 4cf15746fbe8 Network/HTTP/Enumerator.hs
--- a/Network/HTTP/Enumerator.hs Tue Dec 13 00:11:46 2011 -0800
+++ b/Network/HTTP/Enumerator.hs Thu Dec 22 01:27:52 2011 -0600
@@ -227,15 +227,19 @@
Just ci -> return (ci, True)
catchError
(do
+ liftIO $ putStrLn "Startting enumerator"
(toPut, a) <- withCI ci req step
+ liftIO $ putStrLn "Enumerator completed"
liftIO $ if toPut
then putInsecureSocket man key ci
else TLS.connClose ci
return a)
(\se -> liftIO (TLS.connClose ci) >>
- if isManaged
- then withManagedConn man key open req step
- else throwError se)
+ liftIO (putStrLn $ "Error during enum: " ++ show se) >>
+ throwError se)
+ --if isManaged
+ -- then withManagedConn man key open req step
+ -- else throwError se)
withSslConn :: MonadIO m
=> ([X509] -> IO TLS.TLSCertificateUsage)
~/couchdb-enumerator$ ./dist/build/test/test -t conflict
Basic:
DB:
Startting enumerator
Enumerator completed
Startting enumerator
Enumerator completed
Startting enumerator
Error during enum: ParseError {errorContexts = ["HTTP status line"], errorMessage = "Failed reading: takeWith"}
conflict: [Failed]
ERROR: ParseError {errorContexts = ["HTTP status line"], errorMessage = "Failed reading: takeWith"}
Test Cases Total
Passed 0 0
Failed 1 1
Total 1 1
If I do:
simpleHttp tgzUrl >>= L.writeFile storeLocation
where tgzUrl is the URL for a gzipped tar file, the written file ends up as just a plain tar file instead of the gzipped tar file I was hoping for.
Willing to fix this myself and send a pull request if you can point me in the right direction. Maybe a simpleHttpRaw function?
Hi,
Trying to compile with ghc 7.0.2. http-enumerator commit 758bc9b
Network/TLS/Extra/Connection.hs:25:83:
TLSCtx' is not applied to enough type arguments The first argument of
IO' should have kind *', but
TLSCtx' has kind * -> *' In the type signature for
connectionClient':
connectionClient :: CryptoRandomGen g =>
String -> String -> TLSParams -> g -> IO TLSCtx
From what I can see in the documentation and what I understand of the code, HTTP Enumerator does not currently support HTTP Basic auth. I don't know much about the details of HTTP Basic Auth but would it be worthwhile to implement it? I'd of course submit a patch instead of an issue if I were capable.
Unfortunately, a project I'm working on requires SSL and Basic. Curl doesn't compile with GHC7, Network.HTTP doesn't support SSL and HTTP-Enumerator doesn't support Basic. Such a conundrum ;)
I have a URL that's getting mangled by parseUrl and renderQuery.
If I write a program like:
import Network.HTTP.Enumerator
import qualified Data.ByteString.Lazy as L
import Network (withSocketsDo)
url = "http://www.mega-nerd.com/cgi-bin/Count.cgi?ft=6|frgb=55;55;55|tr=0|trgb=0;0;0|wxh=15;20|md=7|dd=B|st=1|sh=1|df=libsndfile.dat"
main = withSocketsDo $ simpleHttp url >>= L.putStr
and run it as:
runhaskell simpleHttp.hs > a.gif
I get a different result than if I had done:
wget $url -O b.gif
Basically the problem is that if you just consider the query string part, the parseUrl ---> renderQuery path is not the identity function.
The query:
?ft=6|frgb=55;55;55|tr=0|trgb=0;0;0|wxh=15;20|md=7|dd=B|st=1|sh=1|df=libsndfile.dat
becomes:
?ft=6%7Cfrgb%3D55&55&55%7Ctr=0%7Ctrgb%3D0&0&0%7Cwxh=15&20%7Cmd=7%7Cdd%3DB%7Cst%3D1%7Csh%3D1%7Cdf%3Dlibsndfile.dat
You could of course say that the CGI script should handle a URI encoded string and I would agree, but I also don't think http-enumerator should be mangling it as much as it does. In fact, I don't http-enumerator should be parsing it into a Query at all, but should leave it as a ByteString (of course in Warp its a completely different matter).
If you agree, I can submit a pull request that fixes this. Obviously this will require a significant version bump but I'll leave that to you.
This is actually really trivial to fix. I already have a solution in this commit
https://github.com/erikd/http-enumerator/commit/f4cc35d99863034fd090312798831eb5d3c63f9b
and I've tested that it fixes my issue.
Cheers,
Erik
When an HTTP server does not send the Content-length and does not use chunked transfer encoding, http-enumerator throws a ConnectionReset exception. Test program:
import Network.HTTP.Enumerator (simpleHttp)
import Control.Concurrent
import Network
import System.IO
main :: IO ()
main = do
sock <- listenOn $ PortNumber 1234
_ <- forkIO $ do
(h, _, _) <- accept sock
("GET" : _) <- words `fmap` hGetLine h
hPutStr h "HTTP/1.0 200 OK\r\n\r\n<h3>It ends here</h3>\n"
hClose h
s <- simpleHttp "http://localhost:1234/"
print s
Expected output:
Chunk "<h3>It ends here</h3>\n" Empty
Actual output:
Error during enum: ConnectionReset
eof: ConnectionReset
I don't know whether or not such a server is being standards-compliant, but I encountered such servers in real life.
http-conduit does not have this problem, and produces the expected output.
Here is the workaround I am using:
diff --git a/Network/TLS/Client/Enumerator.hs b/Network/TLS/Client/Enumerator.hs
index 212f759..fd0b319 100755
--- a/Network/TLS/Client/Enumerator.hs
+++ b/Network/TLS/Client/Enumerator.hs
@@ -57,7 +57,7 @@ connEnum ConnInfo { connRead = read' } =
go (Continue k) = do
bs <- tryIO read'
if all S.null bs
- then throwError ConnectionReset
+ then continue k
else do
step <- lift $ runIteratee $ k $ Chunks bs
go step
I don't know whether this is the correct fix or not. For example, if the connection is closed when fewer than Content-length bytes have been sent (meaning the content was truncated), simpleHttp
will not raise an exception. On the other hand, http-conduit does not raise an exception when content is truncated, either (is this a bug?).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.