Code Monkey home page Code Monkey logo

node-lz4's Introduction

LZ4

LZ4 is a very fast compression and decompression algorithm. This nodejs module provides a Javascript implementation of the decoder as well as native bindings to the LZ4 functions. Nodejs Streams are also supported for compression and decompression.

NB. Version 0.2 does not support the legacy format, only the one as of "LZ4 Streaming Format 1.4". Use version 0.1 if required.

Build

With NodeJS:

git clone https://github.com/pierrec/node-lz4.git
cd node-lz4
git submodule update --init --recursive
npm install

Install

With NodeJS:

npm install lz4

Within the browser, using build/lz4.js:

<script type="text/javascript" src="/path/to/lz4.js"></script>
<script type="text/javascript">
// Nodejs-like Buffer built-in
var Buffer = require('buffer').Buffer
var LZ4 = require('lz4')

// Some data to be compressed
var data = 'Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.'
data += data
// LZ4 can only work on Buffers
var input = Buffer.from(data)
// Initialize the output buffer to its maximum length based on the input data
var output = Buffer.alloc( LZ4.encodeBound(input.length) )

// block compression (no archive format)
var compressedSize = LZ4.encodeBlock(input, output)
// remove unnecessary bytes
output = output.slice(0, compressedSize)

console.log( "compressed data", output )

// block decompression (no archive format)
var uncompressed = Buffer.alloc(input.length)
var uncompressedSize = LZ4.decodeBlock(output, uncompressed)
uncompressed = uncompressed.slice(0, uncompressedSize)

console.log( "uncompressed data", uncompressed )
</script>

From github cloning, after having made sure that node and node-gyp are properly installed:

npm i
node-gyp rebuild

See below for more LZ4 functions.

Usage

Encoding

There are 2 ways to encode:

  • asynchronous using nodejs Streams - slowest but can handle very large data sets (no memory limitations).
  • synchronous by feeding the whole set of data - faster but is limited by the amount of memory

Asynchronous encoding

First, create an LZ4 encoding NodeJS stream with LZ4#createEncoderStream(options).

  • options (Object): LZ4 stream options (optional)
    • options.blockMaxSize (Number): chunk size to use (default=4Mb)
    • options.highCompression (Boolean): use high compression (default=false)
    • options.blockIndependence (Boolean): (default=true)
    • options.blockChecksum (Boolean): add compressed blocks checksum (default=false)
    • options.streamSize (Boolean): add full LZ4 stream size (default=false)
    • options.streamChecksum (Boolean): add full LZ4 stream checksum (default=true)
    • options.dict (Boolean): use dictionary (default=false)
    • options.dictId (Integer): dictionary id (default=0)

The stream can then encode any data piped to it. It will emit a data event on each encoded chunk, which can be saved into an output stream.

The following example shows how to encode a file test into test.lz4.

var fs = require('fs')
var lz4 = require('lz4')

var encoder = lz4.createEncoderStream()

var input = fs.createReadStream('test')
var output = fs.createWriteStream('test.lz4')

input.pipe(encoder).pipe(output)

Synchronous encoding

Read the data into memory and feed it to LZ4#encode(input[, options]) to decode an LZ4 stream.

  • input (Buffer): data to encode
  • options (Object): LZ4 stream options (optional)
    • options.blockMaxSize (Number): chunk size to use (default=4Mb)
    • options.highCompression (Boolean): use high compression (default=false)
    • options.blockIndependence (Boolean): (default=true)
    • options.blockChecksum (Boolean): add compressed blocks checksum (default=false)
    • options.streamSize (Boolean): add full LZ4 stream size (default=false)
    • options.streamChecksum (Boolean): add full LZ4 stream checksum (default=true)
    • options.dict (Boolean): use dictionary (default=false)
    • options.dictId (Integer): dictionary id (default=0)
var fs = require('fs')
var lz4 = require('lz4')

var input = fs.readFileSync('test')
var output = lz4.encode(input)

fs.writeFileSync('test.lz4', output)

Decoding

There are 2 ways to decode:

  • asynchronous using nodejs Streams - slowest but can handle very large data sets (no memory limitations)
  • synchronous by feeding the whole LZ4 data - faster but is limited by the amount of memory

Asynchronous decoding

First, create an LZ4 decoding NodeJS stream with LZ4#createDecoderStream().

The stream can then decode any data piped to it. It will emit a data event on each decoded sequence, which can be saved into an output stream.

The following example shows how to decode an LZ4 compressed file test.lz4 into test.

var fs = require('fs')
var lz4 = require('lz4')

var decoder = lz4.createDecoderStream()

var input = fs.createReadStream('test.lz4')
var output = fs.createWriteStream('test')

input.pipe(decoder).pipe(output)

Synchronous decoding

Read the data into memory and feed it to LZ4#decode(input) to produce an LZ4 stream.

  • input (Buffer): data to decode
var fs = require('fs')
var lz4 = require('lz4')

var input = fs.readFileSync('test.lz4')
var output = lz4.decode(input)

fs.writeFileSync('test', output)

Block level encoding/decoding

In some cases, it is useful to be able to manipulate an LZ4 block instead of an LZ4 stream. The functions to decode and encode are therefore exposed as:

  • LZ4#decodeBlock(input, output[, startIdx, endIdx]) (Number) >=0: uncompressed size, <0: error at offset
    • input (Buffer): data block to decode
    • output (Buffer): decoded data block
    • startIdx (Number): input buffer start index (optional, default=0)
    • endIdx (Number): input buffer end index (optional, default=startIdx + input.length)
  • LZ4#encodeBound(inputSize) (Number): maximum size for a compressed block
    • inputSize (Number) size of the input, 0 if too large This is required to size the buffer for a block encoded data
  • LZ4#encodeBlock(input, output[, startIdx, endIdx]) (Number) >0: compressed size, =0: not compressible
    • input (Buffer): data block to encode
    • output (Buffer): encoded data block
    • startIdx (Number): output buffer start index (optional, default=0)
    • endIdx (Number): output buffer end index (optional, default=startIdx + output.length)
  • LZ4#encodeBlockHC(input, output[, compressionLevel]) (Number) >0: compressed size, =0: not compressible
    • input (Buffer): data block to encode with high compression
    • output (Buffer): encoded data block
    • compressionLevel (Number): compression level between 3 and 12 (optional, default=9)

Blocks do not have any magic number and are provided as is. It is useful to store somewhere the size of the original input for decoding. LZ4#encodeBlockHC() is not available as pure Javascript.

How it works

Restrictions / Issues

  • blockIndependence property only supported for true

License

MIT

node-lz4's People

Contributors

ankon avatar beefcheeks avatar chalker avatar janriemer avatar johan13 avatar kripken avatar kriszyp avatar maximelkin avatar mbec-lbto avatar mog422 avatar mscdex avatar naseemkullah avatar phardera avatar pierrec avatar psirenny avatar shibukawa avatar vihanb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-lz4's Issues

How to transfer lz4 compressed data over network

I am trying to transfer lz4 compressed data from NodeJS to browser. NodeJS will compress the data and browser decompress the data. However, decompressing data failed. Can anyone help to solve the problem?
From NodeJS side:

var rawbuffer = new Buffer(rawstring);
var originlength = rawbuffer.length;
var output = lz4.encode(rawbuffer);
var outputstr = output.toString();
res.setHeader("Original-Length", originlength);
res.send(outputstr);

From Browser side:

var targetLength = xhr.getResponseHeader('Original-Length');
var originBuffer = new Buffer(xhr.responseText);
var uncompressedBuffer = new Buffer(parseInt(targetLength));
var uncompressedSize = LZ4.decodeBlock(originBuffer, uncompressedBuffer);
uncompressedBuffer = uncompressedBuffer.slice(0, uncompressedSize);
var targetString = uncompressedBuffer.toString();

Data is received from Browser side, however uncompressedSize is -1 in Browser.

install error mac/node 7.10.0

gyp info it worked if it ends with ok
gyp info using [email protected]
gyp info using [email protected] | darwin | x64
gyp http GET https://nodejs.org/download/release/v7.10.0/node-v7.10.0-headers.tar.gz
gyp http 200 https://nodejs.org/download/release/v7.10.0/node-v7.10.0-headers.tar.gz
gyp http GET https://nodejs.org/download/release/v7.10.0/SHASUMS256.txt
gyp http 200 https://nodejs.org/download/release/v7.10.0/SHASUMS256.txt
gyp info spawn /usr/local/bin/python2
gyp info spawn args [ '/usr/local/lib/node_modules/yarn/node_modules/node-gyp/gyp/gyp_main.py',
gyp info spawn args   'binding.gyp',
gyp info spawn args   '-f',
gyp info spawn args   'make',
gyp info spawn args   '-I',
gyp info spawn args   '/Volumes/Macintosh_HD/Users/shauncutts/src/gabbage/node_modules/lz4/build/config.gypi',
gyp info spawn args   '-I',
gyp info spawn args   '/usr/local/lib/node_modules/yarn/node_modules/node-gyp/addon.gypi',
gyp info spawn args   '-I',
gyp info spawn args   '/Users/shauncutts/.node-gyp/7.10.0/include/node/common.gypi',
gyp info spawn args   '-Dlibrary=shared_library',
gyp info spawn args   '-Dvisibility=default',
gyp info spawn args   '-Dnode_root_dir=/Users/shauncutts/.node-gyp/7.10.0',
gyp info spawn args   '-Dnode_gyp_dir=/usr/local/lib/node_modules/yarn/node_modules/node-gyp',
gyp info spawn args   '-Dnode_lib_file=node.lib',
gyp info spawn args   '-Dmodule_root_dir=/Volumes/Macintosh_HD/Users/shauncutts/src/gabbage/node_modules/lz4',
gyp info spawn args   '-Dnode_engine=v8',
gyp info spawn args   '--depth=.',
gyp info spawn args   '--no-parallel',
gyp info spawn args   '--generator-output',
gyp info spawn args   'build',
gyp info spawn args   '-Goutput_dir=.' ]
gyp info spawn make
gyp info spawn args [ 'BUILDTYPE=Release', '-C', 'build' ]
  CXX(target) Release/obj.target/lz4/lib/binding/lz4_binding.o
c++: error: unrecognized command line option ‘-stdlib=libc++’
make: *** [Release/obj.target/lz4/lib/binding/lz4_binding.o] Error 1
gyp ERR! build error 
gyp ERR! stack Error: `make` failed with exit code: 2
gyp ERR! stack     at ChildProcess.onExit (/usr/local/lib/node_modules/yarn/node_modules/node-gyp/lib/build.js:285:23)
gyp ERR! stack     at emitTwo (events.js:106:13)
gyp ERR! stack     at ChildProcess.emit (events.js:194:7)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:215:12)
gyp ERR! System Darwin 16.5.0
gyp ERR! command "/usr/local/bin/node" "/usr/local/lib/node_modules/yarn/node_modules/node-gyp/bin/node-gyp.js" "rebuild"
gyp ERR! cwd /Volumes/Macintosh_HD/Users/shauncutts/src/gabbage/node_modules/lz4
gyp ERR! node -v v7.10.0
gyp ERR! node-gyp -v v3.6.1
gyp ERR! not ok

(NB -- generated using yarn, but same thing with npm v.4.2.0)

Invalid block checksum

After compressing a block and decompressing it with blockChecksum enabled, node-lz4 bombs:

    const decoder = lz4.createDecoderStream({
      });
      const encoder = lz4.createEncoderStream({ 
        blockChecksum: true,
        //debug: true,
      })
      const buffers = []
      encoder.on('data', data => decoder.write(data))
      encoder.on('end', () => decoder.end())
      decoder.on('data', data => buffers.push(data))
      decoder.on('end', () => {
        console.log('Received %s', Buffer.concat(buffers))
      })
      encoder.end('panchi pinchi pinchi panchi punchi\n')

results in:

Invalid block checksum: 3942400633 vs 3364695891 for {"type":"Buffer","data":[153,112,97,110,99,104,105,32,112,105,7,0,3,21,0,96,117,110,99,104,105,10]} @26

After investigating in the guts of lib/encoder_stream.js here it appears that it computes the checksum on a buffer that contains the compressed block, and some padding: instead of only 9970616e636869207069070003150060756e6368690a it hashes 9970616e636869207069070003150060756e6368690a0000000000000000000000000000000000000000000000000000000000, which yields the error.

browser support

Hey,
Is it possible to run lz4 in browser?

thanks!
Bartek

pointer being freed was not allocated

Got this error lately (node 0.10.37, OSX):

node(1166,0x7fff7db64310) malloc: *** error for object 0x1044a9d3d: pointer being freed was not allocated

Getting rid of node-lz4 fixed the problem...

compress file from web client, error with compressHC

I'm trying to compress files from the web client, i have an error when i tried to compress the file with the compressHC metho.

Let you my code

reader.onloadend = (function(theFile) {
    return function(e) {
      console.log(e.target.result)
      var input = new Buffer(e.target.result)    

      var output = LZ4.encode(input, { highCompression: true });
      console.log(output);
      console.log(output.length);
    }; 
})(file)
reader.readAsArrayBuffer(file)

the error is "this.compress is not a function"

Tests no longer pass under Node 10

The Buffer readUInt32LE, readInt32LE, writeUInt32LE, writeInt32LE calls have changed (there's no more noAssert parameter) and how things are interpreted also seems to have changed.

The code that deals with xxhash now seems to compare signed & unsigned types, and write calls sometimes reject values (seemingly) for being outside the range of a 32-bit integer.

See: https://gist.github.com/leafi/be4fe7807577d170ef558dd0081a3091 for full log

11 passing (30ms)
3 failing

  1. LZ4 checksum should encode/decode data:
    RangeError [ERR_OUT_OF_RANGE]: The value of "value" is out of range. It must be >= -2147483648 and <= 2147483647. Received 3675999523
    at checkInt (internal/buffer.js:35:11)
    at writeU_Int32LE (internal/buffer.js:515:3)
    at Buffer.writeInt32LE (internal/buffer.js:684:10)
    at Encoder._flush (lib/encoder_stream.js:215:7)
    at Encoder.prefinish (_stream_transform.js:141:10)
    at prefinish (_stream_writable.js:643:14)
    at finishMaybe (_stream_writable.js:651:5)
    at endWritable (_stream_writable.js:662:3)
    at Encoder.Writable.end (_stream_writable.js:602:5)
    at Object.LZ4_compress [as encode] (lib/encoder.js:14:10)
    at Context. (test/checksum-test.js:12:20)

  2. LZ4 checksum should encode/decode data:
    RangeError [ERR_OUT_OF_RANGE]: The value of "value" is out of range. It must be >= -2147483648 and <= 2147483647. Received 2319569705
    at checkInt (internal/buffer.js:35:11)
    at writeU_Int32LE (internal/buffer.js:515:3)
    at Buffer.writeInt32LE (internal/buffer.js:684:10)
    at Encoder._flush (lib/encoder_stream.js:215:7)
    at Encoder.prefinish (_stream_transform.js:141:10)
    at prefinish (_stream_writable.js:643:14)
    at finishMaybe (_stream_writable.js:651:5)
    at endWritable (_stream_writable.js:662:3)
    at Encoder.Writable.end (_stream_writable.js:602:5)
    at Object.LZ4_compress [as encode] (lib/encoder.js:14:10)
    at Context. (test/checksum-test.js:21:20)

  3. LZ4 checksum should encode/decode data:
    RangeError [ERR_OUT_OF_RANGE]: The value of "value" is out of range. It must be >= -2147483648 and <= 2147483647. Received 3729892867
    at checkInt (internal/buffer.js:35:11)
    at writeU_Int32LE (internal/buffer.js:515:3)
    at Buffer.writeInt32LE (internal/buffer.js:684:10)
    at Encoder._flush (lib/encoder_stream.js:215:7)
    at Encoder.prefinish (_stream_transform.js:141:10)
    at prefinish (_stream_writable.js:643:14)
    at finishMaybe (_stream_writable.js:651:5)
    at endWritable (_stream_writable.js:662:3)
    at Encoder.Writable.end (_stream_writable.js:602:5)
    at Object.LZ4_compress [as encode] (lib/encoder.js:14:10)
    at Context. (test/checksum-test.js:31:20)

Doesn't compile with node 0.12

Here is the output I have when trying to install with node 0.12 on OSX (works fine with 0.10):

  CXX(target) Release/obj.target/lz4/lib/binding/lz4.o
../lib/binding/lz4.cc:19:33: error: unknown type name 'Arguments'; did you mean 'v8::internal::Arguments'?
Handle<Value> LZ4Compress(const Arguments& args) {
                                ^~~~~~~~~
                                v8::internal::Arguments
/Users/benweet/.node-gyp/0.12.0/deps/v8/include/v8.h:127:7: note: 'v8::internal::Arguments' declared here
class Arguments;
      ^
../lib/binding/lz4.cc:20:15: error: calling a protected constructor of class 'v8::HandleScope'
  HandleScope scope;
              ^
/Users/benweet/.node-gyp/0.12.0/deps/v8/include/v8.h:816:13: note: declared protected here
  V8_INLINE HandleScope() {}
            ^
../lib/binding/lz4.cc:22:23: error: member access into incomplete type 'const v8::internal::Arguments'
  uint32_t alen = args.Length();
                      ^
/Users/benweet/.node-gyp/0.12.0/deps/v8/include/v8.h:127:7: note: forward declaration of 'v8::internal::Arguments'
class Arguments;
      ^
../lib/binding/lz4.cc:24:45: error: no member named 'New' in 'v8::String'
    ThrowException(Exception::Error(String::New("Wrong number of arguments")));
                                    ~~~~~~~~^
../lib/binding/lz4.cc:25:18: error: no member named 'Close' in 'v8::HandleScope'
    return scope.Close(Undefined());
           ~~~~~ ^
../lib/binding/lz4.cc:25:24: error: no matching function for call to 'Undefined'
    return scope.Close(Undefined());
                       ^~~~~~~~~
/Users/benweet/.node-gyp/0.12.0/deps/v8/include/v8.h:305:28: note: candidate function not viable: requires single argument 'isolate', but no
      arguments were provided
  friend Handle<Primitive> Undefined(Isolate* isolate);
                           ^
../lib/binding/lz4.cc:28:32: error: type 'const v8::internal::Arguments' does not provide a subscript operator
  if (!Buffer::HasInstance(args[0]) || !Buffer::HasInstance(args[1])) {
                           ~~~~^~
../lib/binding/lz4.cc:28:65: error: type 'const v8::internal::Arguments' does not provide a subscript operator
  if (!Buffer::HasInstance(args[0]) || !Buffer::HasInstance(args[1])) {
                                                            ~~~~^~
../lib/binding/lz4.cc:29:49: error: no member named 'New' in 'v8::String'
    ThrowException(Exception::TypeError(String::New("Wrong arguments")));
                                        ~~~~~~~~^
../lib/binding/lz4.cc:30:18: error: no member named 'Close' in 'v8::HandleScope'
    return scope.Close(Undefined());
           ~~~~~ ^
../lib/binding/lz4.cc:30:24: error: no matching function for call to 'Undefined'
    return scope.Close(Undefined());
                       ^~~~~~~~~
/Users/benweet/.node-gyp/0.12.0/deps/v8/include/v8.h:305:28: note: candidate function not viable: requires single argument 'isolate', but no
      arguments were provided
  friend Handle<Primitive> Undefined(Isolate* isolate);
                           ^
../lib/binding/lz4.cc:32:29: error: type 'const v8::internal::Arguments' does not provide a subscript operator
  Local<Object> input = args[0]->ToObject();
                        ~~~~^~
../lib/binding/lz4.cc:33:30: error: type 'const v8::internal::Arguments' does not provide a subscript operator
  Local<Object> output = args[1]->ToObject();
                         ~~~~^~
../lib/binding/lz4.cc:40:14: error: type 'const v8::internal::Arguments' does not provide a subscript operator
    if (!args[3]->IsUint32()) {
         ~~~~^~
../lib/binding/lz4.cc:41:51: error: no member named 'New' in 'v8::String'
      ThrowException(Exception::TypeError(String::New("Invalid endIdx")));
                                          ~~~~~~~~^
../lib/binding/lz4.cc:42:20: error: no member named 'Close' in 'v8::HandleScope'
      return scope.Close(Undefined());
             ~~~~~ ^
../lib/binding/lz4.cc:42:26: error: no matching function for call to 'Undefined'
      return scope.Close(Undefined());
                         ^~~~~~~~~
/Users/benweet/.node-gyp/0.12.0/deps/v8/include/v8.h:305:28: note: candidate function not viable: requires single argument 'isolate', but no
      arguments were provided
  friend Handle<Primitive> Undefined(Isolate* isolate);
                           ^
../lib/binding/lz4.cc:44:14: error: type 'const v8::internal::Arguments' does not provide a subscript operator
    if (!args[2]->IsUint32()) {
         ~~~~^~
../lib/binding/lz4.cc:45:51: error: no member named 'New' in 'v8::String'
      ThrowException(Exception::TypeError(String::New("Invalid startIdx")));
                                          ~~~~~~~~^
fatal error: too many errors emitted, stopping now [-ferror-limit=]
20 errors generated.

Optimization for ArrayBuffer / DataView? Websockets.

On the browser side when you recv binary data off websockets you get it as an ArrayBuffer or Blob.

Right now you cannot pass that to new Buffer() constructor.

Instead you need to convert the buffer to a Uint8Array, this conversion is very expensive for some reason.

Then once you get your output, you need to convert it again to a Uint8Array (which is now 50-100x more expensive since its an uncompressed image) if you want to use it for something like canvas element; ctx.putImageData(imageData, 0, 0) for example.

Is there anyway to optimize this, perhaps allow Uint8Array to be passed directly instead of Buffer?

This is with large ArrayBuffers like around 2-5MB uncompressed.

The actual decompression takes about 10-20ms, then the array conversion back into a Uint8Array takes 20ms-80ms, I think it depends on memory pressure.

LZ4 Javascript compress function corrupts data

I wrote a fuzz test to test node-lz4's javascript binding and came up with a 100KB file that gets corrupted by the compress function.

To reproduce, please download the test zip from here: https://www.dropbox.com/s/8pohyaskevski5m/LZ4Test.zip?dl=0

The zip contains lz4test.js and lz4test.data (100KB but very compressible)

The test expects LZ4 to already be installed in node_modules in the current working directory to which you extract the zip.

The test reads the data file, compresses it, uncompresses it and compares with the original. The length is the same but the contents are different.

The native binding works, and the test can test either native binding or JS binding:

// Test Javascript binding (FAIL):
node lz4test.js js

// Test Native binding (PASS):
node lz4test.js native

Error: Cannot find module 'lz4'

Error: Cannot find module 'lz4'

lz4
path for script is project/node_modules/lz4/build/lz4.js
in chrome and in firefox on linux
I’ve search for exports.lz4 in lz4.js but with no result, I don't know if this is the problem
just cannot get to work in browser
I've downloaded lz4.js from build folder, and I've compiled it too but no results

IE 9 : encodeBlock with byte == 268 ?

Debugging an issue with sending compressed strings over the network, I'm facing the following issue in IE 9 only; IE 10+ , Chrome , Firefox look fine.

Here is the Javascript string to compress:

var value = "{\"classID\":\"ic3.ReportGuts\",\"guts_\":{\"ic3Version\":8,\"schemaName\":\"Sales\",\"cubeName\":\"Sales\",\"layout\":{\"classID\":\"ic3.FixedLayout\",\"guts_\":{\"ic3Version\":8,\"grid\":10,\"boxes\":[{\"classID\":\"ic3.FixedLayoutBox\",\"guts_\":{\"ic3Version\":8,\"header\":\"widget-0 ( HTML/HTML Box )\",\"behaviour\":\"Fixed Box\",\"noPrint\":false,\"position\":{\"top\":20,\"left\":20,\"width\":530,\"height\":130}";

I compress it using the following code:

var input = new Buffer(value);
var encodeBound = LZ4.encodeBound(input.length);
var output = new Buffer(encodeBound);
var compressedSize = LZ4.encodeBlock(input, output);
output = output.slice(0, compressedSize);

Then to debug the content of the buffer I'm writing to the DOM each byte as following:

$('body').append("<div><pre>len:" + output.length + "</pre></div>");
var out = "";
for (var ii = 0; ii < output.length; ii++) {
   out += ( "\n" + output[ii] );
}
$('body').append("<div><pre>" + out + "</pre></div>");
$('body').append("<div><pre>_</pre></div>");

And to my surprise, I'm seeing one line with _268_ which looks like an overflow. Am I doing something wrong here ?

Decompressing this buffer in IE 9 is OK and gives back the original string; but I need to send that data over the network to save it compressed. For that purpose I'm doing an output.toString( 'base64' ) first which I believe does not expect such a value.

Am I doing something wrong with the compressing / base64 logic ? or is that an issue with LZ4 on IE 9 ?

Thx, _marc

enquiry of encodeBlock and decodeBlock

I am trying to compress and uncompress websocket message, can you help me on the following:

  1. Do the output buffer need to be "clean" before each operation. i.e. can i reuse buffer for every compress/uncompress operation?

  2. Can encodeBlock/decodeBlock provide 2 additional parameter: startindex, length for input/output buffer?

    i would like to prepend the "size" of uncompressed message so that server side can allocate memory more efficiently. However, prepending byte to "buffer" means copy entire "buffer".

Poor compression ratio for large strings

I am using node-lz4 in the browser, to compress and decompress blocks. When my string to be compressed goes above roughly 65535 bytes, the compression ratio jumps up from about 10% to about 250% --- i.e., it is actually making my data much longer.

Is there a maximum block size limit for this implementation? I can't see it mentioned anywhere.

So, maybe this is a bug.

bug in 0.1?

Hello,
I installed v0.1 (I can't use 0.2, since I don't have an option to upgrade to node 0.10 which is a pre-requisite for 0.2).
Got below error while executing:-

util.js:538
ctor.prototype = Object.create(superCtor.prototype, {
^
TypeError: Cannot read property 'prototype' of undefined
at exports.inherits (util.js:538:43)
at Object. (/home/ubuntu/node_modules/lz4/lib/decoder_stream.js:25:1)
at Module._compile (module.js:449:26)
at Object.Module._extensions..js (module.js:467:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.require (module.js:362:17)
at require (module.js:378:17)
at Object. (/home/ubuntu/node_modules/lz4/lib/lz4.js:14:31)
at Module._compile (module.js:449:26)

Error: Invalid stream descriptor checksum @6 on Firefox 48.0.1

The decompression used to work on all browsers, but recently with the new update of Firefox it stopped working with Error: Invalid stream descriptor checksum @6 . There are no problems with downloading, size of data is correct and decompression works on all other browsers (including previous version of Firefox).

Any suggestions?

ps. i'm decompressing inside a worker.
Code: (working on all other browsers including mobile)

var lz4local = (function(){
self.importScripts("/configurator/public_html/libs/lz4.js");
//require is renamed to requirelz4 , but it is the same result with original naming
var _lz4Buffer = requirelz4('buffer').Buffer;
var _LZ4 = requirelz4('lz4');
    return {
        buffer: _lz4Buffer,
        lz4: _LZ4
    };
}());
self.addEventListener('message', function (e) {   
    var data = e.data;
    var xhr = new XMLHttpRequest();
    xhr.open('GET', data.url, false);
    xhr.responseType = 'arraybuffer';
    xhr.setRequestHeader("X-Requested-With", "XMLHttpRequest");
    xhr.overrideMimeType("text/plain; charset=x-user-defined");
    xhr.onload = function (e) {
        console.log("size",this.response.byteLength);
            var compressedData = new lz4local.buffer(new Uint8Array(this.response, 0, this.response.byteLength));
            var decompressedData = lz4local.lz4.decode(compressedData).toString();

            var jsondata = JSON.parse(decompressedData);
            self.postMessage(jsondata);

    };
    xhr.send(null);
}, false);

Ubuntu 14.04 - Cannot find module '../build/Release/xxhash'

Not sure how to fix this. There were no errors on npm install. I tried node-gyp rebuild and got a host of other (presumably) unrelated errors. Using node v4.4.5 with npm 2.15.5 here and can't get lz4 going like I could on osx (fwiw).

Distributor ID: Ubuntu
Description:    Ubuntu 14.04.4 LTS
Release:    14.04
Codename:   trusty
"lz4": "^0.5.2",
module.js:327
    throw err;
    ^

Error: Cannot find module '../build/Release/xxhash'
    at Function.Module._resolveFilename (module.js:325:15)
    at Function.Module._load (module.js:276:25)
    at Module.require (module.js:353:17)
    at require (internal/module.js:12:17)
    at Object.<anonymous> (/var/app/node_modules/lz4/lib/utils.js:4:11)
    at Module._compile (module.js:409:26)
    at Object.Module._extensions..js (module.js:416:10)
    at Module.load (module.js:343:32)
    at Function.Module._load (module.js:300:12)
    at Module.require (module.js:353:17)
    at require (internal/module.js:12:17)
    at Object.<anonymous> (/var/app/node_modules/lz4/lib/static.js:60:17)
    at Module._compile (module.js:409:26)
    at Object.Module._extensions..js (module.js:416:10)
    at Module.load (module.js:343:32)
    at Function.Module._load (module.js:300:12)

Invalid magic number

Hi, I'm using lz4 c library to compress a string in a c++ project, send it to a nodejs server in b64 and decompress there.
To compress the string (buffer is comming from rapidjson: GetString() is a well formed json char * and Size() is the correct size):

  int safeSize = LZ4_compressBound(buffer.Size());
  char *compressed = new char[safeSize];
  int compressedBytes = LZ4_compress(buffer.GetString(), compressed, buffer.Size());
  string ret(BASE64::base64_encode((unsigned char *) compressed, compressedBytes));
  delete [] compressed;

The b64 string received in server perfectly matches the sended b64 string. To decompress the string in nodejs:

  var compressedData = new Buffer(req.query.d, 'base64');
  var decompressedData = LZ4.decode(compressedData);

I'm always getting Error: Invalid magic number: 7B5B74F1

I've been investigating c library and there is nothing similar to a magic number.
Is there something to do with versions? I've seen that lz4.cli.c (used in node.js) has LZ4S_MAGICNUMBER defined and some other differences to lz4.c.
In that case, would it work if I compile the library with c files from node.js to use it in c++? OTW how should I proceed?

Please any help is really appreciated.

Regards,
Miguel

xxhash_module not found

After installing the lz4 from npm, no errors are seen. When used in project shows following error
Error: Symbol xxhash_module not found.
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.require (module.js:364:17)
at require (module.js:380:17)
at Object. (C:\Users\ad-ad\xx-xxx\node_modules\lz4\lib\utils.js:4:11)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.require (module.js:364:17)

Env - Windows 7 (64 bit), Node - v0.10.31

Any prob with my env or the module?

Error encoding "Hello World" in browser version

<script type="text/javascript" src="../build/lz4.js"></script>
<script type="text/javascript">
  var Buffer = require('buffer').Buffer
  var LZ4 = require('lz4')
  console.log(LZ4.encode("Hello World"))
</script>

Running that is resulting the following error (sorry copy / paste doesn't seem very good for the stack trace):
Uncaught Error: specified a negative value for writing an unsigned value lz4.js:2126
assert lz4.js:2126
verifuint lz4.js:2107
writeUInt32 lz4.js:1724
Buffer.writeUInt32LE lz4.js:1739
Encoder._flush lz4.js:840
(anonymous function) lz4.js:3975
g lz4.js:2509
EventEmitter.emit lz4.js:2414
finishMaybe lz4.js:4425
endWritable lz4.js:4432
Writable.end lz4.js:4410
LZ4_compress lz4.js:621
(anonymous function) uncompress.html:10

Can not compress an empty stream

Using the example from the Readme:

var fs = require('fs')
var lz4 = require('lz4')

var encoder = lz4.createEncoderStream()

var input = fs.createReadStream('test')
var output = fs.createWriteStream('test.lz4')

input.pipe(encoder).pipe(output)

If test is a zero-size file, then the example crashes with

…/node_modules/lz4/lib/utils.js:19
                return XXH.digest(c)
                           ^

TypeError: Wrong arguments
    at TypeError (native)
    at Object.exports.streamChecksum (…/node_modules/lz4/lib/utils.js:19:14)
    at Encoder._flush (…/node_modules/lz4/lib/encoder_stream.js:210:27)
    at Encoder.<anonymous> (_stream_transform.js:118:12)
    at Encoder.g (events.js:286:16)
    at emitNone (events.js:86:13)
    at Encoder.emit (events.js:185:7)
    at prefinish (_stream_writable.js:478:12)
    at finishMaybe (_stream_writable.js:486:7)
    at endWritable (_stream_writable.js:498:3)

Bad Compression / Decompression using Pure JS version

<Buffer 7b 22 61 64 64 72 65 73 73 22 3a 22 31 32 37 2e 30 2e 30 2e 31 22 7d>
(Actual 23 bytes of Data is: {"address":"127.0.0.1"} )

var lz4 = require('./lz4/lib/binding'), fs = require('fs');
var data = fs.readFileSync('packet-test.txt');
var len = data.length;
var out = new Buffer(len+4096);
var inF = new Buffer(len);

var sz = lz4.compress(data ,out);
lz4.uncompress(out, inF);
res = data.toString('hex') === inF.toString('hex');
if (!res) console.log("Failed to Compress/Decompress");

This is a small packet; but in my testbed I have files sized from 2 bytes to 48m. Out of the 526 files in my compression testbed; 109 of them fail to compress<->decompress. Some of them end up like this one at 0k (compression) and some of them actually are "bigger" than the source (hence the + 4096 on my out buffer) -- this is using the current pull as of today (3/24/2014)

Sending browser compressed data via AJAX request (jquery)

I'm using LZ4 in the browser to compress lengthy AJAX request parameter. I'm wondering how to encode the content of the compressed Buffer to send as a parameter of the AJAX request? Do you have any example and/or pointer ?

Thanks and congrats. for your work.

encode() bombs with TypeError

This seems like it should work:

node -e 'require("lz4").encode(new Buffer("crap"));'

But it yields:

buffer.js:784
    throw TypeError('value is out of bounds');
          ^
TypeError: value is out of bounds
    at TypeError (<anonymous>)
    at checkInt (buffer.js:784:11)
    at Buffer.writeUInt32LE (buffer.js:841:5)
    at LZ4Stream._add (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/encoder.js:95:13)
    at LZ4Stream.add (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/encoder.js:73:8)
    at Object.exports.LZ4_compress [as encode] (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/encoder.js:185:7)
    at [eval]:1:16
    at Object.<anonymous> ([eval]-wrapper:6:22)
    at Module._compile (module.js:456:26)
    at evalScript (node.js:532:25)

This is apparently due to a bad bitwise OR statement at line 95 in encoder.js that produces a signed int rather than unsigned, which in turn blows an exception in buffer.js:

    blockSize.writeUInt32LE( 0x80000000 | data.length, 0, false)

However, replacing the bitwise OR with + does not fix the issue completely. When attempting to decode, you will hit a different error:

node -e 'var lz4 = require("lz4"); lz4.decode(lz4.encode(new Buffer("crap")));'

events.js:72
        throw er; // Unhandled 'error' event
              ^
Error: Unexpected end of LZ4 stream @0
    at Decoder.emit_Error (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/decoder_stream.js:75:22)
    at Decoder.check_Size (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/decoder_stream.js:81:32)
    at Decoder.read_DataBlockData (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/decoder_stream.js:218:12)
    at Decoder._main (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/decoder_stream.js:314:25)
    at Decoder._flush (/home/stolley/viaprotect-web/www/api/node_modules/lz4/lib/decoder_stream.js:284:7)
    at Decoder.<anonymous> (_stream_transform.js:130:12)
    at Decoder.g (events.js:175:14)
    at Decoder.EventEmitter.emit (events.js:92:17)
    at finishMaybe (_stream_writable.js:354:12)
    at endWritable (_stream_writable.js:361:3)

Publish precompiled binaries

It seems like this library expects a user to have Python and some windows tools installed to compile the library. Node-sass includes some precompiled binaries for popular platforms https://www.npmjs.com/package/node-sass#rebuilding-binaries

Any chance this library can implore similar tactics to make it easier to adopt this in JS Ecosystem? Many of my users don't have python on their machines as they use Node and JS exclusively. In addition, some of our production boxes don't have the required tools either.

This library is definitely really worth the speed and performance gains over Node's built in zlib and gzip, but it might be impossible to use due to extra requirements.

Ten times slower than gzip?

I performed a few benchmarks to see how much faster lz4 would be compared to gzip when decompressing an incoming HTTP payload. I'm not sure if I'm doing something wrong, but it seems that this library is a lot slower than just using plain gzip.

I had 3 test JSON docs of different sizes, that I compressed with both LZ4 and gzip:

Uncompressed LZ4 gzip
7,369 bytes 3,975 bytes 2,772 bytes
73,723 bytes 33,028 bytes 21,790 bytes
716,995 bytes 311,365 bytes 202,697 bytes

The LZ4 version was compressed using default options:

uncompressed.pipe(lz4.createEncoderStream()).pipe(compressed)

The gzip version was compressed from my macOS command line:

gzip uncompressed.json

I used autocannon to hammer a test HTTP server with the compressed document. The server would decompress the payload but otherwise discard it afterwards.

Here's an example of how autocannon was configured:

autocannon -i body.json.lz4 -H 'Content-Encoding: lz4' localhost:3000

And here's my test server running on localhost:

'use strict'

const http = require('http')
const zlib = require('zlib')
const lz4 = require('lz4')

const server = http.createServer(function (req, res) {
  const enc = req.headers['content-encoding'] || ''

  let decompressed
  if (/\bgzip\b/.test(enc)) {
    decompressed = req.pipe(zlib.createGunzip())
  } else if (/\blz4\b/.test(enc)) {
    decompressed = req.pipe(lz4.createDecoderStream())
  } else {
    decompressed = req
  }

  const buffers = []
  decompressed.on('data', buffers.push.bind(buffers))
  decompressed.on('end', function () {
    const data = Buffer.concat(buffers)
    res.end()
  })
})

server.listen(3000, function () {
  console.log('Server listening on http://localhost:3000')
})

Test 1 - Decompressing a 7,369 byte JSON document

LZ4 (3,975 bytes):

Stat         Avg     Stdev   Max
Latency (ms) 23.29   9.31    61.38
Req/Sec      419.7   13.36   434
Bytes/Sec    41.4 kB 1.31 kB 43 kB

4k requests in 10s, 416 kB read

Gzip (2,772 bytes):

Stat         Avg    Stdev   Max
Latency (ms) 1.07   0.67    13.48
Req/Sec      7064.4 704.93  7733
Bytes/Sec    699 kB 67.8 kB 766 kB

71k requests in 10s, 6.99 MB read

Test 2 - Decompressing a 73,723 byte JSON document

LZ4 (33,028 bytes):

Stat         Avg     Stdev  Max
Latency (ms) 23.28   8.94   55.9
Req/Sec      419.8   11.45  435
Bytes/Sec    41.8 kB 1.1 kB 43.1 kB

4k requests in 10s, 416 kB read

Gzip (21,790 bytes):

Stat         Avg    Stdev   Max
Latency (ms) 2.7    1.61    21.23
Req/Sec      3131   105.16  3342
Bytes/Sec    313 kB 13.1 kB 331 kB

31k requests in 10s, 3.1 MB read

Test 3 - Decompressing a 716,995 byte JSON document

On a large document like this the difference between gzip and lz4 is much smaller, but gzip still wins:

LZ4 (311,365 bytes):

Stat         Avg     Stdev Max
Latency (ms) 41.56   13.21 102
Req/Sec      237.6   6.95  250
Bytes/Sec    23.7 kB 819 B 24.8 kB

2k requests in 10s, 235 kB read

Gzip (202,697 bytes):

Stat         Avg     Stdev Max
Latency (ms) 26.11   6.51  137.09
Req/Sec      375.4   7.61  381
Bytes/Sec    37.5 kB 819 B 37.7 kB

4k requests in 10s, 372 kB read

Possible bug (skipping last bytes) in uncompress function due to sIdx and eIdx

Doesn't the following code cause the end bytes of the input to be skipped if sIdx is not 0?

exports.uncompress = function (input, output, sIdx, eIdx) {
        sIdx = sIdx || 0
        eIdx = eIdx || (input.length - sIdx)
        // Process each sequence in the incoming data
        for (var i = sIdx, n = eIdx, j = 0; i < n;) {
        // ...

I had trouble with this until I noticed that eIdx is (input.length - sIdx) instead of input.length, causing the last "sIdx" bytes to not be processed.

Is this correct? sIdx and eIdx are not documented in the params above the function, but still. :)

"require"function is not supported in broswer

I compressed my data in server side with lz4, it is pretty impressive and now what I want to do is to decompress my data in client side using javascript. I tried the examples you posted here. however, the "require" fucntion is not supported in browser. Do you have any suggestions? Thank you very much.

Buffer properties/fields are enumerable

All getters, setters and properties (e.g. toString, constructor) appear to be enumerable.
This is probably not what you would want or expect when looping through the Buffer indices using a for-in loop.
You can make them non-enumerable using Object.defineProperty.

Data corruption with highCompression encoding (testcase attached)

This is quite inconvenient, as encoding silently produces corrupted lz4 data, which later can't be read correctly by either node-lz4 or command-line lz4.

Code:

var fs = require('fs')
var lz4 = require('lz4')
var encoder = lz4.createEncoderStream({
  highCompression: true
})
var input = fs.createReadStream('test.txt')
var output = fs.createWriteStream('test.txt.lz4')
input.pipe(encoder).pipe(output)

// this is not required, just `lz4cat test.txt.lz4` also fails
output.on('close', function () {
  var decoder = lz4.createDecoderStream()
  var inputd = fs.createReadStream('test.txt.lz4')
  var outputd = fs.createWriteStream('test.dec.txt')
  inputd.pipe(decoder).pipe(outputd)
});

test.txt is attached: test.txt

Shorter version:

var fs = require('fs')
var lz4 = require('lz4')
var encoder = lz4.createEncoderStream({
  highCompression: true
})
var decoder = lz4.createDecoderStream()
var input = fs.createReadStream('test.txt')
var output = fs.createWriteStream('/dev/null')
input.pipe(encoder).pipe(decoder).pipe(output);

Segmentation fault when encoder chunkSize = 512

I have encounter serious problem during encoding file with small chunkSize

Node.js version:
0.10.5

How to reproduce problem:

var fs = require('fs'),
  lz = require('lz4'),
  encoder = lz.createEncoderStream({
    hc: true,
    chunkSize: 512
  });

fs.createReadStream('rfc2616.txt').pipe(encoder).pipe(fs.createWriteStream('test.lz4'));

And i get an error

segmentation fault (core dumped)  node compress.js

Use lz4 with requireJS in browser

Hi, I tried to use this lz4 version with requireJS in browser, but I can't get it working. Although I ported the CommmonJS format to amd it complains about missing files. Does the lz4.js in build folder depend on xxhashjs and all the other required files? Do I have to import them, too?

uncompressed file size of high compressed file is wrong?

I use the code for a high compressed lz4 file

let uncompressedSize = LZ4.decodeBlock(file, uncompressedFile); // error, value is less than zero

but value is negative (below zero :) )
Is that error or I do something wrong?

File is compressed by liblz4-tool. I used ubuntu package.

sudo apt install liblz4-tool
lz4 -z -9 test.txt

Unable to decode mozlz4 file (even if custom magic number is removed)

Sorry for creating an issue for this. I would contact you directly if I knew how.

This SO question I created talks about everything, but a summary:

I need to read a search.json.mozlz4 file for a Firefox extension I'm porting to WebExtensions. The file uses a custom lz4 format by Mozilla that should be similar to the real deal but uses a custom magic number 'mozLz40\0' with 8 bytes (this page contains the main methods that encode and decode this format).

This Python code uses the lz4 Python binding and decodes the file simply by skipping the magic number in the input data:

import lz4
file_obj = open("search.json.mozlz4", "rb")
if file_obj.read(8) != b"mozLz40\0":
	raise InvalidHeader("Invalid magic number")
print(lz4.block.decompress(file_obj.read()))

However, after several hours I still can't find a way to do this using node-lz4. Removing 8 bytes from the input and commenting out the read and check parts in read_MagicNumber don't seem to make a difference, generating errors further down (wrong version, reserved bit set, etc).

I should note that I am reading the file in an independent way (I can't read just any file path in WebExtensions) and then passing the contents to node-lz4 methods (I tried decode and decodeBlock).

Do you have any idea how I could achieve this? I greatly appreciate any pointers you can give, even if this is not a true issue with the library, seeing that it is a semi-custom format. (Maybe I can read the custom header myself and then ask node-lz4 to decompress the data portion?)

Deprecated declarations

When installing the lz4 0.5.0, I get the following warnings:

> [email protected] install /Users/username/domains/myprofile/log-encoder/node_modules/lz4
> node-gyp rebuild

  CXX(target) Release/obj.target/lz4/lib/binding/lz4_binding.o
../lib/binding/lz4_binding.cc:205:13: warning: 'LZ4_create' is deprecated: use LZ4_createStream() instead [-Wdeprecated-declarations]
  void* p = LZ4_create( Buffer::Data(input) );
            ^
../lib/binding/../../deps/lz4/lib/lz4.h:348:56: note: 'LZ4_create' has been explicitly marked deprecated here
LZ4_DEPRECATED("use LZ4_createStream() instead") void* LZ4_create (char* inputBuffer);
                                                       ^
../lib/binding/lz4_binding.cc:211:62: warning: 'LZ4_sizeofStreamState' is deprecated: use LZ4_createStream() instead [-Wdeprecated-declarations]
  Nan::MaybeLocal<Object> handle = Nan::NewBuffer((char *)p, LZ4_sizeofStreamState(), null_cb, NULL);
                                                             ^
../lib/binding/../../deps/lz4/lib/lz4.h:349:56: note: 'LZ4_sizeofStreamState' has been explicitly marked deprecated here
LZ4_DEPRECATED("use LZ4_createStream() instead") int   LZ4_sizeofStreamState(void);
                                                       ^
../lib/binding/lz4_binding.cc:261:28: warning: 'LZ4_slideInputBuffer' is deprecated: use LZ4_saveDict() instead [-Wdeprecated-declarations]
  char* input_next_block = LZ4_slideInputBuffer(Buffer::Data(lz4ds));
                           ^
../lib/binding/../../deps/lz4/lib/lz4.h:351:56: note: 'LZ4_slideInputBuffer' has been explicitly marked deprecated here
LZ4_DEPRECATED("use LZ4_saveDict() instead")     char* LZ4_slideInputBuffer (void* state);
                                                       ^
3 warnings generated.
  CC(target) Release/obj.target/lz4/deps/lz4/lib/lz4.o
  CC(target) Release/obj.target/lz4/deps/lz4/lib/lz4hc.o
  SOLINK_MODULE(target) Release/lz4.node
  SOLINK_MODULE(target) Release/lz4.node: Finished
  CXX(target) Release/obj.target/xxhash/lib/binding/xxhash_binding.o
  CC(target) Release/obj.target/xxhash/deps/lz4/lib/xxhash.o
  SOLINK_MODULE(target) Release/xxhash.node
  SOLINK_MODULE(target) Release/xxhash.node: Finished
[email protected] node_modules/lz4
├── [email protected]
├── [email protected]
└── [email protected]

Doesn't seem to be fatal, but should be fixed.

Invalid checksum failures

I'm not entirely sure why, but when testing this library I was almost consistently getting checksum failures during decompression, which I know for a fact were false positives because the data being compressed/decompressed was also being cryptographically signed/validated to ensure integrity.

Everything is working as expected with streamChecksum set to false (which is a good enough workaround for my use case, given the aforementioned signing), so it seems like this must be an issue with the checksumming specifically, and not just a general encoding/decoding failure.

I didn't test all that exhaustively, and can't say what about my test case was special enough to cause this, but a few guesses as to the root cause:

  • A bug in handling byte offsets of typed arrays.

  • Something to do with compressing from Node.js and decompressing in a browser.

  • Issues with certain data inputs, e.g. Protocol Buffers or ~50kb png and jpeg images.

(As far as the actual error message: I thought I'd saved it somewhere, but apparently not. That said, I recall something about magic numbers, if that helps.)

Can't seem to read from python's lz4

Python's lz4 produces this when compressing 'a':

<Buffer 01 00 00 00 10 61>

Your library produces this when compressing 'a':

<Buffer 04 22 4d 18 64 70 b9 02 00 00 00 10 61 00 00 00 00 56 74 0d 55>

Neither library can decode the other's output. Is there any chance you could try to make these libraries compatible, since both are supposed to use the same compression?

See: python-lz4/python-lz4#8

FR: Support Block Dependence

I am getting following error for some lz4 files.

Error: Invalid data block: 12 @579083
    at Decoder.emit_Error (src/node_modules/lz4/lib/decoder_stream.js:64:22)
    at Decoder.uncompress_DataBlock (src/node_modules/lz4/lib/decoder_stream.js:246:9)
    at Decoder._main (src/node_modules/lz4/lib/decoder_stream.js:317:25)
    at Decoder._transform (src/node_modules/lz4/lib/decoder_stream.js:60:7)
    at Decoder.Transform._read (_stream_transform.js:167:10)
    at Decoder.Transform._write (_stream_transform.js:155:12)
    at doWrite (_stream_writable.js:301:12)
    at writeOrBuffer (_stream_writable.js:287:5)
    at Decoder.Writable.write (_stream_writable.js:215:11)
    at IncomingMessage.ondata (_stream_readable.js:536:20)

I added some debug code in "Decoder.prototype.uncompress_DataBlock" function in decoder_stream.json

this.descriptor.blockMaxSize 262144 , decodedSize262144
this.descriptor.blockMaxSize 262144, decodedSize -4 <<< This is where it errors out

Invalid stream checksum

Hi, I have a problem in this simple example

var lz4 = require('lz4');

var data = new Buffer(200);
data.fill(0);

lz4.decode(lz4.encode(data));

As result

events.js:72
        throw er; // Unhandled 'error' event
              ^
Error: Invalid stream checksum: -24E4B2DD @26
    at Decoder.emit_Error (/home/arepo/node_modules/lz4/lib/decoder_stream.js:64:22)
    at Decoder.read_EOS (/home/arepo/node_modules/lz4/lib/decoder_stream.js:268:9)
    at Decoder._main (/home/arepo/node_modules/lz4/lib/decoder_stream.js:318:25)
    at Decoder._transform (/home/arepo/node_modules/lz4/lib/decoder_stream.js:60:7)
    at Decoder.Transform._read (_stream_transform.js:179:10)
    at Decoder.Transform._write (_stream_transform.js:167:12)
    at doWrite (_stream_writable.js:225:10)
    at writeOrBuffer (_stream_writable.js:215:5)
    at Decoder.Writable.write (_stream_writable.js:182:11)
    at Decoder.Writable.end (_stream_writable.js:340:10)

My software

% uname -a
Linux work 3.16.1-1-ARCH #1 SMP PREEMPT Thu Aug 14 07:40:19 CEST 2014 x86_64 GNU/Linux

% node --version
v0.10.31

% npm list lz4
/home/arepo
└── [email protected]

Warnings at install on windows

Using Node 4.4.0

D:\Project\electron-quick-start\node_modules\lz4>if not defined npm_config_node_gyp (node "C:\Program Files (x86)\nodejs\node_modules\npm\bin\node-gyp-bin\\..\..\node_modules\node-gyp\bin\node-gyp.js" rebuild )  else (node "" rebuild )
Building the projects in this solution one at a time. To enable parallel build, please add the "/m" switch.
  lz4_binding.cc
..\lib\binding\lz4_binding.cc(38): warning C4267: 'initializing': conversion from 'size_t' to 'uint32_t', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(54): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(66): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(92): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(146): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(177): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(205): warning C4996: 'LZ4_create': use LZ4_createStream() instead [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
  d:\project\electron-quick-start\node_modules\lz4\lib\binding\../../deps/lz4/lib/lz4.h(348): note: see declaration of 'LZ4_create'
..\lib\binding\lz4_binding.cc(211): warning C4996: 'LZ4_sizeofStreamState': use LZ4_createStream() instead [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
  d:\project\electron-quick-start\node_modules\lz4\lib\binding\../../deps/lz4/lib/lz4.h(349): note: see declaration of 'LZ4_sizeofStreamState'
..\lib\binding\lz4_binding.cc(238): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(261): warning C4996: 'LZ4_slideInputBuffer': use LZ4_saveDict() instead [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
  d:\project\electron-quick-start\node_modules\lz4\lib\binding\../../deps/lz4/lib/lz4.h(351): note: see declaration of 'LZ4_slideInputBuffer'
..\lib\binding\lz4_binding.cc(310): warning C4267: 'initializing': conversion from 'size_t' to 'uint32_t', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(326): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(339): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
..\lib\binding\lz4_binding.cc(365): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data [D:\Project\electron-quick-start\node_modules\lz4\build\lz4.vcxproj]
  lz4.c
  lz4hc.c
  win_delay_load_hook.c
     Creating library D:\Project\electron-quick-start\node_modules\lz4\build\Release\lz4.lib and object D:\Project\electron-quick-start\node_modules\lz4\build\Release\lz4.exp
  Generating code
  Finished generating code
  lz4.vcxproj -> D:\Project\electron-quick-start\node_modules\lz4\build\Release\\lz4.node
  xxhash_binding.cc
  xxhash.c
  win_delay_load_hook.c
     Creating library D:\Project\electron-quick-start\node_modules\lz4\build\Release\xxhash.lib and object D:\Project\electron-quick-start\node_modules\lz4\build\Release\xxhash.exp
  Generating code
  Finished generating code
  xxhash.vcxproj -> D:\Project\electron-quick-start\node_modules\lz4\build\Release\\xxhash.node
[email protected] node_modules\lz4
├── [email protected]
├── [email protected]
└── [email protected]

C decoder does not accept valid offsets

I have tried to decompress the attached file (inside a .zip) with node-lz4, both with useJS and without it. With useJS it works fine; without it it does not. I suspect it is an issue with the C decoder, since I think it is valid. The offending sequence is in hex:

token: f1
literal length: 12
literal: 204f63742020332031323a33373a3437206b65726e656c3a205b20202020302e30
offset: 0100

It contains a literal of length 33 and then an offset of 01, which should be valid since it indicates to copy from the previous literal.

Thanks!

offsets.lz4.zip

Error With Firefox (only tested with v 29)

var Buffer = require('buffer').Buffer;
var LZ4 = require('lz4');

LZ4.decode( LZ4.encode('test') )
Error: Invalid version: 0 != 1 @4

Works seemingly everywhere else (IE, Chrome, etc)

Wrong browser-field config in package.json?

Hi,
after building with angular 6, which uses webpack internally, we receive ReferenceErrors during runtime because lz4 is trying to use Buffer.
We were able to narrow down the issue, that even though there is a browser-version, it is not used by webpack while bundling.
After we changed the browser-field in package.json to browser: "./build/lz4.js" everything works fine.
There is a PR for that.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.