Code Monkey home page Code Monkey logo

node-lzo's Introduction

node-lzo npm version

Node.js Bindings for LZO Compression

Example

const lzo = require('lzo');

console.log('Current version:', lzo.version, '-', lzo.versionDate);

let str = 'Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam',
    compressed = lzo.compress(str);

console.log('Original Length:', str.length, '-- Compressed length:', compressed.length);

let decompressed = lzo.decompress(compressed);

console.log('Decompressed Length:', decompressed.length);
console.log(decompressed.toString());

Properties

version

The version of LZO being used.

versionDate

The date on which the version was released.

errors

An object containing the lzo error codes as seen below.

Methods

compress(data, length)

If data is not a Buffer, the function will try to convert it via Buffer.from. If you specify a length, the function will allocate that much memory for the compressed data.
Returns the compressed data as a Buffer.

decompress(data, length)

If data is not a Buffer, the function will try to convert it via Buffer.from. If you specify a length, the function will allocate that much memory for the decompressed data. I suggest you to do so whenever you know the length.
Returns the decompressed data as a Buffer.

Errors

Code Description
-1 LZO_E_ERROR
-2 LZO_E_OUT_OF_MEMORY
-3 LZO_E_NOT_COMPRESSIBLE
-4 LZO_E_INPUT_OVERRUN
-5 LZO_E_OUTPUT_OVERRUN
-6 LZO_E_LOOKBEHIND_OVERRUN
-7 LZO_E_EOF_NOT_FOUND
-8 LZO_E_INPUT_NOT_CONSUMED
-9 LZO_E_NOT_YET_IMPLEMENTED
-10 LZO_E_INVALID_ARGUMENT
-11 LZO_E_INVALID_ALIGNMENT
-12 LZO_E_OUTPUT_NOT_CONSUMED
-99 LZO_E_INTERNAL_ERROR

node-lzo's People

Contributors

mattolson avatar schroffl avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

node-lzo's Issues

LZO implementation is GPL-licensed

Hi there,

I noticed that node-lzo is licensed under the MIT license, but the LZO implementation used is GPL. Wouldn't that make node-lzo GPL-licensed as well?

Buffer question

Apologies in advance as I think this is a question that will just help my understanding rather than an issue (I think).

I have a createReadStream instantiated from the 'webhdfs' package. I can read in a stream of json coming from this filesystem no problem, via the chunk method, and I can serve up the full contents of that file via express endpoint. All good. The problem is when I have an lzo compressed file on that file system.

I've tried a few different ways and I either segmentation fault or it complains about how the buffer has to be an array, or just shows the buffer output, etc, depending on how I code it.

The most current, which I feel is close to what I'm supposed to be doing, looks like this:

var remoteFileStream = hdfs.createReadStream('/user/me/telemetry_hdfs.'+req.params.hdfsfile);

var data = new Buffer();

  remoteFileStream.on('data', function onChunk (chunk) {
    // Concatenate the chunks into the buffer
    data = Buffer.concat([data, chunk]);
    console.log(bufferdata);
 })

Then I'd go on to decompress using lzo when the full data has been received.

The problem is I can't get past this error:

TypeError: First argument must be a string, Buffer, ArrayBuffer, Array, or array-like object.
    at fromObject (buffer.js:280:9)
    at Function.Buffer.from (buffer.js:106:10)
    at new Buffer (buffer.js:85:17)
    at /home/centos/telemetry/new_hdfstest.js:36:14
    at Layer.handle [as handle_request] (/home/centos/telemetry/node_modules/express/lib/router/layer.js:95:5)
    at next (/home/centos/telemetry/node_modules/express/lib/router/route.js:137:13)
    at Route.dispatch (/home/centos/telemetry/node_modules/express/lib/router/route.js:112:3)
    at Layer.handle [as handle_request] (/home/centos/telemetry/node_modules/express/lib/router/layer.js:95:5)
    at /home/centos/telemetry/node_modules/express/lib/router/index.js:281:22
    at param (/home/centos/telemetry/node_modules/express/lib/router/index.js:354:14)
TypeError: First argument must be a string, Buffer, ArrayBuffer, Array, or array-like object

If I instantiate the Buffer() with an empty string inside, I don't error out, so I know the "new Buffer()" piece might have problems as I am instantiating as a string.. Here is what I'm talking about:

  var data = new Buffer('');  /// <<<< ------ HERE
  var rawchunk;

  var chunklength=0;

  remoteFileStream.on('data', function onChunk (chunk) {
    // Do something with the data chunk
    chunklength += chunk.length;
    console.log(chunklength);

    data = Buffer.concat([data, chunk]);
  })

But then when I console out what "data" is after the full buffer is finished, it appears I only have 1 of the buffer chunks, though I know it fully completes. Here is what the output looks like:

....
18518867
18584403
18649939
18715475
18781011
18846547
18912083
18977619
19043155
19108691
19174227
19239763
19305299
19370835
19436371
19501907
19567443
19632979
19698515
19764051
19829587
19895123
19960659
20026195
20091731
20157267
20222803
20288339
20353875
20419411
20428832
<Buffer 89 4c 5a 4f 00 0d 0a 1a 0a 10 20 20 30 09 40 01 05 03 00 00 0d 00 00 00 00 59 23 90 aa 00 00 00 00 00 1e e1 02 96 00 04 00 00 00 00 65 39 5f fb 32 25 ... >
read is fully complete

The code snippet that produced that output, including the on('finish') method was this:

  var data = new Buffer('');
  var chunklength=0;

   remoteFileStream.on('data', function onChunk (chunk) {
    // Do something with the data chunk

    chunklength += chunk.length;
    console.log(chunklength);
    data = Buffer.concat([data, chunk]);
   })

  remoteFileStream.on('finish', function onFinish () {
    // Read is done
    console.log(data)
    console.log('read is fully complete')]\
  });

I was thinking I would put the lzo.decompress() in that last on('finish') method, but the fact that console.log(data) only produces 1 line makes me think there is something wrong.. I get a Segmentation fault when I try to decompress it! Here is the code where I'm trying to do the actual lzo.decompress():

remoteFileStream.on("finish", function onFinish () {
    // Read is done
    console.log('read is fully complete')
    var decompressed = lzo.decompress(data)
  });

I've also tried doing a .pipe(lzo.decompress()) at the end of the hdfs.createReadStream(), and that fails in other ways. It would be cool to figure out how to do it like that but I feel like I should at least be able to figure out this buffer approach. I know this is not an "issue" with the LZO package rather more of a clarification of how it can be used with a buffer, and helping me understand the buffer in general, so I appreciate the patience in advance!!

Thanks!
Chris

npm install lzo failing on windows

Hello,
I am trying to install lzo package in electron framework. I am trying to build electron app on windows 10 64 bit edition.
at first unsuccessful installation error thrown to suggest install node-gyp rebuild so I did that.

but even after that I kept on getting same error message as error code ELIFECYCLE

attaching npm debug log for further assessment ... your help will be appreciated.

2017-10-31T21_46_16_762Z-debug.log
2017-10-31T21_10_08_700Z-debug.log

Note: I got the below mentioned message on command prompt(terminal)
MSBUILD : error MSB3428: Could not load the Visual C++ component "VCBuild.exe". To fix this, 1) install the .NET Framework 2.0 SDK, 2) install Microsoft Visua l Studio 2005 or 3) add the location of the component to the system path if it is installed elsewhere. [C:\Users\Niketan\AppData\Roaming\npm\node_modules\lzo \build\binding.sln]

Its not possible to get visual studio 2005.

Streaming

I am trying to wrap this in a Transform stream as I am dealing with files that are multiple GB in size.

const lzo = require('lzo');
const Transform = require('stream').Transform;

class LzoTransform extends Transform {
    constructor(options) {
        options = options || {};
        super(options);
    }

    _transform(chunk, encoding, callback) {
        this.push(lzo.decompress(chunk));
        setImmediate(callback);
    }
}

const fs = require('fs');
console.info('Starting');
const readStream = fs.createReadStream('file.txt.lzo', { encoding: null });
const writeStream = fs.createWriteStream('file.txt');

writeStream.on('finish', () => {
    console.info('Finished');
    process.exit(0);
});

readStream.pipe(new LzoTransform()).pipe(writeStream);

Error: Decompression failed with code: LZO_E_LOOKBEHIND_OVERRUN

I suspect that this LZO library may not be able to decompress arbitrary chunks of a file and must do the whole thing at once? Any thoughts?

Thanks!

Chris

hard crash (segfault) fault when decompressing data

const lzo = require('lzo');
const str = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
const str_ = lzo.decompress(lzo.compress(str));
console.log(str_);

Above example reliably triggers a crash:

→ node boom.js     
zsh: segmentation fault (core dumped)  node boom.js
*** Error in `/usr/bin/nodejs': free(): invalid next size (fast): 0x0000000004a1d830 ***

node-gyp rebuild error

Hi,

I am trying to install parquetjs and keep running into install errors.
I did use npm install --global --production windows-build-tools successfully,
using now Win10, node v6.11.3 and
{ npm: '3.10.10',
ares: '1.10.1-DEV',
http_parser: '2.7.0',
icu: '58.2',
modules: '48',
node: '6.11.3',
openssl: '1.0.2l',
uv: '1.11.0',
v8: '5.1.281.107',
zlib: '1.2.11' }

The error log is attached below:
parquetjs_debug.log

Devtools was disconnected from the page. Lzo crashes.

Hello,
I am trying to decompress the binary data coming over LAN. Data is in LZO compression from server side. But LZO.decompress() failing to decompress it. and throwing error DevTools was disconnected.

Can this library do binary operation for compression or decompression?

@schroffl I know you said not to use in critical operations, but I have no other option as data is in LZO compressed format so have to use this library.

lzo

Browserify and fs dependency

Hello,

I am trying to use this module in a cordova app using browserify, and am running into an issue with the fs module. This is what I am trying to do:

$ cat lzo.js
var LZO = require('lzo');
console.log(LZO);

$ browserify lzo.js -o lzo-bundle.js

The generated lzo-bundle.js has a depedency on fs which is causing problems during run-time.

/**
 * Module dependencies.
 */

var fs = require('fs')
  , path = require('path')
  , join = path.join
  , dirname = path.dirname
  , exists = fs.existsSync || path.existsSync

Errors:

Uncaught TypeError: exists is not a function
    at Function.getRoot (lzo-bundle.js:2493)
    at bindings (lzo-bundle.js:2395)
    at Object.<anonymous> (lzo-bundle.js:2513)
    at Object.9.bindings (lzo-bundle.js:2579)
    at s (lzo-bundle.js:1)
    at lzo-bundle.js:1
    at Object.7.lzo (lzo-bundle.js:2336)
    at s (lzo-bundle.js:1)
    at e (lzo-bundle.js:1)
    at lzo-bundle.js:1

I'd appreciate any help in sorting this out.

Thanks!

decompress doesnt return value if input is bigger than something arbitrair

Hello,

I noticed when decompressing it is kinda random if the data will be returned or the app will quit without any message.

when I run this code which is a modification on the readme code it the output is kinda odd, its incomplete besides that there is clearly data missing.

const lzo = require('lzo')

function log( ...args ){
	console.log( args.reduce( ( m, arg ) => {
		arg = arg.toString().slice( 0, 10 )
		m += ( arg + ' '.repeat( 10 - arg.length ))
		return m
	}, '' ) )
}

console.log( 'o: Original data\nc: Compressed data\ndc:Decompressed data' )
log( 'o length', 'c length', 'dc length', 'equal size', 'o slice matches dc' )

function test( len ){
	const str = '0123456789'.repeat( len )
	const compressed = lzo.compress(str)
	const decompressed = lzo.decompress(compressed)
	log( str.length, compressed.length, decompressed.length, str == decompressed, str.slice(0,decompressed.length) == decompressed )
}

for( let i = 0; i < 10; i++ ){
	test( i )
}
for( let i = 10; i <= 100; i += 10 ){
	test( i )
}

output:

>node testDecompress.js
o: Original data
c: Compressed data
dc:Decompressed data
o length  c length  dc length equal sizeo slice ma
0         3         0         true      true
10        14        10        true      true
20        24        20        true      true
30        34        30        true      true
40        36        40        true      true
50        39        50        true      true
60        33        60        true      true
70        35        70        true      true
80        37        80        true      true
90        40        90        true      true
100       33        99        false     true
200       37        111       false     true
300       33        99        false     true
400       38        114       false     true
>

is the data length restricted to 99 for decompression?
what causes the unexpected exit, is the code internally going async?

running this app multiple times shows different output every time, sometimes it gets to logging 100 sometimes it finishes, but most of the time it exits somewhere in between.

Worker Threads: "Module did not self-register"

When using the new experimental worker threads in Node 10.5 and above, if you require lzo in the worker, the following error is thrown:

Error: Module did not self-register.
    at Object.Module._extensions..node (internal/modules/cjs/loader.js:718:18)
    at Module.load (internal/modules/cjs/loader.js:599:32)
    at tryModuleLoad (internal/modules/cjs/loader.js:538:12)
    at Function.Module._load (internal/modules/cjs/loader.js:530:3)
    at Module.require (internal/modules/cjs/loader.js:637:17)
    at require (internal/modules/cjs/helpers.js:20:18)
    at bindings (/Users/matt.olson/workspace/storefront-renderer/node_modules/bindings/bindings.js:76:44)
    at Object.<anonymous> (/Users/matt.olson/workspace/storefront-renderer/node_modules/lzo/index.js:3:36)
    at Module._compile (internal/modules/cjs/loader.js:689:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:700:10)

This seems to be related to nodejs/node#21783 and nodejs/node#21481. From the comments, it appears that this is a special case that authors of native addons need to handle. I'm really keen on getting this to work. Let me know if it's already on your radar or if you'd like some help with it. Thanks!

Issue when running npm install on macOS

When running NPM install on macOS, with node-lzo as a dependency, the following error occurs.

Node version v8.10.0, macOS Mojave

The issue can be worked around by adding "lzo": "0.4.7" as a dependency of my project.

> node-gyp rebuild

  CXX(target) Release/obj.target/node_lzo/lib/lzo.o
../lib/lzo.cc:120:1: error: C++ requires a type specifier for all declarations
NODE_MODULE_INIT(/* exports, module, context */) {
^
../lib/lzo.cc:121:10: error: use of undeclared identifier 'exports'
    Init(exports, context);
         ^
../lib/lzo.cc:121:19: error: use of undeclared identifier 'context'
    Init(exports, context);
                  ^
3 errors generated.
make: *** [Release/obj.target/node_lzo/lib/lzo.o] Error 1
gyp ERR! build error
gyp ERR! stack Error: `make` failed with exit code: 2
gyp ERR! stack     at ChildProcess.onExit (/Users/thecjgcjg/.nvm/versions/node/v8.10.0/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:262:23)
gyp ERR! stack     at emitTwo (events.js:126:13)
gyp ERR! stack     at ChildProcess.emit (events.js:214:7)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:198:12)
gyp ERR! System Darwin 18.0.0
gyp ERR! command "/Users/thecjgcjg/.nvm/versions/node/v8.10.0/bin/node" "/Users/thecjgcjg/.nvm/versions/node/v8.10.0/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "rebuild"
gyp ERR! cwd /Users/thecjgcjg/Sites/project/Main Stack/node_modules/lzo
gyp ERR! node -v v8.10.0
gyp ERR! node-gyp -v v3.8.0
gyp ERR! not ok

How to find lzo.node binding file??

Hi, I installed this with a simple npm install and followed the example, but even if I print the version it fails with this error:
`/home/centos/telemetry/node_modules/bindings/bindings.js:91
throw err
^

Error: Could not locate the bindings file. Tried:
→ /home/centos/telemetry/node_modules/lzo/build/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/build/Debug/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/build/Release/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/out/Debug/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/Debug/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/out/Release/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/Release/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/build/default/node_lzo.node
→ /home/centos/telemetry/node_modules/lzo/compiled/7.10.0/linux/x64/node_lzo.node
at bindings (/home/centos/telemetry/node_modules/bindings/bindings.js:88:9)
at Object. (/home/centos/telemetry/node_modules/lzo/index.js:3:32)
at Module._compile (module.js:571:32)
at Object.Module._extensions..js (module.js:580:10)
at Module.load (module.js:488:32)
at tryModuleLoad (module.js:447:12)
at Function.Module._load (module.js:439:3)
at Module.require (module.js:498:17)
at require (internal/module.js:20:19)
at Object. (/home/centos/telemetry/new_hdfstest.js:1:75)`

is a node_lzo.node file supposed to be created or do I have to manually create one?

FWIW, when I try it here, just clicking Run gives the same error:
https://npm.runkit.com/lzo

Thanks,
Chris

heap buffer overflow vulnerability

in index.js:36 you allow the user to specify the size of the target buffer:

'compress': (input, length) => {
  // ...
  let output = Buffer.alloc(length || (input.length + (input.length / 16) + 64 + 3));
  result = lzo.compress(input, output);
  // ...    
}

later you pass a pointer to this buffer to the lzo1x_1_compress function in lzo.cc:21

int result = compress(  (unsigned char *) node::Buffer::Data(inputBuffer),
                        (unsigned char *) node::Buffer::Data(outputBuffer),
                        input_len,
                        output_len );

int compress(const unsigned char *input, unsigned char *output, lzo_uint in_len, lzo_uint& out_len) {
  if (lzo_init() != LZO_E_OK)
    return ERR_INIT_FAILED;

  return lzo1x_1_compress(input, in_len, output, &out_len, wrkmem);
}

the call makes sure to forward the correct length of the output buffer in the out_len (4th) argument. however, from a very quick reading of the source of the lzo1x_1_compress function it seems that this argument is not checked/used as an input bounds parameter, but merely as an output parameter: https://github.com/schroffl/node-lzo/blob/master/lib/minilzo209/minilzo.c#L4791

I'm not 100% positive that there isn't a check that I have missed (the c code in minilzo is very dense and re-uses identifiers a lot), so I might be mistaken, but it seems like this allows users to write arbitrary data beyond the bounds of the output buffer, potentially creating a security vulnerability.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.