Code Monkey home page Code Monkey logo

concat-stream's Introduction

concat-stream

Writable stream that concatenates all the data from a stream and calls a callback with the result. Use this when you want to collect all the data from a stream into a single buffer.

Build Status

NPM

description

Streams emit many buffers. If you want to collect all of the buffers, and when the stream ends concatenate all of the buffers together and receive a single buffer then this is the module for you.

Only use this if you know you can fit all of the output of your stream into a single Buffer (e.g. in RAM).

There are also objectMode streams that emit things other than Buffers, and you can concatenate these too. See below for details.

Related

concat-stream is part of the mississippi stream utility collection which includes more useful stream modules similar to this one.

examples

Buffers

var fs = require('fs')
var concat = require('concat-stream')

var readStream = fs.createReadStream('cat.png')
var concatStream = concat(gotPicture)

readStream.on('error', handleError)
readStream.pipe(concatStream)

function gotPicture(imageBuffer) {
  // imageBuffer is all of `cat.png` as a node.js Buffer
}

function handleError(err) {
  // handle your error appropriately here, e.g.:
  console.error(err) // print the error to STDERR
  process.exit(1) // exit program with non-zero exit code
}

Arrays

var write = concat(function(data) {})
write.write([1,2,3])
write.write([4,5,6])
write.end()
// data will be [1,2,3,4,5,6] in the above callback

Uint8Arrays

var write = concat(function(data) {})
var a = new Uint8Array(3)
a[0] = 97; a[1] = 98; a[2] = 99
write.write(a)
write.write('!')
write.end(Buffer.from('!!1'))

See test/ for more examples

methods

var concat = require('concat-stream')

var writable = concat(opts={}, cb)

Return a writable stream that will fire cb(data) with all of the data that was written to the stream. Data can be written to writable as strings, Buffers, arrays of byte integers, and Uint8Arrays.

By default concat-stream will give you back the same data type as the type of the first buffer written to the stream. Use opts.encoding to set what format data should be returned as, e.g. if you if you don't want to rely on the built-in type checking or for some other reason.

  • string - get a string
  • buffer - get back a Buffer
  • array - get an array of byte integers
  • uint8array, u8, uint8 - get back a Uint8Array
  • object, get back an array of Objects

If you don't specify an encoding, and the types can't be inferred (e.g. you write things that aren't in the list above), it will try to convert concat them into a Buffer.

If nothing is written to writable then data will be an empty array [].

error handling

concat-stream does not handle errors for you, so you must handle errors on whatever streams you pipe into concat-stream. This is a general rule when programming with node.js streams: always handle errors on each and every stream. Since concat-stream is not itself a stream it does not emit errors.

We recommend using end-of-stream or pump for writing error tolerant stream code.

license

MIT LICENSE

concat-stream's People

Contributors

andersdjohnson avatar brianloveswords avatar daxxog avatar hackergrrl avatar jasnell avatar jmannanc avatar kevva avatar landau avatar ljharb avatar mafintosh avatar max-mapper avatar parshap avatar popomore avatar rvagg avatar shinnn avatar tonistiigi avatar trevorah avatar zertosh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

concat-stream's Issues

Callback never fired if read stream emits an error

I noticed that the callback is never fired if for example a readable stream that pipes to it emits an error. This is somewhat unexpected behaviour and I'm not sure if it's by design or a bug.

e.g.:

const concat = require('concat-stream')
const MeterStream = require('meterstream')

const arrToReadStream = (arr) => (
  new Readable({
    read() {
      const buf = arr.shift()
      process.nextTick(() => {
        if (typeof buf === 'undefined') {
          return this.push(null)
        }
        this.push(buf)
      })
    },
  })
)

const rs = arrToReadStream([Buffer.alloc(10, 'a'), Buffer.alloc(90, 'b'), Buffer.alloc(1, 'c')])
rs.pipe(new MeterStream(100))
  .on('error', (err) => {
    t.ok(err instanceof MeterStream.OverflowError)
  })
  // never fired....
  .pipe(concat(console.log))

This is somewhat unexpected because if you pipe to process.stdout or listen for .on('data') events, some data is piped before the error is fired. Also, from what I understand, Node.js streams that are being piped to don't typically have a way to know whether an error happened upstream.

New encoding auto-detection + streams2 change problem case

I'm not sure this is a big deal, but the encoding inference can cause issues when using objectMode streams and not specifying that the encoding is objects to concat-stream.

I found this in one of my tests when I updated this dependency:

var spigot = require("stream-spigot")
var concat = require("concat-stream")

spigot({objectMode: true}, [1, 2, 3, 4, 5])
  .pipe(concat(function (c) { console.log(c) }))

// <Buffer 31 32 33 34 35>

Using {encoding: "object"} fixes this, but where people may be relying on the auto-detection (e.g. legacy concat-stream invocations) they could end up in trouble.

Stream empty for concatenated passthrough streams

This is a very specific example, and I have seen PassThrough streams work with concat-stream, just not in this case:

var Stream = require("stream")
var CombinedStream = require('combined-stream')


var concatStreams = function(a,b) {
    var combinedStream = CombinedStream.create()
    combinedStream.append(a)
    combinedStream.append(b)

    return combinedStream
}
var stringToStream = function(s) {
    var a = new Stream.PassThrough()
    a.write(s)
    a.end()
    return a
}

concatStreams(stringToStream("a"), stringToStream("b")).pipe(process.stdout)

Expected output: ab

Actual output: nothing

In node versions 0.10.25 and 0.10.31 (the current latest stable), the output is blank. In 0.10.29, the expected output happens. I'm not sure if this is a regression in node, or undefined behavior of concat-stream, but its really annoying.

How big is the difference from 1.6.* to 2.* ?

One of the dependencies of 1.6.* has a security vulnerability that GitHub constantly warns about. It's not in 2.* - it's disparity.

Unfortunately, the fix to diff was incorrectly applied to disparity as a new major, instead of as a minor, meaning its downstreams aren't updating, so the "fix" isn't in place.

Can a user of 1.6.* use 2.*?

Could ... could I talk you into patching and publishing a new 1.6? It's just a version bump, and nyc / ava are throwing security faults on this.

concat-stream - buffer object in callback is empty.

See the following piece of code.

   var  concat = require('concat-stream');
   var src = // some stream which contains non-utf-8 content for ex : shift_jis
   var dest = // it is destination stream
   var concatStream = concat(function(buffer)) {
     // buffer is empty string here. It is not readable while debugging as well.
    // I want to do some modifications in this buffer.
    dest.write(buffer);
    dest.close();
   }

On browser I see response is coming back but none of the modifications were made to the content. If I try to debug buffer is empty string in callback.

CI

I'd like to see whether this project is passing the test or not currently.

Buffer Overhead Vulnerability

Node Security Project has released a Buffer Overhead Vulnerability advisory for this project, please see: https://nodesecurity.io/advisories/142 (CVSS high priority) dependency

Overview

concat-stream is a writable stream that concatenates all the data from a stream and calls a callback with the result. By supplying a numeric value to the write function, in certain circumstances it is possible to cause a buffer overread which will read arbitrary memory.

Remediation

Consider using the --zero-fill-buffers command line argument to zero out buffer before using them. Avoid passing numeric values to the write function.

References

https://nodejs.org/api/buffer.html#buffer_the_zero_fill_buffers_command_line_option

add support for node 0.6.0?

I reviewed the code and didn't see anything too 0.8.x specific, is there any reason we can't relax the engine dependency to support 0.6.0? This is breaking my installation of browserify because something in the chain is relying on this module in some sort of fuzzy way because this randomly became a problem for me.

Please add a copy of the MIT license

Hi there, it would be great from a distribution packaging point of view if you could include a copy of the MIT license with your software, usually in a file called LICENSE.

Thanks!

introduce Readable and Duplex concat streams

I wish to be able to do something like this:

concat(input_string) // or `concat([input_string])`
  .pipe(someTransform())
  .pipe(concat(function(data){
    // ...
  });

This is basically using concat-stream to do the reverse (getting a stream from a single piece of data) of what it already does (getting a single piece of data from a stream).

Going even further... why not have duplex concat-streams?

var input = getReadableStream().pipe(concat());
input.once('data', function(data) {
  // do something with [complete] input data
});
var transformed = input.pipe(getTransformStream()).pipe(concat());
transformed.once('data', function(data) {
  // do something with [complete] transformed data
});
transformed.pipe(getWritableStream());

Switch to streams2

I think this module would be a great example of streams2 code. It'd be pretty simple and would give a few minor benefits like correct backpressure and 'finish' events. I'd assume we'd release this as version 2.0 to avoid breaking any consumers of 1.0.

I'd be happy to do the work for this in a pull request if you're up for it. Just wanted to make sure. Otherwise I'll probably give it a shot myself and release [email protected], which is kind of a sucky name compared to [email protected] :P.

Crash mixing Buffers and strings

Here is a funny crash, which I've only been able to reproduce in conjunction with combined-stream:

var concat = require('concat-stream')
var CombinedStream = require('combined-stream')

var stream = CombinedStream.create()
stream.append(new Buffer('foo'))
stream.append('bar')
stream.pipe(concat(function() { }))

crashes like so:

buffer.js:496
    buf.copy(buffer, pos);
        ^
TypeError: Object bar has no method 'copy'
    at Function.Buffer.concat (buffer.js:496:9)
    at ConcatStream.getBody (/home/ubuntu/tmp/node_modules/concat-stream/index.js:37:19)
    at ConcatStream.end (/home/ubuntu/tmp/node_modules/concat-stream/index.js:43:36)
    at CombinedStream.onend (stream.js:79:10)
    at CombinedStream.EventEmitter.emit (events.js:117:20)
    at CombinedStream.end (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:140:8)
    at CombinedStream._getNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:73:10)
    at CombinedStream._pipeNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:106:8)
    at CombinedStream._getNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:78:10)
    at CombinedStream._pipeNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:106:8)

I'm a bit of a Node newbie, and I haven't been able to track this down further. Am I doing something wrong? Or is this a bug in the combined-stream library?

handling errors

Would be nice to make it a standard node callback, passing through any errors. Perhaps rstream.pipe(concat.withError(function(err, data) { /* */ }))

If not, it'd be good to document that people should handle them. Especially as it's in the node-school, so exposed to lots of new people! :)

Happy to do either of these things if you agree.

Use standard, error-first callbacks

Drawn from #15. Let's discuss in a separate issue.

jonathanong on Dec 1, 2013

callback(err, data) - it's a callback, not an event listener, so imo it should have err as the first argument. however, err should always be null since concat-stream should never have any errors (i get #6 (comment)) unless we decide to throw errors when there are crazy typing issues. we can do crazy stuff like check listener.length but i'm not a fan of that either.

substack on Dec 23, 2013

cb(err, data) is annoying if there isn't ever an error. Why not just omit that parameter like it presently is?

jeromew on Jan 13, 2014

[...] don't understand "concat-stream should never have any errors" because shouln't concat-stream handle the errors of the underlying streams to report via cb(err, data) ?

zenflow on June 29, 2015 (just now)

If you pipe a readable stream to this or any other writable stream (with Readable.protototype.pipe), errors will not be piped downstream along with the data.
When using this library, you must handle upstream errors yourself
Besides any upstream errors, there are no other errors to expect.

Cannot pipe. Not readable.

Here's the point:

var concat = require('concat-stream');

process.stdin
    .pipe(concat(function(buff) {
       // do smth
    }))
    .pipe(process.stdout);

It gives the following:

Error: Cannot pipe. Not readable.
    at ConcatStream.Writable.pipe (.../node_modules/concat-stream/node_modules/readable-stream/lib/_stream_writable.js:142:22)

Why chaining is broken in this case? Does concat-stream apply this one?

Convert to real stream

The doc says

Since concat-stream is not itself a stream it does not emit errors

Making concat-stream would make it trivial to get the errors down from the pipeline. It could emit a data event when it's ready.

How to handle errors without err callback argument

The following code:

var concat = require('concat-stream')
var through = require('through')

var stream = through().pause()
stream.pipe(concat(function(data) {}))
stream.emit('error')

throws

stream.js:94
      throw er; // Unhandled stream error in pipe.
            ^
undefined

Before d530532, I believe there would have been an err argument to the callback.

It seems a bit surprising to me to have no err argument to the callback. Is there a nice way to handle errors now? Or should we bring the err argument back maybe?

v2 ideas

you asked to help maintain this library, so here are some thoughts:

  • streams2 implementation, shouldn't be hard with readable-stream.
  • i don't like the type inferencing in getBody. prefer if it were like .pipe(cat.string()), .pipe(cat.buffer()), etc. by default, .pipe(cat()) should just return an array of all the things. developers should know the type of source stream and how they wish to consume the data.
  • delegate specific use-cases to other libraries. for example, if you want to concat to a string, just use raw-body (if we get .pipe support). overkill, but i don't feel like reimplementing the stream decoder stuff.
  • add cat(stream, callback), basically just stream.pipe(this)
  • callback(err, data) - it's a callback, not an event listener, so imo it should have err as the first argument. however, err should always be null since concat-stream should never have any errors (i get #6 (comment)) unless we decide to throw errors when there are crazy typing issues. we can do crazy stuff like check listener.length but i'm not a fan of that either.
  • don't re-emit data (#11). it doesn't handle back pressure so it's a terrible idea. people should just pipe to both cat-stream and the final stream.
  • yield stream.pipe(cat()) support. not sure how to do that yet - maybe duck type it into a promise.

git tags

I've been playing around with some git & node experiments and I noticed there aren't git tags for any of the released versions of concat-stream, any chance you could tag/start tagging releases?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.