max-mapper / concat-stream Goto Github PK
View Code? Open in Web Editor NEWwritable stream that concatenates strings or data and calls a callback with the result
License: MIT License
writable stream that concatenates strings or data and calls a callback with the result
License: MIT License
I noticed that the callback is never fired if for example a readable stream that pipes to it emits an error. This is somewhat unexpected behaviour and I'm not sure if it's by design or a bug.
e.g.:
const concat = require('concat-stream')
const MeterStream = require('meterstream')
const arrToReadStream = (arr) => (
new Readable({
read() {
const buf = arr.shift()
process.nextTick(() => {
if (typeof buf === 'undefined') {
return this.push(null)
}
this.push(buf)
})
},
})
)
const rs = arrToReadStream([Buffer.alloc(10, 'a'), Buffer.alloc(90, 'b'), Buffer.alloc(1, 'c')])
rs.pipe(new MeterStream(100))
.on('error', (err) => {
t.ok(err instanceof MeterStream.OverflowError)
})
// never fired....
.pipe(concat(console.log))
This is somewhat unexpected because if you pipe to process.stdout
or listen for .on('data')
events, some data is piped before the error is fired. Also, from what I understand, Node.js streams that are being piped to don't typically have a way to know whether an error happened upstream.
process.stdin.pipe(concat(function(data) {})).pipe(process.stdout);
Hi there, it would be great from a distribution packaging point of view if you could include a copy of the MIT license with your software, usually in a file called LICENSE.
Thanks!
Node Security Project has released a Buffer Overhead Vulnerability advisory for this project, please see: https://nodesecurity.io/advisories/142 (CVSS high priority) dependency
Overview
concat-stream is a writable stream that concatenates all the data from a stream and calls a callback with the result. By supplying a numeric value to the write function, in certain circumstances it is possible to cause a buffer overread which will read arbitrary memory.
Remediation
Consider using the --zero-fill-buffers command line argument to zero out buffer before using them. Avoid passing numeric values to the write function.
References
https://nodejs.org/api/buffer.html#buffer_the_zero_fill_buffers_command_line_option
I've been playing around with some git & node experiments and I noticed there aren't git tags for any of the released versions of concat-stream
, any chance you could tag/start tagging releases?
It looks like 1.4.5 never made it to npm. npm publish
, please!
Some of the new updates moving this package to streams2 (IE: 280ca96) are great, but they don't work in older versions of node:
> process.version
'v0.8.20'
> var Writable = require('stream').Writable // from index.js
undefined
I think this would require a bump in package.json.engines
.
I found this packages because I use https://github.com/faye/faye which consumes concat-stream
, which thinks it's dependants can run on node v0.8.x.
Thanks!
Here's the point:
var concat = require('concat-stream');
process.stdin
.pipe(concat(function(buff) {
// do smth
}))
.pipe(process.stdout);
It gives the following:
Error: Cannot pipe. Not readable.
at ConcatStream.Writable.pipe (.../node_modules/concat-stream/node_modules/readable-stream/lib/_stream_writable.js:142:22)
Why chaining is broken in this case? Does concat-stream apply this one?
I wish to be able to do something like this:
concat(input_string) // or `concat([input_string])`
.pipe(someTransform())
.pipe(concat(function(data){
// ...
});
This is basically using concat-stream to do the reverse (getting a stream from a single piece of data) of what it already does (getting a single piece of data from a stream).
Going even further... why not have duplex concat-streams?
var input = getReadableStream().pipe(concat());
input.once('data', function(data) {
// do something with [complete] input data
});
var transformed = input.pipe(getTransformStream()).pipe(concat());
transformed.once('data', function(data) {
// do something with [complete] transformed data
});
transformed.pipe(getWritableStream());
you asked to help maintain this library, so here are some thoughts:
readable-stream
.getBody
. prefer if it were like .pipe(cat.string())
, .pipe(cat.buffer())
, etc. by default, .pipe(cat())
should just return an array of all the things. developers should know the type of source stream and how they wish to consume the data.cat(stream, callback)
, basically just stream.pipe(this)
callback(err, data)
- it's a callback, not an event listener, so imo it should have err
as the first argument. however, err
should always be null
since concat-stream should never have any errors (i get #6 (comment)) unless we decide to throw errors when there are crazy typing issues. we can do crazy stuff like check listener.length
but i'm not a fan of that either.cat-stream
and the final stream.yield stream.pipe(cat())
support. not sure how to do that yet - maybe duck type it into a promise.I'm not sure this is a big deal, but the encoding inference can cause issues when using objectMode
streams and not specifying that the encoding is objects to concat-stream.
I found this in one of my tests when I updated this dependency:
var spigot = require("stream-spigot")
var concat = require("concat-stream")
spigot({objectMode: true}, [1, 2, 3, 4, 5])
.pipe(concat(function (c) { console.log(c) }))
// <Buffer 31 32 33 34 35>
Using {encoding: "object"}
fixes this, but where people may be relying on the auto-detection (e.g. legacy concat-stream invocations) they could end up in trouble.
Current version is 2.1.5, this results in duplicates of readable-stream
often being installed for other modules that depend on concat-stream
.
The doc says
Since concat-stream is not itself a stream it does not emit errors
Making concat-stream
would make it trivial to get the errors down from the pipeline. It could emit a data
event when it's ready.
just want to suggest to you that there now is a built in nodejs version that can concat streams and iterators to a single buffer
I brought it to userland
Hey from Node.js here!,
Starting on Node 10 this package will emit deprecation warnings. See this guide on what you should do in order to migrate to Buffer.alloc
/Buffer.from
.
See nodejs/node#19079 for discussion around this change and why we can't make new Buffer
work
Drawn from #15. Let's discuss in a separate issue.
jonathanong on Dec 1, 2013
callback(err, data)
- it's a callback, not an event listener, so imo it should haveerr
as the first argument. however,err
should always benull
since concat-stream should never have any errors (i get #6 (comment)) unless we decide to throw errors when there are crazy typing issues. we can do crazy stuff like checklistener.length
but i'm not a fan of that either.
substack on Dec 23, 2013
cb(err, data)
is annoying if there isn't ever an error. Why not just omit that parameter like it presently is?
jeromew on Jan 13, 2014
[...] don't understand "concat-stream should never have any errors" because shouln't concat-stream handle the errors of the underlying streams to report via cb(err, data) ?
zenflow on June 29, 2015 (just now)
If you pipe a readable stream to this or any other writable stream (with Readable.protototype.pipe), errors will not be piped downstream along with the data.
When using this library, you must handle upstream errors yourself
Besides any upstream errors, there are no other errors to expect.
What's the difference between this project and https://github.com/jeffbski/accum? Similar issue filed there at jeffbski/accum#1
Today I got a message from https://nodesecurity.io:
142 - Buffer Overread
Vulnerable: All - Patched: None - Path: [email protected] > [email protected]
How to fix
Consider using the --zero-fill-buffers command line argument to zero out buffer before using them.
Avoid passing numeric values to the write function.
Would be nice to make it a standard node callback, passing through any errors. Perhaps rstream.pipe(concat.withError(function(err, data) { /* */ }))
If not, it'd be good to document that people should handle them. Especially as it's in the node-school, so exposed to lots of new people! :)
Happy to do either of these things if you agree.
The following code:
var concat = require('concat-stream')
var through = require('through')
var stream = through().pause()
stream.pipe(concat(function(data) {}))
stream.emit('error')
throws
stream.js:94
throw er; // Unhandled stream error in pipe.
^
undefined
Before d530532, I believe there would have been an err
argument to the callback.
It seems a bit surprising to me to have no err
argument to the callback. Is there a nice way to handle errors now? Or should we bring the err
argument back maybe?
One of the dependencies of 1.6.* has a security vulnerability that GitHub constantly warns about. It's not in 2.* - it's disparity
.
Unfortunately, the fix to diff
was incorrectly applied to disparity
as a new major, instead of as a minor, meaning its downstreams aren't updating, so the "fix" isn't in place.
Can a user of 1.6.* use 2.*?
Could ... could I talk you into patching and publishing a new 1.6? It's just a version bump, and nyc
/ ava
are throwing security faults on this.
I'd like to see whether this project is passing the test or not currently.
This is a very specific example, and I have seen PassThrough streams work with concat-stream, just not in this case:
var Stream = require("stream")
var CombinedStream = require('combined-stream')
var concatStreams = function(a,b) {
var combinedStream = CombinedStream.create()
combinedStream.append(a)
combinedStream.append(b)
return combinedStream
}
var stringToStream = function(s) {
var a = new Stream.PassThrough()
a.write(s)
a.end()
return a
}
concatStreams(stringToStream("a"), stringToStream("b")).pipe(process.stdout)
Expected output: ab
Actual output: nothing
In node versions 0.10.25 and 0.10.31 (the current latest stable), the output is blank. In 0.10.29, the expected output happens. I'm not sure if this is a regression in node, or undefined behavior of concat-stream, but its really annoying.
I reviewed the code and didn't see anything too 0.8.x specific, is there any reason we can't relax the engine dependency to support 0.6.0? This is breaking my installation of browserify because something in the chain is relying on this module in some sort of fuzzy way because this randomly became a problem for me.
Here is a funny crash, which I've only been able to reproduce in conjunction with combined-stream:
var concat = require('concat-stream')
var CombinedStream = require('combined-stream')
var stream = CombinedStream.create()
stream.append(new Buffer('foo'))
stream.append('bar')
stream.pipe(concat(function() { }))
crashes like so:
buffer.js:496
buf.copy(buffer, pos);
^
TypeError: Object bar has no method 'copy'
at Function.Buffer.concat (buffer.js:496:9)
at ConcatStream.getBody (/home/ubuntu/tmp/node_modules/concat-stream/index.js:37:19)
at ConcatStream.end (/home/ubuntu/tmp/node_modules/concat-stream/index.js:43:36)
at CombinedStream.onend (stream.js:79:10)
at CombinedStream.EventEmitter.emit (events.js:117:20)
at CombinedStream.end (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:140:8)
at CombinedStream._getNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:73:10)
at CombinedStream._pipeNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:106:8)
at CombinedStream._getNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:78:10)
at CombinedStream._pipeNext (/home/ubuntu/tmp/node_modules/combined-stream/lib/combined_stream.js:106:8)
I'm a bit of a Node newbie, and I haven't been able to track this down further. Am I doing something wrong? Or is this a bug in the combined-stream library?
Hi,
I found this vulnerability(uninit mem) https://snyk.io/vuln/npm:concat-stream:20160901.
This fork seems to have fixed it. https://github.com/substack/node-concat-stream.
Should they(substack) have to make a pull request? Or can it be merged by another fork of substack?
Thanks
See the following piece of code.
var concat = require('concat-stream');
var src = // some stream which contains non-utf-8 content for ex : shift_jis
var dest = // it is destination stream
var concatStream = concat(function(buffer)) {
// buffer is empty string here. It is not readable while debugging as well.
// I want to do some modifications in this buffer.
dest.write(buffer);
dest.close();
}
On browser I see response is coming back but none of the modifications were made to the content. If I try to debug buffer is empty string in callback.
I think this module would be a great example of streams2 code. It'd be pretty simple and would give a few minor benefits like correct backpressure and 'finish'
events. I'd assume we'd release this as version 2.0 to avoid breaking any consumers of 1.0.
I'd be happy to do the work for this in a pull request if you're up for it. Just wanted to make sure. Otherwise I'll probably give it a shot myself and release [email protected]
, which is kind of a sucky name compared to [email protected]
:P.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.