Code Monkey home page Code Monkey logo

duplexer2's People

Contributors

deoxxa avatar jbenet avatar kemitchell avatar mafintosh avatar shinnn avatar timoxley avatar trysound avatar valeriangalliat avatar zertosh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

duplexer2's Issues

Test fails with node >= 10

Hello,

I had to add this patch to make tests work:

--- a/test/tests.js
+++ b/test/tests.js
@@ -173,7 +173,7 @@
     var duplexStream = duplexer3(writable, readable);
     duplexStream.end("aaa");

-    assert.equal(readable._readableState.flowing, null);
+    assert.equal(readable._readableState.flowing, false);

     var transformStream = new stream.Transform({
       transform: function(chunk, encoding, cb) {
@@ -183,10 +183,10 @@
     });
     writable.pipe(transformStream).pipe(readable);

-    assert.equal(readable._readableState.flowing, null);
+    assert.equal(readable._readableState.flowing, false);

     setTimeout(function() {
-      assert.equal(readable._readableState.flowing, null);
+      assert.equal(readable._readableState.flowing, false);

       var src = "";
       duplexStream.on("data", function(buf) {
@@ -198,7 +198,7 @@
         done();
       });

-      assert.equal(readable._readableState.flowing, null);
+      assert.equal(readable._readableState.flowing, false);
     });
   });

"end" event not triggered, not pulling data out of stream?

Since upgrading to 0.1.0, my code didn't work anymore as intended. I think it's because the internals changed and .resume is not called on the readable stream anymore.

I spent quite a while to figure out whether issue is in my (rather complex) code, but still wasn't able to get it to work.
So I tried to create a minimal example to replicate the issue. It seems the issue can be surfaced by having an asynchronous process in _write of the writable side:

var through2 = require("through2");

var duplexer2 = require("./");
var stream = require("readable-stream");


function getMainStream() {
  var writable = new stream.Writable({objectMode: true});
  var readable = through2.obj();

  writable._write = function(data, _, next) {
    setTimeout(() => {
      console.log('written', data);
      readable.write(data);
      next();
    }, 0);
  };
  writable.once('finish', () => {
    console.log('finish');
    readable.end();
  });

  return duplexer2({objectMode: true}, writable, readable);
}

var s = getMainStream()
  .once('end', () => console.log('done'))
  .on('data', d => console.log('out', d));
s.write('a');
s.end();

This is creating a stream and writing one value to it. The stream is "opened" by listing to the data event, but the same behavior can be observed when listing to readable instead or calling resume.

The write side receives data and simply passing it along to the readable side (which is a passthrough stream). Once writing is done (finish event), the readable side is closed (readable.end()).

When I run this code, this is the output I get:

$ node myTest.js
written a
out a
finish

As you can see, the end event triggered. If I comment out the setTimeout part, it works as expected.

Note: If I return the readable and writable directly (e.g. return [readable, writable];), instead of using duplexer2, attach the event handlers to the readable side and write to writable, I get the expected output.

Is there an issue with duplexer2 or am I just misunderstanding streams, in which case, how should this be done correctly (and sorry for hijacking this project for my problem)?

Needs maintainers?

Currently this repo has some unreviewed (and important) PRs and open issues, but hasn't been updated for a long time.

@deoxxa I'd happily maintain this repo if you want to collaborate me.

Change version to at least 0.1.0

This module is great. The problem is that when dependents do npm install duplexer2 --save and they get ^0.0.2 in their package.json, they will be pinned to exactly that version. I believe @substack experienced this exact frustration with this package awhile ago when it was updated from 0.0.1.

Http response stales

I initially addressed this issue here: juliangruber/multipipe#38 and it seems like duplexer2 is causing it...

Anyhow, I would like to use duplexer in a http server context doing something like this:

const express = require('express');
const JSONStream = require('JSONStream');
const duplex = require('duplexer2');
const app = express();

app.post('/duplexer', (req, res, next) => {
  const parse = JSONStream.parse()
  const stringify = JSONStream.stringify(false)
  const pipeline = duplex(parse, stringify)
  
  req.pipe(pipeline).pipe(res); // does not work, no response from the server
  // req.pipe(parse).pipe(stringify).pipe(res); // works
});

app.listen(3000, () => {
  console.log('Server started...')
});

Does anybody maybe have an idea what the reason of this could be? It feels like the response writer is waiting for the request to terminate or the other way around but this just never happens...

Subject to the same bug Check for existence of `process` object (fix for v2)

WHEN stream is no more required and need to be disabled like suggested in subtitle node module :
BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
This is no longer the case. Verify if you need this module and configure a polyfill for it.

If you want to include a polyfill, you need to:
- add a fallback 'resolve.fallback: { "stream": require.resolve("stream-browserify") }'
- install 'stream-browserify'
If you don't want to include a polyfill, you can use an empty module like this:
resolve.fallback: { "stream": false }

THEN webpack config need extra lines as suggested in RequestNetwork/requestNetwork@9f86631 :
new webpack.ProvidePlugin({
Buffer: ['buffer', 'Buffer'],
}),
AND
line 57 need to intrduce (typeof process !== 'undefined') ie:
var asyncWrite = (typeof process !== 'undefined') && !process.browser && ['v0.10', 'v0.9.'].indexOf(process.version.slice(0, 5)) > -1 ? setImmediate : pna.nextTick;
see nodejs/readable-stream@fd4dda7

don't cause streams1 mode

this module is listening for readable's data event and thus turns it into a streams1 stream. would be nice if it would leave them as streams2 streams.

With more than 2 pipeline stages, errors from middle stages are not bubbled.

I'm not sure if there's anything you can reasonably do about this without getting even deeper into the stream internals, but a documentation update would be great so future people don't fall into this trap. I ran into this on Node 5.6.0. Here's a simple workaround:

var output = input.pipe(otherStage).pipe(outputStage);
var d = duplexer(input, output);
// This next line is necessary, or else exceptions from otherStage get lost.
otherStage.on('error', function(e) { d.emit('error', e); });

EDIT: To clarify, what I see is that Node does detect the error, but it quits without any sort of message so it's difficult to find the cause of the process ending unexpectedly.

Errors on the writable stream are emitted twice on the duplex

With default options, the error on the first stream is emitted twice:

var duplexer = require('duplexer2');
var through = require('through2');

var combinedStream = duplexer(
  through(function(chunk, enc, next) {
    next(new Error('this error is emitted twice.'));
  }),
  through(function(chunk, enc, next) {
    next(null, chunk);
  })
);

combinedStream
  .on('error', function(err) {
    // this event happens twice with the same error
    console.log('\n\n')
    console.log('error:', err);
  })
  .end('beep');

If I set bubbleErrors to false, the first stream throws:

var duplexer = require('duplexer2');
var through = require('through2');

var combinedStream = duplexer({
    bubbleErrors: false
  },
  through(function(chunk, enc, next) {
    // this error is thrown
    next(new Error('this error is emitted twice.'));
  }),
  through(function(chunk, enc, next) {
    next(null, chunk);
  })
);

combinedStream
  .on('error', function(err) {
    console.log('\n\n')
    console.log('error:', err);
  })
  .end('beep');

Interestingly, if I put an error handler on the first stream, both error handlers are called:

var duplexer = require('duplexer2');
var through = require('through2');

var combinedStream = duplexer({
    bubbleErrors: false
  },
  through(function(chunk, enc, next) {
    next(new Error('this error is emitted twice.'));
  })
  .on('error', function(err) {
    // this event happens first
    console.log('\n\n');
    console.log('first stream error', err);
  }),
  through(function(chunk, enc, next) {
    next(null, chunk);
  })
);

combinedStream
  .on('error', function(err) {
    // this event happens second
    console.log('\n\n')
    console.log('error:', err);
  })
  .end('beep');

This happens in Node 0.10, 0.12 and 6.6.0, I assume it happens in all of the versions.

I first brought this up in stream-combiner2, but the problem appears to happen here (or maybe even lower down).

Do you know what's happening and how it can be fixed? The only thing I can think of doing on my end is using the last code snippet with a noop on the first stream's error handler, but that feels like a hack.

Fails with 2 transform streams and lot of data

Below code will fail with error RangeError: Maximum call stack size exceeded.

Node version: 0.10.37

Failed code:

'use strict';

var TransformStream = require('stream').Transform,
    util = require('util'),
    duplexer = require('duplexer2');

var MyTransformStream = function() {
    TransformStream.call(this);
};
util.inherits(MyTransformStream, TransformStream);

MyTransformStream.prototype._transform = function(chunk, encoding, callback) {
    callback(null, chunk);
};

var firstStream = new MyTransformStream(),
    secondStream = new MyTransformStream();

firstStream.pipe(secondStream);

var duplexStream = duplexer({
    bubbleErrors: false
}, firstStream, secondStream);

for (var i = 0; i < 3000; i++) {
    duplexStream.write(i.toString());
}

0.1.3 breaks stream-combiner2 1.1.1

Hi, we have a build process that uses grunt-contrib-imagemin to shrink a png file. We started getting build failures this morning with an error of Fatal error: Cannot read property 'contents' of undefined.

After some investigation, I noticed that when I force a dependency on 0.1.2 of duplexer2 in stream-combiner2 the build works but when I have stream-combiner2 use version 0.1.3 or 0.1.4 of duplexer2, the build fails with the same error. This leads me to suspect something in 0.1.3 breaks stream-combiner2 at 1.1.1.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.