Code Monkey home page Code Monkey logo

cloneable-readable's Introduction

cloneable-readable

npm

Clone a Readable stream, safely.

'use strict'

var cloneable = require('cloneable-readable')
var fs = require('fs')
var pump = require('pump')

var stream = cloneable(fs.createReadStream('./package.json'))

pump(stream.clone(), fs.createWriteStream('./out1'))

// simulate some asynchronicity
setImmediate(function () {
  pump(stream, fs.createWriteStream('./out2'))
})

cloneable-readable automatically handles objectMode: true.

This module comes out of an healthy discussion on the 'right' way to clone a Readable in gulpjs/vinyl#85 and nodejs/readable-stream#202. This is my take.

YOU MUST PIPE ALL CLONES TO START THE FLOW

You can also attach 'data' and 'readable' events to them.

API

cloneable(stream)

Create a Cloneable stream. A Cloneable has a clone() method to create more clones. All clones must be resumed/piped to start the flow.

cloneable.isCloneable(stream)

Check if stream needs to be wrapped in a Cloneable or not.

Acknowledgements

This project was kindly sponsored by nearForm.

License

MIT

cloneable-readable's People

Contributors

dependabot-preview[bot] avatar dependabot[bot] avatar dmurvihill avatar greenkeeper[bot] avatar mcollina avatar rluvaton avatar salmanm avatar vweevers avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

cloneable-readable's Issues

Cloned stream will not pipe

I created a small reproduction of a potential problem I found:

const { createReadStream } = require('fs');
const { join } = require('path');

const cloneable = require('cloneable-readable');

const path = join(__dirname, 'index.html');
const original = createReadStream(path);
const stream = cloneable(original);
stream.clone();
stream.pipe(process.stdout); // will not pipe unless clone is removed
original.pipe(process.stdout); // will pipe

In the snippet above, I expect stream.pipe(process.stdout) to print the contents of index.html.
It does so only if we remove stream.clone() right above it.
original.pipe(process.stdout) will still print.
I find this behavior weird because none of the streams have begun being consumed yet, only clone has been called... I am not sure if I am missing something or if this is a bug. The example at the top of the README.md would suggest that my expected behavior is the correct one though.
Thanks!

Finish event not fired ?

Hi,

I'm using Vinyl File to represents a file stream in my application. I use clones to pipe the same content to multiple ffmpeg commands (using stream-transcoder or fluent-ffmpeg (same behaviour happening)).

It works great for the first original Vinyl file, but not for the clones.
I've isolated the error in the example below, if I run it multiple times, sometimes I got the 3 finish events, sometimes I just have the one for the original stream. But the files are really copied in the example.

const cloneable = require('cloneable-readable');
const path = require('path');
const fs = require('fs');

let stream = cloneable(fs.createReadStream(path.join(__dirname, 'sample.wav')));

function pipe(stream, num) {

    stream.on('end', () => {
        console.log(`Reaching end of stream : ${num}`);
    });

    stream.pipe(fs.createWriteStream(path.join(__dirname, `sample-${num}.wav`))).on('finish', () => {
        console.log(`Stop writing to file for stream : ${num}`);
    });

}

setImmediate(pipe.bind(null, stream, 1)); // Pipe in another event loop tick <-- this one finished only, it's the original cloneable.
pipe(stream.clone(), 0);    // Pipe in the same event loop tick
setTimeout(pipe.bind(null, stream.clone(), 2), 1000);   // Pipe a long time after

setTimeout(() => { }, 2000);    // here to maintain a ref in the event loop longer.

Here is an example of what I can see during my tests :

screen shot 2018-03-03 at 18 48 21

Environment

I'm using Node v6.10.3 on mac.
I use a sample wav file of 450kb.

Full example

I think it's not an ffmpeg issue, but for your interest, here is an example of code using ffmpeg which is not working :

const path = require('path');
const fs = require('fs');
const cloneable = require('cloneable-readable');
const ffmpeg = require('fluent-ffmpeg');

let stream = cloneable(fs.createReadStream(path.join(__dirname, 'sample.wav')));

function pipe(stream, num) {

    stream.on('end', () => {
        console.log(`End writing the stream : ${num}`);
    });

    ffmpeg(stream)
        .audioFrequency(8000)
        .format('s16le')
        .output(path.join(__dirname, 'copy', `sample-${num}.wav`))
        .on('end', () => {
            console.log(`finished for the stream: ${num}`);
        })
        .run();

}

setImmediate(pipe.bind(null, stream, 1)); // Pipe in another event loop tick <-- it's the original cloneable.
pipe(stream.clone(), 0);    // Pipe in the same event loop tick
setTimeout(pipe.bind(null, stream.clone(), 2), 1000);   // Pipe a long time after

setTimeout(() => { }, 2000);    // here to maintain a ref in the event loop longer.

If you run the process above, it will never exit. Events are still attached in the event loop ? Are they lost somewhere ?

Thank you for the time and the response,
Pierre.

npm error

hello,
since version 1.1.0 published 27 minutes ago my npm v2.15.8 started having this error:

Error: Cannot find module 'readable-stream'
    at Function.Module._resolveFilename (module.js:325:15)
    at Function.Module._load (module.js:276:25)
    at Module.require (module.js:353:17)
    at require (internal/module.js:12:17)
    at Object.<anonymous> (/app/node_modules/gulp-rev-collector/node_modules/vinyl/node_modules/cloneable-readable/index.js:3:19)

which points here

I'd appreciate your help.

Thanks,
Eran

Swallows errors

The use of pump means errors on the source stream won't be re-emitted on the cloneable stream, or on clones. Because pump calls destroy() rather than destroy(err). Is this by design?

Additionally, if you do add a error callback to pump here:

function clonePiped (that) {
  if (--that._clonesCount === 0) {
    pump(that._original, that, function (err) {
      if (err) console.log(err)
    })
    that._original = undefined
  }
}

Then the tests reveal a few [Error: stream.push() after EOF].

trouble cloning and reading from source

Hello, I am having a bit of a problem while using the package. essentially I am trying to generate a stream, then clone the contents into the function that will send the contents to a FTP server, and then the data within the stream will be returned to the calling function.

What I am currently doing

const stream: PassThrough = await createCsv(csvData) // returns passthrough
const returnStream: PassThrough = await createCsv(csvData) //returns passthrough
await uploadCsv(stream)
return returnStream

how I would like to use it

const stream = await createCsv(csvData)
const clone = Cloneable(stream)

await uploadCsv(clone)

return clone

errors/problems that I am receiving:

console.log(clone.read()) // null
const clone = Cloneable.isCloneable(stream) // cloneable_readable_1.default.isCloneable is not a function

(0 , cloneable_readable_1.clonable) is not a function

I have a NestJS app where I am trying to clone a readable stream

import { cloneable } from 'cloneable-readable'

const stream: Readable = await createCsv(responsesForCsv) // returns readablestream
const readToFtp = cloneable(stream) // (0 , cloneable_readable_1.clonable) is not a function
const readable = cloneable(stream) // (0 , cloneable_readable_1.clonable) is not a function

essentially I am using your package to resolve this issue

try {
  const readable: Readable = await createCsv(responsesForCsv)
  const readable2: Readable = await createCsv(responseForCsv)

  await uploadToFTPServer(stream)

  return readable2
} catch(e) {
  console.log('error', e)
}

so how I wanted it to work:

try {
  const stream: Readable = await createCsv(responsesForCsv)
  const readable = cloneable(stream)
  await uploadToFTPServer(readable)

  return readable
} catch(e) {
  console.log('error', e)
}

Why won't my clone start?

I'm getting hold of myThroughStream in a function call.

The below code gives me an empty file:

  const stream = cloneable(myThroughStream);
  stream.clone().pipe(myWritableFileStream);

This code works:

  myThroughStream.pipe(myWritableFileStream);

I have realised that the stream won't start untill all clones are piped. But I only have one clone as of now. What more needs to be done?

Unable to use cloned stream as normal ReadableStream

Hi, I'm using this library in order to upload a stream to different destination, one of this is an AWS S3 cloud storage provider. The library I'm using (https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/Package/-aws-sdk-lib-storage/) does need a Readable as the payload of the upload, but since the clone has not the Readable prototype I can't use this with that library. Any suggestion?

I get specifically this error:
Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob

An in-range update of tap-spec is breaking the build 🚨

Version 4.1.2 of tap-spec was just published.

Branch Build failing 🚨
Dependency tap-spec
Current Version 4.1.1
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

tap-spec is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push The Travis CI build could not complete due to an error Details

Commits

The new version differs by 6 commits.

  • 0b3f873 Release 4.1.2
  • 810d5ae Merge pull request #60 from maxlutay/check-asserts-length
  • 484f694 also check for asserts.length
  • 90cb1b8 Merge pull request #51 from vassiliy/feat/mocha_pending_appearance
  • 9160ea1 Format pending specs the same way as Mocha
  • 8ce610d Create LICENSE

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Version 10 of node.js has been released

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Bug: isCloneable using instanceof

Issue: isCloneable returns false even if the passed in stream is a Cloneable instance
Cause: a single project loading more than a single version of this module

To reproduce:

  1. Install [email protected].
  2. Install [email protected]
  3. Create a vinyl instance. The result of cloneable.isCloneable(file.contents) is false.

How to pass a cloned stream to multiple functions that return a promise

I have a stream which is an image. I want to upload the image to s3 and also upload the image to a different third party service.

describe("StorageModule", () => {

    let stream;
 
    it("should upload file to s3", async () => {

        const s3storageModule = new StorageModule();

        imageStream = cloneable(fs.createReadStream(process.cwd() + "/test/resources/" + "id/front.jpg"));

        const response = await s3storageModule.upload(key, imageStream.clone());

        expect(response.Location).to.equal(`${s3BaseUrl}/${key}`)

    });


    it("should submit image to 3rd party", async () => {

        const response = await sessionServie.document(imageStream.clone());
        expect(response.status).equal(202);

    });
});

Above i have 2 test cases the first submits the stream to s3 and the second submits a clone of the stream to the third party service that returns a promise.

However the second test fails with

Error: already started
at Cloneable.clone (node_modules/cloneable-readable/index.js:48:11)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.