Comments (5)
Problem solved by reducing chunkSize from 262144 to 65536. Sometimes the readable stream can't read more than 65536 at a time.
I moved to antoher soultion to handle fetched data and decrypt it on the fly.. In case someone needs it :
try {
async function* streamAsyncIterable(stream) {
const reader = stream.getReader()
try {
while (true) {
const {
done,
value
} = await reader.read()
if (done) return
yield value
}
} finally {
reader.releaseLock()
}
}
const response = await fetch(url)
var responseSize = 0;
var chunkMaxSize = 65536;
var totalWorkArray = new Uint8Array([]);
const fileStream = streamSaver.createWriteStream(filename, {
size: fileSize
});
const writer = fileStream.getWriter()
for await (const chunk of streamAsyncIterable(response.body)) {
let chunkSize = chunk.length
responseSize += chunkSize
var mergedArray = new Uint8Array(totalWorkArray.length + chunk.length);
mergedArray.set(totalWorkArray);
mergedArray.set(chunk, totalWorkArray.length);
totalWorkArray = mergedArray
while (totalWorkArray.length > chunkMaxSize) {
const c = totalWorkArray.slice(0, chunkMaxSize);
let work = new Blob([c], {
type: 'application/octet-stream'
})
var temp = totalWorkArray.slice(chunkMaxSize)
totalWorkArray = temp
const plain = await FileDecrypt(work, 1, work.type);
const readableStream = new Blob([plain], {
type: plain.type
}).stream()
await readableStream.getReader().read().then(async ({
value,
done
}) => {
await writer.write(value)
});
}
}
const work = new Blob([totalWorkArray], {
type: 'application/octet-stream'
})
const plain = await FileDecrypt(work, 1, work.type);
const readableStream = new Blob([plain], {
type: plain.type
}).stream()
await readableStream.getReader().read().then(async ({
value,
done
}) => {
await writer.write(value)
await writer.close()
});
} catch (error) {
...
}
from streamsaver.js.
Ouch. This solution makes me sad to see.
Will try to reply back to this in a hour or so when i have access to a computer with a solution that i think works better. In the Meanwhile. Can share the code to the decoder and a encrypted file?
from streamsaver.js.
Okey... so here is what i came up with:
/**
* Read a stream into same underlying ArrayBuffer of a fixed size.
* And yield a new Uint8Array view of the same underlying buffer.
* @param {ReadableStreamBYOBReader} reader
* @param {number} chunkSize
*/
async function* blockReader(reader, chunkSize) {
let offset = 0;
let buffer = new ArrayBuffer(chunkSize)
let done, view
while (!done) {
({value: view, done} = await reader.read(new Uint8Array(buffer, offset, chunkSize - offset)))
buffer = view.buffer
if (done) break
offset += view.byteLength;
if (offset === chunkSize) {
yield view
offset = 0
// if you want to reuse the same allocated buffer for efficiency,
// comment the following line:
// buffer = new ArrayBuffer(chunkSize)
}
}
if (offset > 0) {
yield view.buffer.slice(0, offset)
}
}
const url = 'https://raw.githubusercontent.com/lukeed/clsx/main/src/index.js'
const filename = 'clsx.js'
const fileSize = 1000000
const chunkMaxSize = 65536
const response = await fetch(url)
const fileStream = streamSaver.createWriteStream(filename, {
size: fileSize
})
const writer = fileStream.getWriter()
const reader = response.body.getReader({ mode: 'byte' })
const type = 'application/octet-stream'
const iterator = blockReader(reader, chunkMaxSize)
// `sameUnderlyingBuffer` is an Uint8Array of the same underlying ArrayBuffer
// It means that the Uint8Array is detached in every loop.
// and not reusable in the next loop. (so don't try to concat them all)
// However the ArrayBuffer is reused / recycled and updated in every loop.
// This is the most efficient way to read a stream.
for await (const sameUnderlyingBuffer of iterator) {
const plain = await FileDecrypt(sameUnderlyingBuffer, 1, type)
await writer.write(plain)
}
await writer.close()
from streamsaver.js.
a problem you have in both examples are that you are creating a stream from new Blob(...).stream()
and only read the first chunk.
await readableStream.getReader().read().then(async ({ value, done }) => await writer.write(value))
one .read()
may not read everything from a blob.
it that case you want to do new Blob(...).arrayBuffer()
or better yet, try avoid create a blob at all in the first place. cuz there is not much need in doing that.
from streamsaver.js.
also if i may suggest i would add this polyfill:
ReadableStream.prototype.values ??= function({ preventCancel = false } = {}) {
const reader = this.getReader();
return {
async next() {
try {
const result = await reader.read();
if (result.done) {
reader.releaseLock();
}
return result;
} catch (e) {
reader.releaseLock();
throw e;
}
},
async return(value) {
if (!preventCancel) {
const cancelPromise = reader.cancel(value);
reader.releaseLock();
await cancelPromise;
} else {
reader.releaseLock();
}
return { done: true, value };
},
[Symbol.asyncIterator]() {
return this;
}
};
};
ReadableStream.prototype[Symbol.asyncIterator] ??= ReadableStream.prototype.values;
that way you could do:
for await (const chunk of response.body) { ... }
only servers and Firefox have async iterator.
from streamsaver.js.
Related Issues (20)
- When will the new version of Streamsaver be released? HOT 2
- check integrity of downloaded files
- mitm page ends in 404 HOT 1
- Not compatable with download manager extensions HOT 4
- File cannot be loaded with unsecured connection, when serviceWorker does not exist
- How can I merge the generated docx into a zip?
- Why is the data always 0 when I add files like this? My binary is sure to have data
- Why my memory usage blow up after stopping a download?
- There is something wrong with the file saved in firefox.
- Iframes are not removed from the DOM after a file is downloaded HOT 1
- The browser will crash when downloading files exceeding 2G through fetch HOT 4
- Why is a call being made to an external site at all? HOT 1
- invalid error: [StreamSaver] You didn't send a messageChannel HOT 2
- Incomplete download when write to the streamSaver HOT 1
- Service worker cuts out immediately when network connection is interrupted in FF
- Do I need to use mitm?
- How Do I Modify sw.js to Avoid SSL Certificate Verification? HOT 1
- Turbo rails, secure context: TypeError: Cannot read properties of null (reading 'postMessage') in iframe postMessage handler
- Websocket download not work in firefox HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from streamsaver.js.