flowjs / flow.js Goto Github PK
View Code? Open in Web Editor NEWA JavaScript library providing multiple simultaneous, stable, fault-tolerant and resumable/restartable file uploads via the HTML5 File API.
License: Other
A JavaScript library providing multiple simultaneous, stable, fault-tolerant and resumable/restartable file uploads via the HTML5 File API.
License: Other
Is there a method that will reset the app?
Could anyone please explain the purpose of the "preprocess" custom function? I fail to see a case where it might be useful.
Is it possible to send custom headers for each file?
I currently see the option to add extra headers to the global flow object, but that will impact all uploads. In my app I need to send the values pertinent to each uploading file.
Thanks,
Daniel
I want to attach some data into a file image, like name, description...
I set the data into flow.opt.query {name:'name image', description:'this is my image description'}
I'm sending multiple images, so I need to set a name and a description for each one.
How can I do it?
I appreciate any help ๐
Hello,
Directly to question: How to merge all created chunks in temporary folder to single files? :) I can't see method for this?
Otherwise, lib is useless - i guess?
Hi,
Well done with the library! Although it currently doesn't seem to be much difference between this and resumablejs, I prefer flow's code structure.
Unfortunately it inherits the same big performance issue.
The "testChunks" idea is nice, but it is useless in its current form. If let's say one have uploaded 5/10 chunks in a previous session and wishes to resume the transfer, wouldn't be more practical for Flow to send only ONE single GET requests to retrieve the size of what has been already uploaded to the server and continue from there, instead of testing all chunks from 1 to 10?
There are a couple of more big problems, but let's start with this.
I am willing to collaborate.
Best regards,
Vlad
flow-node.js is so great, it helps me handle upload without changing one line of code. I think it should become a node module.
Will flow.js work against the AWS S3 API? I.e. resumable uploads etc
I'm looking into building a client-side only S3 file browser/uploader/editor
https://github.com/jpillora/xdomain
https://github.com/jpillora/s3hook
https://github.com/jpillora/cms4
So here is what I need it for:
I cancel an upload:
file.cancel();
But on the server side, I'm stuck now with a temp file. And I want to delete it.
$http.delete('/rest/upload/' + file.uniqueIdentifier);
The thing is that the current chunk upload is not done yet when the temp file deletion comes by REST and I'm deleting a temp file that is currently in use by the upload.
So my workaround is to wait a little bit. But that's just awful.
file.cancel();
waitToCancel();
$http.delete('/rest/upload/' + file.uniqueIdentifier);
What I'll like to do is use a promise instead:
file.cancel().then(function(){
$http.delete('/rest/upload/' + file.uniqueIdentifier);
});
I get:
Uncaught InvalidCharacterError: Failed to execute 'setAttribute' on 'Element': '0' is not a valid attribute name.
I'm just doing
flow.assignBrowse($(".attach-media")[0], true, true, "accept=image/*");
And .attach-media
does exist.
On the Flow-Node I kept getting 'invalid_flow_request4' when I was uploading a file > 2Mb,
Fixed it myself by doing parseInt on the :
fileSize != ((totalSize % chunkSize) + parseInt(chunkSize)))
^ In the Validate request function.
Not sure why the variable is being cast to String in the first place (Maybe due to the + also being used in concatenation?)
Not sure if it's just my setup or others, but it couldn't hurt to add it.
Added the issue incase there's any others confused by this problem, saves a lot of logging. (I did it for 10 minutes before I realised that it was concatenating the 2 strings)
The issue is outlined in detail here: flowjs/ng-flow/issues/110
Hello,
Thanks for the great works first. This is really a nice piece of software which can help me to upload large size files. But I am not sure where I can get better documentation to implement this. If i specifically say, currently I need to get the file name at 'fileSuccess' event. My target is, after uploading the file, I should get the file name(preferably with filePath) and need to make the video file viewable in current page. I change the file name in serverside. So, it is needed to get the changed file name in 'fileSuccess' event. But sorrowly, I can not figure out how to solve. Any kind of help appreciated. Again thank you for make such a nice software and good luck.
First: love flow.js, especially ng-flow, great job.
Question: I need to pre-process the actual buffers for chunks before they're uploaded, with a couple of possible actions:
I can't see any way to do this at the moment, but have I missed something, or do I need to fork and if so, would you be interested in a PR with this functionality?
Suggested use cases:
I've recently implemented the node backend from here:
Then got my $.get as:
//'found', filename, original_filename, identifier
//'not_found', null, null, null
$.get = function(req, callback) {
var chunkNumber = req.param('flowChunkNumber', 0);
var chunkSize = req.param('flowChunkSize', 0);
var totalSize = req.param('flowTotalSize', 0);
var identifier = req.param('flowIdentifier', "");
var filename = req.param('flowFilename', "");
if (validateRequest(chunkNumber, chunkSize, totalSize, identifier, filename) == 'valid') {
var chunkFilename = getChunkFilename(chunkNumber, identifier);
fs.exists(chunkFilename, function(exists) {
if (exists) {
callback('found', chunkFilename, filename, identifier);
} else {
callback('not_found', null, null, null);
}
});
} else {
callback('not_found', null, null, null);
}
};
My options in the ng-flow-standalone:
chunkSize: 1024 * 1024,
forceChunkSize: false,
simultaneousUploads: 5,
singleFile: false,
fileParameterName: 'file',
progressCallbacksInterval: 0, //instant feedback
speedSmoothingFactor: 1,
query: {},
headers: {},
withCredentials: false,
preprocess: null,
method: 'multipart',
prioritizeFirstAndLastChunk: false,
target: '/',
testChunks: true,
generateUniqueIdentifier: null,
maxChunkRetries: undefined,
chunkRetryInterval: undefined,
permanentErrors: [415, 500, 501],
onDropStopPropagation: false
I've tried logging every single process on the client side and it doesn't appear to be there that the problem lies. Although it was a rather quick parse tree on paper.
It really confusing as it tops out at the number of chunks in the simultaneous uploads part then either fully stops or takes way too long for the next chunks to start and register. I'm working on an ed-tech site and need fast uploads or the lecturers and users will get annoyed.
Currently a 16Mb file is taking around 5 minutes, on fibre.
Can we get a little better example of error handling implemented in the demos?
RT.
Hi,
in near future, there can be an API added to Javascript : NetworkInformation providing to script useful data like "bandwidth" (estimated, or 0 if offline) and metered (pay-per-use) ... I think it could be a good option to automatically pause the upload.
Wait and see ? :)
Hello,
First thing is first: thanks for this awesome library!
I've encapsulated code needed to handle FlowJs uploads on the server side of ASP.NET MVC into couple .cs
files that are very easy to drag and drop into any project. The code can be found here.
If you find it useful, I thought it might be a good idea to include this link somewhere in the docs.
how can i upload image using flow js in codeigniter
it is working fine using php but not in codeigniter.
please suggest a solution
Looking at the current code/documenation, I can not see way in which file can rejected by the server.
One think required to reject a file is update chunkTest method to stop retrying when server return a specific error message.
Now, test does not send the file data, so to improve validation/reject it would be nice if flow.js could send the initial chunk
when this is enabled I would expect that it will do:
Not so much an issue as a query regarding the preprocessing that ng-flow does.
If you either select a folder with or drag a large volume of files across for upload, ng-flow appears to do some preprocessing on all the files even prior to adding them to the flow list - is this right?
I'm seeing browser lock-up whilst this takes place and I'm wondering if I'm able to leverage the preProcessing property to stop ng-flow from doing this? Any feedback would be greatly appreciated.
With cross domain uploads every test get GET request is accompanied by OPTIONS request allowing testChunk requests to be POST would eliminate number of requests to upload file(s).
I am using the node.js example backend.
The post method inside flow-node.js seems to be prone to a race condition.
The test if all chunks are up sometimes returns 'done' twice:
STATUS done ID 2203735-DSCN1322JPG
STATUS done ID 2203735-DSCN1322JPG
Why not a button?
I'm trying to write a servlet based on sample src/resumable/js/upload/UploadServlet.java.
When I handle the POST requests, I'm not being able to get any request parameter when it tryies to build the ResumableInfo.
As far as I understand, the POST parameters can be read just after calling request.getReader() or request.getInputStream(), doesn't it?
In that case, the java servlet sample wouldn't be working, or I'm loosing something.
Hi,
I have an image/folder manager running where the user can switch between folders while having uploads continuously transferring data to previous folders.in the background. All folders are displayed on the same div (Angular). Whenever a user picks a folder, it's contents are replace with the contents of the selected folder.
To accomplish this setup I created a Flow instance for each folder. Whenever a user picks a folder, I need to "turn off" the listeners of the previously chosen folder Flow instance.
I can "turn on" by using 'assignDrop' but how do I "turn off" again?
Currently I am using a workaround where I added a field 'selected' and did:
flow.on('fileAdded', function(file){return flow.selected && !!{png:1,gif:1,jpg:1,jpeg:1}[$file.getExtension()]});
Any suggestions?
Given the name in the bower.json file being "flow.js", some common grunt and gulp tasks are failing due to the extensioned folder name.
I have looked for a way to override the folder name in my bower.json file without any luck. I'm having to manually exclude the flow.js folder which is causing the source files to not be included in my build.
hi,
how can i add with angularjs, a parameter to the flow? because when flow is initialize, i don't know the parameter. i have tried to bind it to the event filesAdded, there i wanted to add the opts. but the flow does not "rebuild" the options array at this moment, so where can i do this?
br alex
HHi guys. First let me thank you all for this awesome lib.
I don't know if I'm interpreting this right but if a permanent error arouses shouldn't the chunks stop being sent to the server (using testChunks: true)?
I've added a CRC field using the preprocess method. I'm testing the CRC sent in every chunck at server side and replying with 409 if a chunk doesn't match, while one tries to resume a file. I've also added the 409 code as a permanent error.
The behaviour I'm noticing is that flow.js keeps sending chunks independently of the configured permanent errors.
Is this a issue or a misinterpretation of the feature?
First, the Readme is outdated. It says to do npm install express
but this downloads express 4.0 now, but the code depends on express 3.x.
Error: Most middleware (like bodyParser) is no longer bundled with Express and must be installed separately. Please see https://github.com/senchalabs/connect#middleware.
So it should say npm install [email protected]
until it is fixed for 4.x.
Then the upload simply never finishes on the server side.
I actually first tried the node.js sample from resume.js and after it worked fine I tried to change the directory of the uploaded files and at this point point it did the same which the sample here does:
The line POST done <filename>
doesn't appear on the server side log (only partly_done
for all the chunks), but on the browser side it says fileSuccess FlowFile [...] complete
. The samples/Node.js/tmp directory remains empty.
I haven't really tested more and I don't think I have much motivation right now to do it.
node v0.10.26
chromium Version 35.0.1916.17
I want to have preview of my files before upload, is it possible to do something like this?
f.on('fileAdded', function(file, event){
var reader = new FileReader();
reader.onload = function (e) {
$('#preview').
attr('src', e.target.result);
}
reader.readAsDataURL(file.blob);
f.upload();
});
I'm a little confused by the download handler in the Node.js example.
For an image, it opens a new window, for an .mp4 it simply downloads file (and other files I tried).
I want to basically load an .mp4 in a <video>
tag, can I use this or do I need another delivery mechanism.
Also, for something like an avatar, I suppose I would have to save the identifier to the user object like user.avatar = '/download/'+identifier;
and then in my angular app, can I simply do <img src="user.avatar">
is that going to show an image, or open a new window?
This is a bug in IE10/IE11, see uploading-empty-file-ie10-ie11-hangs-indefinitely, but i think flow.js should handle it.
Should a 202 status code be considered a success?
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
10.2.3 202 Accepted
The request has been accepted for processing, but the processing has not been completed. The request might or might not eventually be acted upon, as it might be disallowed when processing actually takes place. There is no facility for re-sending a status code from an asynchronous operation such as this.
The 202 response is intentionally non-committal. Its purpose is to allow a server to accept a request for some other process (perhaps a batch-oriented process that is only run once per day) without requiring that the user agent's connection to the server persist until the process is completed. The entity returned with this response SHOULD include an indication of the request's current status and either a pointer to a status monitor or some estimate of when the user can expect the request to be fulfilled.
first of all, this might be a basic question. I am working with flowjs to upload files to a webapi i need to set id field on each upload.. it appears the target attribute cannot be edited from angularjs controller in my controller I tried this
$scope.uploadImages = function (flowObject, id)
{
// flowObject.defaults.target = "/api/apiimages/" + id;
}
my api looks like this:
upload(int id){}
I have searched so much on the internet and couldnt get a solution to this, thanks
For some reason img flow-img="$flow.files[0]" won't work twice.
So if you wanted a large preview and a thumbnail it's not possible?
Strange behaviour. If the first upload in a Chrome tab is "large" (perhaps requiring more than one chunk), the tab will crash. However, if you first upload a smaller file, then the same large file, it works fine. Interestingly, when I perform the same steps in Safari, it doesn't crash. However, it does seem to temporarily hang my progress bar, leading me to think that Chrome just has a shorter timeout before crashing. What is particular about the first upload that might cause this slowdown/crash?
Pausing the queue should be separate from pausing individual files. I can pause one file so the file is skipped when it's its turn. While uploading the rest of the files I want to pause the whole queue, the transfer stops. But when I resume the queue, the transfer continues, while the specifically paused file remains paused.
I have problem similar to flowjs/flow-php-server#10.
I use the Nodejs example (https://github.com/flowjs/flow.js/tree/master/samples/Node.js) as foundation for my server and everything works fine. When all chunks are uploaded, I assemble the file and then I rename it based on a hash calculated using the file content.
How can I let flow.js know that the file name was changed manually and how can I add this information on an event (e.g. fileSuccess
). I assume the proper way would be to change the filename in the result object.
The corresponding method in the app.js
node server is:
app.post('/upload', multipartMiddleware, function(req, res) {
flow.post(req, function(status, filename, original_filename, identifier) {
if (status == 'done') {
var filepath = UPLOAD_DIR + filename;
var stream = fs.createWriteStream(filepath);
console.log(identifier);
flow.write(identifier, stream);
flow.clean(identifier);
// Here I calculate a hash based on the uploaded file and rename it accordingly.
getHashFromFile(UPLOAD_DIR, filename, function(hash){
var extension = path.extname(filename);
fs.rename(UPLOAD_DIR + filename, UPLOAD_DIR + hash + extension, function(err) {
if ( err ) console.log('ERROR: ' + err);
});
});
}
if (ACCESS_CONTROLL_ALLOW_ORIGIN) {
res.header("Access-Control-Allow-Origin", "*");
}
res.status(status).send();
});
});
I've uploaded the entire file as Gist here: https://gist.github.com/Powernap/36c9b7b1701016ddc03d
An exponential backoff algorithm should be used to prevent Flow.js from causing traffic spikes.
Same as 23/resumable.js#138
Hello there
Is this really the official, next version of https://github.com/23/resumable.js ?
I realise it has some flaws and I wonder if I really should use yours instead. Which one is better maintained and will be?
Lemme know, thanks!
I faced this error with FF 28.0
[Exception... "A parameter or an operation is not supported by the underlying object" code: "15" nsresult: "0x8053000f (InvalidAccessError)" location: ""]
while trying to make an CORS request.
Similar problem is described here http://stackoverflow.com/questions/16677893/evil-firefox-error-a-parameter-or-an-operation-is-not-supported-by-the-under
I solved it adding 3rd parameter for async in your code line 1370
this.xhr.open(method, target, true);
My config is under below:
flowFactoryProvider.defaults = {
target: 'http://up.qiniu.com',
singleFile: true,
permanentErrors: [404, 500, 501],
minFileSize: 0,
testChunks: false,
query: {
token: config.qiniu.token
}
};
Hello I'm a server API developer trying to build an API for flow.js to talk to. The current defaults for simultaneousUploads
is set to 3. This means that flow js will make 3 simultaneous request to the server API when starting an upload. This makes it hard for the server to do any per file initialization work, its very likely that all 3 chucks will be seen as the "first" chunk causing the per file initialization code to kick off 3 times. (It's further complicated by multiple servers behind a load balancer.)
Other multipart upload APIs (like S3 or Dropbox) solve this by having the client send a single initial request per file. This initial request allows the server to do any per file initialization and then typically returns a server side generated identifier that the client should use in future requests. The client can then send multiple simultaneous chucks to the server. Once all the chunks are uploaded the client sends a file complete request, with the identifier, to the server. At this point the server knows that the file is done.
Is it possible to get flow.js to follow part of this pattern? Right now I'm most interested in getting an initial per file request before receiving simultaneous chucked data requests. Any idea on how that can be done?
I was thinking a fileStarting
event could be added. Thoughts?
Main library file has 1488 lines of code, every function needs to be separated into separate files. This will also need a build script to get library glued into one peace.
Maybe we could use some ideas from https://github.com/angular/angular.js, but much simpler.
Right now only xhr.responseText is expose as $message , it would be nice to also gt xhr.statusText especially for error codes.
Hey Guys.
I think I've found an issue with using preprocess and filesuccess. I'm calling preprocessFinished on each chunk from inside of my preprocess function. The problem is that immediately after my preprocess function returns you have this line in the chunk send method:
this.preprocessState = 1;
So, eventhough the state was just set to 2 in the preprocessFinished method, it has now been set back to 1 after my custom function is run.
Because all chunks stay at 1, the isComplete never returns true and the fileSuccess handler is never called because of this if statement:
if (this.isComplete()) {
this.currentSpeed = 0;
this.averageSpeed = 0;
this.flowObj.fire('fileSuccess', this, message);
}
Am I approaching this wrong at all? Why does the preprocessState of a chunk need to be set to 1 after it's just been set to 2?
I've basically just commented out the line that sets the preprocessState to 1.
Thanks,
Steven Tate
I ran into a bug where multiple separate uploads had the same unique ID. Because it uses file size in generating the IDs, generateUniqueIdentifier() will generate the same uniqueId for identical files, making a server extremely confused during chunked uploads.
I propose using a timestamp (or random number) + path instead of file size.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.