Code Monkey home page Code Monkey logo

laravel-chunk-upload's Introduction

Laravel Chunk Upload

Total Downloads Build Status Latest Stable Version License

Introduction

Laravel Chunk Upload simplifies chunked uploads with support for multiple JavaScript libraries atop Laravel's file upload system, designed with a minimal memory footprint. Features include cross-domain request support, automatic cleaning, and intuitive usage.

For example repository with integration tests, visit laravel-chunk-upload-example.

Before contributing, familiarize yourself with the guidelines outlined in CONTRIBUTION.md.

Installation

1. Install via Composer

composer require pion/laravel-chunk-upload

2. Publish the Configuration (Optional)

php artisan vendor:publish --provider="Pion\Laravel\ChunkUpload\Providers\ChunkUploadServiceProvider"

Usage

The setup involves three steps:

  1. Integrate your controller to handle file uploads. Instructions
  2. Define a route for the controller. Instructions
  3. Select your preferred frontend provider (multiple providers are supported in a single controller).
Library Wiki Single & Chunk Upload Simultaneous Uploads Included in Example Project Author
resumable.js Wiki ✔️ ✔️ ✔️ @pionl
DropZone Wiki ✔️ ✔️ ✔️ @pionl
jQuery-File-Upload Wiki ✔️ ✖️ ✔️ @pionl
Plupload Wiki ✔️ ✖️ ✖️ @pionl
simple uploader ✖️ ✔️ ✖️ ✖️ @dyktek
ng-file-upload Wiki ✔️ ✖️ ✖️ @L3o-pold

Simultaneous Uploads: The library must send the last chunk as the final one to ensure correct merging.

Custom Disk: Currently, it's recommended to use the basic storage setup (not linking the public folder). If you have time to verify its functionality, please PR the changes!

For detailed information and tips, refer to the Wiki or explore a working example in a separate repository with example.

Changelog

View the changelog in releases.

Contribution or Extension

Review the contribution guidelines before submitting your PRs (and utilize the example repository for running integration tests).

Refer to CONTRIBUTING.md for contribution instructions. All contributions are welcome.

Compatibility

Though not tested via automation scripts, Laravel 5/6 should still be supported.

Version PHP
11.* 8.2
10.* 8.1, 8.2
9.* 8.0, 8.1
8.* 7.4, 8.0, 8.1
7.* 7.4

Copyright and License

laravel-chunk-upload was authored by Martin Kluska and is released under the MIT License.

Copyright (c) 2017 and beyond Martin Kluska and all contributors (Thank you ❤️)

laravel-chunk-upload's People

Contributors

buschmann23 avatar colbydude avatar dgitts avatar drjdr avatar dyktek avatar hotaibi avatar ivandokov avatar joni2back avatar klimov-paul avatar l3o-pold avatar maksida avatar pionl avatar r0aringthunder avatar rpaggi avatar rtippin avatar shivank44 avatar thefrankman avatar tomswinkels avatar torta avatar trideout avatar vedmant avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

laravel-chunk-upload's Issues

Lumen support?

Anyone had any luck getting this package to work with Lumen?

Division by 0 error

I have noticed in my error logs that every now and then this error pops up. Just now I ran into it for the first time myself.
What's weird is I just now tested a file which works on localhost, but I get division by 0 error in production.

After digging a little deeper the issue seems to be that if it is not a chunked upload, $save->isFinished() false is, then $handler->getPercentageDone throws this error. It might be due to a a slow upload connection why it works on local and not on production?

(I am not too sure if this issue comes up when it is a chunked upload)

Whoops, looks like something went wrong.
1/1
ErrorException in ContentRangeUploadHandler.php line 170:
Division by zero
in ContentRangeUploadHandler.php line 170
at HandleExceptions->handleError(2, 'Division by zero', '/home/clooudtv/httpdocs/vendor/pion/laravel-chunk-upload/src/Handler/ContentRangeUploadHandler.php', 170, array()) in ContentRangeUploadHandler.php line 170
at ContentRangeUploadHandler->getPercentageDone() in MediaController.php line 146
at MediaController->directUploadHandler(object(UploadMediaRequest)) in MediaController.php line 65
at MediaController->uploadHandler(object(UploadMediaRequest))
at call_user_func_array(array(object(MediaController), 'uploadHandler'), array(object(UploadMediaRequest))) in Controller.php line 55
at Controller->callAction('uploadHandler', array(object(UploadMediaRequest))) in ControllerDispatcher.php line 44
at ControllerDispatcher->dispatch(object(Route), object(MediaController), 'uploadHandler') in Route.php line 203
at Route->runController() in Route.php line 160
at Route->run() in Router.php line 559
at Router->Illuminate\Routing\{closure}(object(Request)) in Pipeline.php line 30
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in CheckReferral.php line 27
at CheckReferral->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in NotBlocked.php line 24
at NotBlocked->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in SubstituteBindings.php line 41
at SubstituteBindings->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in VerifyCsrfToken.php line 65
at VerifyCsrfToken->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in ShareErrorsFromSession.php line 49
at ShareErrorsFromSession->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in StartSession.php line 64
at StartSession->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in AddQueuedCookiesToResponse.php line 37
at AddQueuedCookiesToResponse->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in EncryptCookies.php line 59
at EncryptCookies->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in Pipeline.php line 102
at Pipeline->then(object(Closure)) in Router.php line 561
at Router->runRouteWithinStack(object(Route), object(Request)) in Router.php line 520
at Router->dispatchToRoute(object(Request)) in Router.php line 498
at Router->dispatch(object(Request)) in Kernel.php line 174
at Kernel->Illuminate\Foundation\Http\{closure}(object(Request)) in Pipeline.php line 30
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in TransformsRequest.php line 30
at TransformsRequest->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in CheckForMaintenanceMode.php line 46
at CheckForMaintenanceMode->handle(object(Request), object(Closure)) in Pipeline.php line 148
at Pipeline->Illuminate\Pipeline\{closure}(object(Request)) in Pipeline.php line 53
at Pipeline->Illuminate\Routing\{closure}(object(Request)) in Pipeline.php line 102
at Pipeline->then(object(Closure)) in Kernel.php line 149
at Kernel->sendRequestThroughRouter(object(Request)) in Kernel.php line 116
at Kernel->handle(object(Request)) in index.php line 57

Integration with resumable.js and chunk deletion once buildFullFileFromChunks()

Hi,
despite this library is compatible with the most popular uploaders, I've managed to integrate it in a project which uses dropzone.js as uploader and resumable.js for the chunk management. (I am unable to use an alternate upload system).

I've observed that chunks are not deleted once the buildFullFileFromChunks() is called in the ChunkSave class. In the case of uploading several times the same file, the last .part container grows and grows, as it has the same path.

My question is... why the chunks aren't deleted when the final file is merged? That'd also remove the need to run the uploads:clear command.

Chunks merged in wrong order

Hi, I've faced a weird issue with this package using dropzone.js.
Lib version: "pion/laravel-chunk-upload": "^1.2"

Here are my dropzone configs:

{
...
       maxFilesize: 300, // Mb
       timeout: 60000 * 3,
       chunking: true,
       acceptedFiles: '.zip',
       chunkSize: 10000000,
       autoProcessQueue: false,
       maxFiles: 1,
}

And settings from chunk-upload.php:

'chunk' => [
        // setup for the chunk naming setup to ensure same name upload at same time
        'name' => [
            'use' => [
                'session' => false, // should the chunk name use the session id? The uploader must send cookie!,
                'browser' => true, // instead of session we can use the ip and browser?
            ],
        ],
    ],

When uploading a large file (around 230 Mb) sometimes I've faced an issue that chunks are merged in incorrect order and half of chunks are ignored.

Here are the ls result from storage/app/chunks folder with this bug reproduced: https://imgur.com/a/rfF3osg

As you can see, on the first screenshot we have 24 chunks, each has a size on 10 Mb. On the second screenshot you can see that chunks get merged incorrectly in random order (part 1 and then 20 -
This bug reproduces not always, in 50% of cases and I don't know why.

Any help or advice? I've tried several chunk sizes from 2 Mb to 10 Mb, the larger the chunk size the less this bug reproduces (based on my observations).

Chunk are appended in the wrong order when the upload creates 10 chunks or more

If you make an upload that leads to more than 10 chunks, they are appended in the wrong order.
I think this has to do with the sort function which does not take the part numbers into account correctly.
So file xxxxxxxx-10.part is appended before xxxxxxxx-2.part because of the sorting.

I have already applied the fix proposed in another thread by setting browser = true and session = false
for the filenames. So I assured my part filenames are consistent throught the whole upload.

I am using dropzone.js to handle the upload client side, and I checked that the chunk indexes, order and total are all correct.

Sorting the files with natural case seems to fix the problem.
In function buildFullFileFromChunks() from ParallelSave.php

$chunkFiles = $this->savedChunksFiles()->sort();

I replace with this

// Sorting file collection with natural case so xxxx-10.part does not get before xxxx-2.part
$chunkFiles = $this->savedChunksFiles();
$items = $chunkFiles->all();
natcasesort($items);
$chunkFiles = collect($items);

And it solves the problem.

If this a mistake from me that can be resolved without updating the plugin, I'll be happy to be directed to the solution.

Apart from this, it is an excellent plugin :-).

Remove Laravel dependency

Hi, I'm using multiple Illuminate packages instead of Laravel and after installing this package it installed whole laravel framework that I don't need, could you remove direct laravel dependency and instead make it depend on some particular Illuminate packages it requires. It would be great to allow it to work standalone in general.

Error installing package

[Symfony\Component\Debug\Exception\FatalThrowableError]
Cannot use object of type Illuminate\Console\Scheduling\Schedule as array

After in composer update does not wok .
php artisan command does not work after installing this package.
I removed this package from composer.json . Then it was normal like before.
my composer.json file

{
    "name": "laravel/laravel",
    "description": "The Laravel Framework.",
    "keywords": ["framework", "laravel"],
    "license": "MIT",
    "type": "project",
    "require": {
        "php": ">=5.5.9",
        "laravel/framework": "5.1.*",
        "laravelcollective/html": "5.1.*",
        "zizaco/entrust": "dev-laravel-5",
        "ellipsesynergie/api-response": "^0.12.1",
        "oriceon/oauth-5-laravel": "dev-master",
         "guzzlehttp/guzzle": "4.0",
        "nesbot/carbon": "~1.18",
        "paypal/rest-api-sdk-php": "*",
        "webup/laravel-sendinblue": "^1.0",
        "pion/laravel-chunk-upload": "^0.3.1"
    },
    "require-dev": {
        "fzaninotto/faker": "~1.4",
        "mockery/mockery": "0.9.*",
        "phpunit/phpunit": "~4.0",
        "phpspec/phpspec": "~2.1",
        "laracasts/generators": "^1.1",
        "barryvdh/laravel-ide-helper": "^2.1"
    },
    "autoload": {
        "classmap": [
            "database"
        ],
        "psr-4": {
            "App\\": "app/"
        }
    },
    "autoload-dev": {
        "classmap": [
            "tests/TestCase.php"
        ]
    },
    "scripts": {
        "post-install-cmd": [
            "php artisan clear-compiled",
            "php artisan optimize"
        ],
        "post-update-cmd": [
            "php artisan clear-compiled",
            "php artisan ide-helper:generate",
            "php artisan optimize"
        ],
        "post-root-package-install": [
            "php -r \"copy('.env.example', '.env');\""
        ],
        "post-create-project-cmd": [
            "php artisan key:generate"
        ]
    },
    "config": {
        "preferred-install": "dist"
    }
}

Proposal: Register upload handlers in config

It would be great if the handler registration would be in the config file intead in the HandlerFactory. So handlers could work similarly to providers. This would make the library more extendable.

  • Ability to remove unused handlers
  • Developers could extend the library easyer (without the need of implementing the whole library)

What do you think about this?

Proposal: Should ignore user abort

I think ignore_user_abort(true) should be used inside ParallelSave::buildFullFileFromChunks method merges the chunk files. If the client disconnects during that process some of the chunks gets deleted and a partialy merged file is created. By ignoring user abort the process could finish without any problem.

This also applies to ChunkSave. During the merge of the last chunk user abort could be ignored. With this, the last merge and the move could finish without interruption.

This could lead to some inconsistency after the client reconnects.

Maybe this could be configurable and being disabled by default.

mkdir(): Permission denied

Hello,
We are having an issue with the uploader, When uploading a file it throws an error on createChunksFolderIfNeeded() line 216 in Save\ChunkSave
$path appears the be the directory specified in storage.chunks, whist mkdir() needs an absolute path
dump($path) prints 'chunks/'

Thank you,
Nick

All chunked upload are being cancelled on the server (Xampp working fine)

I am using Vue and Vue Dropzone for the Frontend.
Everything is working fine on my local Xampp.
For smaller files than my configured chunk size I am using a non-chunked upload with Laravel file upload methods.
Only files bigger than the chunk size are processed by laravel-chunk-upload.

In my productive environment on my managed server chunked uploads are always cancelled and a 500 error is thrown.
It looks like no chunks are being created on the server at all.
My Laravel logfile states this error:

[2018-05-11 07:56:39] production.ERROR: The file "testfile.zip" was only partially uploaded. {"userId":1,"email":"[email protected]","exception":"[object] (Pion\Laravel\ChunkUpload\Exceptions\UploadFailedException(code: 500): The file "testfile.zip" was only partially uploaded. at /html/myproject/vendor/pion/laravel-chunk-upload/src/Receiver/FileReceiver.php:66)

Any ideas?

$save->isFinished() always false

Thanks for great package, i am facing a problem in both dropzone and resumable.js i used exact same code from example, everything works so far but the

$save = $receiver->receive();
 // check if the upload has finished (in chunk mode it will send smaller files)
 if ($save->isFinished()) {

above condition is always fail, and in last chunk response

$handler->getPercentageDone()

this gives 100 percent but as isFinished() gives false so it doesn't upload actual file.
i have tried with both dropzone and resumable.js both gives same result.

Can't get the file complete

I don't know if I'm using the library wrong but always got the file incomplete.
I'm using laravel 5.3 with homestead, and plupload for the frontend.

In the example you have the following lines:
// save the file and return any response you need
return $this->saveFile($save->getFile());

I suppose that we have to implement the saveFile method, so when I try to move the file
to the new location it looks like a chunk.
When I look at the storage_path sometimes appears just one part of the blob, most of the time the last part.

Can you provide a more explicit example or is there something I'm missing?
Thanks by the way, nice work.

Laravel 5.6

Adjust composer.json to include Laravel 5.6

Cross upload on different tab (chunk merging issue)

Hi,

when I try to upload audios from two different tabs at the same time, audios get damaged.
I'm using a custom front end library with a custom handler.
Final chunk is the last to be uploaded as followed on your documentation

Any ideas?

Thanks in advance

Support Multipart Upload API for S3

Hello! I have a problem with chunking: I use dropzone.js and Laravel to upload large files. I expect a large file to be uploaded to s3, but upload just a last of his parts! As I can see, script doesn't glue these parts together as I expected. That`s why, I had one file before uploading and N files in s3. But I need also one large file in s3. Is there any ways to do that? Thank you!

Question about ChunksInRequestUploadHandler class

Hi, I'd like to know why the "+1" on line 50 of Handler/ChunksInRequestUploadHandler.php:

// the chunk is indexed from zero (for 5 chunks: 0,1,2,3,4)
$this->currentChunk = $request->get("chunk") + 1;

in the constructor of the class. Is it because compatibility for plupload and blueimp's uploaders? Because I'm experiencing problems with the chunk count by using resumable.js library. In that library, the first chunk already starts from 1 and not from 0, getting a false isLastChunk() positive, rendering the file being merged before the last real chunk.

Any ideas?
Thanks,

Wrong detection of upload finish on 32bit PHP

Hi!

On 32bit PHP the maximum of an integer is 2147483647.
In ContentRangeUploadHandler class on lines 93,94,95 intval() is used so bytesStart, bytesEnd and bytesTotal max value is 2147483647. So after about 2GB the script thinks that the upload is finished.

Some of the chunks are skipped when merging

While looking where the issue is coming from I got it fixed by
adding sleep(3); right after
protected function buildFullFileFromChunks()
and before
$chunkFiles = $this->getSavedChunksFiles()->all();

in ParallelSave.php

Issue is easy observed when uploading larger files. If chunks are with size 2MB, then more chunks are left over. If chunk size is 20mb then only 1-2 files are left over. For a 1.3 GB file. it is the same with dropzone and resumable. And it occurs when parallel uploading chunks. If i switch it to false, then upload is ok.

So my theory is that when last chunk gets uploaded the full file reconstruction starts. But I don't see a check if all chunks are uploaded, then start the reconstruction.

Error when installing via composer

I am attempting to clone an existing Laravel application and get it running locally. When I run composer install it has the following error:

Your requirements could not be resolved to an installable set of packages.

  Problem 1
    - Installation request for pion/laravel-chunk-upload ^0.2.3 -> satisfiable by pion/laravel-chunk-upload[v0.2.3].
    - Conclusion: don't install laravel/framework v5.2.45
    - Conclusion: don't install laravel/framework v5.2.44
    - Conclusion: don't install laravel/framework v5.2.43
    - Conclusion: don't install laravel/framework v5.2.42
    - Conclusion: don't install laravel/framework 5.2.41
    - Conclusion: don't install laravel/framework v5.2.40
    - Conclusion: don't install laravel/framework v5.2.39
    - Conclusion: don't install laravel/framework v5.2.38
    - Conclusion: don't install laravel/framework v5.2.37
    - Conclusion: don't install laravel/framework v5.2.36
    - Conclusion: don't install laravel/framework v5.2.35
    - Conclusion: don't install laravel/framework v5.2.34
    - Conclusion: don't install laravel/framework v5.2.33
    - Conclusion: don't install laravel/framework v5.2.32
    - Conclusion: don't install laravel/framework v5.2.31
    - Conclusion: don't install laravel/framework v5.2.30
    - Conclusion: don't install laravel/framework v5.2.29
    - Conclusion: don't install laravel/framework v5.2.28
    - Conclusion: don't install laravel/framework v5.2.27
    - Conclusion: don't install laravel/framework v5.2.26
    - Conclusion: don't install laravel/framework v5.2.25
    - Conclusion: don't install laravel/framework v5.2.24
    - Conclusion: don't install laravel/framework v5.2.23
    - Conclusion: don't install laravel/framework v5.2.22
    - Conclusion: don't install laravel/framework v5.2.21
    - Conclusion: don't install laravel/framework v5.2.20
    - Conclusion: don't install laravel/framework v5.2.19
    - Conclusion: don't install laravel/framework v5.2.18
    - Conclusion: don't install laravel/framework v5.2.17
    - Conclusion: don't install laravel/framework v5.2.16
    - Conclusion: don't install laravel/framework v5.2.15
    - Conclusion: don't install laravel/framework v5.2.14
    - Conclusion: don't install laravel/framework v5.2.13
    - Conclusion: don't install laravel/framework v5.2.12
    - Conclusion: don't install laravel/framework v5.2.11
    - Conclusion: don't install laravel/framework v5.2.10
    - Conclusion: don't install laravel/framework v5.2.9
    - Conclusion: don't install laravel/framework v5.2.8
    - Conclusion: don't install laravel/framework v5.2.7
    - Conclusion: don't install laravel/framework v5.2.6
    - Conclusion: don't install laravel/framework v5.2.5
    - Conclusion: don't install laravel/framework v5.2.4
    - Conclusion: don't install laravel/framework v5.2.3

The relevant line from composer.json:

        "pion/laravel-chunk-upload": "^0.2.3",

chunks not being assembled

Hi
thank you for your useful package.
I started using chunk-upload from past week and for the first time it worked great but after some customization on your sample code and use your advance sample it stopped working.

the problem is that file is completely uploaded but only last chunk was saved on the upload folder while the other chunk is stored on the chunk folder and have not been assembled.

after that I uploaded the exact sample project on the server but it did not work correctly and have the same problem.

I have been using laravel 5.5 as a restful server and my front end including dropzone.js is on another server that uses api to send requests to the server .
this is my front end js code:
var myDropzone = new Dropzone("#my-awesome-dropzone", {
// Setup chunking
chunking: true,
url: window.RequestUrl + "api/v1/upload",
method: "POST",
maxFilesize: 400000000,
chunkSize: 1000000,
// If true, the individual chunks of a file are being uploaded simultaneously.
parallelChunkUploads: true
});

and this is my html:

and these are server response that I got when uploading a 1.1 Mb file.

{"path":"upload/application-octet-stream/2018-07-27/","name":"1_ff5ac9c86fe043298217e80c071d4828.jpg","mime_type":"application-octet-stream"}

{"done":50,"status":true}

please help me.
regard.

Laravel 5.8 support

Hi! Is it possible to tag a release that can be installed with Laravel 5.8.*?
Thanks in advance!

Upload error not handled

Hi!

This almost made me insane. It took me two hours of debugging on a test server.
When an upload error occurs the code gives "Failed to open input stream" error message, because the first time this cause a problem in the code is at ChunkSave:191, because the path name of the uploaded file is empty.
I think the constructor of FileReceiver should handle upload errors but this more of a feature request than a bug.

Strange issue with `$save->getFile()`

I'm having an issue with chunked uploads at $save->getFile(), from the example method below, that returns the last uploaded chunk even after passing the $save->isFinished() check.

The uploadFile method is mostly identical to the provided example:

    public function uploadFile(FileReceiver $receiver, Request $request) {
        if ($receiver->isUploaded()) {
            $save = $receiver->receive();
            if ($save->isFinished()) {
                return $this->saveFile($save->getFile(), $request);
            } else {
                /** @var AbstractHandler $handler */
                $handler = $save->handler();
                
                return response()->json([
                    "done" => $handler->getPercentageDone(),
                ]);
            }
        } else {
            throw new UploadMissingFileException();
        }
    }

Am I doing something wrong?

For example when I upload a file that is about 8MB with a set chunk size is of 1MB, chunks are all uploaded correctly, .part files are created but this is what $save->getFile() is returning (it's only the last chunk, with a size of about 890KB):

UploadedFile {#313
  -test: true
  -originalName: "blob"
  -mimeType: "application/octet-stream"
  -size: 894516
  -error: 0
  #hashName: null
  path: "/Users/user/Sites/project/storage/app/chunks"
  filename: "blob-MQzSnbPSF2TpIxPeAYY5WzhgqdcAXTY5KzkVJfEE-8.part"
  basename: "blob-MQzSnbPSF2TpIxPeAYY5WzhgqdcAXTY5KzkVJfEE-8.part"
  pathname: "/Users/user/Sites/project/storage/app/chunks/blob-MQzSnbPSF2TpIxPeAYY5WzhgqdcAXTY5KzkVJfEE-8.part"
  extension: "part"
  realPath: "/Users/user/Sites/project/storage/app/chunks/blob-MQzSnbPSF2TpIxPeAYY5WzhgqdcAXTY5KzkVJfEE-8.part"
  aTime: 2018-01-19 11:21:19
  mTime: 2018-01-19 11:21:19
  cTime: 2018-01-19 11:21:19
  inode: 8597363617
  size: 894516
  perms: 0100644
  owner: 501
  group: 20
  type: "file"
  writable: true
  readable: true
  executable: false
  file: true
  dir: false
  link: false
}

I store what $save->getFile() returns, and that means that any chunked file is corrupted because it's composed of only its last chunk's bytes.

I'm using the plupload ChunksInRequestUploadHandler handler.
I'm on Laravel 5.5.31.

Incorrect file name

The result of sending a 543 KB file:
ext: "png" filename: "Desktop Screenshot 2018.12.13 - 15.24.16.57.png" mine: "image-png" size: 556526

The result of sending a file over 10 MB:
ext: "" filename: "blob" mine: "video-mp4" size: 33050437

I have this every time I use plupload

How to work with the response from the file upload?

I use dropzone.js and Laravel 5.7.

My script offers the option to add multiple Images and they get uploaded together with some extra Infos for each Image as well as an album Name and Year. It worked fine locally but on the Shared Hosting I had issues so I decided I need to look into chunking.

First issue was than that the Album Info got entered into the db for each chunk so I thought I should first upload the Images to the server, when they are all uploaded start another AJAX call to enter the data for the Album and the Images into the database, resize the images and move them into another folder.

The problem I now realised is when I upload 1 File of 2.5 Mb than I get 3 POST Request and the below 3 answers. As you can see, I would only want the first one. Why are there 2 files in application-octet-stream? Are they are the chunks? I don't have a chunks folder anymore, no idea why this is not created anymore. The problem is that I get the Name from the last Post response, but which is the name of a Chunk and not a file. I just don't know how to handle it to be able to send multiple files + each file got it's own form data + other Form Data like album name and description and the files have to stay in the order they are in the dropzone Queue since I use that as the sort Order.
I have posted part of my dropzone code below where I try to work with the response from the File Upload.

{"path":"upload\/image-jpeg\/2019-01-04\/","name":"20180714_121742_7c8bceedd34b1f605110936faec59d9e.jpg",
"mime_type":"image-jpeg"}

{"path":"upload\/application-octet-stream\/2019-01-04\/","name":"20180714_121742_7c8bceedd34b1f605110936faec59d9e.jpg","mime_type":"application-octet-stream"}

{"path":"upload\/application-octet-stream\/2019-01-04\/","name":"20180714_121742_31ae18f8f1d94a8ce3e89ed0b97496c8.jpg","mime_type":"application-octet-stream"}

Dropzone.options.myAwesomeDropzone = { // The camelized version of the ID of the form element
    autoProcessQueue: false,
    totalMaxsize: 100,
    parallelUploads: 1,  // since we're using a global 'currentFile', we could have issues if parallelUploads > 1, so we'll make it = 1
    maxFilesize: 10,   // max individual file size 1024 MB
    chunking: true,      // enable chunking
    forceChunking: true, // forces chunking when file.size < chunkSize
    parallelChunkUploads: false, // allows chunks to be uploaded in parallel (this is independent of the parallelUploads option)
    chunkSize: 1000000,  // chunk size 1,000,000 bytes (~1MB)
    retryChunks: true,   // retry chunks on failure
    retryChunksLimit: 10, // retry maximum of 3 times (default is 3)
    maxFiles:30,
    timeout: 600000,
    thumbnailWidth: 288,
    thumbnailHeight:192,
    thumbnailMethod: 'contain',  
    paramName:'file',
    acceptedFiles: ".jpg,.jpeg,.png",
    previewTemplate: previewTemplate,
    previewsContainer: "#previews", // Define the container to display the previews
    clickable: ".fileinput-button", // Define the element that should be used as click trigger to select files.
    countImages:0,
  
  
    params: function (files, xhr, chunk) {
         
        if (chunk) {
                return {
                    dzUuid: chunk.file.upload.uuid,
                    dzChunkIndex: chunk.index,
                    dzTotalFileSize: chunk.file.size,
                    dzCurrentChunkSize: chunk.dataBlock.data.size,
                    dzTotalChunkCount: chunk.file.upload.totalChunkCount,
                    dzChunkByteOffset: chunk.index * this.options.chunkSize,
                    dzChunkSize: this.options.chunkSize,
                    dzFilename: chunk.file.name,
                  
                };
            }
        },
        /*
    chunksUploaded: function(file,done){
          // NOT SURE WHAT THIS IS FOR
       
          done();
        },
          */      
  init: function() {
    var myAwesomeDropzone = this;
    var itemsToRemove = [];
    var totalsize = 0;
    var totalsizeFlag = false;
    var countImages = 0;
    var uploadedFiles = []; 
    
    this.on("success", function(file, response) {
        
           var fileData = []; 

           if(file.status == 'success'){
                countImages++;
                fileData = JSON.parse(file.xhr.response);
                uploadedFiles.push(fileData);
                if (countImages == myAwesomeDropzone.files.length){
                    
                    $.ajax({
                        headers: {
                            'X-CSRF-TOKEN': $('meta[name="csrf-token"]').attr('content')
                        },
                        type: "POST",
                        data: {uploadedFiles},
    
                        url: "/gallery/save-album",
                        
                        success: function () {
                            // Must call done() if successful
                          //  done();
                        },
                        error: function (msg) {
                           
                        }
                    });
                }
            }

$save->isFinished() always true

Hi

I have implemented your controller code exactly as in the documentation. My frontend uses the VueJS wrapper for Dropzone, my backend is Laravel 5.7.

With every chunk the server receives, $save->isFinished() returns true. I would expect this to only be the case when the last chunk has been uploaded? Also, $handler->getPercentageDone() returns 100 for every chunk.

I have attached the HAR file for the first chunk request and it's response. Am I missing something here?
dropzone-chunk.har.zip

Google Cloud Storage Support

I am using "superbalist/laravel-google-cloud-storage" to add Google Cloud Storage disk to Lravel's Storage facade. Calling Storage::disk('gcs')->put('filename.ext, $data); saves the file to Google Cloud Storage.

I hope that you can add Google Cloud Storage support to "pionl/laravel-chunk-upload" with the help of "superbalist/laravel-google-cloud-storage".

Could not move the file

Hello,
I'm using resumable.js and Laravel 5.8
The service is working well, but not when i try to upload this file:

https://d2v9y0dukr6mq2.cloudfront.net/video/thumbnail/rdM0-USein9859vv/videoblocks-iceland-winter-aerial-view-of-a-large-valley-between-mountains-and-a-river-_bu-7cxyyf_thumbnail-full01.png

Line 214: vendor\symfony\http-foundation\File\UploadedFile.php

throw new FileException(sprintf('Could not move the file "%s" to "%s" (%s)', $this->getPathname(), $target, strip_tags($error)));

Basically, the error happen when the filename is:

videoblocks-iceland-winter-aerial-view-of-a-large-valley-between-mountains-and-a-river-_bu-7cxyyf_thumbnail-full01.png

But if i rename the file to:
1.png
The error does not happen.

formData option

Please how can I use this formData: function (form){
......
} to send data input from form to controller. Jquery file upload

Type error between Illuminate\Http\UploadedFile and Symfony\Component\HttpFoundation\File\UploadedFile

Hello everyone,
I'm using resumable.js in a Laravel 5.1 project, but the upload always fails right at the first attempt with this error message.

FatalThrowableError in ParallelSave.php line 40: Type error: Argument 1 passed to Pion\Laravel\ChunkUpload\Save\ParallelSave::__construct() must be an instance of Illuminate\Http\UploadedFile, instance of Symfony\Component\HttpFoundation\File\UploadedFile given, called in /var/www/html/myProject/vendor/pion/laravel-chunk-upload/src/Handler/Traits/HandleParallelUploadTrait.php on line 29

This is how I create the resumable object

 var resumable = new Resumable({
        target:"{{route('backend.resumable')}}",
        chunkSize: 1*1024*1024,
        testChunks:false,
        simultaneousUploads:1,
        headers:{
            'X-CSRF-Token' :"{{ csrf_token() }}"
        },
        query:{
            _token : "{{ csrf_token() }}"
        }
    });

I updated all the dependencies and tried again, but the error persists.
Has anyone faced an error like this before?
Thanks.

Problem with resumable.js

Hi,

I have several problem when tried integrating resumable.js with Minio:

  1. Whenever I tried to use 5 Mb chunk, server will stop at 90-100% with request that returned response with status 413, using 1-4 Mb chunk works fine.
  2. Whenever I use testChunks: true property, it always returned response with status 500.
  3. Currently I use Storage::disk()->put() to upload file to minio, but the leftover file not deleted, any idea on how to delete unused file? Here the code:
protected function saveFile(UploadedFile $file)
{
...
// move the file name
// $file->move($finalPath, $fileName);
Storage::disk('minio')->put('test/'.$fileName, fopen($file, 'r+'));
...
}
  1. Is it possible to directly use Storage::cloud() or Storage::disk instead of using temp folder?

Other Info:
I use default UploadController.php with difference that I use Pion\Laravel\ChunkUpload\Handler\ResumableJSUploadHandler;
instead of
use Pion\Laravel\ChunkUpload\Handler\AbstractHandler;

My resumable.js config:

var r = new Resumable({
            target: '{{ url("upload") }}',
            chunkSize: 5*1024*1024, // 5 Mb chunk
            simultaneousUploads: 4,
            testChunks: false,
            throttleProgressCallbacks: 1,
            maxFiles: 1,
            fileType: ['mp4'],
            maxFileSize: 5*1024*1024*1024 // 5 Gb file size limit
      });

Thanks

Error with images over 2M

I'm getting a really weird bug here using Dropzone and Laravel Chunk Upload (pionl/laravel-chunk-upload).

My ini settings are set to 128M for both post/max file size.

My upload controller looks like:

    /**
     * Handles the file upload.
     *
     * @param \Pion\Laravel\ChunkUpload\Receiver\FileReceiver $receiver
     *
     * @return \Illuminate\Http\Response
     */
    public function store(Request $request)
    {
        $receiver = new FileReceiver('file', $request, HandlerFactory::classFromRequest($request));

        if (false === $receiver->isUploaded()) {
            throw new UploadMissingFileException();
        }

        $save = $receiver->receive();

        if ($save->isFinished()) {
            return $this->saveFileToCloud($save->getFile());
        }

        $handler = $save->handler();

        return response()->json([
            'done' => $handler->getPercentageDone(),
            'status' => true,
        ]);
    }

    protected function saveFile(UploadedFile $file, string $folder = 'uploads')
    {
        $path = Storage::disk('public')->putFile($folder, $file);
        $url = Storage::disk('public')->url($path);
        $name = basename($path);

        return response()->json(compact('url', 'name'));
    }

    /**
     * Saves the file to S3 server.
     *
     * @param \Illuminate\Http\UploadedFile $file
     *
     * @return \Illuminate\Http\Response
     */
    protected function saveFileToCloud(UploadedFile $file, string $folder = 'uploads')
    {
        $path = Storage::disk('s3')->putFile($folder, $file);
        $url = Storage::disk('s3')->url($path);
        $name = basename($path);

        dump($url, $name);

        return response()->json(compact('url', 'name'));
    }

I'm also using dropzone on the front-end to handle the chunking.

I'm routing dropzone through a little bit of vue as such:

// when DZ inits
this.dropzone.on("success", this.onDropzoneSuccess.bind(this));

// in methods
onDropzoneSuccess(file, response) {
  console.log("success", file, response);
  this.$emit("success", response);
},

So here's the issue. On files smaller than 2M it uploads successfully, saves to S3 and the controller responds back with the file name and S3 bucket where it saves successfully. The console.log spits back something like:

{ url: "https://my-bucket.s3.my-region.amazonaws.com/uploads/my-image-name.jpeg", name: "my-image-name.jpeg" }

However when the image is larger than 2M it still uploads successfully, saves to S3 and the dump() command in my controller shows me the proper URL/Name...the problem is when onDropzoneSuccess is called it doesn't have any response from the server...just an empty string instead of the intended URL/name that I know it's properly getting. Dropzone's error callbacks don't get called and nothing gets logged to either httpd logs, php logs or laravel logs and I'm at a loss for where exactly this is falling apart.

Success Event Response Empty

Hey Folks,

When uploading chunks everything appears to work fine, both chunks and the final file is uploaded, but when hooking into the success event, the second parameter (i.e. the response) is always empty.

I've tested the route in Postman that does this and it always returns the correct response, just that the success event never receives it. Only the file.

Here's my JS code:

this.$refs.sectionDropzone[0].dropzone.on('success', (file, responseText) => {
    console.log(file, responseText);

    if (responseText) {
        this.getTicket()
            .then(ticket => {
                this.ticket = ticket;

                this.setProgressMeter();
                this.checkProgress(file);
                this.uploadVideo(file, responseText);
            });
    }
});

As you can see, I'm logging the file and responseText parameters, with the responseText parameter always being blank.

Here's the PHP that handles the chunked uploads.

namespace App\Http\Controllers\API;

use Storage;
use App\Models\Video;
use Illuminate\Http\Request;
use Illuminate\Http\UploadedFile;
use App\Http\Controllers\Controller;
use App\Models\CourseChapterSection as Section;
use Pion\Laravel\ChunkUpload\Exceptions\UploadMissingFileException;
use Pion\Laravel\ChunkUpload\Handler\AbstractHandler;
use Pion\Laravel\ChunkUpload\Handler\HandlerFactory;
use Pion\Laravel\ChunkUpload\Receiver\FileReceiver;

class SectionVideoController extends Controller
{
    protected $section;
    protected $video;

    public function __construct(Section $section, Video $video)
    {
        $this->section = $section;
        $this->video = $video;
    }

    public function index()
    {
        $section = $this->section->find(request()->section_id);

        return $section->videos;
    }

    /**
     * Handles the file upload
     *
     * @param Request $request
     *
     * @return \Illuminate\Http\JsonResponse
     *
     * @throws UploadMissingFileException
     * @throws \Pion\Laravel\ChunkUpload\Exceptions\UploadFailedException
     */
    public function store(Request $request) {
        // create the file receiver
        $receiver = new FileReceiver('video', $request, HandlerFactory::classFromRequest($request));

        // check if the upload is success, throw exception or return response you need
        if ($receiver->isUploaded() === false) {
            throw new UploadMissingFileException();
        }

        // receive the file
        $save = $receiver->receive();

        // check if the upload has finished (in chunk mode it will send smaller files)
        if ($save->isFinished()) {
            $file = $save->getFile();

            // save the file and return any response you need, current example uses `move` function. If you are
            // not using move, you need to manually delete the file by unlink($save->getFile()->getPathname())
            $filename = str_slug(basename(time(), '.'.$file->getClientOriginalExtension())).'.'.$file->getClientOriginalExtension();;
            $mime = str_replace('/', '-', $file->getMimeType());

            // move the file name
            $file->move(storage_path().'/app/media/video/', $filename);

            return response()->json([
                'path' => storage_path().'/app/media/video/'.$filename,
                'name' => $filename,
                'mime_type' => $mime,
            ]);
        }

        // we are in chunk mode, lets send the current progress
        /** @var AbstractHandler $handler */
        $handler = $save->handler();

        return response()->json([
            'done' => $handler->getPercentageDone()
        ]);
    }

    public function destroy($id)
    {
        $video = $this->video->find($id);
        $video->sections()->detach(request()->section_id);
        $video->delete();
    }
}

Hope someone is able to help here. Any ideas?

Is dropzone.js working with this ?

I cannot make this package work with dropzone.js, it's the chunking method different or i have an error in my code (i copied from example) ?

For giving details, the chunk is always recognised as a full file.

return saveFile() filename at the end of the upload to dropzone.js

I'm using the DependencyUploadController.php and it is calling saveFile() at the end of the upload.

I return with it the filename back to the response()->json from the uploadFile() in the DependencyUploadController.php

How can I receive this response in the dropzone.js ?

The following file object of the final success event has no information about the Laravel response:

success: function(file){ console.log(file); }

Declaration of Pion\Laravel\ChunkUpload\Handler\SingleUploadHandler::startSaving($chunkStorage, $config) must be compatible with Pion\Laravel\ChunkUpload\Handler\AbstractHandler::startSaving($chunkStorage)

I am getting the below error.
I think the method startSaving in the SingleUploadHandler should be changed from

public function startSaving($chunkStorage, $config)

to:

public function startSaving($chunkStorage, $config= null)

Error:

exception
:
"Symfony\Component\Debug\Exception\FatalErrorException"
file
:
"/home/forge/www.maklerportal.app/vendor/pion/laravel-chunk-upload/src/Handler/SingleUploadHandler.php"
line
:
0
message
:
"Declaration of Pion\Laravel\ChunkUpload\Handler\SingleUploadHandler::startSaving($chunkStorage, $config) must be compatible with Pion\Laravel\ChunkUpload\Handler\AbstractHandler::startSaving($chunkStorage)"
trace
:
[]

[Vote] Add simultaneousUploads support

Currently we do not support simultaneousUploads, but we can add it. It needs to move merging of files at last chunk. This feature will be slower than appending on upload but is doable.

Please vote if it necessary to add this feature.

Thank you.

Video chunked not playing.

Hi. I used Dropzone chunking and this lib. but after the chunk is finished the saved video file from the final folder is not playing.

Cloudflare and chunking

Hi!

I am not too familiar with Cloudflare and it seems that their support does not respond to my inquiry either, so I figured I might as well ask here.

As an example I set the chunk size to 50mb here: http://testnow.ga/jquery-file-upload
The issue I have is that the TTFB (time to first byte) is very long after a chunk is completed and it has something to do with Cloudflare. A workaround would be to use the uploader on a subdomain, but I would rather not.

Image of issue

As you can see it takes 1.2 mins to upload the 50mb chunk, then 30 seconds to process it through the Cloudflare system. Is there a workaround that anyone knows that could maybe ignore reporting back to Cloudflare to prevent such long wait times?

Thanks :-)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.