wgenial / s3-objects-stream-zip-php Goto Github PK
View Code? Open in Web Editor NEW๐๏ธ 3ObjectsStreamZip is a PHP library to stream objects from AWS S3 as a zip file.
License: MIT License
๐๏ธ 3ObjectsStreamZip is a PHP library to stream objects from AWS S3 as a zip file.
License: MIT License
I have a suggestion not to raise an exception when many files are passed and some of them do not exist, even though the parameter $checkObjectExist.
In the file src\S3ObjectsStreamZip.php
foreach ($objects as $object) {
/*
* Added this IF to generate even if the current file doesn't exist
*/
if (file_exists($objectDir)) {
$objectName = isset($object["name"]) ? $object["name"] : basename($object["path"]);
$context = stream_context_create(array(
"s3" => array("seekable" => true)
));
$request = $this->s3Client->createPresignedRequest(
$this->s3Client->getCommand("GetObject", [
"Key" => $object["path"],
"Bucket" => $bucket,
]),
"+1 day"
);
$tmpfile = tempnam(sys_get_temp_dir(), crc32(time()));
$httpClient->request("GET", (string) $request->getUri(), array(
"synchronous" => true,
"sink" => fopen($tmpfile, "w+")
));
if ($stream = fopen($tmpfile, "r", false, $context)) {
$zip->addFileFromStream($objectName, $stream);
}
}
}
how can i get connection with S3 using IAM role? as in IAM role credentials are not passed while connection make. but in this case credentials is required to make connection with S3.
The intermediate files that are downloaded are stored in /tmp
, which makes it use a lot of space unnecessarily.
You could add a call to the unlink
function to delete the file after use and this would improve a lot and avoid the need to go around deleting temporary files.
In this line:
https://github.com/wgenial/s3-objects-stream-zip-php/blob/master/src/S3ObjectsStreamZip.php#L69
....
$tmpfile = tempnam(sys_get_temp_dir(), crc32(time()));
$httpClient->request("GET", (string) $request->getUri(), array(
"synchronous" => true,
"sink" => fopen($tmpfile, "w+")
));
if ($stream = fopen($tmpfile, "r", false, $context)) {
$zip->addFileFromStream($objectName, $stream);
}
unlink($tmpfile); // <-- This line will tell PHP that we no longer need the file.
}
In my case where I was trying to zip up thousands of photos at once, the file validation was making it take 2-3 minutes before the zip actually started streaming. While I get the value of making sure all of the files exist before doing this, I think it makes sense to have an option to silently skip those files during streaming instead of trying to validate them all up front.
Can you please update the dependencies, since maennchen/zipstream-php has released there 1.0.
Thanks
Hello,
I used your implementation for downloading S3 files through PHP like this
`<?php
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
include DIR.'/../vendor/autoload.php';
use Aws\S3\Exception\S3Exception;
use WGenial\S3ObjectsStreamZip\S3ObjectsStreamZip;
use WGenial\S3ObjectsStreamZip\Exception\InvalidParamsException;
try {
header('Content-Description: File Transfer');
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename="imagenes.zip"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
$zipStream = new S3ObjectsStreamZip(array(
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => array(
'key' => 'key',
'secret' => 'secret'
)
));
$objectsA = array();
$files = json_decode($_POST['fileData']);
foreach ($files as $file) {
//array_push($objectsA, $file);
array_push($objectsA, array(
'path' => 'asseetFiles/'.$file->file,
'name' => 'asseetFiles/'.$file->name
));
}
$bucket = 'kaab-files'; // required
$zipname = 'imagenes.zip'; // required
$checkObjectExist = false; // no required | default = false
$zipStream->zipObjects($bucket, $objectsA, $zipname, $checkObjectExist);
}
catch (InvalidParamsException $e) {
echo $e->getMessage();
}
catch (S3Exception $e) {
echo $e->getMessage();
}
`
and some times the result file is corrupted, not all files are included,
You have any idea how can i fix that?
It is a recommended security practice to use machine level IAM roles rather than providing credentials to the client. This is done by using an anonymous client. You can do this by setting credentials to false if a secret and key are not provided.
Can I Only pass folder name and get a zip of that particular folder? I tried Using Iterator But it isn't working.
I have the script working and can download zipped files from my website. On my Mac it works and I can open them.
On Windows it gives an 'Invalid' error when the zip is above 1MB. Any zip below that opens fine.
Any clue what the problem is?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.