Code Monkey home page Code Monkey logo

gaufrette's People

Contributors

aderuwe avatar afitzke avatar akovalyov avatar baachi avatar bsperduto avatar cdfre avatar ddeboer avatar docteurklein avatar hason avatar havvg avatar herzult avatar l3l0 avatar maxakawizard avatar mtarvainen avatar mtdowling avatar nek- avatar nicolasmure avatar nm2107 avatar nyholm avatar oskarstark avatar pawski avatar qpleple avatar richtermeister avatar schmittjoh avatar stloyd avatar stof avatar teohhanhui avatar umpirsky avatar woodsae avatar wysow avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gaufrette's Issues

Rename file?

There seems to be no way to rename a file.

I think this should be the next priority.

Binary data is corrupted when read from GridFS

That is happens because Gaufrette\Util\Size::fromContent uses mb_strlen and InMemoryBuffer stream uses not multi-byte string manipulations so the numbers are different in case of binary data.

Changing mb_strlen to strlen fixed issue but I didn't test it with other adapters.

LazyOpenCloud needed

It would be good to have a variant of the OpenCloud adapter that issues authentication and loads the ObjectStore only when really trying to use the container (e.g. reading, writing or deleting a file) and not when we are just instancing the adapter.

OpenCloud library seems to automatically issue authentication when we call $connection->ObjectStore(...) that is needed to get the ObjectStore to instantiate the adapter.

So in some environment or with some frameworks (eg. Symfony with KnpGaufretteBundle) it might happen that you always load an instance of your filesystem (eg. FilesystemMap with the KnpGaufretteBundle) ans so authentication is issued behind the scene for every single request to your web app even if you don't need to use the filesystem in that request...
And this is definitely something bad!

Custom headers for S3

I haven't delved to deeply into the code yet, but it seems the ability to pass custom metadata (e.g. content-type, required to upload assets to S3 properly) has been removed.

Ftp Adapter : Undefined offset 7

Hi,

I've got the error : Undefined offset: 7 in .../Ftp.php line 343

I want to list all keys of a Ftp on Windows Hosting but when i make a "print_r" of $infos in parseRawlist's method of Ftp's adapter, i've got only 4 items.

Thanks for your help

Where is the 0.2 tag?

Hi,

10 days ago I was able to use Gaufrette through these bundles "knplabs/knp-gaufrette-bundle" and "oryzone/media-storage-bundle". However they require

"knplabs/gaufrette": "0.2.*@dev"

That tag doesn't exist now. What happened? How do we proceed? Should we add the tag in Gaufrette again or we patch the other bundles?

thanks!

directory & filename hashing

to get better performance while serving a lot of files it would be better to split up files in multiple directories, like its done for php sessions:

'5;/tmp'
'/tmp/4/b/1/e/3/sess_4b1e384ad74619bd212e236e52a5a174If'

http://de2.php.net/manual/en/session.configuration.php#ini.session.save-path

Keep in mind that on Linux if you have a directory with too many files, the shell may not be able to expand wildcards.

http://stackoverflow.com/a/466728

maybe also add a filename/key hashing option.

Flawed metadata support

In the Amazon adapter, metadata are used from memory when renaming a file, but the metadata are never populated from the persistent ones.
The same issue exists for getMetadata: it will return them only in the process defining them as it won't read them from amazon

FTP adapter make a ftp connection each time

in the actual code, FTP adapter make a connection in each instance in order to test that destination directory exists even if the instance is not used.

Maybe it should be corrected in order to make same comportement as SFTP adapter: check the directory only when first use.

This directory check is a real performance issue. An option should be great in production environment.

[Ftp] connection succeed but no files found after a certain amount of time / data is exceeded

Using the FTP adapter with passive: true, I recently ran into an issue where the connection looked OK but the call to ftp_nlist in Gaufrette\Adapter\Ftp::exists() failed, returning false.

Before it fails I successfully manage to proceed 60 files.

To be clear :

  1. Everything works fine for the 60 first files (I manage to download them all).
  2. Then a Gaufrette\Exception\FileNotFound is thrown.
  3. Looking at Gaufrette\Adapter\Ftp::exists() I can see that $this->getConnection() returns resource(490) of type (FTP Buffer) (meanings the connection is good, right ?).
  4. However $items is false (whereas it contained files for the 60 first calls).

A temporary workaround is to change the access of Gaufrette\Adapter\Ftp::connect() so it is public, and call it before processing each keys.

But I am not sure it is a good solution.

Someone on the PHP manual also says that in such case we should refresh the connection.

What's your point of view on this one ? Should we add some kind of time out option ?

Switch the amazon SDK to the new one

The AWS team wrote a new version of their SDK. It would be great to switch the adapter to this one (but it would not be BC for people using the library)

S3Adapter sets acl

The default ACL for s3 is private, so why does the normal non acl aware adapter chose to set an acl on write of public? It seems to me that is should not set an acl. If an acl needs to be set then the AclAwareAdapter should be used.

Files on rackspace CDN are not purged on edit

When I overwrite a file on Rackspace cloud files the CDN is not automatically purged (and I can't find a way to manually trigger a purge), so I keep seeing the "old" image for a while.
Any solution?
@Gissues:{"order":50,"status":"notstarted"}

Why protected methods?

Why protected methods?

Why 'computePath' function is protected in Local Adapter?
if I need to get the file path from a key... How should I do if the computepath method is protected now?

Related commit: 902de44

Does OpenCloud adapter support (pseudo-hirarchical) folders?

During an attempt to write funcional test for my LazyOpenCloud adapter (variant of the OpenCloud adapter) I found some errors that may indicate there's no support for pseudo-hirarchical folders.

I haven't looked deeper in the code to see if pseudo-hirarchical folders support really missing (i hope I'll have time to do it soon). Anyway I adopted tvision/RackspaceCloudFilesStreamWrapper and it seems to support pseudo-hirarchical folders, so, in case the feature is missing, we can look at this repo for inspirations.

\cc @james75

Create specs and functional tests for OpenCloud adapter

Hi, I'm developing a lazy version of the OpenCloud adapter and I have already a working prototype, but I want to provide some test before creating a pull request.
My LazyOpenCloud adapter extends the original OpenCloud adapter so any problem with it will reflect in a problem on the new lazy variant.
Furthermore I'm going to use the OpenCloud adapter in production and I would love to have at least some functional test (but either having specs would be awesome) to be sure everything works fine.

I would like to involve @james75, who originally wrote the OpenCloud adapter.

Deprecate Rackspace Adapter and fix composer.json

Rackspace decided to deprecate their own API in favor of the new OpenCloud API.
They will completely shut down their old API in august 2013.
The problem is that they recently completed deleted their cloudfiles repository, so a composer install --dev will not succeed anymore. Furthermore people who want to use the RackspaceCloudFiles and the LazyRackspaceCloudFiles adapters are not able to do so.

I think these two adapters should be deprecated and in favor of the new OpenCloud adapter and the composer.json file should be fixed by removing the github repository called rackspace/php-cloudfiles and by adding "rackspace/php-opencloud": "dev-working" (in dev) and "rackspace/php-opencloud": "dev-master" (in suggest) to enable the new OpenCloud libraries.

OpenCloud adapter method `exists` always return `false`

OpenCloud adapter checks if a file exists by looking at the bytes field of the related OpenCloud\ObjectStore\DataObject instance. This field happens to be NULL even if the file really exists in the file system.

Obviously this compromises the method delete of the filesystem, so files will never be removed.

I will submit a possible fix shortly...

Dependency Ssh\Sftp missing

Hello,

the Sftp adapter doesn't work since neither Ssh\Sftp is part of the library nor the vendor install script adds it.

Local Adapter performance

When accessing large folders it can take a very long time to iterate through.

I'm looking into building a simple file manager that only needs to find files and folders within the immediate folder.

Am I missing a way to disable recursively iterating through every sub folder?

ZF2 AmazonS3 service broken

ZF2 HTTP client has been refactored lately and the ZF2 S3 service has not been updated yet, thus is broken. And it looks like this is Zend2 framework (and not Zend1) which is used for Gaufrette's S3 adapter because it uses PHP namespace.

For instance, ZF2 S3 service (Zend\Service\Amazon\S3\S3, used by Gaufrette's S3 adapter) tries to disable HTTP authentication that way:

$client->setAuth(false);

https://github.com/zendframework/zf2/blob/5f3e5e40b07450971eb7ec8c7becf8b97ae3f316/library/Zend/Service/Amazon/S3/S3.php#L581
But Zend\Http\Client::setAuth() does not work that way anymore:

public function setAuth($user, $password, $type = self::AUTH_BASIC)

https://github.com/zendframework/zf2/blob/5f3e5e40b07450971eb7ec8c7becf8b97ae3f316/library/Zend/Http/Client.php#L633-659

There are others stuff broken, it is not the only one in the S3 service. So the S3 adapter is broken as well.

Am I missing something ? Has anyone succeeded in making the S3 work lately ?

Help with the use Gaufrette.

Would you like help to use Gaufrette with the S3 service, the content that is in the readme could not develop anyone can help?

FTP does not work in MS FTP Service (Windows)

FTP request

125 Data connection already open; Transfer starting.
05-26-12  08:03PM       <DIR>          archive
12-04-12  06:57PM                16142 file1.zip
12-05-12  04:01PM                16142 file2.zip
12-06-12  04:01PM                17537 file3.zip
12-07-12  04:01PM                17537 file4.zip

output:

Notice: Undefined offset: 7 in /some/app/vendor/knplabs/gaufrette/src/Gaufrette/Adapter/Ftp.php line 339

listDirectory Function Removed

It looks like the listDirectory function was removed from the SFTP adapter in this commit: cba97e1

Is there a reason for this? If so, how can I accomplish getting the file listing of a directory?
Thank you!

Bug/Feature Request: Access violations not caught

Steps to reproduce:

Setup Gaufrette and configure a local adapter
Configure it to write to any file in any place (say, foobar.txt), with any text (say, 'hello world!') and tell set overwrite to true.

Before writing executing the script, give ownership to a user of higher permission than you. In Unix, you can probably give it to root and be okay (provided you're not running as root, for some reason).

Attempt to execute the script.

What Happens:

PHP throws an permission denied error

What Should Happen:

...not that. Since this is FS abstraction, it becomes really difficult to write a writable() function without breaking the whole idea behind FS abstraction. Throwing of some sort of an exception if the file cannot be written would be excellent.

I understand the lib is still under intensive development, but this is certainly something that would be really nice to have.

Create Branches Or Tag

Can you create a branch or tag when you introduce BC breaks ?

A change log file will be very good too

Thanks.

some suggestions for Gaufrette

Hi Antoine,

I recently found your nice project and I think I'll use it for further web projects (with Zend Framework).

Browsing through the sources I have some suggestions for you:

  1. Gaufrette\Adapter\Cache

The new source is read twice, which is an unnecessary overhead imho.
You could change that to:

public function read($key)
{
    if ($this->needsReload($key)) {
        $freshSource = $this->source->read($key);
        $this->cache->write($key, $freshSource);
        return $freshSource;
    }

    return $this->cache->read($key);
}
  1. Gaufrette\Adapter\AmazonS3

I have implemented some AmazonS3 stuff in running projects and from that experience I think you should add the possibility to add meta data in the write operation, like this (I'm using putFile there but putObject also accepts $meta):

    $meta = array(
        'Expires' => 'Fri, 5 Mar 2021 00:00:00 CET',
        'Cache-Control' => 'max-age=315360000',
            'x-amz-storage-class' => 'REDUCED_REDUNDANCY',
    );

    $this->_s3->putFile($file_path, $object_name, $meta);

I'm aware that this conflicts with the Adapter's Interface, as other stores don't have such meta data.
It could be an optional parameter in the Interface or you could add another Adapter AmazonS3Web that extends AmazonS3.
That way it would be possible to deliver media assets from S3 to the visitors of the webpage very easily. The 'x-amz-storage-class' meta is not tied to that usage, you simple want to store quite a lot of data with reduced redundancy because it is way cheaper on Amazon!

  1. Slow adapters in general:

I think there could to be another cache which stores if an object is available (and not expired) in the slow store. This would save many expensive connections to the slow store. I'm using a memcache backend with Zend_Cache in my project for that, but surely a file cache would also be possible.

Let me know that you think!

Best regards, Anton

Adapter\Cache::exists

public function exists($key)
{
    return $this->source->exists($key);
}

Currently, the cache adapter just checks the source adapter to see whether a key exists. This means you lose a lot of the benefit of the cache if you have a slow connection to the source adapter. Would it make more sense to check the cache adapter first before falling back to checking the source if it doesn't exist in the cache?

FTP CONSTANT Issues

Mac OS X Version 10.7.2

$ php bin/vendors install

[ErrorException]

Notice: Use of undefined constant FTP_ASCII - assumed 'FTP_ASCII' in /.../src/vendor/bundles/Knp/Bundle/GaufretteBundle/DependencyInjection/Factory/FtpAdapterFactory.php line 56

Create release versions of the library

It would be nice to have release versions (tags in git and versions on packagist.org) so that projects making use of the library can target known versions of the codebase.

Manage directories

Hi,

I have seen that I can list directories with listKeys. But I would like to know if it is possible to create, rename and delete them with Gaufrette.

As far as I know it is not possible.
When I use mkdir or rmdir with a stream, I have the following error:
« Warning: mkdir(): Gaufrette\StreamWrapper::mkdir is not implemented! »

Thank you for your work.

Cannot use AmazonS3 as AmazonS3 because the name is already in use

I am using Gaufrette in a Symfony2 project. After updating the KnpGaufretteBundle which requires the dev version of KnpLabs/Gaufrette I am getting the following error.

PHP Fatal error:  Cannot use AmazonS3 as AmazonS3 because the name is already in use in ../vendor/knplabs/gaufrette/src/Gaufrette/Adapter/AclAwareAmazonS3.php on line 5

If I roll back to 599b594 things work OK. So the commit that is breaking things is 614eac1

It looks like the addition of
use \AmazonS3 as AmazonClient; to Gaufrette/src/Gaufrette/Adapter/AmazonS3.php could be causing the issue.

I think I am getting this error because I am also using amazonwebservices/aws-sdk-for-php which has an AmazonS3 class as well located here

I have not had much time to dig into this much further than that. Any thoughts on a possible solution would be great. Thanks!

Warning: array_replace_recursive()

I'm getting this error trying to run composer.phar install or update.

Warning: array_replace_recursive(): Argument #2 is not an array in
[...]/vendor/knplabs/gaufrette/src/Gaufrette/Adapter/AmazonS3.php line 31

Any idea?

StreamWrapper - handling large file uploads

Hi,

in order for us to make use of Gaufrette, we needed to find a way round some limitations, in particular when handling large files. We came up with https://github.com/escapestudios/EscapeGaufretteExtension as a "extension"/work-around (in short: we've added a TempFileBuffer which uses a temp file to write to rather than keeping the content in memory) and a small change to the way the OpenCloud adapter writes an object (in short: use a DataObject's Create-method with a filename when the content is a file or the DataObject's setData method when the content is not a file).

It would be great to see this or something similar implemented into Gaufrette, as then large file uploads can be taken care of in the great Gaufrette-way!

Thanks in advance for your feedback!

Kind regards,
David

Multiple adapters in a filesystem

This is just a suggestion. It'd be nice to have multiple adapters bound to the same filesystem, to allow moving/copying between different adapters (for instance, moving uploaded files to an S3 bucket).

If you think it's worth a try, when I have some time I'll fork the project to see how it could be implemented.

url_stat requires write permissions

Currently the url_stat method opens the file with "r+", thus failing if the file exists, but can't be written. If using url_stat in file_exists, this yields different results than doing a file_exists of an existing file on which you don't have write permissions.
@Gissues:{"order":50,"status":"inprogress"}

listKeys probably needs refactor

with current implementation:

public function listKeys($pattern = '')
    {
        $dirs = array();
        $keys = array();

        foreach ($this->keys() as $key) {
            if (empty($pattern) || false !== strpos($key, $pattern)) {
                if ($this->adapter->isDirectory($key)) {
                    $dirs[] = $key;
                } else {
                    $keys[] = $key;
                }
            }
        }

        return array(
            'keys' => $keys,
            'dirs' => $dirs
        );
    }

means actually load ALL files and iterate them.

Maybe this logic should be adapter-dependent? Fs adapter can use glob(), Gridfs - indexes, and so on. Compatibility with current adapters can be done via method_exists and fallback to current method if listKeys is not implemented.

Composer suggests vs require

There are four requirements for Gaufrette:

"amazonwebservices/aws-sdk-for-php": "1.5.*",
"rackspace/php-cloudfiles": "*",
"doctrine/dbal": ">=2.3",
"dropbox-php/dropbox-php": "*"

And several suggestions:

"knplabs/knp-gaufrette-bundle": "*",
"dropbox-php/dropbox-php": "to use the Dropbox adapter",
"amazonwebservices/aws-sdk-for-php": "to use the Amazon S3 adapter",
"doctrine/dbal": "to use the Doctrine DBAL adapter",
"ext-zip": "to use the Zip adapter",
"ext-curl": "*",
"ext-mbstring": "*",
 "ext-mongo": "*"

If you don't use the dropbox adapter, you can still use other adapters right? So amazon, rackspace, doctrine and dropbox should be suggestions and not requirements if I understand it right. I can PR it this is the case.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.