coldbox-modules / s3sdk Goto Github PK
View Code? Open in Web Editor NEWAn Amazon S3 SDK for ColdFusion/CFML and the ColdBox Platform
Home Page: https://forgebox.io/view/s3sdk
License: Apache License 2.0
An Amazon S3 SDK for ColdFusion/CFML and the ColdBox Platform
Home Page: https://forgebox.io/view/s3sdk
License: Apache License 2.0
Hi! Thanks for creating this great resource! I've had some success with it so far, but need to make some additions. First, I need to be able to run the test suite!
How do I get the test suite running locally? I've been trying to follow the direction in .travis.yml
, and I've managed to get the server started and get to the test runner, but I'm getting an error "Could not find a Java System property or Env setting with key [AWS_ACCESS_KEY].", even though I've populated my .env
file per the example. I can help write some docs for this if someone is able to help me over the hurdles.
How can I get file contents?
If I have a sample.txt how can I read it?
Hey!
I am having an issue when I download a pdf from my s3 bucket my pdf reader says "There was an error while reading a stream."
My code to download the pdf looks like this s3sdk.downloadObject(uri="pathToFile", filePath = "#getTempDirectory()");
I am using snapshot 4.6.0.
I have a feeling it has something to do with the encoding but I'm not sure where to look to fix the issue.
Thanks,
David
Edit: I tested with a plain text file and the download worked fine. Also tested with an image and did not work.
Unable to use as standalone due to the logbox dependency.
When running getBucket() method:
Error:
variable [LOG] doesn't exist
Line:
897: if( log.canDebug() ){
There was a discussion i saw on CFML Slack Channel where one of the Guys' named @simone has shared some details that the Creating of Buckets is not working, I replicated the cases he mentioned and noticed something, the Buckets do get Created with an error, but the Permissions are set to block access all.
Here is the Code he used for Creating Buckets
He Modified the code as below:
boolean function putBucket(
required string bucketName = variables.defaultBucketName,
string acl = variables.defaultACL,
string location = "",
string objectOwnership = variables.defaultObjectOwnership,
boolean BlockPublicAcls = false,
boolean IgnorePublicAcls = false,
boolean BlockPublicPolicy = false,
boolean RestrictPublicBuckets = false
){
requireBucketName( arguments.bucketName );
if(arguments.location == "EU") {
var constraintXML = "<CreateBucketConfiguration><LocationConstraint>EU</LocationConstraint></CreateBucketConfiguration>"
} else if(arguments.location == 'ca-central-1') {
var constraintXML = "<CreateBucketConfiguration><LocationConstraint>ca-central-1</LocationConstraint></CreateBucketConfiguration>"
} else if(arguments.location == 'ap-southeast-2') {
var constraintXML = "<CreateBucketConfiguration><LocationConstraint>ap-southeast-2</LocationConstraint></CreateBucketConfiguration>"
} else {
var constraintXML = "";
}
var headers = { "content-type" : "text/xml" };
if ( len( arguments.objectOwnership ) ) {
if (
!listFindNoCase(
"BucketOwnerPreferred,ObjectWriter,BucketOwnerEnforced",
arguments.objectOwnership
)
) {
throw(
message = "Invalid value [#arguments.objectOwnership#] for [objectOwnership] when creating bucket.",
detail = "Valid options are: [BucketOwnerPreferred, ObjectWriter, BucketOwnerEnforced]"
);
}
headers[ "x-amz-object-ownership" ] = arguments.objectOwnership;
}
var results = s3Request(
method = "PUT",
resource = arguments.bucketName,
body = constraintXML,
headers = headers
);
// s3 does not provide a way to set this when creating the bucket
putBucketPublicAccess(
arguments.bucketName,
arguments.BlockPublicAcls,
arguments.IgnorePublicAcls,
arguments.BlockPublicPolicy,
arguments.RestrictPublicBuckets
);
// Must set ACL in second step in case public access settings above would prevent the ACL from being saved.
putBucketACL( arguments.bucketName, arguments.acl );
return results.responseheader.status_code == 200;
}
so in the above I see he added some conditions in the constraintXML which makes the code to check which region it has to choose and create
Thanks
Currently, the putObjectFile
method reads the entire contents of the given file in to memory before passing it along to putObject
. The AWS RestAPI supports MultiPart uploads, which can be used to stream the contents of a file up in to an S3 Bucket, The process for this is:
Add support for multipart and, for files over a certain size, use this as the methodology for putting files to S3
AWS reports that signatures are incorrect for many operations in the test suite when running CF11.
I've verified that the signing code in Sv4Util.cfc produces the same output with CF11, CF2016, and Lucee 4.5 and 5. So, the problem must be somewhere in the S3Request()
method. That's as far as I've gotten so far.
Hi! Thanks for creating this great resource!
While, I've been able to write files to S3 successfully, using SigV4 signing, I haven't been able to generate pre-signed object URLs using SigV4. getAuthenticatedURL()
generates SigV2 URLs, I believe, so it doesn't work with objects in new buckets, and may stop working altogether in a couple of months if AWS turns off SigV2 support.
I'm interested in trying to add this functionality, leveraging what's already been built in Sv4Util
. I just need a little bit of setup guidance from you. See #17 for that, please!
Hello,
I'm trying to get a bucket listing using getBucket(), while trying to pass the 'prefix' parameter:
var list = s3.getBucket(bucketName = "mybucket", prefix = "/myfolder");
When I attempt to do so I receive the following error:
Type: S3SDKError
Messages: Error making Amazon REST Call Code: InvalidArgument\nMessage:
Only one auth mechanism allowed; only the X-Amz-Algorithm query parameter,
Signature query string parameter or the Authorization header should be specified
When I try the same call without the 'prefix' the call succeeds and I get my bucket listing. However because of the huge amount of items in the bucket I would like to use the prefix to filter.
I'm finding that special characters such as " ", "(", and ")" cause putObject() to fail with errors like this:
Code: SignatureDoesNotMatch\nMessage: The request signature we calculated does not match the signature you provided. Check your key and signing method.
I modified the file name in AmazonS3Spec.cfc "can store a new object"
to be "example(test).txt" and "example test.txt" and both failed whereas "exampletest.txt" was put successfully.
I tried pre-urlencoding it including the custom methodology defined here: https://www.bennadel.com/blog/2656-url-encoding-amazon-s3-resource-keys-for-pre-signed-urls-in-coldfusion.htm
But still no luck.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.