Code Monkey home page Code Monkey logo

awscala's Introduction

awscala's People

Contributors

ada-u avatar adamnfish avatar bplommer avatar deacondesperado avatar derrickburns avatar dragisak avatar github-actions[bot] avatar gribeiro avatar hamnis avatar howardjohn avatar jamesjj avatar jmpage avatar jwszolek avatar kimxogus avatar marhel avatar martijndwars avatar michalgute avatar mixolyde avatar mr-sd avatar mslinn avatar nakulgan avatar note avatar pishen avatar pjfanning avatar scala-steward avatar seratch avatar soxee avatar taisukeoe avatar tanacasino avatar xuwei-k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

awscala's Issues

EU_CENTRAL_1 not found in SQS.apply()

Hi,

We're currently using 0.5.2. We were using an older version (0.3.x) before, and this code worked, so something somewhere has changed!

Here is the code:

implicit val sqs = SQS.apply(, )(Region.Ireland)

This is the error:
java.lang.NoSuchFieldError: EU_CENTRAL_1
awscala.Region0$.(Region0.scala:27)
awscala.Region0$.(Region0.scala)
awscala.package$.(package.scala:3)
awscala.package$.(package.scala)

Any clues?

Expiration time not working

I'm trying to upload a file and set a expiration time to it.

    val metadata: ObjectMetadata = new ObjectMetadata()
    metadata.setContentLength(file.length)
    metadata.setExpirationTime(DateTime.now().plusMinutes(10).toDate)

    bucket.get.putObjectAsPublicRead(filename, file, metadata)

But when I check the file in the s3 bucket it doesn't have expiry date. It's some bug in AWScala or am I using it wrong?

Dynamodbv2 throws NullPointerException (rather than returning None) on accessing a table you just deleted

Hi. Wanted to point out this issue I have noticed: If you first create a table, then you delete it, and wait for the deletion to take place, calling

dynamoDB.table(tableName)

throws a NullPointerException rather than returning None. Calling

dynamoDB.table("thisisarandomnameneverusedbefore") returns None, as by expected behavior.

This is the code to replicate the deletion error:
//Table created here...//

val table = dynamoDB.table(tableName).get
dynamoDB.delete(table)

//Wait for deletion
println("Waiting for deletion")
Thread.sleep(timeout.toMillis)

dynamoDB.table(tableName) should be (None) //<-- NPE here

and here is the stacktrace, it appears to be a TableMeta problem:

java.lang.NullPointerException was thrown.
java.lang.NullPointerException
at scala.collection.convert.Wrappers$JListWrapper.iterator(Wrappers.scala:87)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at awscala.dynamodbv2.TableMeta$.apply(TableMeta.scala:13)
at awscala.dynamodbv2.DynamoDB$class.describe(DynamoDB.scala:49)
at awscala.dynamodbv2.DynamoDBClient.describe(DynamoDB.scala:333)
at awscala.dynamodbv2.DynamoDB$class.table(DynamoDB.scala:52)
at awscala.dynamodbv2.DynamoDBClient.table(DynamoDB.scala:333)

Hope it helps!

D.

Default empty constructor for AmazonS3Client (and others)

In some circumstances, leveraging the default constructor for the java S3 client is more appealing than the default constructor provided in AWScala.

Both default constructors look for credentials stored in system properties. However, the scala API fails if the parameters aren't present. The java API won't fail. And if you're running the code on an EC2 instance with an IAM role that grants permissions to S3 buckets, you'll be in the clear. See http://blogs.aws.amazon.com/security/post/Tx1XG3FX6VMU6O5/A-safer-way-to-distribute-AWS-credentials-to-EC2

At the moment, this is how I initialize my S3 client.

val s3 = new AmazonS3Client with S3 {
  override def createBucket(name: String): Bucket = super.createBucket(name)
}

Can we add more direct support for the default constructors of the underlying java APIs?

Cannot get S3 buckets in US_EAST_1 region

If I use the following code:

import awscala._
import s3._

object AwsTest {
  def main(args: Array[String]): Unit = {
    implicit val s3 = S3.at(Region.US_EAST_1)
    val bucket = s3.bucket("foo").get
    println(bucket.totalSize)
  }
}

I get the error:

[error] (run-main-c) java.lang.IllegalArgumentException: Cannot create enum from us-east-1 value!
java.lang.IllegalArgumentException: Cannot create enum from us-east-1 value!
    at com.amazonaws.services.s3.model.Region.fromValue(Region.java:224)
    at awscala.s3.S3$class.at(S3.scala:28)
    at awscala.s3.S3Client.at(S3.scala:239)
    at awscala.s3.S3$.apply(S3.scala:11)
    at awscala.s3.S3$.at(S3.scala:15)

Since I created my buckets in the us-east-1 region, and AWScala uses US_WEST_1 as the default region (unlike the AWS SDK for Java, which uses us-east-1 as the default region), while I can list buckets, I can do no operations on specific buckets because the Region defaults to US_WEST_1 and I'm unable to override this as you can see above.

If I change the definition of the s3 var to just S3(), then I get the following error from the AWS SDK:

[error] (run-main-e) com.amazonaws.services.s3.model.AmazonS3Exception: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint. (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: 35046B17DA2581B0), S3 Extended Request ID: iOteNJlE7qEkrQ6F+aMfpBeXgKjdZV0aJtB2BJCPgAQkDqTjPIgMoyvftB9kHGIdeAENGrrA0cQ=

dynamodbv2.Item doesn't support getting attributes by key

All DynamoDB get operations (get, query, scan, etc...) return Item objects. While the AWS API returns a Map, the Item object turns it into a simple Seq(name, AttributeValue). That means that to get the value of a specific attribute (which I assume is a common case you have to either turn it back into a Map or iterate. It would be much more useful to retain the attribute names and be able to do, for example, item("attribute").

403 Forbidden with user without list all buckets right

Good Morning,

i have an issue working with this library while using an user without the right to list all existing bucket. If I got it right, every time I try to retrieve a Bucket, the library first tries to list all the existing buckets and then picks the one requested by the user, causing a 403 Forbidden error when using a "limited access" user.

Looking at how the AWS Java client deal with this situation by providing a direct method to retrieve an object by providing a bucket name and file path, i think it could be enough to add a similar method to get around the problem. For example, looking at getObject, i think adding a new

def getObject(bucketName: String, key: String)

could go.

Unfortunately I am not an S3 expert to evaluate all the possible implication, i just hope this to be useful.

What do you think?

Thanks for your work

Option[Table] results in NoSuchElementException

Hi there,

Checking if a table exists :

dynamoDB.table("mytable").nonEmpty

I expect to get either None or Table. Instead, it results in:

java.util.NoSuchElementException: None.get
    at scala.None$.get(Option.scala:347)
    at scala.None$.get(Option.scala:345)
    at awscala.dynamodbv2.TableMeta.table(TableMeta.scala:36)
    at awscala.dynamodbv2.DynamoDB$$anonfun$table$1.apply(DynamoDB.scala:53)
    at awscala.dynamodbv2.DynamoDB$$anonfun$table$1.apply(DynamoDB.scala:53)
    at scala.Option.map(Option.scala:146)
    at awscala.dynamodbv2.DynamoDB$class.table(DynamoDB.scala:53)
    at awscala.dynamodbv2.DynamoDBClient.table(DynamoDB.scala:338)

Reason for this is that table function will call get on an option type:

hashPK = keySchema.find(_.keyType == aws.model.KeyType.HASH).get.attributeName,

while the keySchema may not have a hash key, for example when a table doesn't exist at all.

Any plans to support AWS CloudFormation

Hello, we are using awscala for an intern tool at our university and we need to use the cloudFormation service from Amazon and I was wondering if you are interested to create a new service in your library to support this. Waiting for your reply. Have a nice day :)

how do I paginate through dynamodb scan operation

I have a table that has more than 1000 rows. and I need to paginate through the result of the scan operation, in the JAVA SDK I used getLastEvaluatedKey() and then start with that for my next iteration , is there something similar in this awsscala ?

for example,

val googlers: Seq[Item] = table.scan(Seq("Company" -> Condition.gt("Google"))) // will filter against the 1st 1000 rows only.

thanks in advance

limit in select not working as expected

I am having some problems using a limit in a select with AWScala. It seem to fetch the records in batches of the limit size up until the last one. In my query

select * from logs where liveFrom < '1421192471363' and liveTo > '1421192471363' and rank > '0' order by rank limit 10

I get back 320 records out of a total of 323 matching records without the limit.

I have a feeling this might be related to the Stream based next token implementation but didn't have a chance to investigate.

The Java API seems to work correctly and I switched to Java just for this.

Thanks

SQS message stream

Is there any way to use SQS with a stream so I don't have to set up a timer to check and I can provide a function to handle any new messages?

DynamoDB scan operation ExclusiveStartKey

The default scan method assumes that the returned results do not exceed 1MB. However, if the result returned by the scan operation exceeds 1MB this method requires keeping track of the ExclusiveStartKey and then iterating over the result based on that. please refer to the following code snippet for example:

def scan2(table: Table,
    filter: Seq[(String, aws.model.Condition)],
    limit: Int = 1000,
    segment: Int = 0,
    totalSegments: Int = 1,
    select: Select = aws.model.Select.ALL_ATTRIBUTES,
    attributesToGet: Seq[String] = Nil): Seq[Item] = try {

    val req = new aws.model.ScanRequest()
      .withTableName(table.name)
      .withScanFilter(filter.toMap.asJava)
      .withSelect(select)
      .withLimit(limit)
      .withSegment(segment)
      .withTotalSegments(totalSegments)
    if (!attributesToGet.isEmpty) {
      req.setAttributesToGet(attributesToGet.asJava)
    }


     implicit val dynamoDB = DynamoDB(credentialsawsscala)
     val scanResult = dynamoDB.scan(req)


     var lastEvaluatedKey = scanResult.getLastEvaluatedKey()

     var all_data = scala.collection.mutable.Seq[Item]()

     val single_iteration = scanResult.getItems.asScala.map(i => Item(table, i)).toSeq

     all_data = all_data ++ single_iteration
     while (lastEvaluatedKey!=null)
     {

      val req: com.amazonaws.services.dynamodbv2.model.ScanRequest = new aws.model.ScanRequest()
      .withTableName(table.name)
      .withScanFilter(filter.toMap.asJava)
      .withSelect(select)
      .withLimit(limit)
      .withSegment(segment)
      .withTotalSegments(totalSegments)
      .withExclusiveStartKey(lastEvaluatedKey)

      val scanResult = dynamoDB.scan(req)
      val single_iteration = scanResult.getItems.asScala.map(i => Item(table, i)).toSeq
      all_data = all_data ++ single_iteration
      lastEvaluatedKey = scanResult.getLastEvaluatedKey()
     }
     all_data 

  } catch { case e: aws.model.ResourceNotFoundException => Nil }

Please let me know how do you want to address this issue, I will be happy to send a pull request .

runAndAwait doesn't handle intermittent aws errors

When calling runAndAwait, the following exception is throw, even though the instance is created.

com.amazonaws.AmazonServiceException: The instance ID 'i-35ebdfce' does not exist (Service: AmazonEC2; Status Code: 400
  at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1077)
  at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:725)
  at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:460)
  at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:295)
  at com.amazonaws.services.ec2.AmazonEC2Client.invoke(AmazonEC2Client.java:9393)
  at com.amazonaws.services.ec2.AmazonEC2Client.describeImages(AmazonEC2Client.java:4762)

Can't access underlying in the Instance class

This means that either you need to provide a mapping for EVERY aws instance call, or provide access to the underlying instance class.

For example, this means there is currently no way to set the tags on an Instance.

DynamoDB batchGet doesn't work with when batch size exceeds limits

DynamoDB imposes a limit of the number (100) and result size from a batch get request. The AWS API BatchGetResult does list the items that were NOT returned. The Scala version should either make those items available OR return a Stream that provides ALL the requested items.

Region doesn't behave like README suggests

Trying to follow the README section on dynamo

scala> import awscala._, dynamodbv2._
import awscala._
import dynamodbv2._

scala> DynamoDB.at(Region.Tokyo)
<console>:14: error: value Tokyo is not a member of object awscala.Region
              DynamoDB.at(Region.Tokyo)
                                 ^

Make Condition a Case Class for consistency

Hello! Your library is really useful and saving me a lot of time, but something tripped me up today.
I was defining a Policy that had conditions, Something like this:

val policy = Policy(
        Seq(
            Statement(Effect.Allow, 
                Seq(
                    Action("s3:ListAllMyBuckets"), 
                    Action("s3:GetBucketLocation")
                ),
                Seq(
                    Resource("arn:aws:s3:::*"),
                    Resource(s"arn:aws:s3:::${bucketName}")
                )
            ),
            Statement(Effect.Allow,
                Seq(
                    Action("s3:ListBucket")
                ),
                Seq(
                    Resource(s"arn:aws:s3:::${bucketName}")
                ),
                conditions = Seq(
                    new awscala.Condition(
                        "StringLike",
                        "s3:prefix",
                        Seq("", s"${keyToAllow}")
                    )
                )
            )
        )
    )

Something that really threw me off was that Resource, Statement and Action are all case classes, so they have the companion object's apply method to Simply state Action(...) instead of new Action(...). While it's not a huge deal, you'll notice that Condition is not a case class and therefore needs to be instantiated with new.

This puzzled me for a while because I was doing the import awscala.Condition fine, but was getting the error not found: value Condition when trying to make my policy. I figure'd it out as you can tell from the above. But is there a reason why Condition is not a case class?

Proxy support using AWS SDK ClientConfiguration

Looks like the current version of AWScala doesn't allow me to specify a proxy configuration (I'm using this for S3). AWS's Java SDK supports proxy through a ClientConfiguration object which can be passed into the S3 constructor.

Any plans to support this in the future? I am playing around with it myself, we'll see if I can add it, but let me know if you have pointers or are planning on adding this.

Missing instance types

The awscala.ec2.InstanceType object proxies the aws.model.InstanceType class from the Java SDK. However, it misses some of the new instance types. In my particular case, I'm missing the t2.micro type.

Http request execution error on createTable action

Hello

When i try to create table on local DynamoDB instance, i am getting this error

com.amazonaws.http.AmazonHttpClient executeHelper INFO: Unable to execute HTTP request: Read timed out java.net.SocketTimeoutException: Read timed out at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.socketRead(SocketInputStream.java:116) at java.net.SocketInputStream.read(SocketInputStream.java:170) at java.net.SocketInputStream.read(SocketInputStream.java:141) at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:160) at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:84) at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:273) at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:140) at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:57) at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:260) at org.apache.http.impl.AbstractHttpClientConnection.receiveResponseHeader(AbstractHttpClientConnection.java:283) at org.apache.http.impl.conn.DefaultClientConnection.receiveResponseHeader(DefaultClientConnection.java:251) at org.apache.http.impl.conn.ManagedClientConnectionImpl.receiveResponseHeader(ManagedClientConnectionImpl.java:197) at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:271) at com.amazonaws.http.protocol.SdkHttpRequestExecutor.doReceiveResponse(SdkHttpRequestExecutor.java:66) at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123) at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:685) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:487) at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57) at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:769) at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:506) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:318) at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:1803) at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.createTable(AmazonDynamoDBClient.java:832) at awscala.dynamodbv2.DynamoDB$class.createTable(DynamoDB.scala:109) at awscala.dynamodbv2.DynamoDBClient.createTable(DynamoDB.scala:348) at awscala.dynamodbv2.DynamoDB$class.create(DynamoDB.scala:84) at awscala.dynamodbv2.DynamoDBClient.create(DynamoDB.scala:348) at awscala.dynamodbv2.DynamoDB$class.createTable(DynamoDB.scala:58) at awscala.dynamodbv2.DynamoDBClient.createTable(DynamoDB.scala:348) at tr.com.cma.agent.tagger.DynamoController$Test$$anonfun$2.apply$mcV$sp(DynamoController$Test.scala:24) at tr.com.cma.agent.tagger.DynamoController$Test$$anonfun$2.apply(DynamoController$Test.scala:19) at tr.com.cma.agent.tagger.DynamoController$Test$$anonfun$2.apply(DynamoController$Test.scala:19) at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22) at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) at org.scalatest.Transformer.apply(Transformer.scala:22) at org.scalatest.Transformer.apply(Transformer.scala:20) at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647) at org.scalatest.Suite$class.withFixture(Suite.scala:1122) at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683) at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644) at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656) at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656) at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656) at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683) at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714) at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714) at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413) at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401) at scala.collection.immutable.List.foreach(List.scala:381) at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390) at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427) at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401) at scala.collection.immutable.List.foreach(List.scala:381) at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396) at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483) at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714) at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683) at org.scalatest.Suite$class.run(Suite.scala:1424) at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683) at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760) at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760) at org.scalatest.SuperEngine.runImpl(Engine.scala:545) at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760) at org.scalatest.FlatSpec.run(FlatSpec.scala:1683) at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55) at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563) at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557) at scala.collection.immutable.List.foreach(List.scala:381) at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557) at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044) at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043) at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722) at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043) at org.scalatest.tools.Runner$.run(Runner.scala:883) at org.scalatest.tools.Runner.run(Runner.scala) at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:138) at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

S3 Storage Class option

Hi ,

I'm searching to know how to specify StorageClass option when I put a new object and also to update an existing object (all my objects are stored as STANDARD instead of RR).

KR

/François

Build fails when set to build.properties sets sbt.version=0.13.2

$ sbt test
[info] Loading global plugins from /home/mslinn/.sbt/0.13/plugins
[info] Loading project definition from /home/mslinn/work/experiments/aws/AWScala.seratch/project
References to undefined settings: 

  *:scalacOptions from *:scalacOptions (/home/mslinn/work/experiments/aws/AWScala.seratch/project/Build.scala:31)

  *:projectId from *:dependencyUpdatesData ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:11)

  *:externalResolvers from *:dependencyUpdatesData ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:11)

  *:scalaBinaryVersion from *:dependencyUpdatesData ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:11)

  *:projectId from *:dependencyUpdates ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:12)

  *:target from *:dependencyUpdatesReportFile ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:10)

  *:projectId from *:dependencyUpdatesReport ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:13)

  *:target from *:sublimeExternalSourceDirectoryParent ((com.orrsella.sbtsublime.SublimePlugin) SublimePlugin.scala:43)

  *:libraryDependencies from *:libraryDependencies (/home/mslinn/work/experiments/aws/AWScala.seratch/project/Build.scala:21)

  *:resolvers from *:resolvers (/home/mslinn/work/experiments/aws/AWScala.seratch/project/Build.scala:20)

    at sbt.Init$class.Uninitialized(Settings.scala:260)
    at sbt.Def$.Uninitialized(Def.scala:10)
    at sbt.Init$class.delegate(Settings.scala:184)
    at sbt.Def$.delegate(Def.scala:10)
    at sbt.Init$class.compiled(Settings.scala:135)
    at sbt.Def$.compiled(Def.scala:10)
    at sbt.Init$class.make(Settings.scala:141)
    at sbt.Def$.make(Def.scala:10)
    at sbt.Load$.apply(Load.scala:142)
    at sbt.Load$.defaultLoad(Load.scala:40)
    at sbt.BuiltinCommands$.doLoadProject(Main.scala:458)
    at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:452)
    at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:452)
    at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:60)
    at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:60)
    at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
    at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
    at sbt.Command$.process(Command.scala:95)
    at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:100)
    at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:100)
    at sbt.State$$anon$1.process(State.scala:179)
    at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:100)
    at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:100)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
    at sbt.MainLoop$.next(MainLoop.scala:100)
    at sbt.MainLoop$.run(MainLoop.scala:93)
    at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:71)
    at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:66)
    at sbt.Using.apply(Using.scala:25)
    at sbt.MainLoop$.runWithNewLog(MainLoop.scala:66)
    at sbt.MainLoop$.runAndClearLast(MainLoop.scala:49)
    at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:33)
    at sbt.MainLoop$.runLogged(MainLoop.scala:25)
    at sbt.StandardMain$.runManaged(Main.scala:57)
    at sbt.xMain.run(Main.scala:29)
    at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:57)
    at xsbt.boot.Launch$.withContextLoader(Launch.scala:77)
    at xsbt.boot.Launch$.run(Launch.scala:57)
    at xsbt.boot.Launch$$anonfun$explicit$1.apply(Launch.scala:45)
    at xsbt.boot.Launch$.launch(Launch.scala:65)
    at xsbt.boot.Launch$.apply(Launch.scala:16)
    at xsbt.boot.Boot$.runImpl(Boot.scala:32)
    at xsbt.boot.Boot$.main(Boot.scala:21)
    at xsbt.boot.Boot.main(Boot.scala)
[error] References to undefined settings: 
[error] 
[error]   *:scalacOptions from *:scalacOptions (/home/mslinn/work/experiments/aws/AWScala.seratch/project/Build.scala:31)
[error] 
[error]   *:projectId from *:dependencyUpdatesData ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:11)
[error] 
[error]   *:externalResolvers from *:dependencyUpdatesData ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:11)
[error] 
[error]   *:scalaBinaryVersion from *:dependencyUpdatesData ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:11)
[error] 
[error]   *:projectId from *:dependencyUpdates ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:12)
[error] 
[error]   *:target from *:dependencyUpdatesReportFile ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:10)
[error] 
[error]   *:projectId from *:dependencyUpdatesReport ((com.timushev.sbt.updates.UpdatesPlugin) UpdatesPlugin.scala:13)
[error] 
[error]   *:target from *:sublimeExternalSourceDirectoryParent ((com.orrsella.sbtsublime.SublimePlugin) SublimePlugin.scala:43)
[error] 
[error]   *:libraryDependencies from *:libraryDependencies (/home/mslinn/work/experiments/aws/AWScala.seratch/project/Build.scala:21)
[error] 
[error]   *:resolvers from *:resolvers (/home/mslinn/work/experiments/aws/AWScala.seratch/project/Build.scala:20)
[error]  
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?

async aws clients

Hi,
do you plan to support async aws clients (i.e AmazonDynamoDBAsyncClient etc) ?
Thanks
Jakub

Cannot set object metadata

Hello, my objects are created with the wrong content-type and I'd like to change that. The code runs fine and gives no errors, still the metadata is not changed. Here's my code, note that setAsPublicRead works and keys are indeed publicly accessible.

case Some(key) =>
          key.setAsPublicRead()

          key.metadata.setContentType("text/plain")

          val meta = key.getObjectMetadata
          meta.setContentType("text/plain")

          val userMeta = new java.util.HashMap[String, String]
          userMeta.put("test-key", "test-value")
          meta.setUserMetadata(userMeta)

          key.setObjectMetadata(meta)

Issue #89 might be related

Modular dependencies

Would you consider breaking out the dependencies into more modular pieces? Having to depend on all of aws-java-sdk when only using S3 is really unnecessary. aws-java-sdk is already broken up into pieces, so it would just take breaking the build up into sub-projects and having those depend on each other as needed.

This could potentially increase library adoption. I for one, wouldn't use the library as it is since it's adding something like 85mb of dependencies that I won't be using.

Where do I configure my AWS credentials?

The README shows pretty well how to use the library features but I couldn't find how to configure my AWS credentials(key and secret). How can I do it?

I believe we could add it to README also.

Update to the latest aws-java version!

I coded against your library which depends on a very old httpclient and apache common codec. Now I have a conflict that prevents me to continue my development.

I noticed that upgrading to the latest version of aws-java would solve the problem.

screenshot 2014-08-20 11 33 10

Please update the maven repo with a new version ASAP because I'm completely stuck now!

Failing Tests

I just ran sbt test on the develop branch, and got:

[info] Run completed in 54 seconds, 301 milliseconds.
[info] Total number of tests run: 9
[info] Suites: completed 9, aborted 0
[info] Tests: succeeded 6, failed 3, canceled 0, ignored 0, pending 0
[info] *** 3 TESTS FAILED ***
[error] Failed tests:
[error]     awscala.EC2Spec
[error]     awscala.DynamoDBV2Spec
[error] (test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 200 s, completed May 1, 2014 10:49:17 PM

Is anything else supposed to be running in order for these tests to work, or are there configurations, permissions or configurations required... or are these actually broken tests?

SQS Test no longer compiles

The update to use SDK 1.7.1 has broken SQSSpec, presumably because of the renaming of awscala.sqs.SQS$createQueue to awscala.sqs.SQS$createQueueAndReturnQueueName.

I don't want to fix this myself because I'm unsure of the reasoning behind the name change.

S3 objectSummaries truncates at 1000

I'm assuming you are aware that the S3 objectSummaries which uses the list objects truncates at 1000 objects and needs to support the isTruncated() and listNextBatchOfObjects() methods/requests

is this on your near term roadmap, is there a workaround (not obvious from reading code) or is it a punt for the foreseeable future?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.