spectralogic / ds3_java_sdk Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
i see the below error against a test installation.
Caused by: java.security.SignatureException: Failed to generate HMAC : null
at com.spectralogic.ds3client.utils.Signature.calculateRFC2104HMAC(Signature.java:59)
at com.spectralogic.ds3client.utils.Signature.signature(Signature.java:70)
at com.spectralogic.ds3client.NetworkClientImpl$RequestExecutor.getSignature(NetworkClientImpl.java:315)
at com.spectralogic.ds3client.NetworkClientImpl$RequestExecutor.addHeaders(NetworkClientImpl.java:257)
at com.spectralogic.ds3client.NetworkClientImpl$RequestExecutor.execute(NetworkClientImpl.java:211)
at com.spectralogic.ds3client.NetworkClientImpl.getResponse(NetworkClientImpl.java:162)
at com.spectralogic.ds3client.Ds3ClientImpl.bulkGet(Ds3ClientImpl.java:110)
at com.spectralogic.ds3client.helpers.Ds3ClientHelpersImpl.innerStartReadJob(Ds3ClientHelpersImpl.java:106)
at com.spectralogic.ds3client.helpers.Ds3ClientHelpersImpl.startReadJob(Ds3ClientHelpersImpl.java:100)
at ch.cyberduck.core.spectra.SpectraBulkService.pre(SpectraBulkService.java:136)
... 17 more
Can you apply 0e8b9cd to the 3.0 branch.
Hello,
We've been having some issues transfering an object from S3 to Blackpearl using this library, version 5.4.0
We are transfering using the following ObjectChannelBuilder
// Scala code
// inputStream souce of bytes
new Ds3ClientHelpers.ObjectChannelBuilder() {
@throws[IOException]
override final def buildChannel(key: String): SeekableByteChannel = {
val readChannel: ReadableByteChannel = Channels.newChannel(inputStream)
new ReadOnlySeekableByteChannel(readChannel)
}
}
It usually happens when we are transfering files bigger than 64 GB.
I configured the max upload size
WriteJobOptions
.create()
.withMaxUploadSize(/* Let's say 512 GB */)
but didn't change the behaviour. (I assume this might not affect the blob size)
Please see stack trace below, any feedback is welcome.
org.apache.http.client.ClientProtocolException: null
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:187)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:72)
at com.spectralogic.ds3client.networking.NetworkClientImpl$RequestExecutor.execute(NetworkClientImpl.java:239)
at com.spectralogic.ds3client.networking.NetworkClientImpl.getResponse(NetworkClientImpl.java:177)
at com.spectralogic.ds3client.Ds3ClientImpl.putObject(Ds3ClientImpl.java:70)
at com.spectralogic.ds3client.helpers.strategy.transferstrategy.PutJobTransferMethod.transferJobPart(PutJobTransferMethod.java:78)
at com.spectralogic.ds3client.helpers.strategy.transferstrategy.MaxNumObjectTransferAttemptsDecorator.transferJobPart(MaxNumObjectTransferAttemptsDecorator.java:59)
at com.spectralogic.ds3client.helpers.strategy.transferstrategy.AbstractTransferStrategy.lambda$transferJobParts$2(AbstractTransferStrategy.java:196)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
at delay @ org.typelevel.log4cats.slf4j.internal.Slf4jLoggerInternal$Slf4jLogger.$anonfun$info$4(Slf4jLoggerInternal.scala:91)
at delay @ org.typelevel.log4cats.slf4j.internal.Slf4jLoggerInternal$Slf4jLogger.isInfoEnabled(Slf4jLoggerInternal.scala:66)
at ifM$extension @ org.typelevel.log4cats.slf4j.internal.Slf4jLoggerInternal$Slf4jLogger.info(Slf4jLoggerInternal.scala:91)
at >>$extension @ fs2.Pull$.fs2$Pull$$go$1(Pull.scala:1189)
Caused by: org.apache.http.client.NonRepeatableRequestException: Cannot retry request with a non-repeatable request entity
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:108)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:72)
at com.spectralogic.ds3client.networking.NetworkClientImpl$RequestExecutor.execute(NetworkClientImpl.java:239)
at com.spectralogic.ds3client.networking.NetworkClientImpl.getResponse(NetworkClientImpl.java:177)
at com.spectralogic.ds3client.Ds3ClientImpl.putObject(Ds3ClientImpl.java:70)
at com.spectralogic.ds3client.helpers.strategy.transferstrategy.PutJobTransferMethod.transferJobPart(PutJobTransferMethod.java:78)
at com.spectralogic.ds3client.helpers.strategy.transferstrategy.MaxNumObjectTransferAttemptsDecorator.transferJobPart(MaxNumObjectTransferAttemptsDecorator.java:59)
at com.spectralogic.ds3client.helpers.strategy.transferstrategy.AbstractTransferStrategy.lambda$transferJobParts$2(AbstractTransferStrategy.java:196)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.spectralogic.ds3client.exceptions.ContentLengthNotMatchException: The Content length for /_Common_Test_Bucket/big_file.mxf (68719476736) does not match the number of bytes read (0)
at com.spectralogic.ds3client.Ds3InputStreamEntity.writeTo(Ds3InputStreamEntity.java:49)
at org.apache.http.impl.execchain.RequestEntityProxy.writeTo(RequestEntityProxy.java:121)
at org.apache.http.impl.DefaultBHttpClientConnection.sendRequestEntity(DefaultBHttpClientConnection.java:156)
at org.apache.http.impl.conn.CPoolProxy.sendRequestEntity(CPoolProxy.java:152)
at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:238)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
... 12 common frames omitted
Caused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "Latest" (class com.spectralogic.ds3client.models.bulk.BulkObject), not marked as ignorable (4 known properties: "Name", "InCache", "Length", "Offset"])
at [Source: org.apache.http.conn.EofSensorInputStream@7979b8b7; line: 1, column: 683] (through reference chain: com.spectralogic.ds3client.models.bulk.MasterObjectList["Objects"]->java.util.ArrayList[0]->com.spectralogic.ds3client.models.bulk.Objects["Object"]->java.util.ArrayList[0]->com.spectralogic.ds3client.models.bulk.BulkObject["Latest"])
at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:51)
at com.fasterxml.jackson.databind.DeserializationContext.reportUnknownProperty(DeserializationContext.java:839)
at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1045)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1352)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1330)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:264)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:25)
at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:95)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:258)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125)
at com.fasterxml.jackson.dataformat.xml.deser.WrapperHandlingDeserializer.deserialize(WrapperHandlingDeserializer.java:120)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:25)
at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:95)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:258)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125)
at com.fasterxml.jackson.dataformat.xml.deser.WrapperHandlingDeserializer.deserialize(WrapperHandlingDeserializer.java:120)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2796)
at com.spectralogic.ds3client.serializer.XmlOutput.fromXml(XmlOutput.java:116)
at com.spectralogic.ds3client.commands.BulkResponse.processResponse(BulkResponse.java:52)
at com.spectralogic.ds3client.commands.AbstractResponse.<init>(AbstractResponse.java:62)
at com.spectralogic.ds3client.commands.BulkResponse.<init>(BulkResponse.java:34)
at com.spectralogic.ds3client.commands.BulkPutResponse.<init>(BulkPutResponse.java:24)
at com.spectralogic.ds3client.Ds3ClientImpl.bulkPut(Ds3ClientImpl.java:115)
at com.spectralogic.ds3client.helpers.Ds3ClientHelpersImpl.innerStartWriteJob(Ds3ClientHelpersImpl.java:69)
at com.spectralogic.ds3client.helpers.Ds3ClientHelpersImpl.startWriteJob(Ds3ClientHelpersImpl.java:51)
at ch.cyberduck.core.spectra.SpectraBulkService.pre(SpectraBulkService.java:75)
... 36 more
testing to see if issues work for ds3_java_sdk
The check for VersionID should be nullableEquals. Ask me how I know.
Exception in thread "main" java.lang.NullPointerException
at com.spectralogic.ds3client.models.bulk.Ds3Object.equals(Ds3Object.java:129)
at java.util.ArrayList.remove(ArrayList.java:528)
at java.util.ArrayList.forEach(ArrayList.java:1249)
at com.spectralogic.WrittenObjects.addAll(WrittenObjects.java:64)
at com.spectralogic.DS3Fill.fillOneSet(DS3Fill.java:109)
at com.spectralogic.Main.lambda$main$1(Main.java:88)
at com.spectralogic.util.Timer.timeBlock(Timer.java:55)
at com.spectralogic.Main.main(Main.java:88)
Doing a get object with an object key like this has spaces
results in the first line of the request being the following:
GET http://192.168.56.101:8080/testtday/this HAS SPACES HTTP/1.1
Also, given that the ?marker= field holds object keys, we should ensure that it's also URL encoded.
For a bulk start request with one file sized 0
bytes, the reesponse is
<?xml version="1.0"?>
<MasterObjectList BucketName="test.cyberduck.ch" CachedSizeInBytes="0" ChunkClientProcessingOrderGuarantee="IN_ORDER" CompletedSizeInBytes="0" JobId="1ffbc34c-8a5a-4373-b8ff-70e58ca8dfb1" OriginalSizeInBytes="0" Priority="NORMAL" RequestType="PUT" StartDate="2015-12-22T15:17:21.000Z" Status="IN_PROGRESS" UserId="403020e4-02ac-4cf2-b84e-1d29a8980390" UserName="iterate" WriteOptimization="CAPACITY">
<Nodes>
<Node EndPoint="192.168.56.101" HttpPort="8080" Id="80b968b0-8df4-11e5-9255-0800279adf5d"/>
</Nodes>
</MasterObjectList>
Reading the reply fails in WriteJobImpl.<init>
with
java.lang.NullPointerException
at com.spectralogic.ds3client.helpers.WriteJobImpl.<init>(WriteJobImpl.java:46)
at com.spectralogic.ds3client.helpers.Ds3ClientHelpersImpl.innerStartWriteJob(Ds3ClientHelpersImpl.java:70)
at com.spectralogic.ds3client.helpers.Ds3ClientHelpersImpl.startWriteJob(Ds3ClientHelpersImpl.java:48)
The build fails due to violations in ds3-sdk:findbugsMain
.
When reading a job status with a non empty master object list returned but a Retry-After
header in the response, the status in GetAvailableJobChunksResponse
will be set to AVAILABLE
regardless. Not sure if this is a bug, but from my understanding the Retry-After
header should always be taken into account.
An XML payload, formatted as follows, must be sent to describe the GET job to create:
<Objects <Object Name="{string}" Length="{64-bit integer}" Offset="{64-bit integer}"/> ... </Objects>
When using Ds3ClientHelpers#startReadJob
it is not possible to pass an offset parameter and the length attribute is not serialized either.
That would be required to support reads with an offset.
Type of maxUploadSize
in WriteJobOptions
is int
but should possibly be long
.
Hi. We can't seem to find the jar for ds3-interfaces 4.1.0 in bintray:
http://dl.bintray.com/spectralogic/ds3/com/spectralogic/ds3/ds3-interfaces/4.1.0/
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.