Closed
Description
I have a java.util.Properties
object which is loaded with data from a file in S3 via the overloaded Properties#load
method that accepts an InputStream
. In every 2.0.0-preview-X
release of the SDK, the following snippet of code works perfectly:
properties.load(S3Client.create().getObject(builder -> {
builder.bucket(bucket).key(key).build();
}, ResponseTransformer.toInputStream()));
However, in 2.1.0
, the following exception is thrown:
Exception in thread "main" software.amazon.awssdk.core.exception.SdkClientException: Data read has a different checksum than expected. Was -736260903, but expected 723910859
at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:97)
at software.amazon.awssdk.services.s3.checksums.ChecksumValidatingInputStream.validateAndThrow(ChecksumValidatingInputStream.java:165)
at software.amazon.awssdk.services.s3.checksums.ChecksumValidatingInputStream.read(ChecksumValidatingInputStream.java:128)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at software.amazon.awssdk.core.io.SdkFilterInputStream.read(SdkFilterInputStream.java:66)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at java.util.Properties$LineReader.readLine(Properties.java:435)
at java.util.Properties.load0(Properties.java:353)
at java.util.Properties.load(Properties.java:341)
...
A temporary solution that I've found is to use the following:
properties.load(S3Client.create().getObject(builder -> {
builder.bucket(bucket).key(key).build();
}, ResponseTransformer.toBytes()).asInputStream());
However, I suspect that the above snippet loads the contents of the file into memory, which some users may not want to occur if working with large files.
Environment:
java version "1.8.0_121"
Java(TM) SE Runtime Environment (build 1.8.0_121-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.121-b13, mixed mode)
Please let me know if I can provide any more information.
Thank you!