public class HadoopS3AccessHelper extends Object implements S3AccessHelper
S3AccessHelper
for the Hadoop S3A filesystem.Constructor and Description |
---|
HadoopS3AccessHelper(org.apache.hadoop.fs.s3a.S3AFileSystem s3a,
Configuration conf) |
Modifier and Type | Method and Description |
---|---|
com.amazonaws.services.s3.model.CompleteMultipartUploadResult |
commitMultiPartUpload(String destKey,
String uploadId,
List<com.amazonaws.services.s3.model.PartETag> partETags,
long length,
AtomicInteger errorCount)
Finalizes a Multi-Part Upload.
|
boolean |
deleteObject(String key)
Deletes the object associated with the provided key.
|
long |
getObject(String key,
File targetLocation)
Gets the object associated with the provided
key from S3 and puts it in the provided
targetLocation . |
com.amazonaws.services.s3.model.ObjectMetadata |
getObjectMetadata(String key)
Fetches the metadata associated with a given key on S3.
|
com.amazonaws.services.s3.model.PutObjectResult |
putObject(String key,
File inputFile)
Uploads an object to S3.
|
String |
startMultiPartUpload(String key)
Initializes a Multi-Part Upload.
|
com.amazonaws.services.s3.model.UploadPartResult |
uploadPart(String key,
String uploadId,
int partNumber,
File inputFile,
long length)
Uploads a part and associates it with the MPU with the provided
uploadId . |
public HadoopS3AccessHelper(org.apache.hadoop.fs.s3a.S3AFileSystem s3a, Configuration conf)
public String startMultiPartUpload(String key) throws IOException
S3AccessHelper
startMultiPartUpload
in interface S3AccessHelper
key
- the key whose value we want to upload in parts.IOException
public com.amazonaws.services.s3.model.UploadPartResult uploadPart(String key, String uploadId, int partNumber, File inputFile, long length) throws IOException
S3AccessHelper
uploadId
.uploadPart
in interface S3AccessHelper
key
- the key this MPU is associated with.uploadId
- the id of the MPU.partNumber
- the number of the part being uploaded (has to be in [1 ... 10000]).inputFile
- the (local) file holding the part to be uploaded.length
- the length of the part.result
of the attempt to upload the part.IOException
public com.amazonaws.services.s3.model.PutObjectResult putObject(String key, File inputFile) throws IOException
S3AccessHelper
S3AccessHelper.uploadPart(String, String, int, File, long)
method, this object is not going to be associated to any MPU and, as such, it is not subject
to the garbage collection policies specified for your S3 bucket.putObject
in interface S3AccessHelper
key
- the key used to identify this part.inputFile
- the (local) file holding the data to be uploaded.result
of the attempt to stage the incomplete part.IOException
public com.amazonaws.services.s3.model.CompleteMultipartUploadResult commitMultiPartUpload(String destKey, String uploadId, List<com.amazonaws.services.s3.model.PartETag> partETags, long length, AtomicInteger errorCount) throws IOException
S3AccessHelper
commitMultiPartUpload
in interface S3AccessHelper
destKey
- the key identifying the object we finished uploading.uploadId
- the id of the MPU.partETags
- the list of ETags
associated with this MPU.length
- the size of the uploaded object.errorCount
- a counter that will be used to count any failed attempts to commit the MPU.result
of the attempt to finalize the MPU.IOException
public boolean deleteObject(String key) throws IOException
S3AccessHelper
deleteObject
in interface S3AccessHelper
key
- The key to be deleted.true
if the resources were successfully freed, false
otherwise (e.g.
the file to be deleted was not there).IOException
public long getObject(String key, File targetLocation) throws IOException
S3AccessHelper
key
from S3 and puts it in the provided
targetLocation
.getObject
in interface S3AccessHelper
key
- the key of the object to fetch.targetLocation
- the file to read the object to.IOException
public com.amazonaws.services.s3.model.ObjectMetadata getObjectMetadata(String key) throws IOException
S3AccessHelper
getObjectMetadata
in interface S3AccessHelper
key
- the key.ObjectMetadata
.IOException
Copyright © 2014–2021 The Apache Software Foundation. All rights reserved.