Read the Gz file from S3 Kotlin - kotlin

I am trying to get the gz file from S3 that is gz encoded using s3Client.
Unfortunately, I am get
UncheckedIOException "Cannot encode string." exception
when using
s3u.getObject(srcBucket, srcKey).await().asUtf8String()
The code for upload is
val gzip = key.endsWith(".gz")
val contentAsBytes: ByteArray = if (gzip) {
val baos = ByteArrayOutputStream()
GZIPOutputStream(baos).bufferedWriter(StandardCharsets.UTF_8)
.use { it.write(content) }
baos.toByteArray()
} else {
content.toByteArray(StandardCharsets.UTF_8)
}
val metadata = ObjectMetadata()
metadata.contentLength = contentAsBytes.size.toLong()
metadata.contentType = "text/plain; charset=utf-8"
if (gzip) {
metadata.contentEncoding = "gzip"
}
Can anyone help me know how to get the gzip file using s3?

Related

How to stream a big zip file until end?

I am trying to stream a big zip file with Multi through GRPC service as follows:
#GrpcService
class HelloGrpcService : HelloGrpc {
override fun source(request: Empty?): Multi<SourceResponse> {
val file = File("/Users/developer/Downloads/Archive.zip")
val res = SourceResponse.newBuilder().setData(ByteString.readFrom(file.inputStream())).build()
return Multi.createFrom().item(res)
}
}
Unfortunately, I have received the following exception:
System.ArgumentException: The JSON value of length 212507417 is too large and not supported.
My goal is to stream the file and not to send it once.
The question is how to stream a big file in GRPC?
Here is the proto file:
syntax = "proto3";
import "google/protobuf/empty.proto";
option java_multiple_files = true;
option java_package = "io.acme";
option java_outer_classname = "HelloGrpcProto";
package hello;
service HelloGrpc {
rpc Source(google.protobuf.Empty) returns (stream SourceResponse) {}
}
message SourceResponse {
bytes data = 1;
}

Convert Base64 to PNG-File and save on Ktor Backende

I need to save an image from the Client App as PNG on the Backend.
Im sending the Image as Base64 with Post to the Backend.
I cant find a way to convert the Base64 String to an PNG File and dont know, how I could save them as File on the Server.
Thats the function I use to get the Data from the client. In val picture I get the Image as Base64.
fun savepicture(data: getpicture) =
transaction {
val userid= data.userid
val date = data.date
val time = data.time
val picture= data.picture
println("$picture")
try {
decodeImage(aufnahme)
}
catch(e: Exception) {
println("Fehler: $e")
}
if (picture.isNotEmpty()) {
return#transaction true
}
return#transaction false
}
fun decodeImage(image: String) {
val pictureBytes = Base64.getDecoder().decode(image)
val path = Path("Path/to/destination")
path.writeBytes(pictureBytes)
}
With this function i create the Base64 String. The Bitmap is created of a picture taken form the Device.
fun encodeImage(bm: Bitmap): String? {
val baos = ByteArrayOutputStream()
bm.compress(Bitmap.CompressFormat.PNG, 90, baos)
val b = baos.toByteArray()
return java.util.Base64.getEncoder().encodeToString(b)
}
I hope someone could help me to convert and save my image.
You can use java.util.Base64 to decode your base64 string into bytes. Then you'll need to write the bytes into a file.
For instance:
val picture: String = "the base64 data here as a string"
// decode the base64 text into bytes
val pictureBytes = Base64.getDecoder().decode(picture)
// write the bytes to a file
val path = Path("the/path/to/the/file.png")
path.writeBytes(pictureBytes)

C# How I can upload file to MinIO (AWS S3 compatible API) via gRPC without buffering data?

How can I upload large files to MinIO (AWS S3 compatible API) via gRPC service without buffering data?
I have gRPC service with following definition:
service MediaService {
rpc UploadMedia(stream UploadMediaRequest) returns (UploadMediaResponse);
}
message UploadMediaRequest {
oneof Data {
UploadMediaMetadata metadata = 1;
UploadMediaStream fileStream = 2;
}
}
message UploadMediaMetadata {
string bucket = 1;
string virtialDirectory = 2;
string fileName = 3;
string contentType = 4;
map<string, string> attributes = 6;
}
message UploadMediaStream {
bytes bytes = 1;
}
And implementation of UploadMedia:
public override async Task<UploadMediaResponse> UploadMedia(
IAsyncStreamReader<UploadMediaRequest> requestStream,
ServerCallContext context)
{
UploadMediaMetadata? metadata = null;
var token = context.CancellationToken;
var traceId = context.GetHttpContext().TraceIdentifier;
await using var memoryStream = new MemoryStream();
await foreach (var req in requestStream.ReadAllAsync(token))
{
if (req.DataCase == UploadMediaRequest.DataOneofCase.Metadata)
{
metadata = req.Metadata;
_logger.LogTrace("[Req: {TraceId}] Received metadata", traceId);
}
else
{
await memoryStream.WriteAsync(req.FileStream.Bytes.Memory, token);
_logger.LogTrace("[Req: {TraceId}] Received chunk of bytes", traceId);
}
}
if (metadata == null)
{
throw new RpcException(new Status(StatusCode.InvalidArgument, "Not found metadata."));
}
memoryStream.Seek(0L, SeekOrigin.Begin);
var uploadModel = _mapper.Map<UploadModel>(metadata);
uploadModel.FileStream = memoryStream;
var file = await _fileService.UploadFile(uploadModel, token);
await _eventsService.Notify(new MediaUploadedEvent(file.PublicId), token);
_logger.LogTrace("[Req: {TraceId}] File uploaded", traceId);
return new UploadMediaResponse { File = _mapper.Map<RpcFileModel>(file) };
}
At the method I read request stream and write data to MemoryStream. After that I upload file to storage:
var putObjectArgs = new PutObjectArgs()
.WithStreamData(fileStream)
.WithObjectSize(fileStream.Length)
.WithObject(virtualPath)
.WithBucket(bucket)
.WithContentType(contentType)
.WithHeaders(attributes);
return _storage.PutObjectAsync(putObjectArgs, token);
I want to upload files without buffering data in Memory.
I think I can write bytes from stream to disk and after that create FileStream, but I don't want one more dependency.

How to get InputStream from MultipartFormDataInput?

I'm trying to save pdf in wildfly, I'm using RestEasy MultipartFormDataInput provided with wildfly 20.0.1,
but it doesn't work.
This is what I have:
public static Response uploadPdfFile(MultipartFormDataInput multipartFormDataInput) {
// local variables
MultivaluedMap<String, String> multivaluedMap = null;
String fileName = null;
InputStream inputStream = null;
String uploadFilePath = null;
try {
Map<String, List<InputPart>> map = multipartFormDataInput.getFormDataMap();
List<InputPart> lstInputPart = map.get("poc");
for(InputPart inputPart : lstInputPart){
// get filename to be uploaded
multivaluedMap = inputPart.getHeaders();
fileName = getFileName(multivaluedMap);
if(null != fileName && !"".equalsIgnoreCase(fileName)){
try {
// write & upload file to UPLOAD_FILE_SERVER
//here I have the error: Unable to find a MessageBodyReader for media type:
//application/pdf
inputStream = inputPart.getBody(InputStream.class,InputStream.class);
uploadFilePath = writeToFileServer(inputStream, fileName);
}catch (Exception e) {
e.printStackTrace();
}
// close the stream
inputStream.close();
}
}
}
catch(IOException ioe){
ioe.printStackTrace();
}
finally{
// release resources, if any
}
return Response.ok("File uploaded successfully at " + uploadFilePath).build();
}
I'm using postman for test, http POST method, in the body I send: form-data - file and selected the file.pdf.
When I sent the request, I have the next RunTimeException when I try:
inputStream = inputPart.getBody(InputStream.class,null);
I get:
java.lang.RuntimeException: RESTEASY007545: Unable to find a MessageBodyReader for media type: application/pdf and class type org.jboss.resteasy.util.Base64$InputStream
At the moment I am saving the file receiving it in Base64, but I think that with MultipartFormDataInput it is the correct way.
This is what I have when debug:
Thanks for your support.
I solved this changing the InputStream from "org.jboss.resteasy.util.Base64.InputStream"
to "java.io.InputStream"

Limiting upload file size in Playframework 2.2.x

Regarding file upload in play framework 2.2.3.
According to update on this question,
adding the following to application.conf should enable upload of files upto 10MB.
parsers.MultipartFormData.maxLength = 10240K
this does not work for me. I get "413, Request entity too large" error code for any file greater than 1MB.
I tried setting another field
parsers.text.maxLength=10M
Still upload fails with 413.
I upload files using XHR with a FormData that a can contain multiple files.
Upload Controller Code :
public Result uploadAttendeeFiles(){
try{
MultipartFormData body = request().body().asMultipartFormData();
List<FilePart> uploadedFiles = body.getFiles();
Map<String, String> returnMessages = new HashMap<String, String>();
String fileName ="", fileExtension="", fieldId = "";
int fileCounter = 0;
if (!CommonUtils.isEmpty(uploadedFiles)) {
for (FilePart filePar : uploadedFiles) {
try {
fileExtension = Files.getFileExtension(filePar.getFilename());
fieldId = body.asFormUrlEncoded().get("fieldId")[fileCounter];
fileName = body.asFormUrlEncoded().get("fileName")[fileCounter++];
InputStream in = new FileInputStream(filePar.getFile());
Object objectId = uploadService.loadFile(in, fileName, fileExtension);
if(objectId != null)
returnMessages.put(fieldId, objectId.toString());
else
returnMessages.put(fieldId, "failed-Failed to save file!");
} catch (IOException e) {
returnMessages.put(fileName+"."+fileExtension, "failed-Error while uploading file!");
e.printStackTrace();
}
}
}
else{
return ok("{\"errormessage\":\"No files selected!\"}");
}
return ok(Json.toJson(returnMessages));
} catch(Exception e) {
e.printStackTrace();
return ok("{\"errormessage\":\"Error while uploading files!\"}");
}
}