Amazon PutObjectRequest for large files throws error - amazon-s3

I am getting an error while uploading large files more than 50 MB using
PUTObjectRequest . It throws an error unable to write data to the transport connection: An existing connection was forcibly closed
by the remote host.
I am using federated user for this putobjectrequest.
Please help me out to solve this issue.
I am sending multiple files parallel using task as
Task.Factory.StartNew(()=>{
PutObjectRequest req=new PutObjectRequest()
{
bucketName=_bucketName,
key=fileKey,
FilePath=demoPath
};
PutObjectResponse resp= client.PurObjectRequest(req);
}
);

This was a bug in aws sdk version i was using 2.3.20
Now i am using aws sdk version 2.3.40 and it is working fine. Basically it was error due to time difference between client machine and server fixed in the updated aws sdk dll

Related

Kafka Connect S3 source throws java.io.IOException

Kafka Connect S3 source connector throws the following exception around 20 seconds into reading an S3 bucket:
Caused by: java.io.IOException: Attempted read on closed stream.
at org.apache.http.conn.EofSensorInputStream.isReadAllowed(EofSensorInputStream.java:107)
at org.apache.http.conn.EofSensorInputStream.read(EofSensorInputStream.java:133)
at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:90)
at com.amazonaws.event.ProgressInputStream.read(ProgressInputStream.java:180)
The error is preceded by the following warnning:
WARN Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection. This is likely an error and may result in sub-optimal behavior. Request only the bytes you need via a ranged GET or drain the input stream after use. (com.amazonaws.services.s3.internal.S3AbortableInputStream:178)
I am running Kafka connect out of this image: confluentinc/cp-kafka-connect-base:6.2.0. Using the confluentinc-kafka-connect-s3-source-2.1.1 jar.
My source connector configuration looks like so:
{
"connector.class":"io.confluent.connect.s3.source.S3SourceConnector",
"tasks.max":"1",
"s3.region":"eu-central-1",
"s3.bucket.name":"test-bucket-yordan",
"topics.dir":"test-bucket/topics",
"format.class": "io.confluent.connect.s3.format.json.JsonFormat",
"partitioner.class":"io.confluent.connect.storage.partitioner.DefaultPartitioner",
"schema.compatibility":"NONE",
"confluent.topic.bootstrap.servers": "blockchain-kafka-kafka-0.blockchain-kafka-kafka-headless.default.svc.cluster.local:9092",
"transforms":"AddPrefix",
"transforms.AddPrefix.type":"org.apache.kafka.connect.transforms.RegexRouter",
"transforms.AddPrefix.regex":".*",
"transforms.AddPrefix.replacement":"$0_copy"
}
Any ideas on what might be the issue? Also I was unable to find the repository of Kafka connect S3 source connector, is it opensource?
Edit: I don't see the problem if gzip compression on the kafka-connect sink is disabled.
The warning means that close()was called before the file was read. S3 was not done with sending the data but the connection was left hanging.
2 options:
Validate that the input stream contains no more data. That way the connection can be reused
Call s3ObjectInputStream.abort() (NOTE: this connection could not be reused if you abort the input stream and a new one will need to created which will have performance impact.) In some cases this might make sense e.g. when the read is getting too slow etc.

Amazon S3 File Read Timeout. Trying to download a file using JAVA

New to Amazon S3 usage.I get the following error when trying to access the file from Amazon S3 using a simple java method.
2016-08-23 09:46:48 INFO request:450 - Received successful response:200, AWS Request ID: F5EA01DB74D0D0F5
Caught an AmazonClientException, which means the client encountered an
internal error while trying to communicate with S3, such as not being
able to access the network.
Error Message: Unable to store object contents to disk: Read timed out
The exact lines of code worked yesterday.I was able to download 100% of 5GB file in 12 min. Today I'm in a better connected environment but only 2% or 3% of the file is downloaded and then the program fails.
Code that I'm using to download.
s3Client.getObject(new GetObjectRequest("mybucket", file.getKey()), localFile);
You need to set the connection timeout and the socket timeout in your client configuration.
Click here for a reference article
Here is an excerpt from the article:
Several HTTP transport options can be configured through the com.amazonaws.ClientConfiguration object. Default values will suffice for the majority of users, but users who want more control can configure:
Socket timeout
Connection timeout
Maximum retry attempts for retry-able errors
Maximum open HTTP connections
Here is an example on how to do it:
Downloading files >3Gb from S3 fails with "SocketTimeoutException: Read timed out"

IBM Mobilefirst uploading .wlapp gives Throwable Connection Closed within 1 minute

i'm using liberty 8.5.5.5 and database DB2 10.5 and MobileFirst 7.1
While i'm uploading wlapp file(file size 50mb and above) in worklight console it says
Throwable Connection Closed : Read failed. Possible end of stream encountered, while redirecting request to http://XX.XX.XX.XX:9080/wladmin/management-apint-apis/1.0/runtimes/sample/applications?async=true
kindly let me know, what can be the maximum size of .wlapp file that i can upload to IBM Mobilefirst console
During upload an error is throwing approximately in 1 minute. Is there a way to increase the timeout in configuration.
Kindly advice.
There is no filesize restriction for .wlapp files.
Note that an iFix was recently publishing handling the timeout values when uploading .wlapp files via the console: http://www-01.ibm.com/support/docview.wss?uid=swg1PI59605
Make sure you are using an iFix level that is greater than 2016-03-23.
You can verify your iFix build number in the console's About screen.

PayPal works fine on localhost, but fails while works on aws server

I have integrated paypal to my MVC4 application. PayPal dll version 1.5.0.0, NewtonJson dll version 6.0.0.0
At first i got exception at localhost while getting access token
Dictionary<string, string> sdkConfig = new Dictionary<string, string>();
sdkConfig.Add("mode", "sandbox");
string accessToken = new PayPal.Api.OAuthTokenCredential("MyClientId", "MySecretId", sdkConfig).GetAccessToken();
Exception was
Invalid HTTP response: The request was aborted: Could not create SSL/TLS secure channel.
From stackoverflow I got a fix
System.Net.ServicePointManager.Expect100Continue = true;
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls12;
System.Net.ServicePointManager.DefaultConnectionLimit = 9999;
Its currently working fine on my localhost, but getting exception when uploaded and run on AWS windows instance.
Retried 3 times.... Exception in PayPal.HttpConnection.Execute(). Check log for more details.
Can somebody help me on this ?
UPDATE
I have checked with uploading the same code in mochahost server. Its working perfectly there too
My EC2 instance is Windows Server 2008 DataCenter, 32 bit with IIS7
Make sure TCP443 is open on any elastic load balancer (ELB) you are using and in the security group assigned to the EC2 instance. In the OS make sure TCP443 is allowed with Windows Advanced Firewall.

Service reference error when moving dev. environment from XP to W7

I am building an application and I am using web services for getting data from a server. It was working fine when I was developing on my XP machine but had to switch to Windows 7. On the new machine I grabbed the latest version of the code using sourcesafe.
However, when I try to add a service reference in the solution or update an existing one I get the following error:
There was an error downloading 'http://localhost:52490/Service/CustomerService.asmx'.
The request failed with the error message:
Server Error in '/' Application.
Parser Error
Description: An error occurred during the parsing of a resource required to service this request. Please review the following specific parse error details and modify your source file appropriately.
Parser Error Message: Could not create type 'Digital_Server.CustomerService'.
Source Error
Source File: /Service/CustomerService.asmxLine: 1
Version Information: Microsoft .NET Framework Version:2.0.50727.4927; ASP.NET Version:2.0.50727.4927
--.
Metadata contains a reference that cannot be resolved: 'http://localhost:52490/Service/CustomerService.asmx'.
An error occurred while receiving the HTTP response to http://localhost:52490/Service/CustomerService.asmx. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by the server (possibly due to the service shutting down). See server logs for more details.
The underlying connection was closed: An unexpected error occurred on a receive.
Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
An existing connection was forcibly closed by the remote host
If the service is defined in the current solution, try building the solution and adding the service reference again.
Does it has anything to do with the IIS or is it any configuration file I have to change in the solution?
Any help is appreciated.
How do you deploy your app to the server? Via a zip file?
Perhaps it extracts the files without the path.