Not able to upload large files using plUpload - file-upload

Recently while uploading large files (in GBs) using PLupload, the file upload is aborted throwing java.io.EOFException: Unexpected EOF read on the socket exception.
Any idea how to resolve this ?
java.io.EOFException: Unexpected EOF read on the socket at org.apache.coyote.http11.InternalNioInputBuffer.fill(InternalNioInputBuffer.java:152) at org.apache.coyote.http11.InternalNioInputBuffer$SocketInputBuffer.doRead(InternalNioInputBuffer.java:177) at org.apache.coyote.http11.filters.IdentityInputFilter.doRead(IdentityInputFilter.java:110) at org.apache.coyote.http11.AbstractInputBuffer.doRead(AbstractInputBuffer.java:416) at org.apache.coyote.Request.doRead(Request.java:460) at org.apache.catalina.connector.InputBuffer.realReadBytes(InputBuffer.java:338) at org.apache.tomcat.util.buf.ByteChunk.substract(ByteChunk.java:395) at org.apache.catalina.connector.InputBuffer.read(InputBuffer.java:363) at org.apache.catalina.connector.CoyoteInputStream.read(CoyoteInputStream.java:190) at org.apache.commons.fileupload.MultipartStream$ItemInputStream.makeAvailable(MultipartStream.java:997) at org.apache.commons.fileupload.MultipartStream$ItemInputStream.read(MultipartStream.java:901) at java.io.InputStream.read(InputStream.java:101)

Related

Unable to establish SSL connection when using wget to download GEDI data from LP DAAC data pool

I was using wget to download GEDI data from LP DAAC data pool. It always returns an error of "unable to establish SSL connection". I attempted wget in promote or Pycharm and added the "--no-check-certificate" configuration.
The wget is the newest release (1.21.3,64bit).
OS: windows11.
from the following massages, I guess the connection to EarthData is successful because it returns the data downloading link that I can open manually in the browser and then can start downloading. This error could happen in the last step that wget starts accessing the returned link and then downloading.
returned messages:
--2022-08-14 09:51:09-- https://e4ftl01.cr.usgs.gov//GEDI_L1_L2/GEDI/GEDI01_B.002/2019.04.20/GEDI01_B_2019110092939_O01996_01_T03334_02_005_01_V002.h5
Resolving e4ftl01.cr.usgs.gov (e4ftl01.cr.usgs.gov)... 2001:49c8:4000:127d::133:130, 152.61.133.130
Connecting to e4ftl01.cr.usgs.gov (e4ftl01.cr.usgs.gov)|2001:49c8:4000:127d::133:130|:443... failed: Bad file descriptor.
Connecting to e4ftl01.cr.usgs.gov (e4ftl01.cr.usgs.gov)|152.61.133.130|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://urs.earthdata.nasa.gov/oauth/authorize?scope=uid&app_type=401&client_id=ijpRZvb9qeKCK5ctsn75Tg&response_type=code&redirect_uri=https%3A%2F%2Fe4ftl01.cr.usgs.gov%2Foauth&state=aHR0cHM6Ly9lNGZ0bDAxLmNyLnVzZ3MuZ292Ly9HRURJX0wxX0wyL0dFREkvR0VESTAxX0IuMDAyLzIwMTkuMDQuMjAvR0VESTAxX0JfMjAxOTExMDA5MjkzOV9PMDE5OTZfMDFfVDAzMzM0XzAyXzAwNV8wMV9WMDAyLmg1 [following]
--2022-08-14 09:51:55-- https://urs.earthdata.nasa.gov/oauth/authorize?scope=uid&app_type=401&client_id=ijpRZvb9qeKCK5ctsn75Tg&response_type=code&redirect_uri=https%3A%2F%2Fe4ftl01.cr.usgs.gov%2Foauth&state=aHR0cHM6Ly9lNGZ0bDAxLmNyLnVzZ3MuZ292Ly9HRURJX0wxX0wyL0dFREkvR0VESTAxX0IuMDAyLzIwMTkuMDQuMjAvR0VESTAxX0JfMjAxOTExMDA5MjkzOV9PMDE5OTZfMDFfVDAzMzM0XzAyXzAwNV8wMV9WMDAyLmg1
Resolving urs.earthdata.nasa.gov (urs.earthdata.nasa.gov)... 2001:4d0:241a:4081::89, 198.118.243.33
Connecting to urs.earthdata.nasa.gov (urs.earthdata.nasa.gov)|2001:4d0:241a:4081::89|:443... failed: Bad file descriptor.
Connecting to urs.earthdata.nasa.gov (urs.earthdata.nasa.gov)|198.118.243.33|:443... connected.
Unable to establish SSL connection.

failed retrieving file from mirror.erickochen.nl

When I run pacman -Syu to update, it first shows no error, I normally update everything and after that, I run pacman -Syu again, it shows this, what is the reason and any solution?
:: Synchronizing package databases...
core is up to date
extra is up to date
community is up to date
error: failed retrieving file 'core.db' from mirror.erickochen.nl : Failed to connect to mirror.erickochen.nl port 443 after 5241 ms: Connection timed out
error: failed retrieving file 'extra.db' from mirror.erickochen.nl : Failed to connect to mirror.erickochen.nl port 443 after 5202 ms: Connection timed out
error: failed retrieving file 'community.db' from mirror.erickochen.nl : Failed to connect to mirror.erickochen.nl port 443 after 5202 ms: Connection timed out
warning: too many errors from mirror.erickochen.nl, skipping for the remainder of this transaction
:: Starting full system upgrade...
there is nothing to do
Sometimes mirrors go offline, it's recommended to have multiple mirrors so you don't have a single point of failure, as well as keeping mirrors updated. Using reflector is recommended since it also finds fast candidates based on your location.
For the time being, edit /etc/pacman.d/mirrorlist and uncomment a couple of mirrors, then try updating again.

JMeter SocketException when uploading 1GB file

JMeter 5.4.1
OpenJDK 15.0.1
My test server is configured by default to allow a max 1073741824 byte file to be uploaded, a limit that is configurable. My goal is to validate that the configured limit is respected.
When I configure it for 1048576 bytes and exceed that limit with my upload, the server sends the response:
"{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1","title":"One or more validation errors occurred.","status":400,"traceId":"|bd37f36b-4cd68e1b8b433669.","errors":{"":["Failed to read the request form. Multipart body length limit 1048576 exceeded."]}}"
When I configure it for 1073741824 bytes and exceed that limit with my upload, JMeter reports the following error:
java.net.SocketException: Connection reset by peer
at java.base/sun.nio.ch.NioSocketImpl.implWrite(NioSocketImpl.java:420)
at java.base/sun.nio.ch.NioSocketImpl.write(NioSocketImpl.java:440)
at java.base/sun.nio.ch.NioSocketImpl$2.write(NioSocketImpl.java:826)
at java.base/java.net.Socket$SocketOutputStream.write(Socket.java:1051)
at java.base/sun.security.ssl.SSLSocketOutputRecord.deliver(SSLSocketOutputRecord.java:342)
at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1277)
at org.apache.http.impl.io.SessionOutputBufferImpl.streamWrite(SessionOutputBufferImpl.java:124)
at org.apache.http.impl.io.SessionOutputBufferImpl.flushBuffer(SessionOutputBufferImpl.java:136)
at org.apache.http.impl.io.SessionOutputBufferImpl.write(SessionOutputBufferImpl.java:167)
at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:113)
at org.apache.http.entity.mime.content.FileBody.writeTo(FileBody.java:121)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl$ViewableFileBody.writeTo(HTTPHC4Impl.java:1513)
at org.apache.http.entity.mime.AbstractMultipartForm.doWriteTo(AbstractMultipartForm.java:134)
at org.apache.http.entity.mime.AbstractMultipartForm.writeTo(AbstractMultipartForm.java:157)
at org.apache.http.entity.mime.MultipartFormEntity.writeTo(MultipartFormEntity.java:113)
at org.apache.http.impl.DefaultBHttpClientConnection.sendRequestEntity(DefaultBHttpClientConnection.java:156)
at org.apache.http.impl.conn.CPoolProxy.sendRequestEntity(CPoolProxy.java:152)
at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:238)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl$2.doSendRequest(HTTPHC4Impl.java:458)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.executeRequest(HTTPHC4Impl.java:935)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.sample(HTTPHC4Impl.java:646)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerProxy.sample(HTTPSamplerProxy.java:66)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1296)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1285)
at org.apache.jmeter.threads.JMeterThread.doSampling(JMeterThread.java:638)
at org.apache.jmeter.threads.JMeterThread.executeSamplePackage(JMeterThread.java:558)
at org.apache.jmeter.threads.JMeterThread.processSampler(JMeterThread.java:489)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:256)
at java.base/java.lang.Thread.run(Thread.java:832)
My JMeter.bat file has the heap set as follows:
set HEAP=-Xms1g -Xmx4g -XX:MaxMetaspaceSize=256m
This looks to be similar to a post from last Oct, but there was no solution suggested/reported.
SocketException after sending huge request via JMeter
Connection reset by peer means that your server has reset the connection so you should rather look for the clue in your server logs.
The only thing I suggest to do on JMeter side is to consider switching to HTTP Raw Request sampler, it has nice feature of streaming the file directly to the server without loading it to memory first, I think you will find it extremely helpful when it comes to load testing with more than one virtual user. See HTTP Raw Request for SOAP + MTOM post on JMeter Plugins support forum for more details.

Minio uploads through the web interface and API receives "Unauthorized request."

I can successfully upload files to my Minio server using mc command line client (logged in as root):
./mc cp roobina.jpg minio/mag
roobina.jpg: 63.50 KiB / 63.50 KiB
But when I try to upload a file to a bucket using minio's own web interface I receive this error:
Unauthorized request.
When using api (in a php application using AmazonS3 libraries), I receive this error:
Error:Error executing "PutObject" on "https://s3.***.net/clbu/public/4d/4b/d1ad580690058a636ad58e5af931541336ec.jpg"; AWS HTTP error: Client error: `PUT https://s3.***.net/clbu/public/4d/4b/d1ad580690058a636ad58e5af931541336ec.jpg` resulted in a `403 Forbidden` response:
Forbidden (truncated...) Unable to parse error information from response - Error parsing XML: String could not be parsed as XML
Could someone please help?
After looking at different possible causes, I found that mod_security of apache (used as reverse proxy of minio:9000) was interfering with uploads causing the problem.
I disabled mod_security on the reverse proxy account and the problem is now solved.

Unacceptable TLS certificate trying to run simple GStreamer pipeline

I'm trying to run a very GStreamer pipeline on macOS 10.15 provided here, to play a video from the Internet; however, I get the error "Unacceptable TLS certificate". I also tried with other video URLs but got the same error for all. The output is the same using souphttpsrc instead of uridecodebin. See the pipeline and the console output below.
I remember this kind of pipeline working seamlessly on Ubuntu. What could be the reason for this behaviour on macOS, and is there anyway to make the TLS certificate "acceptable"?
$ gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! audioconvert ! autoaudiosink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'source': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
ERROR: from element /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstSoupHTTPSrc:source: Secure connection setup failed.
Additional debug info:
../ext/soup/gstsouphttpsrc.c(1383): gst_soup_http_src_parse_status (): /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstSoupHTTPSrc:source:
Unacceptable TLS certificate (6), URL: https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm, Redirect to: (NULL)
ERROR: pipeline doesn't want to preroll.
ERROR: from element /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstSoupHTTPSrc:source: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstSoupHTTPSrc:source:
streaming stopped, reason error (-5)
ERROR: pipeline doesn't want to preroll.
ERROR: from element /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstTypeFindElement:typefindelement0: Stream doesn't contain enough data.
Additional debug info:
../plugins/elements/gsttypefindelement.c(988): gst_type_find_element_chain_do_typefinding (): /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstTypeFindElement:typefindelement0:
Can't typefind stream
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ..