I have public url of a video file
Im trying to find the file size of that video file.
So that I had created a web request and called GetResponse () and getting the file size from response.ContentLength.
My question is,
If the file size is more than 100MB, the GetResponse() method will take much time to complete the request. And will it effect any performance because I just want to find the file size, so no need to download the entire file.
Please help me.
Thanks
Naresh Ede
Related
I am working on a webapp where the user provides an image file-text sequence. I am compressing the sequence into a single ZIP file uisng JSZip.
On the server I simply use PHP move_uploaded_file to the desired location after having checked the file upload error status.
A test ZIP file created in this way can be found here. I have downloaded the file, expanded it in Windows Explorer and verified that its contents (two images and some HTML markup in this instance) are all present and correct.
So far so good. The trouble begins when I try to fetch that same ZIP file and expand it using JSZip.loadAsync which consistently reports Corrupted zip: missing 210 bytes. My PHP code for squirting back the ZIP file is actually pretty simple. Shorn of the various security checks I have in place the essential bits of that code are listed below
if (file_exists($file))
{
ob_clean();
readfile($file);
http_response_code(200);
die();
} else http_response_code(399);
where the 399 code is interpreted in my webapp as a need to create a new resource locally instead of trying to read existing resource data. The trouble happens when I use the result text (on an HTTP response of 200) and feed it to JSZip.loadAsync.
What am I doing wrong here? I assume there is something too naive about the way I am using readfile at the PHP end but I am unable to figure out what that might be.
What we set out to do
Attempt to grab a server-side ZIP file from JavaScript
If it does not exist send back a reply (I simply set a custom HTTP response code of 399 and interpret it) telling the client to go prepare its own new local copy of that resource
If it does exist send back that ZIP file
Good so far. However, reading the existent ZIP file into PHP and sending it back does not make sense + is fraught with problems. My approach now is to send back an http_response_code of 302 which the client interprets as being an instruction to "go get that ZIP for yourself directly".
At this point to get the ZIP "directly" simply follow the instructions in this tutorial on MDN.
I am working on an ASP.Net core application where one of the actions is responsible to upload large files. The Limits.MaxRequestBodySize property is set to 100MB in the Startup.cs for Kestrel. The action that uploads the file is already decorated with [DisableRequestSizeLimit] and [RequestFormLimits(MultipartBodyLengthLimit = int.MaxValue)].
Despite ignoring the limit at action level and setting the maximum body size to 100MB at global level, the request fails with 500: BadHttpRequestException "Request body too large" when the file I am trying to upload is only 34MB. The exception occurs in one of the middleware at "await context.Request.Body.CopyToAsync(stream)". And the exception stack trace also mentions the Content-length is 129MB. The exception does not occur if I set the Limits.MaxRequestBodySize to 200MB or to null.
Questions:
Why is the request size 129MB when I am uploading only 34MB file? what makes the remaining ~100MB?
When the request is already in context.Request.Body, why is it throwing error while copying it ("await context.Request.Body.CopyToAsync(stream)")to a new stream?
I really appreciate any help with these. Please let me know if anything is unclear, I can provide more details.
Regards,
Siva
The issue could be that the default request limit in you application or the webserver is to low. Looks like the default maxAllowedContentLength is approx. 30MB.
Perhaps these link can help you out:
https://stackoverflow.com/a/59840618/432074
https://github.com/dotnet/aspnetcore/issues/20369#issuecomment-607057822
here is the solution:
There was nothing wrong with the MaxRequestBodySize or maxAllowedContentLength. It was the size of the request that was causing the issue. Eventhough I was uploading file size of ~34MB, the file was converted to byte array and then to base64. This resulted in increased request size. I used IFormFile interface to send the file instead of byte array/base64, it is working fine now.
okay so here is the deal..
I am capturing the audio stream from microphone via naudio WaveIn method..
Now this allows me to save my capture straight into a wav file continuously.
Now i want this wav to be continuously uploaded on my ftp account.
i.e the user is rec from mic.. his input is being stored into a file..and this file is being uploaded to my ftp.
I am currently facing problems regarding file lock which certainly does not allow me to access the same file at the same time it is being written.
I would be grateful to you if you can suggest me a method to upload my stream directly to my ftp account which does not involve this file issue
This is my code for recording:
Dim recordingFormat As New WaveFormat(8000, 16, 1)
writer = New WaveFileWriter("recorded.wav", recordingFormat)
waveInStream = New WaveIn()
waveInStream.DeviceNumber = 0
waveInStream.WaveFormat = recordingFormat
AddHandler waveInStream.DataAvailable, AddressOf waveInStream_DataAvailable
waveInStream.StartRecording()
And for continuously saving the stream into file:
writer.Write(e.Buffer, 0, e.BytesRecorded)
I want this ^ to to be fed directly into the ftp buffer..
Any help will be appreciated. TIA !
You can't upload a WAV file to an FTP server while you are still creating it. The WAV file contains some chunk sizes in its header that are not filled in until you have completed creating the WAV file.
Also, you haven't said what you are using to upload the file to FTP, but I expect most FTP file uploaders don't support the file being transferred growing in size while it is being uploaded.
Looks like you need a timer to stop recording once in a minute, build the wav file,
and start recording again while you upload the created wave file at the same time.
You can use the upload mechanism of the net framework itself to simply upload a file.
I think you will be missing 1 second every minute, not so bad.
I have a requirement where user can upload files present in app to SharePoint via same app.
I tried using http://schemas.microsoft.com/sharepoint/soap/CopyIntoItems method of sharepoint. But it needs file in base64 encoded format to be embedded into body of SOAP request . My code crashed on device when I tried to convert even a 30 MB file in base64 encoded string? Same code executed just fine on simulator
Is there any other alternative to upload files (like file streaming etc) onto sharepoint?? I may have to upload files upto 500 MB? Is there more efficient library to convert NSData into base64 encoded string for large file???
Should I read file in chunks and then convert that into base64 encoded string and upload file once complete file is converted? Any other appraoches???
First off, your code probably crashed because it ran out of memory. I would do a loop where I read chunks that I converted and then pushed to a open socket. This probably means that you need to go to a lower level than NSURLConnection, I have tried to search for NSURLConnection and chunked upload without much success.
Some seem to suggest using ASIHttp, but looking at the homepage it seems abandoned by the developer, so I can't recommend that.
AFNetworking looks really good, it has blocks support and I can see in the example on the first page how it could be used for you. Look at the streaming request example. Basically create a NSInputStream that you push chunked data to and use it in a AFHTTPURLConnectionOperation.
Im currently implementing an update feature into an app that I'm building. It uses NSUrlConnection and the NSURLConnectionDelegate to download the files and save them to the users device.
At the moment, 1 update item downloads multiple files but i want to display this download of multiple files using 1 UIProgressView. So my problem is, how do i get the expected content length of all the files i'm about to download? I know i can get the expectedContentLength of the NSURLResponse object that gets passed into the didReceiveResponse method but thats just for the file thats being downloaded.
Any help is much appreciated. Thanks.
How about having some kind of information file on your server, which actually gives you the total bytes. You could load that at first and then load your files. Then you can substract the loaded amount for each file from the total amount.
Another method would be to connect to all files at first, and cancel the connection after you received responses. Add the expected bytes of all files and then use that as a basis for showing the total progress while loading files.
Downside of #1: you have to manually keep track of the bytes.
Downside of #2: you'll have the double amount of requests, even though they get cancelled after the response.
Use ASIhttp opensource framework widely used for this purpose,
here u just need to set progressview delegate..so it will keep updating your progress view
Try this
http://allseeing-i.com/ASIHTTPRequest/