How can i check size of file using ContentLength before save as in server (MVC4) - asp.net-mvc-4

My error is Maximum request length exceeded.
I want file upload must be smaller than 2MB. Please help me to fix code bellow, thanks
my controller:
public ActionResult Index()
{
var path = "~/Images/upload/";
if (Request.Files["UpFile"] != null && Request.Files["UpFile"].ContentLength < 2048)
{
var upload = Request.Files["UpFile"];
upload.SaveAs(Server.MapPath(path + upload.FileName));
}
else
{
ModelState.AddModelError("", "The size of file too big");
}
return View();
}

Try to manage your maximum request length for reducing errors to minimum:
Maximum request length exceeded
I think it's a good practice to use try..catch when working with uploading files even if you have global exception handler.

Related

Upload large files Spring Webflux

I want to upload files to a minio file container.
Smaller files work as expected with this code:
private Mono<Boolean> saveFileToMinio(FilePart filePart) {
log.info("About to save database to minio container...");
Mono<Boolean> result = Mono.from(
filePart.content().flatMap(dataBuffer -> {
var bytes = dataBuffer.asByteBuffer().array();
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
return Flux.just(bytes);
})
.flatMap(databaseFileService::write)
.then(Mono.just(true))
.onErrorMap(throwable -> {
log.error(throwable.getMessage(), throwable);
return throwable;
}));
log.info("Successfully saved database to minio container...");
return result;
}
I need to provide a byte[] for my minio service to be uploaded.
Smaller files work as expected (will be stored to the container). But larger files (12 MB in my test) don´t work.
I get this exception:
java.lang.IndexOutOfBoundsException: readPosition 0 and length 1024 should be smaller than writePosition 808
I´ve tried a suggestion DataBufferUtils.join from another SO post. This is kind of odd but i think the following code does the job:
private Mono<Boolean> saveFileToMinio(FilePart filePart) {
var result = DataBufferUtils.join(filePart.content()).map(dataBuffer -> {
var bytes = dataBuffer.asByteBuffer().array();
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
return bytes;
}).map(databaseFileService::write).then(Mono.just(true))
.onErrorMap(throwable -> {
log.error(throwable.getMessage(), throwable);
return throwable;
});
log.info("Successfully saved database to minio container...");
return result;
}
Especially this line seems to do the trick:
DataBufferUtils.join
I don´t know why but it seems to work.
EDIT:
Looking up the above static join sets a parameter maxByteCount to -1. Maybe the other (not working function for larger files) sets some limits. But i don´t know.

Set minimum and maximum file size For file upload in asp.net core

how can i set minimum and maximum size for file upload in asp.net core 3?
i try to find Any way to fix this
On the Internet, I just found a solution to define the maximum size
In my opinion, if you want to check the minimum and maximum file size For file upload, I suggest you could try to create a custom middleware to check the httpcontentlength, if the http content length doesn't match the minimum and maximum size, then you could return the custom response.
More details, you could refer to below codes:
Add below middleware into the startup.cs Configure method:
Notice: I use app.usewhen to check the path, this will only work for the url path contains "api". If you want to match all the request, you could directly use app.Use.
app.UseWhen(context =>
context.Request.Path.StartsWithSegments("/api"),
CheckRequestLengthAsync);
CheckRequestLengthAsync method:
private void CheckRequestLengthAsync(IApplicationBuilder app)
{
app.Use(async (context, next) =>
{
if (context.Request.ContentLength <50 && context.Request.ContentLength > 5)
{
context.Response.StatusCode = 500;
context.Response.ContentType = "text/html";
await context.Response.WriteAsync("Not match the content length");
}
else
{
// Do work that doesn't write to the Response.
await next();
// Do other work that doesn't write to the Response.
}
});
}
Result:
Answer #brando-zhang Is write but need to Little fix
if (context.Request.ContentLength <10000 || context.Request.ContentLength > 200000)
allowed 10kb-200kb

How to limit blob storage file size in ASA output

I'm working with an Azure solution where there is an output to a blob storage in ASA. I'm getting output files in a folder tree structure like this: yyyy/mm/dd/hh (e.g. 2017/10/26/07). Sometimes, files in the blob storage are saving in the hour folder after that hour is past and, as the result, files can be very big. Is there a way to limit the size of those files from ASA?
There is no way to limit the size today, size limitation is based only on blob's limit. However ASA will create a new folder for every hour if your path is yyyy/mm/dd/hh though. Please note that this is based on System.Timestamp column, not wall clock time.
Yes you limit the file size and create new file once the existing file size reaches the limit by using below length property.
namespace Microsoft.Azure.Management.DataLake.Store.Models {
...
// Summary:
// Gets the number of bytes in a file.
[JsonProperty(PropertyName = "length")]
public long? Length { get; }
...
}
Below is the example with scenario:
scenario If file size exceeds 256MB OR 268435456 bytes then create new file or use existing file.
Create a function and use this function to determine the file path, below is the sample code snippet for function.
Code Snippet:
public static async Task<string> GetFilePath(DataLakeStoreClient client, string path) {
var createNewFile = false;
......
if (await client.GetFileSize(returnValue) >= 256 * 1024 * 1024)
{
returnValue = GetFilePath(path);
createNewFile = true;
}
......
}
public async Task<long?> GetFileSize(string filepath) {
return (await this._client.FileSystem.GetFileStatusAsync(_connectionString.AccountName, path)).FileStatus.Length;
}

Limit front-end file upload size in Magento

So here's my problem. My admins need to upload files that can be anywhere from 2-500 MB's each. I've set my php.ini settings appropriately and all is well with this requirement. But now I've been asked to allow guests to upload files from the front-end. Obviously, I do not want to give them the ability to upload 500 MB files.
I've searched around and have been unable to find a decent answer for allowing large file uploads in the admin while limiting front-end guests to smaller file sizes.
So how do you allow your admin's to continue uploading extremely large files while restricting front-end users to a smaller file sizes?
Here's my solution:
public function saveAction()
{
$post = $this->getRequest()->getPost();
$helper = Mage::helper('my_module');
if ( $post ) {
try {
if ($_FILES['size'] >= 2000000) { // Limit is set to 2 MB
$errors[] = $helper->__('You have exceeded the max file size.');
$error = true;
}
if ($error) {
throw new Exception();
}
// Perform save operations here.
} catch (Exception $e) {
foreach($errors as $error) {
Mage::getSingleton('core/session')->addError($error);
}
$this->_redirect('*/*/*');
return;
}
}
}
This checks to see if the file exceeds the limit. If it does, it throws an exception.
Anyway, I'm looking for better/alternative solutions to this same problem. Post them if you've got them!
I can give you the basic idea on how you can achieve this. In your saveAction() of admin add this Mage::app()->getWebsite()->getId() This will return the current website ID. The admin website ID will be always 0. So add an IF condition in the saveAction() to check whether the current website ID is 0 or other.
If the return value is not equal to 0 then you can limit the size to 50MB or what ever u want. If it is equal to 0 then you can allow the limit upto 500MB.
Hope this helps.

Azure storage: Uploaded files with size zero bytes

When I upload an image file to a blob, the image is uploaded apparently successfully (no errors). When I go to cloud storage studio, the file is there, but with a size of 0 (zero) bytes.
The following is the code that I am using:
// These two methods belong to the ContentService class used to upload
// files in the storage.
public void SetContent(HttpPostedFileBase file, string filename, bool overwrite)
{
CloudBlobContainer blobContainer = GetContainer();
var blob = blobContainer.GetBlobReference(filename);
if (file != null)
{
blob.Properties.ContentType = file.ContentType;
blob.UploadFromStream(file.InputStream);
}
else
{
blob.Properties.ContentType = "application/octet-stream";
blob.UploadByteArray(new byte[1]);
}
}
public string UploadFile(HttpPostedFileBase file, string uploadPath)
{
if (file.ContentLength == 0)
{
return null;
}
string filename;
int indexBar = file.FileName.LastIndexOf('\\');
if (indexBar > -1)
{
filename = DateTime.UtcNow.Ticks + file.FileName.Substring(indexBar + 1);
}
else
{
filename = DateTime.UtcNow.Ticks + file.FileName;
}
ContentService.Instance.SetContent(file, Helper.CombinePath(uploadPath, filename), true);
return filename;
}
// The above code is called by this code.
HttpPostedFileBase newFile = Request.Files["newFile"] as HttpPostedFileBase;
ContentService service = new ContentService();
blog.Image = service.UploadFile(newFile, string.Format("{0}{1}", Constants.Paths.BlogImages, blog.RowKey));
Before the image file is uploaded to the storage, the Property InputStream from the HttpPostedFileBase appears to be fine (the size of the of image corresponds to what is expected! And no exceptions are thrown).
And the really strange thing is that this works perfectly in other cases (uploading Power Points or even other images from the Worker role). The code that calls the SetContent method seems to be exactly the same and file seems to be correct since a new file with zero bytes is created at the correct location.
Does any one have any suggestion please? I debugged this code dozens of times and I cannot see the problem. Any suggestions are welcome!
Thanks
The Position property of the InputStream of the HttpPostedFileBase had the same value as the Length property (probably because I had another file previous to this one - stupid I think!).
All I had to do was to set the Position property back to 0 (zero)!
I hope this helps somebody in the future.
Thanks Fabio for bringing this up and solving your own question. I just want to add code to whatever you have said. Your suggestion worked perfectly for me.
var memoryStream = new MemoryStream();
// "upload" is the object returned by fine uploader
upload.InputStream.CopyTo(memoryStream);
memoryStream.ToArray();
// After copying the contents to stream, initialize it's position
// back to zeroth location
memoryStream.Seek(0, SeekOrigin.Begin);
And now you are ready to upload memoryStream using:
blockBlob.UploadFromStream(memoryStream);