How to get full directory path of BlazorInputFile in blazor web assembly - blazorinputfile

Team,
I have a blazor web assembly app, which upload the file and process it later. However , I would like to know the base path of the file from where it it picked in the machine.
My code goes as follows . Does anyone has idea to get the file path such as "C:\myfile.txt".
With the File object, I cannot achieve the full path, I can access only its memory stream.
<h1>FILE UPLAOD </h1>
<InputFile OnChange="HandleSelection" ></InputFile>
#code
{
string status;
async Task HandleSelection(IFileListEntry[] files)
{
var file = files.FirstOrDefault();
if (file != null)
{
// Just load into .NET memory to show it can be done
// Alternatively it could be saved to disk, or parsed in memory, or similar
var ms = new MemoryStream();
await file.Data.CopyToAsync(ms);
Console.WriteLine(ms);
status = $"Finished loading {file.Size} bytes from {file.Name}";
var content = new MultipartFormDataContent
{
{ new ByteArrayContent(ms.GetBuffer()),"\"upload\"", file.Name}
};
await client.PostAsync("upload", content);
}
}
}

Even if you get the fullpath (C:\myfile.txt") file won't load
by default, all browser has a security mechanism that any local disk file won't be loaded into a website until you disable that security for your website

Related

How to access a file server share from an ASP.NET Core web API application published in IIS within the same domain?

I need access to files that are in a files server in my LAN from my Angular app.
I assume that I need to publish my Angular app in the same network, that is, in my IIS Server inside the same LAN
Now on my local machine, I try to access my shared folder \192.168.100.7\OfertasHistoric" but I donĀ“t know how to do it.
When I try this
[HttpGet("directorio")]
public async Task<ActionResult<string[]>> GetDirectoryContents()
{
string[] files = Directory.GetFiles(#"\\192.168.100.7\ofertashistorico");
return files;
}
I get this error
System.IO.DirectoryNotFoundException: Could not find a part of the path '/Users/kintela/Repos/Intranet-WebAPI/Intranet.API/\192.168.100.7\ofertashistorico'
It seems that the path that you give to the GetFiles method only searches from the current directory where the project is located downwards and I don't know how to indicate a different one.
I also do not know how to manage the issue of the credentials necessary to access said resource
Any idea, please?
Thanks
I am using below code and it works for me. Please check it.
Steps:
Navigate to the path like : \\192.168.2.50\ftp
Delete \ftp, the address in folder explorer should be \\192.168.2.50, find the folder you want, right click and map network drive.
You can try it with this address ftp:\\192.168.2.50, it will pop up a window. Input you usename and password, then you can check the files.
Test Result
Sample code
[HttpGet("directorio")]
public IActionResult GetDirectoryContents()
{
string networkPath = #"ftp:\\192.168.2.50";
string userName = #"Administrator";
string password = "Yy16";
#region FtpWebRequest
var networkCredential = new NetworkCredential(userName, password);
var uri = new Uri(networkPath);
var request = (FtpWebRequest)WebRequest.Create(uri);
request.Credentials = networkCredential;
request.Method = WebRequestMethods.Ftp.ListDirectory;
try
{
using (var response = (FtpWebResponse)request.GetResponse())
{
using (var stream = response.GetResponseStream())
{
using (var reader = new StreamReader(stream))
{
Console.WriteLine(reader.ReadToEnd());
}
}
}
}
catch (WebException ex)
{
Console.WriteLine("Access to the path '" + networkPath + "' is denied. Error message: " + ex.Message);
}
#endregion
return Ok();
}

Design Minimal API and use HttpClient to post a file to it

I have a legacy system interfacing issue that my team has elected to solve by standing up a .NET 7 Minimal API which needs to accept a file upload. It should work for small and large files (let's say at least 500 MiB). The API will be called from a legacy system using HttpClient in a .NET Framework 4.7.1 app.
I can't quite seem to figure out how to design the signature of the Minimal API and how to call it with HttpClient in a way that totally works. It's something I've been hacking at on and off for several days, and haven't documented all of my approaches, but suffice it to say there have been varying results involving, among other things:
4XX and 500 errors returned by the HTTP call
An assortment of exceptions on either side
Calls that throw and never hit a breakpoint on the API side
Calls that get through but the Stream on the API end is not what I expect
Errors being different depending on whether the file being uploaded is small or large
Text files being persisted on the server that contain some of the HTTP headers in addition to their original contents
On the Minimal API side, I've tried all sorts of things in the signature (IFormFile, Stream, PipeReader, HttpRequest). On the calling side, I've tried several approaches (messing with headers, using the Flurl library, various content encodings and MIME types, multipart, etc).
This seems like it should be dead simple, so I'm trying to wipe the slate clean here, start with an example of something that partially works, and hope someone might be able to illuminate the path forward for me.
Example of Minimal API:
// IDocumentStorageManager is an injected dependency that takes an int and a Stream and returns a string of the newly uploaded file's URI
app.MapPost(
"DocumentStorage/CreateDocument2/{documentId:int}",
async (PipeReader pipeReader, int documentId, IDocumentStorageManager documentStorageManager) =>
{
using var ms = new MemoryStream();
await pipeReader.CopyToAsync(ms);
ms.Position = 0;
return await documentStorageManager.CreateDocument(documentId, ms);
});
Call the Minimal API using HttpClient:
// filePath is the path on local disk, uri is the Minimal API's URI
private static async Task<string> UploadWithHttpClient2(string filePath, string uri)
{
var fileStream = File.Open(filePath, FileMode.Open);
var content = new StreamContent(fileStream);
var httpRequestMessage = new HttpRequestMessage(HttpMethod.Post, uri);
var httpClient = new HttpClient();
httpRequestMessage.Content = content;
httpClient.Timeout = TimeSpan.FromMinutes(5);
var result = await httpClient.SendAsync(httpRequestMessage);
return await result.Content.ReadAsStringAsync();
}
In the particular example above, a small (6 bytes) .txt file is uploaded without issue. However, a large (619 MiB) .tif file runs into problems on the call to httpClient.SendAsync which results in the following set of nested Exceptions:
System.Net.Http.HttpRequestException - "Error while copying content to a stream."
System.IO.IOException - "Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.."
System.Net.Sockets.SocketException - "An existing connection was forcibly closed by the remote host."
What's a decent way of writing a Minimal API and calling it with HttpClient that will work for small and large files?
Kestrel allows uploading 30MB per default.
To upload larger files via kestrel you might need to increase the max size limit. This can be done by adding the "RequestSizeLimit" attribute. So for example for 1GB:
app.MapPost(
"DocumentStorage/CreateDocument2/{documentId:int}",
[RequestSizeLimit(1_000_000_000)] async (PipeReader pipeReader, int documentId) =>
{
using var ms = new MemoryStream();
await pipeReader.CopyToAsync(ms);
ms.Position = 0;
return "";
});
You can also remove the size limit globally by setting
builder.WebHost.UseKestrel(o => o.Limits.MaxRequestBodySize = null);
This answer is good but the RequestSizeLimit filter doesn't work for minimal APIs, it's an MVC filter. You can use the IHttpMaxRequestBodySizeFeature to limit the size (assuming you're not running on IIS). Also, I made a change to accept the body as a Stream. This avoids the memory stream copy before calling the CreateDocument API:
app.MapPost(
"DocumentStorage/CreateDocument2/{documentId:int}",
async (Stream stream, int documentId, IDocumentStorageManager documentStorageManager) =>
{
return await documentStorageManager.CreateDocument(documentId, stream);
})
.AddEndpointFilter((context, next) =>
{
const int MaxBytes = 1024 * 1024 * 1024;
var maxRequestBodySizeFeature = context.HttpContext.Features.Get<IHttpMaxRequestBodySizeFeature>();
if (maxRequestBodySizeFeature is not null and { IsReadOnly: true })
{
maxRequestBodySizeFeature.MaxRequestBodySize = MaxBytes;
}
return next(context);
});
If you're running on IIS then https://learn.microsoft.com/en-us/iis/configuration/system.webserver/security/requestfiltering/requestlimits/#configuration

Uploading file from Blazor WebAssembly App directly to Blob storage

I've been trying to develop a Blazor WebAssembly app (I'm trying with both .NET Standard 2.1 and .NET 5.0) and my goal is to allow the user to select a file using InputFileand for that file to be uploaded to an Azure Blob Storage Container. I've been looking around a lot and following different guides here and there but with no success. The issues I was obtaining were usually related to security for example CORS (Although it had been fully set up), Authorization fails and System.PlatformNotSupportedException: System.Security.Cryptography.Algorithms is not supported on this platform.
Regardless of whether it is good practice or not; is it possible to directly upload to a blob storage from a blazor app? One method i tried was via a SAS token. It works via a CONSOLE APP but not a BLAZOR APP.
<label for="browseData"><b>Browse File</b></label>
<p><InputFile id="browseData" OnChange="#OnInputFileChange" /></p>
private async Task OnInputFileChange(InputFileChangeEventArgs e)
{
var maxAllowedFiles = 1;
var inputFile = e.GetMultipleFiles(maxAllowedFiles).First();
var stream = inputFile.OpenReadStream();
await StorageService.UploadFileToStorage(stream, "sftp-server", inputFile.Name);
}
Storage Service
public class AzureStorageService
{
private readonly IAzureStorageKeyService _azureStorageKeyService;
public AzureStorageService(IAzureStorageKeyService azureStorageKeyService)
{
_azureStorageKeyService = azureStorageKeyService;
}
public async Task<Uri> UploadFileToStorage(Stream stream, string container, string fileName)
{
try
{
const string REPLACE_THIS_ACCOUNT = "test";
var blobUri = new Uri("https://"
+ REPLACE_THIS_ACCOUNT +
".blob.core.windows.net/" +
container + "/" + fileName);
// Create the blob client.
AzureSasCredential azureSasCredential = new AzureSasCredential(
"?sv=2019-12-12&ss=bfqt&srt=sco&sp=rwdlacupx&se=2021-01-20T04:21:45Z&st=2021-01-19T20:21:45Z&spr=https&sig=OIkLePYDcF2AChtYUKs0VxUajs4KmwSyOXpQkFLvN2M%3D");
var blobClient = new BlobClient(blobUri, azureSasCredential);
// Upload the file
var response = await blobClient.UploadAsync(stream, true);
return blobUri;
}
catch (Exception ex)
{
Console.WriteLine(ex);
return null;
}
}
}
Like I was mentioning this will work via a console app but not a blazor app due to CORS..is this a security feature that just cannot be bypassed and just has to be done via the server side through a function -> blob?
If you want to directly upload file to Azure blob in Blazor WebAssembly application, we need to configure CORS fro Azure storage account. Regarding how to configure it, please refer to here.
For example
Configure CORS
Allowed origins: *
Allowed verbs: DELETE,GET,HEAD,MERGE,POST,OPTIONS,PUT
Allowed headers: *
Exposed headers: *
Maximum age (seconds): 3600
Create Account SAS.
My upload file compoment
#page "/fileupload"
#using System.ComponentModel.DataAnnotations
#using System.IO
#using System.Linq
#using System.Threading
#using Azure.Storage.Blobs
#using Azure.Storage.Blobs.Models
#using Azure.Storage
#using Azure
<h3>File Upload Component</h3>
<label for="browseData"><b>Browse File</b></label>
<p><InputFile id="browseData" OnChange="#OnInputFileChange" /></p>
#{
var progressCss = "progress " + (displayProgress ? "" : "d-none");
}
<div class="#progressCss">
<div class="progress-bar" role="progressbar" style="#($"width: { progressBar }%")" aria-valuenow="#progressBar" aria-valuemin="0" aria-valuemax="100"></div>
</div>
#code {
private bool displayProgress = false;
private string result = string.Empty;
private string progressBar;
private int maxAllowedSize = 10 * 1024 * 1024;
private async Task OnInputFileChange(InputFileChangeEventArgs e)
{
var maxAllowedFiles = 1;
var inputFile = e.GetMultipleFiles(maxAllowedFiles).First();
var blobUri = new Uri("https://"
+ "23storage23" +
".blob.core.windows.net/" +
"upload" + "/" + inputFile.Name);
AzureSasCredential credential = new AzureSasCredential("");
BlobClient blobClient = new BlobClient(blobUri, credential, new BlobClientOptions());
displayProgress = true;
var res = await blobClient.UploadAsync(inputFile.OpenReadStream(maxAllowedSize), new BlobUploadOptions
{
HttpHeaders = new BlobHttpHeaders { ContentType = inputFile.ContentType },
TransferOptions = new StorageTransferOptions
{
InitialTransferSize = 1024 * 1024,
MaximumConcurrency = 10
},
ProgressHandler = new Progress<long>((progress) =>
{
progressBar = (100.0 * progress / inputFile.Size).ToString("0");
})
});
}
}
Besides, you also can download my sample to do a test.
I was able to upload a file with your code. It was working like a charm. So, there is no error in your source code. I used the Microsoft.NET.Sdk.BlazorWebAssembly SDK with net5.0 as the target framework. Probably, it's a configuration issue with the storage account.
The CORS policy allows the host of the application (https://localhost:5001) to accesses the resources with any verb and any header (*) values. The response can include the content-length header. Is your WASM self hosted or hosted within an ASP core application? You can easily spot what origin is sent to azure by inspected the request.
Based on your context, only a few verbs like PUT might be enough.
The configuration of the SAS key can also be more specific.
Using SAS from Blazor WASM
Even not entirely part of your questions, but worth mentioning. If you want to allow a direct upload into your blob storage - and there are good reasons for it - before uploading a file, your WASM app should send a request to your API, the response contains the SAS key that can only be used for this one operation. The SAS key itself should be a user delegation SAS key

Reading static files in ASP.NET Blazor

I have a client side Blazor Application. I want to have an appsetting.json file for my client-side configuration like we have an environment.ts in
Angular.
For that, I am keeping a ConfigFiles folder under wwwroot and a JSON file inside of it. I am trying to read this file as below.
First get the path:
public static class ConfigFiles
{
public static string GetPath(string fileName)
{
return Path.Combine("ConfigFiles", fileName);
}
}
Than read it:
public string GetBaseUrl()
{
string T = string.Empty;
try
{
T = File.ReadAllText(ConfigFiles.GetPath("appsettings.json"));
}
catch (Exception ex)
{
T = ex.Message;
}
return T;
}
But I always get the error:
Could not find a part of the path "/ConfigFiles/appsettings.json".
Inside the GetPath() method, I also tried:
return Path.Combine("wwwroot/ConfigFiles", fileName);
But I still get the same error:
Could not find a part of the path "wwwroot/ConfigFiles/appsettings.json".
Since there is no concept of IHostingEnvironmentin client-side Blazor, what is the correct way to read a static JSON file here?
I have a client side Blazor Application
OK, That means that File.ReadAllText(...) and Path.Combine(...) are not going to work at all.
Client-side means that you could be running on Android or Mac-OS or whatever.
The Blazor team provides you with a complete sample of how to read a file, in the form of the FetchData sample page.
forecasts = await Http.GetJsonAsync<WeatherForecast[]>("sample-data/weather.json");
this gets you the contents of a file in wwwroot/sample-data
You can use Http.GetStringAsync(...) if you want AllText
If you want to have per-user settings then look into the Blazored.LocalStorage package.

Amazon S3 Response DotNetZip MVC

My application (MVC) needs to download, zip and return one or many files from Amazon S3. I am using the .NET SDK and GetObject to receive the files, and want to use DotNetZip to then zip them up and return the generated zip file as a file stream result for the user to download.
Can anyone suggest the most efficient way of doing this, I am seeing OutOfMemory exceptions when downloading large files from S3, they could be up to 1gb in size for example.
My code so far;
using (
var client = AWSClientFactory.CreateAmazonS3Client(
"apikey",
"apisecret",
new AmazonS3Config { RegionEndpoint = RegionEndpoint.EUWest1 })
)
{
foreach (var file in files)
{
var request = new GetObjectRequest { BucketName = "bucketname", Key = file };
using (var response = client.GetObject(request))
{
}
}
}
If I copy the response into a memory stream and add that to the zip, all works ok (on small files), but with large files assume I cannot store the entire thing in memory?