I have a scenario to upload databases backups to the Azure Blob storage via Window Service.
It is working for the bak files size range between 300-500 MB but if the size exceeds 700 MB to 1 GB or more. It took more than an hour and then throw an exception.
Please check the code below let me know what I am doing wrong and what is the efficient method to upload large size files to blob storage. I have tried these two methods.
public static void UploadFile(AzureOperationHelper azureOperationHelper)
{
CloudBlobContainer blobContainer = CreateCloudBlobContainer(tenantId, applicationId,
clientSecret, azureOperationHelper.storageAccountName, azureOperationHelper.containerName,
azureOperationHelper.storageEndPoint);
blobContainer.CreateIfNotExists();
var writeOptions = new BlobRequestOptions()
{
SingleBlobUploadThresholdInBytes = 50 * 1024 * 1024,//maximum for 64MB,32MB by default
ParallelOperationThreadCount = 12,
};
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(azureOperationHelper.blobName);
//blob.UploadFromFile(azureOperationHelper.srcPath);
blob.UploadFromFile(azureOperationHelper.srcPath, options: writeOptions);
}
public static void UploadFileStream(AzureOperationHelper azureOperationHelper)
{
CloudBlobContainer blobContainer = CreateCloudBlobContainer(tenantId, applicationId,
clientSecret, azureOperationHelper.storageAccountName, azureOperationHelper.containerName,
azureOperationHelper.storageEndPoint);
blobContainer.CreateIfNotExists();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(azureOperationHelper.blobName);
//byte[] contents = File.ReadAllBytes(azureOperationHelper.srcPath);
//var writeOptions = new BlobRequestOptions()
//{
// SingleBlobUploadThresholdInBytes = 50 * 1024 * 1024,//maximum for 64MB,32MB by default
// ParallelOperationThreadCount = 12,
//};
//blob.UploadFromByteArray(contents, 0, contents.Length, AccessCondition.GenerateIfNotExistsCondition(), options: writeOptions);
blob.StreamWriteSizeInBytes = 100 * 1024 * 1024; //100 MB
blob.UploadFromFile(string.Format(azureOperationHelper.srcPath));
//using (var fs = new FileStream(azureOperationHelper.srcPath, FileMode.Open))
//{
// blob.UploadFromStream(fs);
//}
}
Below are the exceptions I got.
Microsoft.WindowsAzure.Storage.StorageException: The remote server returned an error: (403) Forbidden. ---> System.Net.WebException: The remote server returned an error: (403) Forbidden. at Microsoft.WindowsAzure.Storage.Shared.Protocol.HttpResponseParsers.ProcessExpectedStatusCodeNoException[T](HttpStatusCode expectedStatusCode, HttpStatusCode actualStatusCode, T retVal, StorageCommandBase`1 cmd, Exception ex)
Microsoft.WindowsAzure.Storage.StorageException: The client could not finish the operation within specified timeout. ---> System.TimeoutException: The client could not finish the operation within specified timeout.
Please the code below, it works well at my side(about 2GB file, takes about 10 minutes for completing uploading):
public string UploadFile(string sourceFilePath)
{
try
{
string storageAccountConnectionString = "AZURE_CONNECTION_STRING";
CloudStorageAccount StorageAccount = CloudStorageAccount.Parse(storageAccountConnectionString);
CloudBlobClient BlobClient = StorageAccount.CreateCloudBlobClient();
CloudBlobContainer Container = BlobClient.GetContainerReference("container-name");
Container.CreateIfNotExists();
CloudBlockBlob blob = Container.GetBlockBlobReference( Path.GetFileName(sourceFilePath) );
HashSet<string> blocklist = new HashSet<string>();
byte[] fileContent = File.ReadAllBytes(sourceFilePath);
const int pageSizeInBytes = 10485760;
long prevLastByte = 0;
long bytesRemain = fileContent.Length;
do
{
long bytesToCopy = Math.Min(bytesRemain, pageSizeInBytes);
byte[] bytesToSend = new byte[bytesToCopy];
Array.Copy(fileContent, prevLastByte, bytesToSend, 0, bytesToCopy);
prevLastByte += bytesToCopy;
bytesRemain -= bytesToCopy;
//create blockId
string blockId = Guid.NewGuid().ToString();
string base64BlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId));
blob.PutBlock(
base64BlockId,
new MemoryStream(bytesToSend, true),
null
);
blocklist.Add(base64BlockId);
} while (bytesRemain > 0);
//post blocklist
blob.PutBlockList(blocklist);
return "Success";
}
catch (Exception ex)
{
return ex.Message;
}
}
It works well for uploading large file(someone gives the solution from here).
Please let me know if you can work with it.
Related
I am new to .NET and I have to perform this. Assuming we have the connection string and the Environment variable setup, could someone give me resources or code or guide on how to do it?
I just need to upload a pdf file in Azure Blob Storage using Minimal API
From the Minimal API document, we can see that the Minimal API does not support the binding the IFormFile.
No support for binding from forms. This includes binding IFormFile. We plan to add support for IFormFile in the future.
So, to upload file in the Minimal API, you can get the upload file from the HttpRequest Form. Refer to the following code:
app.MapPost("/upload", (HttpRequest request) =>
{
if (!request.Form.Files.Any())
return Results.BadRequest("At least one fie is need");
//Do something with the file
foreach(var item in request.Form.Files)
{
var file = item;
//insert the file into the Azure storage
}
return Results.Ok();
});
The screenshot as below:
Then, to upload the file to Azure Blob Storage, refer the following tutorial:
Upload images/files to blob azure, via web api ASP.NET framework Web application
Code like this:
CloudStorageAccount storageAccount;
Dictionary<string, object> dict = new Dictionary<string, object>();
string strorageconn = ConfigurationManager.AppSettings.Get("MyBlobStorageConnectionString");
if (CloudStorageAccount.TryParse(strorageconn, out storageAccount))
{
try
{
// Create the CloudBlobClient that represents the
// Blob storage endpoint for the storage account.
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
// Create a container called 'quickstartblobs' and
// append a GUID value to it to make the name unique.
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("images");
await cloudBlobContainer.CreateIfNotExistsAsync();
// Set the permissions so the blobs are public.
BlobContainerPermissions permissions = new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
};
await cloudBlobContainer.SetPermissionsAsync(permissions);
var httpRequest = HttpContext.Current.Request;
foreach (string file in httpRequest.Files)
{
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.Created);
var postedFile = httpRequest.Files[file];
string imageName = ("images" + serverTime.Year.ToString() + serverTime.Month.ToString() + serverTime.Day.ToString() +
serverTime.Hour.ToString() + serverTime.Minute.ToString() + serverTime.Second.ToString() + serverTime.Millisecond.ToString()
+ postedFile.FileName );
if (postedFile != null && postedFile.ContentLength > 0)
{
int MaxContentLength = 1024 * 1024 * 1; //Size = 1 MB
IList<string> AllowedFileExtensions = new List<string> { ".jpg", ".gif", ".png" };
var ext = postedFile.FileName.Substring(postedFile.FileName.LastIndexOf('.'));
var extension = ext.ToLower();
if (!AllowedFileExtensions.Contains(extension))
{
var message = string.Format("Please Upload image of type .jpg,.gif,.png.");
dict.Add("error", message);
return Request.CreateResponse(HttpStatusCode.BadRequest, dict);
}
else if (postedFile.ContentLength > MaxContentLength)
{
var message = string.Format("Please Upload a file upto 1 mb.");
dict.Add("error", message);
return Request.CreateResponse(HttpStatusCode.BadRequest, dict);
}
else
{
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(imageName);
cloudBlockBlob.Properties.ContentType = postedFile.ContentType;
await cloudBlockBlob.UploadFromStreamAsync(postedFile.InputStream);
}
}
var message1 = string.Format("Image Updated Successfully.");
return Request.CreateErrorResponse(HttpStatusCode.Created, message1);
}
var res3 = string.Format("Please Upload a image.");
dict.Add("error", res3);
return Request.CreateResponse(HttpStatusCode.NotFound, dict);
}
catch (Exception ex)
{
HttpResponseMessage response2 = Request.CreateResponse(HttpStatusCode.BadRequest, ex.InnerException.ToString());
return response2;
}
}
else
{
var res = string.Format("Did not connect successfull.");
dict.Add("error", res);
return Request.CreateResponse(HttpStatusCode.NotFound, dict);
}
How can I upload large files to MinIO (AWS S3 compatible API) via gRPC service without buffering data?
I have gRPC service with following definition:
service MediaService {
rpc UploadMedia(stream UploadMediaRequest) returns (UploadMediaResponse);
}
message UploadMediaRequest {
oneof Data {
UploadMediaMetadata metadata = 1;
UploadMediaStream fileStream = 2;
}
}
message UploadMediaMetadata {
string bucket = 1;
string virtialDirectory = 2;
string fileName = 3;
string contentType = 4;
map<string, string> attributes = 6;
}
message UploadMediaStream {
bytes bytes = 1;
}
And implementation of UploadMedia:
public override async Task<UploadMediaResponse> UploadMedia(
IAsyncStreamReader<UploadMediaRequest> requestStream,
ServerCallContext context)
{
UploadMediaMetadata? metadata = null;
var token = context.CancellationToken;
var traceId = context.GetHttpContext().TraceIdentifier;
await using var memoryStream = new MemoryStream();
await foreach (var req in requestStream.ReadAllAsync(token))
{
if (req.DataCase == UploadMediaRequest.DataOneofCase.Metadata)
{
metadata = req.Metadata;
_logger.LogTrace("[Req: {TraceId}] Received metadata", traceId);
}
else
{
await memoryStream.WriteAsync(req.FileStream.Bytes.Memory, token);
_logger.LogTrace("[Req: {TraceId}] Received chunk of bytes", traceId);
}
}
if (metadata == null)
{
throw new RpcException(new Status(StatusCode.InvalidArgument, "Not found metadata."));
}
memoryStream.Seek(0L, SeekOrigin.Begin);
var uploadModel = _mapper.Map<UploadModel>(metadata);
uploadModel.FileStream = memoryStream;
var file = await _fileService.UploadFile(uploadModel, token);
await _eventsService.Notify(new MediaUploadedEvent(file.PublicId), token);
_logger.LogTrace("[Req: {TraceId}] File uploaded", traceId);
return new UploadMediaResponse { File = _mapper.Map<RpcFileModel>(file) };
}
At the method I read request stream and write data to MemoryStream. After that I upload file to storage:
var putObjectArgs = new PutObjectArgs()
.WithStreamData(fileStream)
.WithObjectSize(fileStream.Length)
.WithObject(virtualPath)
.WithBucket(bucket)
.WithContentType(contentType)
.WithHeaders(attributes);
return _storage.PutObjectAsync(putObjectArgs, token);
I want to upload files without buffering data in Memory.
I think I can write bytes from stream to disk and after that create FileStream, but I don't want one more dependency.
If I insert a small image, for example 1-2 kb, then it is successful, if I select an image of 70 kb or any other large one, then I always have an error.
I have tried both the stored procedure and directly pass an array of bytes - error.
Exception System.Data.SqlClient.SQLException: "A transport layer error occurred while receiving results from the server. (provider: TCP provider, error: 0 - The semaphore timeout is exceeded.)"
Everything is fine with SQL Server itself, images are loaded successfully through WinForms, there are no problems.
Button:
<RadzenUpload Accept="image/*" ChooseText="Select..." Url=#($"api/upload/single/{clientid}/1") Progress="#((args) => OnProgress(args, "Loading ..."))" />
Controller and void:
[HttpPost("single/{id}/{photoid}")]
public async Task<IActionResult> Single(IFormFile file, string id,string photoid)
{
try
{
IVZBlazor.Iservice.IPhotoService serv;
string fileName = Path.GetFileName(file.FileName);
string rootpath = System.IO.Path.Combine(System.IO.Directory.GetCurrentDirectory(), "wwwroot");
string path = rootpath + "\\Uploads";
using var fileStream = file.OpenReadStream();
long length = file.Length;
byte[] bytes = new byte[length];
fileStream.Read(bytes, 0, (int)file.Length);
using (FileStream stream = new FileStream(Path.Combine(path, fileName), FileMode.Create))
{
file.CopyTo(stream);
}
await Save(bytes, id);
return StatusCode(200);
}
catch (Exception ex)
{
return StatusCode(500, ex.Message);
}
}
private DynamicParameters SetParameter(byte[] oPhoto)
{
DynamicParameters parameters = new DynamicParameters();
parameters.Add("#photo", oPhoto);
return parameters;
}
public async Task<int> Save(byte[] photo, string clientid)
{
using (var connection = new SqlConnection(Startup.ConnectSQL))
{
await connection.OpenAsync();
var sqlStatement = #"UPDATE [dbo].[photo] SET [image1] =#photo WHERE [personid] ='" + clientid + "'";
int res = await connection.ExecuteAsync(sqlStatement, SetParameter(photo));
return res;
}
}
I am uploading a gzip file into S3 bucket using java application, the data of which will be used in Athena. The gzip file is getting corrupted while uploading.
Due to which Athena is unable to view the data from the gzip file, also when the file is downloaded and manually tried to unzip, it says 'it is not a gzip file'.
private void getAndProcessFilesGenReports(String parUrl, String custCode, long size, String queryDate) {
try (CloseableHttpClient httpclient = HttpClientBuilder.create().setDefaultCredentialsProvider(getCredentialsProvider()).build();) {
CloseableHttpResponse response;
HttpGet httpget = new HttpGet(BASE_URI.concat(parUrl));
response = httpclient.execute(httpget);
httpget.setConfig(config);
response.getStatusLine().getStatusCode(), response.getStatusLine().getReasonPhrase());
if (response.getStatusLine().getStatusCode() != 200) {
log.error("getAndProcessFilesGenReports partUrl could not get response for custCode---> {}", custCode);
}
if (response.getStatusLine().getStatusCode() == 200) {
GZIPInputStream gzis = new GZIPInputStream(response.getEntity().getContent());
String bucketName = bucketForDetailedBilling(GEN_REPORT_TYPE, custCode, queryDate);
uploadGzipFileToS3(gzis, size, bucketName);
}
} catch (Exception e) {
log.error("error in getAndProcessFilesGenReports()--->", e);
}
}
private void uploadGzipFileToS3(InputStream gzis, long size, String bucketName) {
log.info("uploadGzipFileToS3 size{} --- bucketName {}--->", size, bucketName);
ClientConfiguration clientConfiguration = new ClientConfiguration();
clientConfiguration.setConnectionMaxIdleMillis(600000);
clientConfiguration.setConnectionTimeout(600000);
clientConfiguration.setClientExecutionTimeout(600000);
clientConfiguration.setUseGzip(true);
clientConfiguration.setConnectionTTL(1000 * 60 * 60);
AmazonS3Client amazonS3Client = new AmazonS3Client(clientConfiguration);
TransferManager transferManager = new TransferManager(amazonS3Client);
try {
ObjectMetadata objectMetadata = new ObjectMetadata();
objectMetadata.setContentLength(size);
transferManager.getConfiguration().setMultipartUploadThreshold(1024 * 5);
PutObjectRequest request = new PutObjectRequest(bucketName, DBR_NAME + DBR_EXT, gzis, objectMetadata);
request.getRequestClientOptions().setReadLimit(1024 * 5 + 1);
request.setSdkClientExecutionTimeout(10000 * 60 * 60);
Upload upload = transferManager.upload(request);
upload.waitForCompletion();
}`
I am creating a universal Windows Phone 8.1 App. I am trying to download the file and view it into launcher. I works for small file less than 15 MB files. But when file size is more than 15 MB, I got the out of memory exception.
async private Task<object> GetMailAttachments(string attachNotify)
{
try
{
cmdBarMailItem.IsEnabled = false;
if (await Device.IsNetworkAvailable())
{
cts = new CancellationTokenSource();
// Ignore SSL Certificate which is untrusted,expired and has invalid hostname.
var filter = new HttpBaseProtocolFilter() { AllowUI = false };
filter.IgnorableServerCertificateErrors.Add(Windows.Security.Cryptography.Certificates.ChainValidationResult.Untrusted);
filter.IgnorableServerCertificateErrors.Add(Windows.Security.Cryptography.Certificates.ChainValidationResult.Expired);
filter.IgnorableServerCertificateErrors.Add(Windows.Security.Cryptography.Certificates.ChainValidationResult.InvalidName);
// Start calling the soap service #userGetAttachmentByIndex
using (var client = new System.Net.Http.HttpClient(new WinRtHttpClientHandler(filter)))
{
//Prepare parameters which is to be post via soap envelope.
List<KeyValuePair<string, string>> parameter = new List<KeyValuePair<string, string>>();
parameter.Add(new KeyValuePair<string, string>("sessionId", GlobalInfo.SessionID));
parameter.Add(new KeyValuePair<string, string>("attachmentIndex", attachNotify.Split('|')[1].ToString()));
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("text/xml"));
client.DefaultRequestHeaders.Add("SOAPAction", "userGetAttachmentByIndex");
var postContent = new StringContent(StringHelper.ConstructSoapRequest(parameter, "userGetAttachmentByIndex"), Encoding.UTF8, "text/xml");
// Getting response from soap service
var response = await client.PostAsync(new Uri(AppEnv.ServiceEndPoint), postContent, cts.Token);
if (response.StatusCode == System.Net.HttpStatusCode.OK)
{
string soapResponse = await response.Content.ReadAsStringAsync();
var soap = XDocument.Parse(soapResponse);
XNamespace ns = "http://service.webservice.cryoserver.ci";
var base64BinaryStr = soap.Descendants(ns + "userGetAttachmentByIndexResponse").First().Descendants(ns + "return").First().Descendants(ns + "attachmentType").First().Descendants(ns + "binaryData").First().Descendants(ns + "base64Binary").First().Value;
await saveStringToLocalFile(base64BinaryStr);
var file = await Windows.Storage.ApplicationData.Current.LocalFolder.GetFileAsync("myTest.pdf");
bool x = await Windows.System.Launcher.LaunchFileAsync(file);
return x;
}
}
}
cmdBarMailItem.IsEnabled = true;
}
catch (TaskCanceledException)
{
PopupRetrieve.IsOpen = false;
ProgressBar.IsVisible = false;
cmdBarMailItem.IsEnabled = true;
}
catch(Exception ex)
{
cmdBarMailItem.IsEnabled = true;
ProgressBar.IsVisible = false;
MessageBox.Show(AlertType.Connectivity);
}
return null;
}
async Task saveStringToLocalFile(string content)
{
try
{
// saves the string 'content' to a file 'filename' in the app's local storage folder
// byte[] fileBytes = System.Text.Encoding.UTF8.GetBytes(content.ToCharArray());
byte[] byteArray = Convert.FromBase64String(content);
// create a file with the given filename in the local folder; replace any existing file with the same name
StorageFile file = await Windows.Storage.ApplicationData.Current.LocalFolder.CreateFileAsync("myTest.pdf", CreationCollisionOption.ReplaceExisting);
// write the char array created from the content string into the file
using (var stream = await file.OpenStreamForWriteAsync())
{
stream.Write(byteArray, 0, byteArray.Length);
stream.Flush();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
I got the exception on executing the line
string soapResponse = await response.Content.ReadAsStringAsync();
Anybody have an idea why the exception occurs ? What could be possible solution to fix it.
Any help would be appriciable. :)