HttpPost file from a directory - asp.net-core

I have an image file in my projects directory, I need to Post it to another api but no success so far.
I have tried to mock IFormFile but it seems like its empty or something.
any ideas?
here is the extension I found and used but it doesn't seem to work.
public static IFormFile AsMockIFormFile(this FileInfo physicalFile)
{
var fileMock = new Mock<IFormFile>();
var ms = new MemoryStream();
var writer = new StreamWriter(ms);
writer.Write(physicalFile.OpenRead());
writer.Flush();
ms.Position = 0;
var fileName = physicalFile.Name;
//Setup mock file using info from physical file
fileMock.Setup(_ => _.FileName).Returns(fileName);
fileMock.Setup(_ => _.Length).Returns(ms.Length);
fileMock.Setup(m => m.OpenReadStream()).Returns(ms);
fileMock.Setup(m => m.ContentDisposition).Returns(string.Format("inline; filename={0}", fileName));
//...setup other members (code removed for brevity)
return fileMock.Object;
}
then when I try to copy this image to the folder to see if its actualy an image the image is 0 bytes:
var stream = System.IO.File.Create(#"wwwroot\images\dsadas.jpeg");
await fileMock.Object.CopyToAsync(stream);
if there is another way, or a way to make this work I would appreciate it!

Related

.NET Core API saving image upload asynchronously with ImageSharp, MemoryStream and FileStream

I have a .NET Core API that I'd like to extend to save uploaded images asynchronously.
Using ImageSharp I should be able to check uploads and resize if predefined size limits are exceeded. However I can't get a simple async save working.
A simple (non-async) save to file works without problem:
My Controller extracts IFormFile from the upload and calls the following method without any problem
public static void Save(IFormFile image, string imagesFolder)
{
var fileName = Path.Combine(imagesFolder, image.FileName);
using (var stream = image.OpenReadStream())
using (var imgIS = Image.Load(stream, out IImageFormat format))
{
imgIS.Save(fileName);
}
}
ImageSharp is currently lacking async methods so a workaround is necessary.
The updated code below saves the uploaded file but the format is incorrect - when viewing the file I get the message "It appears we don't support this file format".
The format is extracted from the ImageSharp Load method. and used when saving to MemoryStream.
MemoryStream CopyToAsync method is used to save to FileStream to make the upload asynchronous.
public static async void Save(IFormFile image, string imagesFolder)
{
var fileName = Path.Combine(imagesFolder, image.FileName);
using (var stream = image.OpenReadStream())
using (var imgIS = Image.Load(stream, out IImageFormat format))
using (var memoryStream = new MemoryStream())
using (var fileStream = new FileStream(fileName, FileMode.OpenOrCreate))
{
imgIS.Save(memoryStream, format);
await memoryStream.CopyToAsync(fileStream).ConfigureAwait(false);
fileStream.Flush();
memoryStream.Close();
fileStream.Close();
}
}
I can't work out whether the issue is with ImageSharp Save to MemoryStream, or the MemoryStream.CopyToAsync.
I'm currently getting 404 on SixLabors docs - hopefully not an indication that the project has folded.
How can I make the upload async and save to file in the correct format?
CopyToAsync copies a stream starting at its current position. You must change the current position of memoryStream back to start before copying:
// ...
memoryStream.Seek(0, SeekOrigin.Begin);
await memoryStream.CopyToAsync(fileStream).ConfigureAwait(false);
// ...

PDF generation in mvc4 with itextsharp

I am working on pdf generation, it is successfully implemented using itextsharp.dll. It’s working fine on local environment after publish also. We have our own server at other site
But same code doesn't work on the server,pdf is not generated instead it gives an error: 'The document has no pages.'
Initially I thought it is due to no data in document but it works locally with or without data in the document.
I had code implemented as follows to make a web request Is any problem in that ??
try
{
var myHttpWebRequest = (HttpWebRequest)WebRequest.Create(strPdfData + "?objpatId=" + patID);
var response = myHttpWebRequest.GetResponse();
myHttpWebRequest.Timeout = 900000;
var stream = response.GetResponseStream();
StreamReader sr = new StreamReader(stream);
content = sr.ReadToEnd();
}
create a method in the controller:
[HttpGet]
public JsonResult GetFile()
{
var json = new WebClient().DownloadFile(string address, string fileName);
//This code is just to convert the file to json you can keep it in file format and send to the view
dynamic result = Newtonsoft.Json.JsonConvert.DeserializeObject(json);
var oc = Newtonsoft.Json.JsonConvert.DeserializeObject<countdata[]>(Convert.ToString(result.countdata));
return Json(oc, JsonRequestBehavior.AllowGet);
}
In the view just call this function:
#Url.Action('genPDF','GetFile');

Windows Azure UploadFromStream No Longer Works After Porting to MVC4 - Pointers?

Updated my MVC3/.Net 4.5/Azure solution to MVC4.
My code for uploading an image to blob storage appears to fail each time in the upgraded MVC4 solution. However, when I run my MVC3 solution works fine. Code that does the uploading, in a DLL, has not changed.
I’ve uploaded the same image file in the MVC3 and MVC4 solution. I’ve inspected in the stream and it appears to be fine. In both instance I am running the code locally on my machine and my connections point to blob storage in cloud.
Any pointers for debugging? Any known issues that I may not be aware of when upgrading to MVC4?
Here is my upload code:
public string AddImage(string pathName, string fileName, Stream image)
{
var client = _storageAccount.CreateCloudBlobClient();
client.RetryPolicy = RetryPolicies.Retry(3, TimeSpan.FromSeconds(5));
var container = client.GetContainerReference(AzureStorageNames.ImagesBlobContainerName);
image.Seek(0, SeekOrigin.Begin);
var blob = container.GetBlobReference(Path.Combine(pathName, fileName));
blob.Properties.ContentType = "image/jpeg";
blob.UploadFromStream(image);
return blob.Uri.ToString();
}
I managed to fix it. For some reason reading the stream directly from the HttpPostFileBase wasn't working. Simply copy it into a new memorystream solved it.
My code
public string StoreImage(string album, HttpPostedFileBase image)
{
var blobStorage = storageAccount.CreateCloudBlobClient();
var container = blobStorage.GetContainerReference("containerName");
if (container.CreateIfNotExist())
{
// configure container for public access
var permissions = container.GetPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Container;
container.SetPermissions(permissions);
}
string uniqueBlobName = string.Format("{0}{1}", Guid.NewGuid().ToString(), Path.GetExtension(image.FileName)).ToLowerInvariant();
CloudBlockBlob blob = container.GetBlockBlobReference(uniqueBlobName);
blob.Properties.ContentType = image.ContentType;
image.InputStream.Position = 0;
using (var imageStream = new MemoryStream())
{
image.InputStream.CopyTo(imageStream);
imageStream.Position = 0;
blob.UploadFromStream(imageStream);
}
return blob.Uri.ToString();
}

Zip all files in a folder

I'm a bit new to WinRT developing platform, and it's already driving me crazy (I'm a long-time .Net developer, and all those removed APIs are quite annoying)
I'm experiencing a problem while zipping all files present in the Windows.Storage.ApplicationData.Current.TemporaryFolder
Here is my current code (VB.Net, based on MSDN code, and "file" is the zip file I'll put all my files into) :
Using zipMemoryStream As New MemoryStream()
Using zipArchive As New Compression.ZipArchive(zipMemoryStream, Compression.ZipArchiveMode.Create)
For Each fileToCompress As Windows.Storage.StorageFile In (Await Windows.Storage.ApplicationData.Current.TemporaryFolder.GetFilesAsync())
Dim buffer As Byte() = WindowsRuntimeBufferExtensions.ToArray(Await Windows.Storage.FileIO.ReadBufferAsync(fileToCompress))
Dim entry As ZipArchiveEntry = zipArchive.CreateEntry(fileToCompress.Name)
Using entryStream As Stream = entry.Open()
Await entryStream.WriteAsync(buffer, 0, buffer.Length)
End Using
Next
End Using
Using zipStream As Windows.Storage.Streams.IRandomAccessStream = Await file.OpenAsync(Windows.Storage.FileAccessMode.ReadWrite)
Using outstream As Stream = zipStream.AsStreamForWrite()
Dim buffer As Byte() = zipMemoryStream.ToArray()
outstream.Write(buffer, 0, buffer.Length)
outstream.Flush()
End Using
End Using
End Using
It builds well, but when I launch the code, I have the exception :
UnauthorizedAccessException : Access denied. (Exception de HRESULT : 0x80070005 (E_ACCESSDENIED))
On line : WindowsRuntimeBufferExtensions.ToArray(blahblah...
I'm wondering what is wrong. Any idea ?
Thanks in advance !
I rewrote your method in C# to try it out:
var file = await ApplicationData.Current.LocalFolder.CreateFileAsync("test.zip");
using (var zipMemoryStream = new MemoryStream())
{
using (var zipArchive = new System.IO.Compression.ZipArchive(zipMemoryStream, System.IO.Compression.ZipArchiveMode.Create))
{
foreach (var fileToCompress in (await ApplicationData.Current.TemporaryFolder.GetFilesAsync()))
{
var buffer = WindowsRuntimeBufferExtensions.ToArray(await FileIO.ReadBufferAsync(fileToCompress));
var entry = zipArchive.CreateEntry(fileToCompress.Name);
using (var entryStream = entry.Open())
{
await entryStream.WriteAsync(buffer, 0, buffer.Length);
}
}
}
using ( var zipStream = await file.OpenAsync(Windows.Storage.FileAccessMode.ReadWrite))
{
using (var outstream = zipStream.AsStreamForWrite())
{
var buffer = zipMemoryStream.ToArray();
outstream.Write(buffer, 0, buffer.Length);
outstream.Flush();
}
}
}
It works flawlessly - it creates the zip file in local folder as expected. Since you get the exception in ToArray call, the reason could be that the file you're trying to open is already locked from somewhere else. If you are creating these files yourself or even only accessing them, make sure you're closing the streams.
To test this method you could manually create a folder inside temp folder, put a couple of files in it and then run the method on that folder (the files are in C:\Users\<Username>\AppData\Local\Packages\<PackageName>\TempState) just to exclude any other reason for error.

Azure storage: Uploaded files with size zero bytes

When I upload an image file to a blob, the image is uploaded apparently successfully (no errors). When I go to cloud storage studio, the file is there, but with a size of 0 (zero) bytes.
The following is the code that I am using:
// These two methods belong to the ContentService class used to upload
// files in the storage.
public void SetContent(HttpPostedFileBase file, string filename, bool overwrite)
{
CloudBlobContainer blobContainer = GetContainer();
var blob = blobContainer.GetBlobReference(filename);
if (file != null)
{
blob.Properties.ContentType = file.ContentType;
blob.UploadFromStream(file.InputStream);
}
else
{
blob.Properties.ContentType = "application/octet-stream";
blob.UploadByteArray(new byte[1]);
}
}
public string UploadFile(HttpPostedFileBase file, string uploadPath)
{
if (file.ContentLength == 0)
{
return null;
}
string filename;
int indexBar = file.FileName.LastIndexOf('\\');
if (indexBar > -1)
{
filename = DateTime.UtcNow.Ticks + file.FileName.Substring(indexBar + 1);
}
else
{
filename = DateTime.UtcNow.Ticks + file.FileName;
}
ContentService.Instance.SetContent(file, Helper.CombinePath(uploadPath, filename), true);
return filename;
}
// The above code is called by this code.
HttpPostedFileBase newFile = Request.Files["newFile"] as HttpPostedFileBase;
ContentService service = new ContentService();
blog.Image = service.UploadFile(newFile, string.Format("{0}{1}", Constants.Paths.BlogImages, blog.RowKey));
Before the image file is uploaded to the storage, the Property InputStream from the HttpPostedFileBase appears to be fine (the size of the of image corresponds to what is expected! And no exceptions are thrown).
And the really strange thing is that this works perfectly in other cases (uploading Power Points or even other images from the Worker role). The code that calls the SetContent method seems to be exactly the same and file seems to be correct since a new file with zero bytes is created at the correct location.
Does any one have any suggestion please? I debugged this code dozens of times and I cannot see the problem. Any suggestions are welcome!
Thanks
The Position property of the InputStream of the HttpPostedFileBase had the same value as the Length property (probably because I had another file previous to this one - stupid I think!).
All I had to do was to set the Position property back to 0 (zero)!
I hope this helps somebody in the future.
Thanks Fabio for bringing this up and solving your own question. I just want to add code to whatever you have said. Your suggestion worked perfectly for me.
var memoryStream = new MemoryStream();
// "upload" is the object returned by fine uploader
upload.InputStream.CopyTo(memoryStream);
memoryStream.ToArray();
// After copying the contents to stream, initialize it's position
// back to zeroth location
memoryStream.Seek(0, SeekOrigin.Begin);
And now you are ready to upload memoryStream using:
blockBlob.UploadFromStream(memoryStream);