I got an assignment to see if I can make power app that will generate some PDF file for end user to see.
After through research on this topic I found out that this is not an easy to achieve :)
In order to make power app generate and download/show generated pdf I made these steps:
Created power app with just one button :) to call Azure function from step 2
Created Azure function that will generate and return pdf as StreamContent
Due to power app limitations (or I just could not find the way) there was no way for me to get pdf from response inside power app.
After this, I changed my Azure function to create new blob entry but know I have problem to get URL for that new entry inside Azure function in order to return this to power app and then use inside power app Download function
My Azure function code is below
using System;
using System.Net;
using System.Net.Http.Headers;
using System.Runtime.InteropServices;
using Aspose.Words;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log, Stream outputBlob)
{
log.Info($"C# HTTP trigger function processed a request. RequestUri={req.RequestUri}");
var dataDir = #"D:/home";
var docFile = $"{dataDir}/word-templates/WordAutomationTest.docx";
var uid = Guid.NewGuid().ToString().Replace("-", "");
var pdfFile = $"{dataDir}/pdf-export/WordAutomationTest_{uid}.pdf";
var doc = new Document(docFile);
doc.Save(pdfFile);
var result = new HttpResponseMessage(HttpStatusCode.OK);
var stream = new FileStream(pdfFile, FileMode.Open);
stream.CopyTo(outputBlob);
// result.Content = new StreamContent(stream);
// result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
// result.Content.Headers.ContentDisposition.FileName = Path.GetFileName(pdfFile);
// result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
// result.Content.Headers.ContentLength = stream.Length;
return result;
}
I left old code (the one that streams pdf back under comments just as reference of what I tried)
Is there any way to get download URL for newly generated blob entry inside Azure function?
Is there any better way to make power app generate and download/show generated PDF?
P.S. I tried to use PDFViewer control inside power app, but this control is completely useless cause U can not set Document value via function
EDIT: Response from #mathewc helped me a lot to finally wrap this up. All details are below.
New Azure function that works as expected
#r "Microsoft.WindowsAzure.Storage"
using System;
using System.Net;
using System.Net.Http.Headers;
using System.Runtime.InteropServices;
using Aspose.Words;
using Microsoft.WindowsAzure.Storage.Blob;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log, CloudBlockBlob outputBlob)
{
log.Info($"C# HTTP trigger function processed a request. RequestUri={req.RequestUri}");
var dataDir = #"D:/home";
var docFile = $"{dataDir}/word-templates/WordAutomationTest.docx";
var uid = Guid.NewGuid().ToString().Replace("-", "");
var pdfFile = $"{dataDir}/pdf-export/WordAutomationTest_{uid}.pdf";
var doc = new Document(docFile);
doc.Save(pdfFile);
var result = new HttpResponseMessage(HttpStatusCode.OK);
var stream = new FileStream(pdfFile, FileMode.Open);
outputBlob.UploadFromStream(stream);
return req.CreateResponse(HttpStatusCode.OK, outputBlob.Uri);
}
REMARKS:
Wee need to add "WindowsAzure.Storage" : "7.2.1" inside project.json. This package MUST be the same version as one with same name that is in %USERPROFILE%\AppData\Local\Azure.Functions.Cli
If you change your blob output binding type from Stream to CloudBlockBlob you will have access to CloudBlockBlob.Uri which is the blob path you require (documentation here). You can then return that Uri back to your Power App. You can use CloudBlockBlob.UploadFromStreamAsync to upload your PDF Stream to the blob.
Related
I'm pretty new to ABP Framework and probably this question has a really simple answer, but I haven't managed to find it. Images are an important part of any app and handling them the best way (size, caching) is mandatory.
Scenario
setup a File System Blob Storing provider. This means that the upload file will be stored in the file system as an image file
make a service that uses a Blob container to save and retrieve the image. So, after saving it, I use the unique file name as a blob name. This name is used to retrieve it back.
the user is logged in, so authorization is required
I can easily obtain the byte[]s of the image by calling blobContainer.GetAllBytesOrNullAsync(blobName)
I want to easily display the image in <img> or in datatable row directly.
So, here is my question: is there an easy way to use a blob stored image as src of a <img> directly in a razor page? What I've managed to achieve is setting in the model, a source as a string made from image type + bytes converted to base 64 string (as here) however in this case I need to do it in the model and also I don't know if caching is used by the browser. I don't see how caching would work in this case.
I am aware that this may be a question more related to asp.net core, but I was thinking that maybe in abp there is some way via a link to access the image.
If you have the ID of the blob then it is easy to do. Just create a Endpoint to get the Image based on the blob id.
Here is the sample AppService
public class DocumentAppService : FileUploadAppService
{
private readonly IBlobContainer<DocumentContainer> _blobContainer;
private readonly IRepository<Document, Guid> _repository;
public DocumentAppService(IRepository<Document, Guid> repository, IBlobContainer<DocumentContainer> blobContainer)
{
_repository = repository;
_blobContainer = blobContainer;
}
public async Task<List<DocumentDto>> Upload([FromForm] List<IFormFile> files)
{
var output = new List<DocumentDto>();
foreach (var file in files)
{
using var memoryStream = new MemoryStream();
await file.CopyToAsync(memoryStream).ConfigureAwait(false);
var id = Guid.NewGuid();
var newFile = new Document(id, file.Length, file.ContentType, CurrentTenant.Id);
var created = await _repository.InsertAsync(newFile);
await _blobContainer.SaveAsync(id.ToString(), memoryStream.ToArray()).ConfigureAwait(false);
output.Add(ObjectMapper.Map<Document, DocumentDto>(newFile));
}
return output;
}
public async Task<FileResult> Get(Guid id)
{
var currentFile = _repository.FirstOrDefault(x => x.Id == id);
if (currentFile != null)
{
var myfile = await _blobContainer.GetAllBytesOrNullAsync(id.ToString());
return new FileContentResult(myfile, currentFile.MimeType);
}
throw new FileNotFoundException();
}
}
Upload function will upload the files and Get function will get the file.
Now set the Get route as a src for the image.
Here is the blog post: https://blog.antosubash.com/posts/dotnet-file-upload-with-abp
Repo: https://github.com/antosubash/FileUpload
I have a .NET Core API that I'd like to extend to save uploaded images asynchronously.
Using ImageSharp I should be able to check uploads and resize if predefined size limits are exceeded. However I can't get a simple async save working.
A simple (non-async) save to file works without problem:
My Controller extracts IFormFile from the upload and calls the following method without any problem
public static void Save(IFormFile image, string imagesFolder)
{
var fileName = Path.Combine(imagesFolder, image.FileName);
using (var stream = image.OpenReadStream())
using (var imgIS = Image.Load(stream, out IImageFormat format))
{
imgIS.Save(fileName);
}
}
ImageSharp is currently lacking async methods so a workaround is necessary.
The updated code below saves the uploaded file but the format is incorrect - when viewing the file I get the message "It appears we don't support this file format".
The format is extracted from the ImageSharp Load method. and used when saving to MemoryStream.
MemoryStream CopyToAsync method is used to save to FileStream to make the upload asynchronous.
public static async void Save(IFormFile image, string imagesFolder)
{
var fileName = Path.Combine(imagesFolder, image.FileName);
using (var stream = image.OpenReadStream())
using (var imgIS = Image.Load(stream, out IImageFormat format))
using (var memoryStream = new MemoryStream())
using (var fileStream = new FileStream(fileName, FileMode.OpenOrCreate))
{
imgIS.Save(memoryStream, format);
await memoryStream.CopyToAsync(fileStream).ConfigureAwait(false);
fileStream.Flush();
memoryStream.Close();
fileStream.Close();
}
}
I can't work out whether the issue is with ImageSharp Save to MemoryStream, or the MemoryStream.CopyToAsync.
I'm currently getting 404 on SixLabors docs - hopefully not an indication that the project has folded.
How can I make the upload async and save to file in the correct format?
CopyToAsync copies a stream starting at its current position. You must change the current position of memoryStream back to start before copying:
// ...
memoryStream.Seek(0, SeekOrigin.Begin);
await memoryStream.CopyToAsync(fileStream).ConfigureAwait(false);
// ...
I have searched high and low for this one and can't seem to find a way of accessing the Request.Content in an MVC web api. I basically am trying to create a File Service to and from Azure Blob and Table storage (table for storing metadata about the file, blob for the actual file)....
I was converting the steps in the following link, but this is where I have come unstuck
the back end I have working but can't find a way of the new unified controller passing a fileobject from json post through to the service! Any ideas would be greatly appreciated as always... or am I just going about this the wrong way?
Article here....
UPDATE: so to clarify, what I am trying to do in the new MVC 6 (where you no longer have an apicontroller to inherit from) is to access a file that has been uploaded to the api from a JSON post. That is the long and short of what I am trying to achieve.
I am trying to use the article based on the old Web API which uses the Request.Content to access it, however even if I use the WebAPIShim which they provide I still come unstuck with other objects or properties that are no longer available so I'm wondering if I need to approach it a different way, but either way, all I am trying to do is to get a file from a JSON post to a MVC 6 Web api and pass that file to my back end service....
ANY IDEAS?
Here is an example without relying on model binding.
You can always find the request data in Request.Body, or use Request.Form to get the request body as a form.
[HttpPost]
public async Task<IActionResult> UploadFile()
{
if (Request.Form.Files != null && Request.Form.Files.Count > 0)
{
var file = Request.Form.Files[0];
var contentType = file.ContentType;
using (var fileStream = file.OpenReadStream())
{
using (var memoryStream = new MemoryStream())
{
await fileStream.CopyToAsync(memoryStream);
// do what you want with memoryStream.ToArray()
}
}
}
return new JsonResult(new { });
}
If the only thing in your request is a File you can use the IFormFile class in your action:
public FileDetails UploadSingle(IFormFile file)
{
FileDetails fileDetails;
using (var reader = new StreamReader(file.OpenReadStream()))
{
var fileContent = reader.ReadToEnd();
var parsedContentDisposition = ContentDispositionHeaderValue.Parse(file.ContentDisposition);
fileDetails = new FileDetails
{
Filename = parsedContentDisposition.FileName,
Content = fileContent
};
}
return fileDetails;
}
I am working on pdf generation, it is successfully implemented using itextsharp.dll. It’s working fine on local environment after publish also. We have our own server at other site
But same code doesn't work on the server,pdf is not generated instead it gives an error: 'The document has no pages.'
Initially I thought it is due to no data in document but it works locally with or without data in the document.
I had code implemented as follows to make a web request Is any problem in that ??
try
{
var myHttpWebRequest = (HttpWebRequest)WebRequest.Create(strPdfData + "?objpatId=" + patID);
var response = myHttpWebRequest.GetResponse();
myHttpWebRequest.Timeout = 900000;
var stream = response.GetResponseStream();
StreamReader sr = new StreamReader(stream);
content = sr.ReadToEnd();
}
create a method in the controller:
[HttpGet]
public JsonResult GetFile()
{
var json = new WebClient().DownloadFile(string address, string fileName);
//This code is just to convert the file to json you can keep it in file format and send to the view
dynamic result = Newtonsoft.Json.JsonConvert.DeserializeObject(json);
var oc = Newtonsoft.Json.JsonConvert.DeserializeObject<countdata[]>(Convert.ToString(result.countdata));
return Json(oc, JsonRequestBehavior.AllowGet);
}
In the view just call this function:
#Url.Action('genPDF','GetFile');
Updated my MVC3/.Net 4.5/Azure solution to MVC4.
My code for uploading an image to blob storage appears to fail each time in the upgraded MVC4 solution. However, when I run my MVC3 solution works fine. Code that does the uploading, in a DLL, has not changed.
I’ve uploaded the same image file in the MVC3 and MVC4 solution. I’ve inspected in the stream and it appears to be fine. In both instance I am running the code locally on my machine and my connections point to blob storage in cloud.
Any pointers for debugging? Any known issues that I may not be aware of when upgrading to MVC4?
Here is my upload code:
public string AddImage(string pathName, string fileName, Stream image)
{
var client = _storageAccount.CreateCloudBlobClient();
client.RetryPolicy = RetryPolicies.Retry(3, TimeSpan.FromSeconds(5));
var container = client.GetContainerReference(AzureStorageNames.ImagesBlobContainerName);
image.Seek(0, SeekOrigin.Begin);
var blob = container.GetBlobReference(Path.Combine(pathName, fileName));
blob.Properties.ContentType = "image/jpeg";
blob.UploadFromStream(image);
return blob.Uri.ToString();
}
I managed to fix it. For some reason reading the stream directly from the HttpPostFileBase wasn't working. Simply copy it into a new memorystream solved it.
My code
public string StoreImage(string album, HttpPostedFileBase image)
{
var blobStorage = storageAccount.CreateCloudBlobClient();
var container = blobStorage.GetContainerReference("containerName");
if (container.CreateIfNotExist())
{
// configure container for public access
var permissions = container.GetPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Container;
container.SetPermissions(permissions);
}
string uniqueBlobName = string.Format("{0}{1}", Guid.NewGuid().ToString(), Path.GetExtension(image.FileName)).ToLowerInvariant();
CloudBlockBlob blob = container.GetBlockBlobReference(uniqueBlobName);
blob.Properties.ContentType = image.ContentType;
image.InputStream.Position = 0;
using (var imageStream = new MemoryStream())
{
image.InputStream.CopyTo(imageStream);
imageStream.Position = 0;
blob.UploadFromStream(imageStream);
}
return blob.Uri.ToString();
}