Is it possible to use "Windows::Graphics::Imaging::SoftwareBitmap::Convert" to convert RGB8 into Gray8? - c++-winrt

I am trying to convert a SoftwareBitmap (Rgba8) into Gray8 and save it as png file.
Below is my code.
winrt::apartment_context ui_thread;
co_await winrt::resume_background();
// Create a new file with the specified name
winrt::Windows::Storage::StorageFile DepthstorageFile = co_await winrt::Windows::Storage::KnownFolders::SavedPictures().CreateFileAsync(DepthfileName, winrt::Windows::Storage::CreationCollisionOption::ReplaceExisting);
// Open a writeable stream to the file
winrt::Windows::Storage::Streams::IRandomAccessStream Depthstream = co_await DepthstorageFile.OpenAsync(winrt::Windows::Storage::FileAccessMode::ReadWrite);
// Create a BitmapEncoder object to encode the SoftwareBitmap
winrt::Windows::Graphics::Imaging::BitmapEncoder Depthencoder = co_await winrt::Windows::Graphics::Imaging::BitmapEncoder::CreateAsync(winrt::Windows::Graphics::Imaging::BitmapEncoder::PngEncoderId(), Depthstream);
std::unique_lock<std::mutex> ulC(m_CameraBufferMutex);
m_SoftwareBitMapGray = Windows::Graphics::Imaging::SoftwareBitmap::Convert(sourcebitmap, Windows::Graphics::Imaging::BitmapPixelFormat::Gray8, Windows::Graphics::Imaging::BitmapAlphaMode::Premultiplied);
Depthencoder.SetSoftwareBitmap(m_SoftwareBitMapGray);
ulC.unlock();
co_await Depthencoder.FlushAsync();
co_await ui_thread;
It seems that it save a empty png image.
Could someone tell me which part is wrong?

Related

Save picture directly to stream? [duplicate]

I have a filename pointing to a text file, including its path, as a string. Now I'd like to load this .csv file into memory stream. How should I do that?
For example, I have this:
Dim filename as string="C:\Users\Desktop\abc.csv"
Dim stream As New MemoryStream(File.ReadAllBytes(filename))
You don't need to load a file into a MemoryStream.
You can simply call File.OpenRead to get a FileStream containing the file.
If you really want the file to be in a MemoryStream, you can call CopyTo to copy the FileStream to a MemoryStream.
I had an XML file being read from disk, using the old XmlReader API. How to read the XML file into memory, and then work with it in memory, instead of reading the disk repeatedly? Based on VB answer from Centro (upvoted) but with a Using block, and in C#.
The key line:
MemoryStream myXMLDocument = new MemoryStream(File.ReadAllBytes(#"c:\temp\myDemoXMLDocument.xml"));
Re the OP's question, if you wanted to load a CSV file into a MemoryStream:
MemoryStream myCSVDataInMemory = new MemoryStream(File.ReadAllBytes(#"C:\Users\Desktop\abc.csv"));
Following is a code snippet showing code to reads through XML document now that it's in a MemoryStream. Basically the same code as when it was coming from a FileStream that pointed to a file on disk. Yes, the XMLTextReader API is old and clunky, but it's what I had to work with in this app.
string myXMLFileName = #"c:\temp\myDemoXMLDocument.xml";
using (MemoryStream myXMLDocument = new MemoryStream(File.ReadAllBytes(myXMLFileName)))
{
myXMLTextReader = new XmlTextReader(myXMLDocument);
myXMLTextReader.WhitespaceHandling = WhitespaceHandling.None;
myXmlTextReader.Read(); // read the XML declaration node, advance to <Batch> tag
while (!myXmlTextReader.EOF)
{
if (myXmlTextReader.Name == "xml" && !myXmlTextReader.IsStartElement()) break;
// advance to <Batch> tag
while (myXmlTextReader.Name == "Batch" && myXmlTextReader.IsStartElement())
{
string BatchIdentifier = myXmlTextReader.GetAttribute("BatchIdentifier");
myXmlTextReader.Read(); // advance to next tag
while (!myXmlTextReader.EOF)
{
if (myXmlTextReader.Name == "Transaction" && myXmlTextReader.IsStartElement())
{
// Start a new set of items
string transactionID = myXmlTextReader.GetAttribute("ID");
myXmlTextReader.Read(); // Read next element, possibly another Transaction tag
}
}
//All Batch tags are completed.Move to next tag
myXmlTextReader.Read();
}
// Close the XML memory stream.
myXmlTextReader.Close();
myXmlDocument.Close();
}
}
You can copy it to a file stream like so:
string fullPath = Path.Combine(filePath, fileName);
FileStream fileStream = new FileStream(fullPath, FileMode.Open);
Image image = Image.FromStream(fileStream);
MemoryStream memoryStream = new MemoryStream();
image.Save(memoryStream, ImageFormat.Jpeg);
//Close File Stream
fileStream.Close();

.NET Core API saving image upload asynchronously with ImageSharp, MemoryStream and FileStream

I have a .NET Core API that I'd like to extend to save uploaded images asynchronously.
Using ImageSharp I should be able to check uploads and resize if predefined size limits are exceeded. However I can't get a simple async save working.
A simple (non-async) save to file works without problem:
My Controller extracts IFormFile from the upload and calls the following method without any problem
public static void Save(IFormFile image, string imagesFolder)
{
var fileName = Path.Combine(imagesFolder, image.FileName);
using (var stream = image.OpenReadStream())
using (var imgIS = Image.Load(stream, out IImageFormat format))
{
imgIS.Save(fileName);
}
}
ImageSharp is currently lacking async methods so a workaround is necessary.
The updated code below saves the uploaded file but the format is incorrect - when viewing the file I get the message "It appears we don't support this file format".
The format is extracted from the ImageSharp Load method. and used when saving to MemoryStream.
MemoryStream CopyToAsync method is used to save to FileStream to make the upload asynchronous.
public static async void Save(IFormFile image, string imagesFolder)
{
var fileName = Path.Combine(imagesFolder, image.FileName);
using (var stream = image.OpenReadStream())
using (var imgIS = Image.Load(stream, out IImageFormat format))
using (var memoryStream = new MemoryStream())
using (var fileStream = new FileStream(fileName, FileMode.OpenOrCreate))
{
imgIS.Save(memoryStream, format);
await memoryStream.CopyToAsync(fileStream).ConfigureAwait(false);
fileStream.Flush();
memoryStream.Close();
fileStream.Close();
}
}
I can't work out whether the issue is with ImageSharp Save to MemoryStream, or the MemoryStream.CopyToAsync.
I'm currently getting 404 on SixLabors docs - hopefully not an indication that the project has folded.
How can I make the upload async and save to file in the correct format?
CopyToAsync copies a stream starting at its current position. You must change the current position of memoryStream back to start before copying:
// ...
memoryStream.Seek(0, SeekOrigin.Begin);
await memoryStream.CopyToAsync(fileStream).ConfigureAwait(false);
// ...

Convert mp4 voice file to WAV stream

I used this code to resample file and save it but the file sounds like fast-forward recording:
using (MediaFoundationReader reader = new MediaFoundationReader(url))
{
using (ResamplerDmoStream resampledReader = new ResamplerDmoStream(reader, new WaveFormat(16000, 16, 1)))
{
using (WaveFileWriter waveWriter = new WaveFileWriter(#"c:\test.wav", resampledReader.WaveFormat))
{
resampledReader.CopyTo(waveWriter);
}
}
}
The WaveFileWriter must have the same WaveFormat as resampledReader. So pass in resampledReader.WaveFormat to the WaveFileWriter constructor.
I'm not sure what resampling means, but if you need to get the wav from a video, this works...
using (var video = new MediaFoundationReader(file))
{
file = TempWav;
WaveFileWriter.CreateWaveFile(file, video);
}

Render a PDF file and save to object using grails Rendering and Attachmentable plugins

I am attempting to generate a PDF file that contains object information and then attach it to another object that is stored in the database. The attachmentable plugin I am using is working now for user end attachments, but I need my system to be able to do it automatically.
I am using:
Grails 1.3.9
Attachmentable 0.3.0 http://grails.org/plugin/attachmentable
Rendering 0.4.3 http://grails.org/plugin/rendering
I have been able to generate and display the pdf, but do not know how to attach it using the attachmentable plugin. I need some way to take the generated pdf byte array and convert it to a MultipartFile for the attachmentable plugin function I call. The error I get shows that my argument types are invalid.
I save object1 and object2, then generate the pdf of object1 and try to attach it to object2.
Thanks in advance for you help!
Thing1 Controller Snippets:
ByteArrayOutputStream bytes = pdfRenderingService.render(template: "/thing1/pdf", model: [thing1: thing1])
attachmentableService.addAttachment("unknown", thing2.id, bytes)
AttachmentableService function I am attempting to call:
def addAttachment(def poster, def reference, CommonsMultipartFile file) {
addAttachment(CH.config, poster, reference, file)
}
def addAttachment(def config,
def poster,
def reference,
CommonsMultipartFile file) {
if (reference.ident() == null) {
throw new AttachmentableException(
"You must save the entity [${delegate}] before calling addAttachment.")
}
if (!file?.size) {
throw new EmptyFileException(file.name, file.originalFilename)
}
String delegateClassName = AttachmentableUtil.fixClassName(reference.class)
String posterClass = (poster instanceof String) ? poster : AttachmentableUtil.fixClassName(poster.class.name)
Long posterId = (poster instanceof String) ? 0L : poster.id
String filename = file.originalFilename
// link
def link = AttachmentLink.findByReferenceClassAndReferenceId(
delegateClassName, reference.ident())
if (!link) {
link = new AttachmentLink(
referenceClass: delegateClassName,
referenceId: reference.ident())
}
// attachment
Attachment attachment = new Attachment(
// file
name: FilenameUtils.getBaseName(filename),
ext: FilenameUtils.getExtension(filename),
length: 0L,
contentType: file.contentType,
// poster
posterClass: posterClass,
posterId: posterId,
// input
inputName: file.name)
link.addToAttachments attachment
if (!link.save(flush: true)) {
throw new AttachmentableException(
"Cannot create Attachment for arguments [$user, $file], they are invalid.")
}
// save file to disk
File diskFile = AttachmentableUtil.getFile(config, attachment, true)
file.transferTo(diskFile)
attachment.length = diskFile.length()
// interceptors
if(reference.respondsTo('onAddAttachment')) {
reference.onAddAttachment(attachment)
}
attachment.save(flush:true) // Force update so searchable can try to index it again.
return reference
}
Grails runtime error:
groovy.lang.MissingMethodException: No signature of method: com.macrobit.grails.plugins.attachmentable.services.AttachmentableService.addAttachment() is applicable for argument types: (java.lang.String, java.lang.Long, java.io.ByteArrayOutputStream) values: [unknown, 80536, %PDF-1.4 and a long string of unreadable data...]
Possible solutions: addAttachment(java.lang.Object, java.lang.Object, org.springframework.web.multipart.commons.CommonsMultipartFile), addAttachment(java.lang.Object, java.lang.Object, java.lang.Object, org.springframework.web.multipart.commons.CommonsMultipartFile)
Service Method I Added:
def customAddMethod(def poster, def reference, def pdfBytes) {
customAddMethod(CH.config, poster, reference, pdfBytes)
}
def customAddMethod(def config,
def poster,
def reference,
def pdfBytes) {
if (reference.ident() == null) {
throw new AttachmentableException(
"You must save the entity [${delegate}] before calling customAddMethod.")
}
String delegateClassName = AttachmentableUtil.fixClassName(reference.class)
String posterClass = (poster instanceof String) ? poster : AttachmentableUtil.fixClassName(poster.class.name)
Long posterId = (poster instanceof String) ? 0L : poster.id
String filename = "File Name"
// link
def link = AttachmentLink.findByReferenceClassAndReferenceId(
delegateClassName, reference.ident())
if (!link) {
link = new AttachmentLink(
referenceClass: delegateClassName,
referenceId: reference.ident())
}
// attachment
Attachment attachment = new Attachment(
// file
name: "File Name",
ext: "pdf",
length: 0L,
contentType: "application/pdf",
// poster
posterClass: posterClass,
posterId: posterId,
// input
inputName: "File Name")
link.addToAttachments attachment
if (!link.save(flush: true)) {
throw new AttachmentableException(
"Cannot create Attachment for arguments [$user, $file], they are invalid.")
}
// save file to disk
byte[] bytes = pdfBytes.toByteArray(); //convert ByteArrayOutputStream to ByteArray
File diskFile = AttachmentableUtil.getFile(config, attachment, true) //file path
FileOutputStream fos = new FileOutputStream(diskFile); //open file output stream to write to
fos.write(bytes); //write rendered pdf bytes to file
fos.flush();
fos.close();
attachment.length = diskFile.length()
// interceptors
if(reference.respondsTo('onAddAttachment')) {
reference.onAddAttachment(attachment)
}
attachment.save(flush:true) // Force update so searchable can try to index it again.
return reference
}
It looks like the AttachmentableService you referenced (from the Attachmentable plugin) assumes it's dealing with a file-upload scenario, such that you could easily grab the MultipartFile instance via request.getFile(). That's not the case for you - you're creating the file via the Rendering plugin, and you want that file attached to a domain object, right?
You could try constructing a CommonsMultipartFile instance manually by first writing the pdf bytes to disk, and then create a DiskFileItem via DiskFileItemFactory.
See this post for an example of what I'm thinking:
How to make CommonsMultipartFile from absolute file path?
Another, better, option might be to checkout that plugin's source and add a method that doesn't require you to go through those gyrations - perhaps a version of the addAttachment method that accepts a File or an OutputStream instead - and submit a PR to the plugin author.
(Looks like they're adding an 'addAttachment' method to qualifying domain objects, which also expects a CommonsMultipartFile).
Otherwise, you might just have to create your own service to basically provide the same end result, which apparently is to create an AttachmentLink and associated Attachment instance.

Azure storage: Uploaded files with size zero bytes

When I upload an image file to a blob, the image is uploaded apparently successfully (no errors). When I go to cloud storage studio, the file is there, but with a size of 0 (zero) bytes.
The following is the code that I am using:
// These two methods belong to the ContentService class used to upload
// files in the storage.
public void SetContent(HttpPostedFileBase file, string filename, bool overwrite)
{
CloudBlobContainer blobContainer = GetContainer();
var blob = blobContainer.GetBlobReference(filename);
if (file != null)
{
blob.Properties.ContentType = file.ContentType;
blob.UploadFromStream(file.InputStream);
}
else
{
blob.Properties.ContentType = "application/octet-stream";
blob.UploadByteArray(new byte[1]);
}
}
public string UploadFile(HttpPostedFileBase file, string uploadPath)
{
if (file.ContentLength == 0)
{
return null;
}
string filename;
int indexBar = file.FileName.LastIndexOf('\\');
if (indexBar > -1)
{
filename = DateTime.UtcNow.Ticks + file.FileName.Substring(indexBar + 1);
}
else
{
filename = DateTime.UtcNow.Ticks + file.FileName;
}
ContentService.Instance.SetContent(file, Helper.CombinePath(uploadPath, filename), true);
return filename;
}
// The above code is called by this code.
HttpPostedFileBase newFile = Request.Files["newFile"] as HttpPostedFileBase;
ContentService service = new ContentService();
blog.Image = service.UploadFile(newFile, string.Format("{0}{1}", Constants.Paths.BlogImages, blog.RowKey));
Before the image file is uploaded to the storage, the Property InputStream from the HttpPostedFileBase appears to be fine (the size of the of image corresponds to what is expected! And no exceptions are thrown).
And the really strange thing is that this works perfectly in other cases (uploading Power Points or even other images from the Worker role). The code that calls the SetContent method seems to be exactly the same and file seems to be correct since a new file with zero bytes is created at the correct location.
Does any one have any suggestion please? I debugged this code dozens of times and I cannot see the problem. Any suggestions are welcome!
Thanks
The Position property of the InputStream of the HttpPostedFileBase had the same value as the Length property (probably because I had another file previous to this one - stupid I think!).
All I had to do was to set the Position property back to 0 (zero)!
I hope this helps somebody in the future.
Thanks Fabio for bringing this up and solving your own question. I just want to add code to whatever you have said. Your suggestion worked perfectly for me.
var memoryStream = new MemoryStream();
// "upload" is the object returned by fine uploader
upload.InputStream.CopyTo(memoryStream);
memoryStream.ToArray();
// After copying the contents to stream, initialize it's position
// back to zeroth location
memoryStream.Seek(0, SeekOrigin.Begin);
And now you are ready to upload memoryStream using:
blockBlob.UploadFromStream(memoryStream);