Having created an application in Unity for the Hololens 1 and setting the skybox to be visible in the Skybox settings, I am able to now see the sky (including the night sky - a black background with white dots representing stars) when using the application. However, when recording a mixed reality video (as per the instructions here at https://learn.microsoft.com/en-us/hololens/holographic-photos-and-videos#capture-a-mixed-reality-video) the night sky is not captured in the video and you can only see objects in the night sky on black surfaces (while the video is recording, the application still renders the night sky normally). You can see the result of a video capture here: https://www.youtube.com/watch?v=8OCIytZaQKE&t=30s from 1:59-2:03 on the black TV screen.
Why is the night sky being rendered in the application, but not in the video? Moreover, is there a way to record the video with the black background/night sky in it?
You cleared backbuffer with transparent black which is invisible on video. You need to render the black sky with non-transparent black then.
Related
I'm using the EMGU CV .NET library. I noticed that when I take pictures of anything with color, the colors usually get "washed out" if the background is dark(ish). General rule of thumb I've found is that, the darker the background is, the more washed out the colors are.
Here is how I'm retrieving the image from the camera with EMGU.
Dim imgFeed As Bitmap = mCamera.RetrieveBgrFrame.ToBitmap
In the images below (cropped out some of the background on both), the left image is on dry white cement and the right image is on wet white cement. You can see the "washed out" color especially on the first tag, which is bright orange duct tape.
Here is another image, taken on black pavement in the sun, which in reality is much darker than the white cement, but appears similar in color to the background in the wet cement image above.
Is there some sort of auto-balancing that's occurring in the EMGU library? If so, can I stop this from happening? I need to see the colors more clearly than the background. I've read about _EqualizeHist() and I implemented it, but that did not help me see the colors any more clearly; adding contrast to the image didn't really help because the colors were already close to white.
Update
After reading Spark's answer, I found the SetCaptureProperty() method. I see that you can disable the auto exposure property by setting the value to 0 as shown below.
mCamera.SetCaptureProperty(CvEnum.CAP_PROP.CV_CAP_PROP_AUTO_EXPOSURE, 0.0)
Sadly though, with the particular camera I'm using, it looks like the driver does not support changing this property.
This is nothing to do with the algorithm. It is the behavior of Auto Exposure (AEC) algorithm running inside camera chip. Try disabling auto exposure control of camera and reduce the manual exposure level.
Little theory: Most of the AEC algorithm works with full frame weighted method. So in the sample which you showed for white washed case, the black background occupies more portion of the image, which make AEC algorithm to assume that the image is too dark and hence increase exposure level internally.
Suppose there there is a scene as follows:
There is a scene with the same size as the frame of the device. The scene has a red ball, which is able to move throughout the 'world'. This world is defined by black and white areas, where the ball is ONLY able to move in the area that is white. Here is a picture to help explain:
Parts of the black area can be erased, as if the user is drawing with white color over the scene. This would mean that the area in which the ball can be moved is constantly changing. Now, how would one go about implementing a physicsBody for the an edge between the white and black areas?
I tried redefining the physicsBody every time it is changed, but once the shape becomes complex enough, this isn't a viable solution at all. I tried creating a two-dimensional array of 'boxes' that are invisible and specify whether most of the area within each box is white or black, and if the ball touched a box that was black, it would be pushed back. However, this required heavy rendering and iterating over the array too much. Since my original array contained boxes a little bigger than a pixel, I tried making these boxes bigger to smooth the motion a little, but this eventually caused part of the ball to be stopped by white areas and appear to be inside the black area. This was undesired, since the user could feel invisible barriers that they seemed to be hitting.
I tried searching for other methods to implement this 'destructible terrain' type scene, but the solutions that I found and tried were using other game engines. To further clarify, I am using Objective-C and Apple's SpriteKit framework; and I am not looking for a detailed class full of code, but rather some pseudo-code or implementation ideas that would lead me to a solution.
Thank you.
If your deployment target is iOS 8, this may be what you're looking for...
+ bodyWithTexture:alphaThreshold:size:
Here's a description from Apple's documentation
Creates a physics body from the contents of a texture. Only texels
that exceed a certain transparency value are included in the physics
body.
where a texel is a texture element. You will need to convert an image to the texture before creating the SKPhysicsBody.
I'm not sure if it will allow for a hole in the middle like your drawing. If not, I suspect you can connect two physics bodies, a left half and a right half, to form the hole.
before I start I'm not a developer, so apologies in advance for any potentially daft questions. Our developer has just integrated a custom soundcloud player for our website http://www.samplephonics.com/ (see when you hover over the top 8 or recent sample packs) but I want to change the light grey background colour around the waveform to white.
As far as I am aware it's configured so that the light grey mask displays the shape of the waveform as a hole, which then has solid light grey, grey and dark grey images behind to show the apparent colour of the waveform in idle, loading and playing states..
Does anyone know how we can change this mask to white, or have any ideas I can send his way? He used this example as a starting point for creating the player: http://static.soundcloud.com/demos/soundcloud-custom-player/examples/sc-player-minimal.html
Thanks in advance :)
I've previously answered this here – https://stackoverflow.com/a/14731623/236135
unless the library you want to use for waveform customisation is using HTML5 canvas, you won't be able to use change the color of that chrome (so no, not possible with either HTML5 Widget API or Custom Player API)
In short, you'd have to use canvas or manipulate the images on the server and retrieve them from server.
The white background in my apple touch icon turns black?? My white background isn't transparent. The icon has a white triangle, red circle and black text. The only thing I can make out is the white triangle and red circle. Any idea whats causing this and how to make the icon keep its white background?
Much appreciated.
I'm not sure exactly what it was, but I may have narrowed it down.
The first time I uploaded the images they may have had the transparent background.
The second time I uploaded them I added the white background, but the png save for devices setting still had transparency checked.
The third time I uploaded them I saved them without the transparency checked but still got the black background.
All along I think the iOS Safari browser had the first set of images stored in cache. After clearing the cache the new image showed up with no problem.
Edit: Its the transparent background that turns it black.
As gstricklind found, if your png has alpha channel transparency, iOS will unfortunately ignore any "precomposed" directive and apply all its usual effects, in adddition to giving your icon a black background (white didn't seem like a better choice??).
The solution is, of course, to precompile the icons with no transparency. This makes it pretty difficult to utilize Android's support of alpha channels while offering itouch icons as well if a black background does not work for you. I found this tool useful when composing: http://www.gieson.com/Library/projects/utilities/icon_slayer
Same issue seen here: How to get iOS to properly respect the "apple-touch-icon-precomposed" link attribute for a "web app"
Transparent Apple Touch Icons are indeed possible. What you need to do is strip all excess information from the PNG using ImageOptim for Mac https://imageoptim.com/mac or a similar program on windows. I did extensive research and stumbled on the answer by looking at the apple touch icon used by the Virtual DJ site. This icon keeps it's transparent background on iOS and MacOS Sierra. I looked at the file size and realized how small it was compared to the custom apple touch icons with transparency that I made for my favorites in Safari. Next I looked at the file information using the inspector in Mac's Preview app. All of my custom apple touch icons had four tabs in the inspector section named General, exif, PNG and TIFF. The virtual DJ icon only had two tabs named PNG and General. Once I ran my custom images through ImageOptim the file sizes plummeted by over 90% and the extra TIFF and exif tabs disappeared from the inspector in Preview. Sure enough once I reloaded my custom images and launched Safari all of my Favorites tabs have transparent backgrounds!!!! Please see attached image for proof. Transparent Apple-Touch-Icons
In Expresson player when showing a video with 4:3 aspect ratio the full screen video image crops of some of the picture at the bottom, and inserts a black bar at the top. What part of the code should I change to fix this issue?
I had exactly the same problem, I have listed how to fix it with Expression Blend here:
http://social.expression.microsoft.com/Forums/en-US/encoder/thread/e58270bf-e101-4a13-ab8c-e7b3b7dbc2a9