I'm using the react-native-linear-gradient package to form a linear gradient. I've picked the exact same colors with color picker in XD design to form the same gradient in my app. Here is my code:
<LinearGradient start={{x:0,y:0}} end={{x:1,y:1}} style={StyleSheet.absoluteFill} colors={['#D300B5', '#FF5400']} >...
These hex values are the same values with the design, yet, here is the result compared with the design:
The colors are significantly washed out. I've checked the opacities to make sure everything is at 100%, there aren't anything over the gradient, the gradient view isn't extending beyond the screen, both XD, my Mac, and the iPhone X use the Display P3 color space.
Why are the colors washed out?
Note: This solution is iOS-only and applies to all colors used in app.
After a long time of not being able to find out anything, I've created a patch of React Native itself, as the problem originates from how React Native creates colors itself in native code in RCTConvert.m:
return [UIColor colorWithRed:... green:... blue:...]
Switching both occurances (there are two as of writing) of colorWithRed to colorWithDisplayP3Red and rebuilding (don't forget as we're changing native code, hot reloading won't work) the app worked: Colors are now rendered in P3 color space. Please note that this approach changes all colors that you create/use in app, so every color will basically look more crisp.
Related
My developer is running into a problem, that he says is impossible to do in React Native and I just don't believe him. Can this be done:
I have an animation of a ball moving along a linear scale
As this ball moves along the scale, when it passes certain values, the image of the ball is meant to change (ie from values 1-4 = Image1, 5-7 Image2 etc)
The change in the image should not affect the "flow" of the moving graphic
Is this possible?
What my developer has done is to calculate the length of time it takes the ball to reach each "change" value (5 in the above example) and change the image at these times, using a TimeOut to load the new image. The result is almost OK, but has a "jittery" effect and therefore doesn't look how I want it to look.
Are there any other possibilities with React Native?
Thanks.
Best regards,
A
I'm using the EMGU CV .NET library. I noticed that when I take pictures of anything with color, the colors usually get "washed out" if the background is dark(ish). General rule of thumb I've found is that, the darker the background is, the more washed out the colors are.
Here is how I'm retrieving the image from the camera with EMGU.
Dim imgFeed As Bitmap = mCamera.RetrieveBgrFrame.ToBitmap
In the images below (cropped out some of the background on both), the left image is on dry white cement and the right image is on wet white cement. You can see the "washed out" color especially on the first tag, which is bright orange duct tape.
Here is another image, taken on black pavement in the sun, which in reality is much darker than the white cement, but appears similar in color to the background in the wet cement image above.
Is there some sort of auto-balancing that's occurring in the EMGU library? If so, can I stop this from happening? I need to see the colors more clearly than the background. I've read about _EqualizeHist() and I implemented it, but that did not help me see the colors any more clearly; adding contrast to the image didn't really help because the colors were already close to white.
Update
After reading Spark's answer, I found the SetCaptureProperty() method. I see that you can disable the auto exposure property by setting the value to 0 as shown below.
mCamera.SetCaptureProperty(CvEnum.CAP_PROP.CV_CAP_PROP_AUTO_EXPOSURE, 0.0)
Sadly though, with the particular camera I'm using, it looks like the driver does not support changing this property.
This is nothing to do with the algorithm. It is the behavior of Auto Exposure (AEC) algorithm running inside camera chip. Try disabling auto exposure control of camera and reduce the manual exposure level.
Little theory: Most of the AEC algorithm works with full frame weighted method. So in the sample which you showed for white washed case, the black background occupies more portion of the image, which make AEC algorithm to assume that the image is too dark and hence increase exposure level internally.
I'm trying to "load" some 2d image as a material for a mesh, (I know about spritemanager) but I'm unfortunately getting this sprite with it's white background. How can I "make it go"(the background)?
Thanks.
If the image is of a file type that supports transparency, open it up in image editing software (paint.net, photoshop, etc.) and delete the white or replace it with empty/transparent color.
Otherwise, look for an option in the unity documentation to set a specific color value as 'background' or 'transparent' so that that color will be ignored.
First of all, you need to add an alpha channel to your texture and save it in a format that supports alpha channel transparency.
Here is a quick tutorial on how to do this in GIMP:
Note that you can remove the selected background with the Delete-key.
In my case, I'm exporting the result as a PNG for alpha transparency support. You can do this from the export menu by renaming the file suffix to .png:
I use these settings to export my PNG:
Then, after importing the image into Unity, Make sure that in the texture's import settings the image's alpha source is set to Input Texture Alpha and that Alpha is Transparency tickbox is checked like so:
Finally, since you are using this on a mesh, you need to ensure that your mesh has a material applied that has it's render mode set to Cutout:
Hope this little guide helps.
I'm creating a coloring book app for a client and it's coming along well. Very well, in fact, because it's finished. However there's one snag: some of the colors aren't displaying as expected.
The client gave me images and a key to use for the brush creation. I took the color reference and made several different sizes of circle to use, for each color, to represent different brush sizes. I then load the brush as so:
brush = [[CCSprite spriteWithFile:#"yellowbrush3.png"] retain];
[brush setBlendFunc: (ccBlendFunc) { GL_ONE, GL_ONE_MINUS_SRC_ALPHA }];
[brush setOpacity:20];
The brush image for this particular file is:
I made a screenshot of the colors output to compare with the key I used to create the brushes:
About half of the colors show up just fine while the others are pretty noticeable.
I've tried different levels of opacity, changing some GL settings, but nothing seems to help.
This is due to a blending artifact and the fact that you are drawing on a white background. To confirm this, alter the background color and see that the hue for each color changes slightly. To correct this issue you can reduce the opacity of the painted colors, or pick a more appropriate blending mode.
Try changing your glBlendFunc to use GL_ONE for both parameters. This will remove the blending but at least your colors should be 100% accurate.
I have a CCSprite that was created from a png with transparent background.
I want to be able to apply colors to this sprite in a way that I`m free to define which color it is, without the actual color of the sprite affecting the amount of each color I have to add.
I`ve tried this:
mySprite.color = ccc3(200,200,255);
In an attempt to add a little blue-ish feel to my sprite, but as it works by setting the amount of tint that's gonna be displayed based on existant color of the sprite, and my sprite has virtually no blue in any of it (most of it is yellow) the resulting effect is pretty sketchy, everything gets really dark, and there is one slight blue-ish coloring, but not as I wanted.
The ideal effect for me on this case would be to ADD a light blue mask to it with very low alpha.
Is there an easy way to do that without composing sprites?
I've tried using CCTexture2D, but had no luck, as there is no built in method for working with colors, and most tutorials only teach you how to build textures out of image files.
This is deceptively hard to do in code with the original sprite. Another option would be:
create a new sprite, which is just a white outline version of your original sprite
the color property of this white sprite will now respond exactly to the RGB values you pass in
so pass in your light blue value to the white sprite and set the opacity correctly
then overlay it on your original sprite
Any good?
The only way you can achieve this is by overlaying (masking) the sprite with the proper OpenGL blend functions. When you say "add a light blue mask" then that's what you need to do. You may find this visual blendfunc tool helpful, if only to understand how blending with mask sprites works and what you can achieve with it. There's no built-in support for this by Cocos2D however, so you'll have to revert to pure OpenGL inside a cocos2d node's -(void) draw {} method.
Tinting (changing the color property) will only modify the RGB channels of the entire image by changing the vertex colors of all 4 vertices.