Overlay non transparent Pixels in iOS Cocoa - objective-c

Is there any way on the iOS SDK to overlay the non-transparent pixels in an image with colored pixels?
Thanks very much to both who answered.
The final solution I implemented used the code mentioned in the accepted answer within the drawRect method of a subclassed UIView, I used the following code to overlay the color:
CGContextSetFillColor(context, CGColorGetComponents([UIColor colorWithRed:0.5 green:0.5 blue:0 alpha:1].CGColor));
CGContextFillRect(context, area);

I think you are probably looking for the blend mode kCGBlendModeSourceAtop. First, draw the UIImage into your view. Then obtain the current graphics context with
CGContext currentCtx = UIGraphicsGetCurrentContext();
Then save the context state, and set the blend mode:
CGContextSaveGState(currentCtx);
CGContextSetBlendMode(currentCtx, kCGBlendModeSourceAtop);
Then you can draw whatever you want to be overlaid over the opaque pixels of the UIImage. Afterward, just reset the context state:
CGContextRestoreGState(currentCtx);
Hope this helps!

You can modify how something is drawn using the blend mode. See http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CGContext/Reference/reference.html%23//apple_ref/doc/c_ref/CGBlendMode for a full list of the blend modes supported by Core Graphics on iOS. From your description, I think you would be interested in either kCGBlendModeSourceIn, which draws the new content using the old content's alpha value as a mask, or kCGBlendModeSourceAtop, which draws the new content using the old content's alpha as a mask on top of the old content using the new content's alpha value as a mask. You can set the blend mode for all drawing using CGContextSetBlendMode, or you can draw a UIImage with a certain blend mode using -[UIImage drawAtPoint:blendMode:alpha:].

Related

CATextLayer subpixel antialiasing

My app draws layer-backed labels over a NSImageView.
The image view displays an image, and a tint color over that image.
This ensures that the contrast between the labels and the background image works.
As you can see, subpixel antialiasing is enabled and works correctly.
When you hover over those labels, they animate the frame property (Actually the view containing them).
While animating, the subpixel antialiasing is disabled, and when done enabled again.
This looks incredibly weird.
The layer is never redrawn, and the subpixel antialiasing doesn't have to change.
So I don't see a good reason why it shouldn't be displayed when animating.
I've tried everything I can think of.
Making the NSTextField opaque
Making the CATextLayer opaque
Giving the NSTextField a background-color
Giving the CATextLayer a background-color
Always the same result.
Disabling subpixel antialiasing for the labels is not an option, since it's not well readable on non-retina devices.
EDIT
I forgot that the layer is replaced with a presentationLayer while animating.
This layer probably does not support subpixel antialiasing, which is why it's disabled.
Now the question is if I can replace this presentationLayer with a CATextLayer.
What I also noticed is that setting shouldRasterize to YES enabled subpixel antialiasing also for animation, but only against the background color. So no background-color will bring no subpixel antialiasing.
Is there any way that you can post a piece of sample code? I quickly mocked up an NSWindow, added an NSImageView, added a background-less NSTextField with setWantsLayer: set to YES. In my applicationDidFinishLaunching: I set a new rect on the NSTextField's Animator frame, but I didn't see any pixelation.
The problem is with positioning of the text layer. Let's presume you use left alignment. The text will look good if x and y coordinates of the layer's frame origin are rounded numbers. For example:
CGFloat x = 10.6;
CGFloat y = 10.3;
textLayer.frame = CGRectMake(x, y, width, height); // the text will be blur.
textLayer.frame = CGRectMake(round(x), round(y), width, height); // the text will not be blur.
So, the first problem may be that the coordinates you assign to the layer's frame are not rounded.
The tricky part is that the edge of the layer may still be not aligned with pixels even if you think you passed rounded coordinates. This may happen if the anchorPoint of your layer is (0.5, 0.5), which is the default value. If you set:
textLayer.position = CGPointMake(10.0, 10.0);
you may think it should draw the text sharp. However, position point is in the center of the layer here, and depending on the layer's width and height the left and top edge's coordinates may be fractional numbers.
If you want to make a quick test do this.
Use textLayer.frame = frame instead of using position and anchor point, it will assign the coordinates directly to the frame.
Make sure the numbers you use in the frame are rounded.
Do not mess with rendering mechanism, remove the code that changes shouldRasterize, antialiasing, etc.
If this makes the text sharp, you can start using the anchor point and position and to see how the result changes.

How to stretch iOS UITabBarItem background in iOS

This is what I'm using now:
tabBarController.tabBar.selectionIndicatorImage = [UIImage imageNamed:#"image.png"];
I have a couple pixels a of a solid color and I would like to stretch it the full width and height of my tabBarItem
How would you do this?
This should help solve your problem
tabBarController.tabBar.selectionIndicatorImage = [[UIImage imageNamed:#"image.png"]
resizableImageWithCapInsets:UIEdgeInsetsMake(0, 0, 0, 0)];
From Apple documentation
During scaling or resizing of the image, areas covered by a cap are not scaled or
resized. Instead, the pixel area not covered by the cap in each direction is tiled, left-to-
right and top-to-bottom, to resize the image. This technique is often used to create
variable-width buttons, which retain the same rounded corners but whose center region grows
or shrinks as needed. For best performance, use a tiled area that is a 1x1 pixel area in size.
Since we are setting the edge insets to be zero the image will be stetched to cover the entire area.

How to set an image to a NSButton to be resized along with the button?

I have an image (3width x 15height, the first and last pixels are rounded on the corners), and I want to create a button (the height of the button is always the same, 15px, but the width changes) that uses this image no matter what size it is, kinda like UIEdgeInsets on iOS.
I've tried to use [[button cell] setImageScaling:NSImageScaleAxesIndependently]; but the image gets distorted and loses quality.
How can I do this on OSX? Thanks!
You could probably create a custom subclass of NSButtonCell and use the Application Kit's NSDrawThreePartImage() function in the cell's drawInteriorWithFrame:inView: method.
For more info on how to use NSDrawThreePartImage(), see Cocoa Drawing Guide: Drawing Resizable Textures Using Images.
If it's possible to set this in graphical interface > Attribute Inspection > Button section Background

Change color of custom drawn UIView using block animations

I have a UIView which draws itself via drawRect. I use core graphics and draw a bezier curve. I would like to have an animation which changes the color of the bezier curve drawn on some occasions. However just having the color as a property and changing it in an animation block doesn't work. I also need to ensure that it is redrawn correctly. What is the way to do this? I'm quite new to IOS
I don't think you can do what you are trying to do. The system has special code that knows how to animate changes in it's animatable properties, but it does not support animating changes to any other properties.
To animate a color change, you're going to need to use a CAShapeLayer to do your drawing. the strokeColor and fillColor properties of shape layers are animatable.

Using resizableImageWithCapInsets: image for button only works for the state set, other states show a "gap"

When using resizableImageWithCapInsets: to create an image for a UIButton only the normal state (the state set the image with using setBackgroundImage:forState:) works. All other states show a gap instead of the drawn image. UIButton says that if no image is set for a particular state, the normal state image will be used with an overlay for disabled and selected states.
Here is the normal state:
Here is the selected state:
And here is the source image:
It clearly is using the resizable image I provided, but the image is not drawing the resized area. (You can see the left and right edges but the middle area that is to be stretched just isn't drawn).
Interestingly, stretchableImageWithLeftCapWidth:topCapHeight: does work. Now this is a deprecated method in iOS 5, but with the gap being shown in the new API, I may be stuck using it.
I do recognize that I can provide more images for each state but that defeats the purpose I'm trying to achieve of reducing memory footprint plus adds extra dependency on my graphics designer which I'd like to avoid.
// This is the gist of the code being used
UIImage* image = [UIImage imageNamed:#"button.png"];
UIEdgeInsets insets = UIEdgeInsetsMake(image.size.height/2, image.size.width/2, image.size.height/2, image.size.width/2);
image = [image resizableImageWithCapInsets:insets];
[self.button setBackgroundImage:image forState:UIControlStateNormal];
// Even doing the following results in the same behaviour
[self.button setBackgroundImage:image forState:UIControlStateSelected];
You aren't creating your insets properly for the image capping. I've reproduced your issue and corrected it by using the correct insets.
With your current code, you are creating caps of half of the image height and width - this leaves you with a "stretchable" area of 0x0 pixels - so you get nothing in the middle.
Why this isn't showing up as wrong in the normal state of the button I'm not sure - perhaps there is some optimisation built in to UIButton to fix things or auto-strectch if you don't supply a stretchable image, and this is not applied to the other states.
The caps are supposed to define the area of the image that must not be stretched. In the case of your button.png image, this is 6 pixels on the left and right sides, and 16 pixels in from the top and bottom. This isn't quite standard, you should tell your graphics designer that (at least for left-right which is the most common stretching) you should only have a 1px area in the centre, however this does not affect the outcome. If you do have a 1px stretchable area then you can standardise your code by deriving the caps from the image size as you have tried to do in your question (each cap is then (image.size.height - 1) / 2 for top/bottom, same but with width for left/right).
To get the correct images on your button, use the following code for creating the stretchable image:
UIEdgeInsets insets = UIEdgeInsetsMake(16, 6, 16, 6);
image = [image resizableImageWithCapInsets:insets];
I was experiencing problems while using resizable images on iOS5 too. It turns out that if your button is of type "RountedRect" and you manipulate the background images, the resizable images will not behave as expected. (iOS6 handles this ok, presumably by assuming your new button type and adjusting as needed.)