I am completely stumped here; I have a series of small images I'm tinkering with and making into buttons:
And as you can see they are all decently crisp and sharp, and retain this when I open the png files in Preview and what not.
However, when I use them in NSButtons and NSImageViews in Interface Builder, setting Scaling to None:
The images become horribly blurred. What am I doing wrong? I don't know where to start and what to try; should I go back to the icons and try to make them pixel perfect? Does it have to do with anti-aliasing or something along those lines?
EDIT:
For some reason, it seems as if the NSButtons and NSImageViews are loading the high resolution versions of the images, even though I'm on a normal display, which can be identified by a slight light blue stroke I added to them. For some reason, Quartz Debug does not identify these as high resolution images and there's no red tint. Removing references to the #2x images does fix the problem... but...
If you check out session 245 in the WWDC 2012 videos Advanced Tips and Tricks for High Resolution on OS X in the first section on NSImage you'll find out why.
NSImage doesn't have any concept of high resolution - it just uses the smallest image that has more pixels than the space it has to fill - so if your NSImageView is bigger in dimension than your 1x image it will use the 2x image as it has more pixels.
I have this problem before. It seems that if your image's DPI isn't 72, the image size will be wrong. You can get the real size use the code below.
NSImage *image = [NSImage imageNamed:#"image"];
NSBitmapImageRep *rep = [NSBitmapImageRep imageRepWithData:[image TIFFRepresentation]];
NSSize size = NSMakeSize([rep pixelsWide], [rep pixelsHigh]);
[image setSize: size];
When specifying image names in Interface Builder and [NSImage imageNamed:], make sure to use foo instead of foo.png. While iOS is smart enough to add the #2x in the later case, Mac OS X is not. It will load the non-retina image in the later case, but will add the #2x in the first case (if such an image is present).
Are you assigning the images to your Buttons in IB or in Code?
If you are doing it in code, maybe creating a copy of the image (e.g. [myImage copy]), and assigning that copy to your button may solve this.
In my case (drawing icons in custom NSOutlineView), I had to make sure that the x,y origin of the drawRect is rounded to int values:
NSMakeRect( round(NSMinX(cellFrame)-iconSize.width),
round(NSMidY(cellFrame)-(iconSize.height/2.0f)), …);
This is actually a response to the earlier post about DPI, but I was unable to reply directly to it. The code in that post gave the true pixel dimensions for me (that is, it did not indicate any trouble). However, image DPI was definitely the culprit in my case. The symptoms I was seeing were:
With my NSImageViews set to No Scaling, the images would appear squashed.
With my NSImageViews set to Axes Independently, most images would appear correctly if the dimensions of the NSImageViews were set to exactly match the dimensions of the image.
However, even in this case, some images had strange artifacts in them that were not there when viewing the same image via Preview or elsewhere (or even via Interface Builder, for that matter -- they only appeared at runtime).
The images that had trouble were at a DPI other than 72. When I re-created the images at 72 DPI, all of the above behavior disappeared.
This was a pretty confounding issue -- I hope this helps someone!
For me, I just needed to set image scaling to none:
In Interface Builder
In code
NSImageCell *image;
[image setImageScaling:NSImageScaleNone];
NSButtonCell *button;
[button setImageScaling:NSImageScaleNone];
Related
I am trying to save the view with its subview, but the saved image is little bit blurry (especially the label's text)
I tried all the solutions given in stackoverflow - no use.
Can anyone help me on the same?
I am using the below code
UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, 0);
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
And getting the blurred text, also the picture quality is low.
You could try a higher resolution image. It should be fine if you compress a high resolution image to down, but scaling up a low resolution image to a larger size will generally blur the image contents, as it stretches everything.
The preferred approach is [UIView snapshotViewAfterScreenUpdates:]. You should only use drawViewHierarchyInRect:afterScreenUpdates: if you plan to apply additional effects.
That said, there are several likely causes, depending on how you're manipulating or saving the image. For example, saving text in JPEG format will cause blurriness. Rotating or scaling the image without great care can make the text blurry. Drawing the image incorrectly (for instance, failing to pixel-align it) can make the text blurry. You should simplify your problem if you're making multiple steps, and validate the quality at each step. To discuss it further on StackOverflow, you need to provide details on how you're manipulating and displaying the image, not just how you generate it.
Text is extremely susceptible to artifacts. If you must take pictures of it (something you generally should avoid if at all possible), you should make sure to manipulate it as little as possible. It is always better to manipulate the text before it's drawn rather than after.
I have set a patterned background on my UIView using:
myView.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"backgroundImage.png"]];
But the image appears to be stretched or scaled up and doesn't appear at the desired resolution. Is there a way to set the size or scale of a background pattern? Or is the images size used as a default. Does the image DPI have an affect?
The pattern is constructed by tiling the image until it fills the given area.
So there is no control on tile size other than the original image dimensions.
Now, if you want to provide retina images you should just have a #2x version and iOS will take care of that automatically (btw change the method call to [UIImage imageNamed:#"backgroundImage"] - the file extension is optional for png images).
Do not provide higher dpi images for retina, instead provide an image that is twice the size of the non-retina one (and obviously not by oversampling the image).
Finally the only control you seem to have on the pattern (at least the only one that is documented) is the phase. Here is the relevant part from the official documentation:
By default, the phase of the returned color is 0, which causes the top-left corner of the image to be aligned with the drawing origin. To change the phase, make the color the current color and then use the CGContextSetPatternPhase function to change the phase.
Turns out I wasn't using the #2x naming convention so images were appearing stretched. I added it in and it fixed everything.
I am a bit confused on managing the different sized images with retina and non-retina displays.
I add a custom button in Storyboard and add some text and then add the backgroundimage, which is a vector done in Illustrator (Width: 630 / Height:130):
UIImage *img = [UIImage imageNamed:#"iPad1_orange_button.png"];
[myButton setBackgroundImage:img forState:UIControlStateNormal];
the button shows up:
...the button comes out very small.
I have another image with the #2x for retina but that comes out the same size.
My question is how to manage the sizes of the buttons in regards to image size. DO i need to set the size of the button manually?
Also, when i create a button in Illustrator with the same pixel size as the button i use in XCODE and export it as .png, add it to XCODE and drag it into Storyboard it comes out very large.
Just a quick clarification:
The storyboard dimensions are NOT in pixels.
An iPhone 4S has 640x960(x,y) # 326ppi. xCode dimensions are 320by460(x,y). Just take the numbers and convert em to get the appropriate pixel sizes.
IE: If you want a button that that is 100 wide and 100 tall in storyboard you'd to create an image 100*(640/320) px wide and 100*(960/460) px tall (I believe my math is right...).
Depending on what you're building for you'd need different images to cater to the different devices.
On another note, the term retina display doesn't designate a clear-cut standard of px by px. If I recall correctly it's a term Apple coined that basically means that a screen has enough px that the human eye (hence retina) would not be able to see the jaggies.
Similar to Intel's coining of the term Ultrabook; no REAL spec floor/ceiling just a marketing ploy.
Think of dimensions and what you would consider a pixel to be a "density independent pixel". You are always working in the original size when it comes to position and size but it is up to the operating system to handle the different resolutions for you. For now we have the original size and Retina which is X2 the density but this could change. You never need to manually change the resolution for the given device.
This is simple:
make the button frame the size of the normal res image
when you load the image with UIImage imageNamed: do it like this
UIImage *img = [UIImage imageNamed:#"iPad1_orange_button"];
*Note I removed the file extension (.png) - this instructs UIImage to select the resolution appropriate version, the regular or the #2x. iOS will show the 2x version of the image in the same frame size as the normal res image. Pretty simple.
I have a 64px by 64px redSquare.png file at a 326ppi resolution. I'm drawing it at the top left corner of my View Controller's window as follows:
myImage = [UIImage imageNamed:#"redSquare.png"];
myImageView = [[UIImageView alloc] initWithImage:myImage];
[self.view addSubview:myImageView];
Given that the iPhone 4S has a screen resolution of 960x640 (326ppi) there should be enough room for 9 more squares to fit next to the first one. However there's only room for 4 more. i.e. the square is drawn larger than what it should given my measurements.
// even tried resizing UIImageView in case it was
// resizing my image to a different size, by adding
// this next line, but no success there either :
myImageView.frame = CGRectMake(0, 0, 64, 64);
I believe it has to do with the way the device is "translating" my pixels. I read about the distinction between Points Versus Pixels in Apple's documentation but it doesn't mention how one can work around this problem. I know I'm measuring in pixels. Should I be measuring in points? And how could I do that? How exactly am I to resize my image so that it can hold 9 more same-sized squares next to it (i.e. on the same horizontal..) ?
Thank you
To display an image at full resolution on a Retina display, it needs to have #2x appended to the end of its name. In practice, this means you should save the image you're currently using as redSquare#2x.png and a version of that image in 32x32 pixels as redSquare.png.
Once you have done this, there is no need to change your code. The appropriate image will be displayed depending on the device's capabilities. This will allow your app to render correctly on both Retina and non-Retina devices.
I have a rectangular NSImage A and I want to scale to embed into a squared transparent image B keeping A's ratio. So, in the end I'll get a squared image with the rectangle in it.
How can I compose that image?. I mean, how can I draw an NSImage over another NSImage and save the resulting image?.
I've been reading about clipping an NSImage inside a beizer but I need to keep ratio instead of filling the beizer square.
I hope you understand what I want.
Thanks.
The 'Cocoa Drawing Guide' has a section called 'Drawing to an Image'. From that documentation:
It is possible to create images programmatically by locking focus on an NSImage object and drawing other images or paths into the image context. This technique is most useful for creating images that you intend to render to the screen, although you can also save the resulting image data to a file.
There is example code there.