Obtaining the maximum height of a font - objective-c

So I have an NSFont, and I want to get the maximum dimensions for any characters, ie. the pitch and letter height. [font maximumAdvancement] seems to return an NSSize of {pitch, 0}, so that's not helping. Bounding rect doesn't seem to work either, and the suggestion from jwz's similar question of creating a bezier path, appending a glyph and getting the bounding rectange is also giving me back {0, 0}. What gives here?
UPDATE: The code I'm using to get the bezier size is this:
NSBezierPath *bezier = [NSBezierPath bezierPath];
NSGlyph g;
{
NSTextStorage *ts = [[NSTextStorage alloc] initWithString:#" "];
[ts setFont:font];
NSLayoutManager *lm = [[NSLayoutManager alloc] init];
NSTextContainer *tc = [[NSTextContainer alloc] init];
[lm addTextContainer:tc];
[tc release]; // lm retains tc
[ts addLayoutManager:lm];
[lm release]; // ts retains lm
g = [lm glyphAtIndex:0];
[ts release];
}
NSPoint pt = {0.0f};
[bezier moveToPoint:pt];
[bezier appendBezierPathWithGlyph:g inFont:font];
NSRect bounds = [bezier bounds];

The glyph for the space character doesn't have any subpaths, so of course its bounds have size NSZeroSize. Try -[NSFont boundingRectForFont] instead.

Related

Autosize Text in Label for PaintCode CGContext

I'm using the following to draw text inside a Bezier Path. How can i adjust this to allow the text to autosize.
EDIT
I was able to update to iOS7 methods but still nothing. I can autosize text within a UILabel fine, but because this is CGContext it is harder
NSString* textContent = #"LOCATION";
NSMutableParagraphStyle* locationStyle = NSMutableParagraphStyle.defaultParagraphStyle.mutableCopy;
locationStyle.alignment = NSTextAlignmentCenter;
NSDictionary* locationFontAttributes = #{NSFontAttributeName: [UIFont fontWithName:myFont size: 19], NSForegroundColorAttributeName: locationColor, NSParagraphStyleAttributeName: locationStyle};
CGFloat locationTextHeight = [textContent boundingRectWithSize: CGSizeMake(locationRect.size.width, INFINITY) options: NSStringDrawingUsesLineFragmentOrigin attributes: locationFontAttributes context: nil].size.height;
CGContextSaveGState(context);
CGContextClipToRect(context, locationRect);
[textContent drawInRect: CGRectMake(CGRectGetMinX(locationRect), CGRectGetMinY(locationRect) + (CGRectGetHeight(locationRect) - locationTextHeight) / 2, CGRectGetWidth(locationRect), locationTextHeight) withAttributes: locationFontAttributes];
CGContextRestoreGState(context);
Try using this method of NSAttributedString:
- (CGRect)boundingRectWithSize:(CGSize)size
options:(NSStringDrawingOptions)options
context:(NSStringDrawingContext *)context;
Where the context will provide you actualScaleFactor.
The usage is something like this:
NSAttributedString *string = ...;
NSStringDrawingContext *context = [NSStringDrawingContext new];
context.minimumScaleFactor = 0.5; // Set your minimum value.
CGRect bounds = [string boundingRectWithSize:maxSize
options:NSStringDrawingUsesLineFragmentOrigin
context:context];
CGFloat scale = context. actualScaleFactor;
// Use this scale to multiply font sizes in the string, so it will fit.

Convert geographic coordinates to a CGPoint on a custon=m UIView

I'm building a application in whom one of the features will be shoving geographic coordinates on a custom UIImageView. I'm bad at math so I cant seem to get the right values. This is what I'm doing:
I have a image that is 2048x2048 and I put it in a UIScrollView and when I get coordinates, let's say "Sydney -33.856963, 151.215219" I turn them into a UIView coordinates (x,y)
- (void)viewDidLoad
{
[super viewDidLoad];
scrollView = [[UIScrollView alloc]initWithFrame:CGRectMake(0, 0, self.view.frame.size.width,self.view.frame.size.height)];
scrollView.delegate = self;
scrollView.showsVerticalScrollIndicator = YES;
scrollView.scrollEnabled = YES;
scrollView.userInteractionEnabled = YES;
[scrollView setBounces:NO];
scrollView.minimumZoomScale = 0.5;
scrollView.maximumZoomScale = 100.0;
mainImageView = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, IMAGE_SIZE, IMAGE_SIZE)];
mainImageView.image = [UIImage imageNamed:#"map.jpg"];
mainImageView.contentMode = UIViewContentModeScaleAspectFit;
tipScroll.contentSize = CGSizeMake(IMAGE_SIZE, IMAGE_SIZE);
[scrollView addSubview:mainImageView];
[self.view addSubview:scrollView];
NSString* fileContents = #"-33.856963,151.215219";
NSArray* pointStrings = [fileContents componentsSeparatedByCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
for (int i = 0; i<[pointStrings count]; i++) {
NSArray* latLonArr = [currentPointString componentsSeparatedByCharactersInSet:[NSCharacterSet characterSetWithCharactersInString:#","]];
[self getCoordinates:latLonArr];
}
}
-(void)getCoordinates:(NSArray *)latLonArr{
double comparatorWidth = IMAGE_SIZE/360.0f;
double comparatorHeight = IMAGE_SIZE/180.0f;
double Xl = [[latLonArr objectAtIndex:1] doubleValue]; //151.215219
double Yl = [[latLonArr objectAtIndex:0] doubleValue]; //-33.856963
coordX = (Xl+180.0f)*comparatorWidth;
coordY = (90.0f-Yl)*comparatorHeight;
UIImage *image=[UIImage imageNamed:[NSString stringWithFormat:#"star.png"]];
imageView=[[UIImageView alloc] init];
[imageView setFrame:CGRectMake(coordX,coordY,10,10)];
[imageView setImage:image];
[scrollView addSubview:imageView];
}
The further I go from the center coordinates (0,0) the more the points are not accurate. If I have coordinates for a city in West Africa it will be right on spot, but Sydney is a lot off. How can I fix this?
I think the problem is that the earth is not flat. That means that you can not simple convert geo coordinates to 2-dimensional system of the view. http://en.wikipedia.org/wiki/Geographic_coordinate_system
Check this question and the correct answer:
Converting longitude/latitude to X/Y coordinate
Look at the white horizontal lines on the image you posted. They're not evenly spaced out - they get wider towards the bottom of the image. This means that your map image is not made using an Equirectangular projection, and is probably a Mercator projection image.
The code you have posted which converts lat/long to Y/X just by offset and scaling would only work for Equirectangular projection images.
For mercator projections, the conversion is more complex. Please see Covert latitude/longitude point to a pixels (x,y) on mercator projection.
So you have two options:
A) Use equirectangular projection map images
B) Continue using Mercator map images, and fix your lat/long -> Y/X conversion algorithm

Scale Up NSImage and Save

I would like to scale up an image that's 64px to make it 512px (Even if it's blurry or pixelized)
I'm using this to get the image from my NSImageView and save it:
NSData *customimageData = [[customIcon image] TIFFRepresentation];
NSBitmapImageRep *customimageRep = [NSBitmapImageRep imageRepWithData:customimageData];
customimageData = [customimageRep representationUsingType:NSPNGFileType properties:nil];
NSString* customBundlePath = [[NSBundle mainBundle] pathForResource:#"customIcon" ofType:#"png"];
[customimageData writeToFile:customBundlePath atomically:YES];
I've tried setSize: but it still saves it 64px.
Thanks in advance!
You can't use the NSImage's size property as it bears only an indirect relationship to the pixel dimensions of an image representation. A good way to resize pixel dimensions is to use the drawInRect method of NSImageRep:
- (BOOL)drawInRect:(NSRect)rect
Draws the entire image in the specified rectangle, scaling it as needed to fit.
Here is a image resize method (creates a new NSImage at the pixel size you want).
- (NSImage*) resizeImage:(NSImage*)sourceImage size:(NSSize)size
{
NSRect targetFrame = NSMakeRect(0, 0, size.width, size.height);
NSImage* targetImage = nil;
NSImageRep *sourceImageRep =
[sourceImage bestRepresentationForRect:targetFrame
context:nil
hints:nil];
targetImage = [[NSImage alloc] initWithSize:size];
[targetImage lockFocus];
[sourceImageRep drawInRect: targetFrame];
[targetImage unlockFocus];
return targetImage;
}
It's from a more detailed answer I gave here: NSImage doesn't scale
Another resize method that works is the NSImage method drawInRect:fromRect:operation:fraction:respectFlipped:hints
- (void)drawInRect:(NSRect)dstSpacePortionRect
fromRect:(NSRect)srcSpacePortionRect
operation:(NSCompositingOperation)op
fraction:(CGFloat)requestedAlpha
respectFlipped:(BOOL)respectContextIsFlipped
hints:(NSDictionary *)hints
The main advantage of this method is the hints NSDictionary, in which you have some control over interpolation. This can yield widely differing results when enlarging an image. NSImageHintInterpolation is an enum that can take one of five values…
enum {
NSImageInterpolationDefault = 0,
NSImageInterpolationNone = 1,
NSImageInterpolationLow = 2,
NSImageInterpolationMedium = 4,
NSImageInterpolationHigh = 3
};
typedef NSUInteger NSImageInterpolation;
Using this method there is no need for the intermediate step of extracting an imageRep, NSImage will do the right thing...
- (NSImage*) resizeImage:(NSImage*)sourceImage size:(NSSize)size
{
NSRect targetFrame = NSMakeRect(0, 0, size.width, size.height);
NSImage* targetImage = [[NSImage alloc] initWithSize:size];
[targetImage lockFocus];
[sourceImage drawInRect:targetFrame
fromRect:NSZeroRect //portion of source image to draw
operation:NSCompositeCopy //compositing operation
fraction:1.0 //alpha (transparency) value
respectFlipped:YES //coordinate system
hints:#{NSImageHintInterpolation:
[NSNumber numberWithInt:NSImageInterpolationLow]}];
[targetImage unlockFocus];
return targetImage;
}

Blurred Screenshot of a view being drawn by UIBezierPath

I'm drawing my graph view using UIBezierPathmethods and coretext. I use addQuadCurveToPoint:controlPoint: method to draw curves on graph. I also use CATiledLayer for the purpose of rendering graph with large data set on x axis. I draw my whole graph in an image context and in drawrect: method of my view I draw this image in my whole view. Following is my code.
- (void)drawImage{
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, 0.0);
// Draw Curves
[self drawDiagonal];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
[screenshot retain];
UIGraphicsEndImageContext();
}
- (void)drawRect:(CGRect)rect{
NSLog(#"Draw iN rect with Bounds: %#",NSStringFromCGRect(rect));
[screenshot drawInRect:self.frame];
}
However in screenshot the curves drawn between two points are not smooth. I've also set the Render with Antialiasing to YES in my info plist. Please see screenshot.
We'd have to see how you construct the UIBezierPath, but in my experience, for smooth curves, the key issue is whether the slope of the line between a curve's control point and the end point of that particular segment of the curve is equal to the slope between the next segment of the curve's start point and its control point. I find that easier to draw general smoooth curves using addCurveToPoint rather than addQuadCurveToPoint, so that I can adjust the starting and ending control points to satisfy this criterion more generally.
To illustrate this point, the way I usually draw UIBezierPath curves is to have an array of points on the curve, and the angle that the curve should take at that point, and then the "weight" of the addCurveToPoint control points (i.e. how far out the control points should be). Thus, I use those parameters to dictate the second control point of a UIBezierPath and the first controlPoint of the next segment of the UIBezierPath. So, for example:
#interface BezierPoint : NSObject
#property CGPoint point;
#property CGFloat angle;
#property CGFloat weight;
#end
#implementation BezierPoint
- (id)initWithPoint:(CGPoint)point angle:(CGFloat)angle weight:(CGFloat)weight
{
self = [super init];
if (self)
{
self.point = point;
self.angle = angle;
self.weight = weight;
}
return self;
}
#end
And then, an example of how I use that:
- (void)loadBezierPointsArray
{
// clearly, you'd do whatever is appropriate for your chart.
// this is just a unclosed loop. But it illustrates the idea.
CGPoint startPoint = CGPointMake(self.view.frame.size.width / 2.0, 50);
_bezierPoints = [NSMutableArray arrayWithObjects:
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x, startPoint.y)
angle:M_PI_2 * 0.05
weight:100.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x + 100.0, startPoint.y + 70.0)
angle:M_PI_2
weight:70.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x, startPoint.y + 140.0)
angle:M_PI
weight:100.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x - 100.0, startPoint.y + 70.0)
angle:M_PI_2 * 3.0
weight:70.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x + 10.0, startPoint.y + 10)
angle:0.0
weight:100.0 / 1.7],
nil];
}
- (CGPoint)calculateForwardControlPoint:(NSUInteger)index
{
BezierPoint *bezierPoint = _bezierPoints[index];
return CGPointMake(bezierPoint.point.x + cosf(bezierPoint.angle) * bezierPoint.weight,
bezierPoint.point.y + sinf(bezierPoint.angle) * bezierPoint.weight);
}
- (CGPoint)calculateReverseControlPoint:(NSUInteger)index
{
BezierPoint *bezierPoint = _bezierPoints[index];
return CGPointMake(bezierPoint.point.x - cosf(bezierPoint.angle) * bezierPoint.weight,
bezierPoint.point.y - sinf(bezierPoint.angle) * bezierPoint.weight);
}
- (UIBezierPath *)bezierPath
{
UIBezierPath *path = [UIBezierPath bezierPath];
BezierPoint *bezierPoint = _bezierPoints[0];
[path moveToPoint:bezierPoint.point];
for (NSInteger i = 1; i < [_bezierPoints count]; i++)
{
bezierPoint = _bezierPoints[i];
[path addCurveToPoint:bezierPoint.point
controlPoint1:[self calculateForwardControlPoint:i - 1]
controlPoint2:[self calculateReverseControlPoint:i]];
}
return path;
}
When I render this into a UIImage (using the code below), I don't see any softening of the image, but admittedly the images are not identical. (I'm comparing the image rendered by capture against that which I capture manually with a screen snapshot by pressing power and home buttons on my physical device at the same time.)
If you're seeing some softening, I would suggest renderInContext (as shown below). I wonder if you writing the image as JPG (which is lossy). Maybe try PNG, if you used JPG.
- (void)drawBezier
{
UIBezierPath *path = [self bezierPath];
CAShapeLayer *oval = [[CAShapeLayer alloc] init];
oval.path = path.CGPath;
oval.strokeColor = [UIColor redColor].CGColor;
oval.fillColor = [UIColor clearColor].CGColor;
oval.lineWidth = 5.0;
oval.strokeStart = 0.0;
oval.strokeEnd = 1.0;
[self.view.layer addSublayer:oval];
}
- (void)capture
{
UIGraphicsBeginImageContextWithOptions(self.view.frame.size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// save the image
NSData *data = UIImagePNGRepresentation(screenshot);
NSString *documentsPath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
NSString *imagePath = [documentsPath stringByAppendingPathComponent:#"image.png"];
[data writeToFile:imagePath atomically:YES];
// send it to myself so I can look at the file
NSURL *url = [NSURL fileURLWithPath:imagePath];
UIActivityViewController *controller = [[UIActivityViewController alloc] initWithActivityItems:#[url]
applicationActivities:nil];
[self presentViewController:controller animated:YES completion:nil];
}

Redraw NSBezierPath multiple times, with scaling?

I am currently writing many chunks of text on an NSView using the NSString helper method ... however it is very slow to write a large amount of in many cases repeated text. I am trying to re-write the code so that the text is converted to NSBezierPath's that are generated once, then drawn many times. The following will draw text at the bottom of the screen.
I am still trying to read through the apple documentation to understand how this works, but in the mean time I wonder if there is an easy way to alter this code to re-draw the path in multiple locations?
// Write a path to the view
NSBezierPath* path = [self bezierPathFromText: #"Hello world!" maxWidth: width];
[[NSColor grayColor] setFill];
[path fill];
Here is the method that writes some text into a path:
-(NSBezierPath*) bezierPathFromText: (NSString*) text maxWidth: (float) maxWidth {
// Create a container describing the shape of the text area,
// for testing done use the whole width of the NSView.
NSTextContainer* container = [[NSTextContainer alloc] initWithContainerSize:NSMakeSize(maxWidth - maxWidth/4, 60)];
// Create a storage object to hold an attributed version of the string to display
NSFont* font = [NSFont fontWithName:#"Helvetica" size: 26];
NSDictionary* attr = [NSDictionary dictionaryWithObjectsAndKeys: font, NSFontAttributeName, nil];
NSTextStorage* storage = [[NSTextStorage alloc] initWithString: text attributes: attr];
// Create a layout manager responsible for writing the text to the NSView
NSLayoutManager* layoutManger = [[NSLayoutManager alloc] init];
[layoutManger addTextContainer: container];
[layoutManger setTextStorage: storage];
NSRange glyphRange = [layoutManger glyphRangeForTextContainer: container];
NSGlyph glyphArray[glyphRange.length];
NSUInteger glyphCount = [layoutManger getGlyphs:glyphArray range:glyphRange];
NSBezierPath* path = [[NSBezierPath alloc] init];
//NSBezierPath *path = [NSBezierPath bezierPathWithRect:NSMakeRect(0, 0, 30, 30)];
[path moveToPoint: NSMakePoint(0, 7)];
[path appendBezierPathWithGlyphs:glyphArray count: glyphCount inFont:font];
// Deallocate unused objects
[layoutManger release];
[storage release];
[container release];
return [path autorelease];
}
Edit: I am attempting to optimise an application that outputs to screen, a sequence of large amounts text such as a sequence of 10,000 numbers. Each number has markings around it and/or differing amounts of space between them, and some numbers have dots and/or lines above, below or between them. Its like the example at the top of page two of this document but with much much more output.
You could start by removing the line:
[path moveToPoint: NSMakePoint(0, 7)];
so that your path isn't tied to a particular location in the view. With that done, you can call your method to get the path, move to a point, stroke the path, move to another point, stroke the path, and so on. If you want to move from the starting point within the path description, use -relativeMoveToPoint:.
It appears this might do the trick, however I am not sure if it is the best way to do it?
// Create the path
NSBezierPath* path = [self bezierPathFromText: #"Fish are fun to watch in a fish tank, but not fun to eat, or something like that." maxWidth: width];
// Draw a copy of it at a transformed (moved) location
NSAffineTransform* transform = [[NSAffineTransform alloc] init];
[transform translateXBy: 10 yBy: 10];
NSBezierPath* path2 = [path copy];
[path2 transformUsingAffineTransform: transform];
[[NSColor greenColor] setFill];
[path2 fill];
[path2 release];
[transform release];
[path2 release];
// Draw another copy of it at a transformed (moved) location
transform = [[NSAffineTransform alloc] init];
[transform translateXBy: 10 yBy: 40];
path2 = [path copy];
[path2 transformUsingAffineTransform: transform];
[[NSColor greenColor] setFill];
[path2 fill];
[path2 release];
[transform release];