Convert geographic coordinates to a CGPoint on a custon=m UIView - objective-c

I'm building a application in whom one of the features will be shoving geographic coordinates on a custom UIImageView. I'm bad at math so I cant seem to get the right values. This is what I'm doing:
I have a image that is 2048x2048 and I put it in a UIScrollView and when I get coordinates, let's say "Sydney -33.856963, 151.215219" I turn them into a UIView coordinates (x,y)
- (void)viewDidLoad
{
[super viewDidLoad];
scrollView = [[UIScrollView alloc]initWithFrame:CGRectMake(0, 0, self.view.frame.size.width,self.view.frame.size.height)];
scrollView.delegate = self;
scrollView.showsVerticalScrollIndicator = YES;
scrollView.scrollEnabled = YES;
scrollView.userInteractionEnabled = YES;
[scrollView setBounces:NO];
scrollView.minimumZoomScale = 0.5;
scrollView.maximumZoomScale = 100.0;
mainImageView = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, IMAGE_SIZE, IMAGE_SIZE)];
mainImageView.image = [UIImage imageNamed:#"map.jpg"];
mainImageView.contentMode = UIViewContentModeScaleAspectFit;
tipScroll.contentSize = CGSizeMake(IMAGE_SIZE, IMAGE_SIZE);
[scrollView addSubview:mainImageView];
[self.view addSubview:scrollView];
NSString* fileContents = #"-33.856963,151.215219";
NSArray* pointStrings = [fileContents componentsSeparatedByCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
for (int i = 0; i<[pointStrings count]; i++) {
NSArray* latLonArr = [currentPointString componentsSeparatedByCharactersInSet:[NSCharacterSet characterSetWithCharactersInString:#","]];
[self getCoordinates:latLonArr];
}
}
-(void)getCoordinates:(NSArray *)latLonArr{
double comparatorWidth = IMAGE_SIZE/360.0f;
double comparatorHeight = IMAGE_SIZE/180.0f;
double Xl = [[latLonArr objectAtIndex:1] doubleValue]; //151.215219
double Yl = [[latLonArr objectAtIndex:0] doubleValue]; //-33.856963
coordX = (Xl+180.0f)*comparatorWidth;
coordY = (90.0f-Yl)*comparatorHeight;
UIImage *image=[UIImage imageNamed:[NSString stringWithFormat:#"star.png"]];
imageView=[[UIImageView alloc] init];
[imageView setFrame:CGRectMake(coordX,coordY,10,10)];
[imageView setImage:image];
[scrollView addSubview:imageView];
}
The further I go from the center coordinates (0,0) the more the points are not accurate. If I have coordinates for a city in West Africa it will be right on spot, but Sydney is a lot off. How can I fix this?

I think the problem is that the earth is not flat. That means that you can not simple convert geo coordinates to 2-dimensional system of the view. http://en.wikipedia.org/wiki/Geographic_coordinate_system
Check this question and the correct answer:
Converting longitude/latitude to X/Y coordinate

Look at the white horizontal lines on the image you posted. They're not evenly spaced out - they get wider towards the bottom of the image. This means that your map image is not made using an Equirectangular projection, and is probably a Mercator projection image.
The code you have posted which converts lat/long to Y/X just by offset and scaling would only work for Equirectangular projection images.
For mercator projections, the conversion is more complex. Please see Covert latitude/longitude point to a pixels (x,y) on mercator projection.
So you have two options:
A) Use equirectangular projection map images
B) Continue using Mercator map images, and fix your lat/long -> Y/X conversion algorithm

Related

Blurred Screenshot of a view being drawn by UIBezierPath

I'm drawing my graph view using UIBezierPathmethods and coretext. I use addQuadCurveToPoint:controlPoint: method to draw curves on graph. I also use CATiledLayer for the purpose of rendering graph with large data set on x axis. I draw my whole graph in an image context and in drawrect: method of my view I draw this image in my whole view. Following is my code.
- (void)drawImage{
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, 0.0);
// Draw Curves
[self drawDiagonal];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
[screenshot retain];
UIGraphicsEndImageContext();
}
- (void)drawRect:(CGRect)rect{
NSLog(#"Draw iN rect with Bounds: %#",NSStringFromCGRect(rect));
[screenshot drawInRect:self.frame];
}
However in screenshot the curves drawn between two points are not smooth. I've also set the Render with Antialiasing to YES in my info plist. Please see screenshot.
We'd have to see how you construct the UIBezierPath, but in my experience, for smooth curves, the key issue is whether the slope of the line between a curve's control point and the end point of that particular segment of the curve is equal to the slope between the next segment of the curve's start point and its control point. I find that easier to draw general smoooth curves using addCurveToPoint rather than addQuadCurveToPoint, so that I can adjust the starting and ending control points to satisfy this criterion more generally.
To illustrate this point, the way I usually draw UIBezierPath curves is to have an array of points on the curve, and the angle that the curve should take at that point, and then the "weight" of the addCurveToPoint control points (i.e. how far out the control points should be). Thus, I use those parameters to dictate the second control point of a UIBezierPath and the first controlPoint of the next segment of the UIBezierPath. So, for example:
#interface BezierPoint : NSObject
#property CGPoint point;
#property CGFloat angle;
#property CGFloat weight;
#end
#implementation BezierPoint
- (id)initWithPoint:(CGPoint)point angle:(CGFloat)angle weight:(CGFloat)weight
{
self = [super init];
if (self)
{
self.point = point;
self.angle = angle;
self.weight = weight;
}
return self;
}
#end
And then, an example of how I use that:
- (void)loadBezierPointsArray
{
// clearly, you'd do whatever is appropriate for your chart.
// this is just a unclosed loop. But it illustrates the idea.
CGPoint startPoint = CGPointMake(self.view.frame.size.width / 2.0, 50);
_bezierPoints = [NSMutableArray arrayWithObjects:
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x, startPoint.y)
angle:M_PI_2 * 0.05
weight:100.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x + 100.0, startPoint.y + 70.0)
angle:M_PI_2
weight:70.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x, startPoint.y + 140.0)
angle:M_PI
weight:100.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x - 100.0, startPoint.y + 70.0)
angle:M_PI_2 * 3.0
weight:70.0 / 1.7],
[[BezierPoint alloc] initWithPoint:CGPointMake(startPoint.x + 10.0, startPoint.y + 10)
angle:0.0
weight:100.0 / 1.7],
nil];
}
- (CGPoint)calculateForwardControlPoint:(NSUInteger)index
{
BezierPoint *bezierPoint = _bezierPoints[index];
return CGPointMake(bezierPoint.point.x + cosf(bezierPoint.angle) * bezierPoint.weight,
bezierPoint.point.y + sinf(bezierPoint.angle) * bezierPoint.weight);
}
- (CGPoint)calculateReverseControlPoint:(NSUInteger)index
{
BezierPoint *bezierPoint = _bezierPoints[index];
return CGPointMake(bezierPoint.point.x - cosf(bezierPoint.angle) * bezierPoint.weight,
bezierPoint.point.y - sinf(bezierPoint.angle) * bezierPoint.weight);
}
- (UIBezierPath *)bezierPath
{
UIBezierPath *path = [UIBezierPath bezierPath];
BezierPoint *bezierPoint = _bezierPoints[0];
[path moveToPoint:bezierPoint.point];
for (NSInteger i = 1; i < [_bezierPoints count]; i++)
{
bezierPoint = _bezierPoints[i];
[path addCurveToPoint:bezierPoint.point
controlPoint1:[self calculateForwardControlPoint:i - 1]
controlPoint2:[self calculateReverseControlPoint:i]];
}
return path;
}
When I render this into a UIImage (using the code below), I don't see any softening of the image, but admittedly the images are not identical. (I'm comparing the image rendered by capture against that which I capture manually with a screen snapshot by pressing power and home buttons on my physical device at the same time.)
If you're seeing some softening, I would suggest renderInContext (as shown below). I wonder if you writing the image as JPG (which is lossy). Maybe try PNG, if you used JPG.
- (void)drawBezier
{
UIBezierPath *path = [self bezierPath];
CAShapeLayer *oval = [[CAShapeLayer alloc] init];
oval.path = path.CGPath;
oval.strokeColor = [UIColor redColor].CGColor;
oval.fillColor = [UIColor clearColor].CGColor;
oval.lineWidth = 5.0;
oval.strokeStart = 0.0;
oval.strokeEnd = 1.0;
[self.view.layer addSublayer:oval];
}
- (void)capture
{
UIGraphicsBeginImageContextWithOptions(self.view.frame.size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// save the image
NSData *data = UIImagePNGRepresentation(screenshot);
NSString *documentsPath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
NSString *imagePath = [documentsPath stringByAppendingPathComponent:#"image.png"];
[data writeToFile:imagePath atomically:YES];
// send it to myself so I can look at the file
NSURL *url = [NSURL fileURLWithPath:imagePath];
UIActivityViewController *controller = [[UIActivityViewController alloc] initWithActivityItems:#[url]
applicationActivities:nil];
[self presentViewController:controller animated:YES completion:nil];
}

How to set position on MKMap

I have 4 coordinates of rect.
like
north="55.9504356" south="55.5485293" east="38.1289043" west="37.1044291"
how i can set map position in this rectangle?
You should convert your coordinates. Try to obtain a center of rect somehow. Here's the sample of my code:
CLLocationCoordinate2D coordinate = {[neighborhood.Latitude doubleValue], [neighborhood.Longitude doubleValue]};
MKCoordinateRegion region = MKCoordinateRegionMakeWithDistance(coordinate, 15000, 15000);
[self.mapView setRegion:region animated:NO];
as you can see you point out a center of rect and the distance from center in meters.
google goecoding gives me viewport <LatLonBox north="55.9504356" south="55.5485293" east="38.1289043" west="37.1044291"/> and center of region Russia, Moscow.
CLLocationCoordinate2D centerOfRegion = CLLocationCoordinate2DMake([data.coordinatesLongitude doubleValue],[data.coordinatesLatitude doubleValue]); //37.6166667,55.7500000
//N+E S+W for get distance between
double lat1 = [[data.squareArray objectAtIndex:0] doubleValue];
double lng1 = [[data.squareArray objectAtIndex:2] doubleValue];
double lat2 = [[data.squareArray objectAtIndex:1] doubleValue];
double lng2 = [[data.squareArray objectAtIndex:3] doubleValue];
CLLocation *firstLocation = [[[CLLocation alloc] initWithLatitude:lat1 longitude:lng1] autorelease];
CLLocation *secondLocation = [[[CLLocation alloc] initWithLatitude:lat2 longitude:lng2] autorelease];
// and /2 to make a little smaller viewport
CLLocationDistance distance = [secondLocation distanceFromLocation:firstLocation];
[map setRegion:MKCoordinateRegionMakeWithDistance(centerOfRegion, distance/2, distance/2)];
so a got centered region and normaly size of region.
My solution is so, hopeits can help somebody else.

Trying to display images using UIScrollView

I want to create an simple app that will contain one centered image at the first start screen, than upon swipe gesture(right, left) change images.
I'm very new to this so here is what I though is what I'm looking for http://idevzilla.com/2010/09/16/uiscrollview-a-really-simple-tutorial/ .
This is the code I have in my controller implementation :
- (void)viewDidLoad
{
[super viewDidLoad];
NSMutableDictionary *myDictionary = [[NSMutableDictionary alloc] init];
[myDictionary setObject:#"img1.jpg" forKey:#"http://www.testweb.com"];
[myDictionary setObject:#"img2.jpg" forKey:#"http://www.test2.com"];
UIScrollView *scroll = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
scroll.pagingEnabled = YES;
NSInteger numberOfViews = [myDictionary count];
for (NSString* key in myDictionary) {
UIImage * image = [UIImage imageNamed:[NSString stringWithFormat:[myDictionary objectForKey:key]]];
CGRect rect = CGRectMake(10.0f, 90.0f, image.size.width, image.size.height);
UIImageView * imageView = [[UIImageView alloc] initWithFrame:rect];
[imageView setImage:image];
CGPoint superCenter = CGPointMake([self.view bounds].size.width / 2.0, [self.view bounds].size.height / 2.0);
[imageView setCenter:superCenter];
self.view.backgroundColor = [UIColor whiteColor];
[scroll addSubview:imageView];
}
scroll.contentSize = CGSizeMake(self.view.frame.size.width * numberOfViews, self.view.frame.size.height);
[self.view addSubview:scroll];
}
My first issue here is that I get img2 on my initial screen instead of img1. And second issue is when I swipe right I get white screen and no image on it. Any suggestions what I missed, what I can try/read etc?
EDIT :
I'm looking to do this the "lightest" possible way, using no fancy galleries api etc. Just display couple of really small images(i.e 200x200 px) centered on the screen that I can swipe back and forth(should't be too hard). Well everything is hard when learning a new language.
There is a project on github MWPhotoBrowser that allows you to put in a set of images which are viewed one at a time. You can then scroll from one image to the next with a swipe gesture. It is also easy to add functionality and it should give you a good understanding of how this is done. There is also Apple's PhotoScroller example which gives you straight forward code in how to do this same thing and also tile your images. Since you are new to iOS you may want to first look at both of these examples or possibly just use them as your photo viewer.
Your problem is likely to be the fact that you are setting CGRect rect = CGRectMake(10.0f, 90.0f, image.size.width, image.size.height); for both of your image views. This means that both of your UIImageView objects are placed in exactly the same position (both are positioned at 10.f on the x-coordinate of the scrollview). As the second image view is added last it covers the first and means that there is white space to the right of it.
In order to fix this you could try something like...
- (void)viewDidLoad
{
//your setup code
float xPosition = 10.f;
for (NSString* key in myDictionary) {
UIImage * image = [UIImage imageNamed:[NSString stringWithFormat:[myDictionary objectForKey:key]]];
CGRect rect = CGRectMake(xPosition, 90.0f, image.size.width, image.size.height);
xPosition += image.size.width;
//rest of your code...
}
//rest of your code
}
The above would mean that the second view is positioned on the x-coordinate at 10 + the width of the first image. Note that I haven't tested my answer.
First off the images are placed on top of each other.
CGPoint superCenter = CGPointMake([self.view bounds].size.width / 2.0, [self.view bounds].size.height / 2.0);
[imageView setCenter:superCenter];
Right here you are setting both images to be placed at the center of the screen. The second thing is your using an NSDictionary and looping through the keys. NSDictionary are not orders like an array is. You would have to sort the keys to us it in a specific order. So Barjavel had it right but skipped over the fact your setting the images to center. Try this:
- (void)viewDidLoad
{
[super viewDidLoad];
NSArray *myArray = [[NSArray alloc] initWithObjects:#"img1.jpg", #:image2.jpg", nil];
UIScrollView *scroll = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
scroll.pagingEnabled = YES;
NSInteger numberOfViews = [myArray count];
int xPosition = 0;
for (NSString* image in myArray) {
UIImage * image = [UIImage imageNamed:image];
CGRect rect = CGRectMake(xPosition, 90.0f, image.size.width, image.size.height);
UIImageView * imageView = [[UIImageView alloc] initWithFrame:rect];
[imageView setImage:image];
self.view.backgroundColor = [UIColor whiteColor];
[scroll addSubview:imageView];
xPosition += image.size.width;
}
scroll.contentSize = CGSizeMake(self.view.frame.size.width * numberOfViews, self.view.frame.size.height);
[self.view addSubview:scroll];
}

clipping an image objective-c

I created a Grid of Images.
The frames of those images are squares-shaped(CGRects).
Unfortunately the images fill-fits the square unproportionally.
But I would like to "crop" or "mask" the given image. Which means my frames will show only parts of an image but proportionally correct. I tried contentModes of UIImageView but no luck.
for(int index = 0; index < (_cols*_rows) ;index++)
{
NSValue *value = [myArray objectAtIndex:index];
CGRect myrect = [value CGRectValue];
UIImageView *myImageView = [[UIImageView alloc] initWithImage:myImage];
myImageView.frame = myrect;
[self addSubview:myImageView];
[myImageView release];
}
Are you sure contentMode won't do what you want? If I understand you correctly, UIViewContentModeScaleAspectFill will do exactly what you describe.

Obtaining the maximum height of a font

So I have an NSFont, and I want to get the maximum dimensions for any characters, ie. the pitch and letter height. [font maximumAdvancement] seems to return an NSSize of {pitch, 0}, so that's not helping. Bounding rect doesn't seem to work either, and the suggestion from jwz's similar question of creating a bezier path, appending a glyph and getting the bounding rectange is also giving me back {0, 0}. What gives here?
UPDATE: The code I'm using to get the bezier size is this:
NSBezierPath *bezier = [NSBezierPath bezierPath];
NSGlyph g;
{
NSTextStorage *ts = [[NSTextStorage alloc] initWithString:#" "];
[ts setFont:font];
NSLayoutManager *lm = [[NSLayoutManager alloc] init];
NSTextContainer *tc = [[NSTextContainer alloc] init];
[lm addTextContainer:tc];
[tc release]; // lm retains tc
[ts addLayoutManager:lm];
[lm release]; // ts retains lm
g = [lm glyphAtIndex:0];
[ts release];
}
NSPoint pt = {0.0f};
[bezier moveToPoint:pt];
[bezier appendBezierPathWithGlyph:g inFont:font];
NSRect bounds = [bezier bounds];
The glyph for the space character doesn't have any subpaths, so of course its bounds have size NSZeroSize. Try -[NSFont boundingRectForFont] instead.