I have locating places on ios map using pin point and lines. Now how create a view on that locating place.
In this above picture is i located the places using pin and lines. now i want to create a view inside the locating places. how can i do this. please some body help me to solve this issue.
You have annotation on screen . The annotation are Latitude and lagnitude values .
NSArray *data = [root objectForKey:#"data"];
for (NSArray *row in data) {
NSNumber * latitude = [[row objectAtIndex:22]objectAtIndex:1];
NSNumber * longitude = [[row objectAtIndex:22]objectAtIndex:2];
NSString * crimeDescription = [row objectAtIndex:18];
NSString * address = [row objectAtIndex:14];
CLLocationCoordinate2D coordinate;
coordinate.latitude = latitude.doubleValue;
coordinate.longitude = longitude.doubleValue;
MyLocation *annotation = [[MyLocation alloc] initWithName:crimeDescription address:address coordinate:coordinate] ;
[_mapView addAnnotation:annotation];
}
It can be help you
1. http://www.raywenderlich.com/21365/introduction-to-mapkit-in-ios-6-tutorial
. http://spitzkoff.com/craig/?p=1082
Related
I've been following this tutorial that helps me draw route directions(polyline) between 2 tapped places on map. However I don't need to get tapped locations since by the time I want to show user the route, I'll already have latitude and longitude of origin and destination.
Can I create GMSPath with an array of latitude and longitude I already have?
If so, how..?
- (void)addDirections:(NSDictionary *)json {
NSDictionary *routes = [json objectForKey:#"routes"][0];
NSDictionary *route = [routes objectForKey:#"overview_polyline"];
NSString *overview_route = [route objectForKey:#"points"];
GMSPath *path = [GMSPath pathFromEncodedPath:overview_route];
GMSPolyline *polyline = [GMSPolyline polylineWithPath:path];
polyline.map = self.mainGoogleMap;}
Yes you can do it via following method.
First you need to create NSMutablePath global object.
Then you have to add latitude and longitude in that path like below and then assign that path to polyline object and your polyline is drawn
In viewDidLoad
pathDynamic = [[GMSMutablePath alloc] init];
When you want to add the location in mutable path
CLLocation *loc = [[CLLocation alloc] initWithLatitude:[[dictData valueForKey:#"fl_route_latitude"] doubleValue] longitude:[[dictData valueForKey:#"fl_route_longitude"] doubleValue]];
[pathDynamic addCoordinate:loc.coordinate];//Add your location in mutable path
GMSPolyline *polyline = [GMSPolyline polylineWithPath:pathDynamic];
// Add the polyline to the map.
polyline.strokeColor = AppOrangeColor;
polyline.strokeWidth = 3.0f;
polyline.map = [self getMapView];
I am working with MKMapView's, annotations, overlays, etc, but I'm having a pain in the butt issue with MKMapPointForCoordinate() returning an invalid coordinate.
Code:
MKMapPoint* pointArr;
for (Category* route in validRoutes){
NSString* routeID = [route routeid];
NSArray* pointData = [routes objectForKey:routeID];
pointArr = malloc(sizeof(MKMapPoint) * [pointData count]);
int i = 0;
for (NSDictionary* routeData in pointData) {
NSString* latitude = [routeData objectForKey:#"latitude"];
NSString* longitude = [routeData objectForKey:#"longitude"];
NSLog(#"L: %# L: %#",latitude, longitude);
CLLocationCoordinate2D coord = CLLocationCoordinate2DMake([[f numberFromString:latitude] doubleValue], [[f numberFromString:longitude] doubleValue]);
NSLog(#"Coord: %f %f",coord.latitude,coord.longitude);
MKMapPoint point = MKMapPointForCoordinate(coord);
NSLog(#"Point: %f %f",point.x,point.y);
pointArr[i] = point;
i++;
}
MKPolyline *polyline = [MKPolyline polylineWithPoints:pointArr count: i];
polyline.title = [route name];
[routeOverlays setObject:polyline forKey: [route routeid]];
[map addOverlay:polyline];
free(pointArr);
}
Output Example:
L: 41.380840 L: -83.641319
Coord: 41.380840 -83.641319
Point: 71850240.204982 100266073.824832
I don't understand why the conversion to a MKMapPoint is destroying the values of my CLLocationCoordinate2D. The overlay doesn't show up on the map because the values are invalid...
EDIT: I got the point working by using MKMapPointMake instead, BUT, my overlay still isn't showing. This is the mapView: viewForOverlay: code:
-(MKOverlayView*)mapView:(MKMapView *)mapView viewForOverlay:(id<MKOverlay>)overlay {
MKOverlayView *overlayView = nil;
//Checks if the overlay is of type MKPolyline
if([overlay isKindOfClass:[MKPolyline class]]){
MKPolylineView *routeLineView = [[MKPolylineView alloc] initWithPolyline:overlay];
routeLineView.strokeColor = [UIColor orangeColor];
routeLineView.lineWidth = 10;
return overlayView;
}
return nil;
}
The method gets called (Used a breakpoint to confirm), and I have annotations working (So the delegate has to be setup correctly, I assume)
Double edit: :facepalm: I was returning nil every time in the delegate code. That's what I get for copying and pasting the previous versions code ;P
An MKMapPoint is not a latitude/longitude in degrees (like CLLocationCoordinate2D).
They are not interchangeable and so you should not expect the MKMapPoint x,y values to show any obvious relation to the corresponding latitude and longitude.
An MKMapPoint is the conversion of latitude and longitude onto a flat projection using x,y values which are not in the same scale or range as latitude and longitude. Please see the Map Coordinate Systems section in the Location Awareness Programming Guide for a more detailed explanation.
By the way, if you have CLLocationCoordinate2D values, it's much easier to create a polyline using polylineWithCoordinates instead of polylineWithPoints. That way, you don't need to bother with any conversion.
See iOS SDK: MapKit MKPolyLine not showing for some more details and an example.
I have many MKMapViews, and each of them has an annotation. I am trying to retrieve the coordinates of each in this way:
for (MKMapView *map in MapViewArray)
{
// add textfield contents to array
NSString *latitude = [NSString stringWithFormat:#"%#", map.annotations];
[latitudes addObject: latitude];
}
I was looking for the right code instead of this:
map.annotations
I want to find the latitude here..
How can I do this??
Each map annotation is an object, so you'll have to get the coordinate value from the annotation instead. The annotations are stored in map.annotations, which is an array. If you only have one annotation per map, you can use this:
CLLocationCoordinate2D coordinate = [[map.annotations lastObject] coordinate];
NSString *latitude = [NSString stringWithFormat:#"%.2f", coordinate.latitude];
If you have multiple annotations, you'll obviously have to iterate through each one individually, then get the location data.
My problem actually seems rather silly... I am writing an iPhone application that uses MKMapKit. The app grabs the EXIF metadata from a provided geotagged photo. The problem is that the latitude and longitude coordinates that I retrieve, for example:
Lat: 34.25733333333334
Lon: 118.5373333333333
returns a location in China. Is there a regional setting that I am missing or do I need to convert the lat/long coordinates before using them?
Thank you all in advance for any help you can provide.
Here is the code I am using to grab the GPS data. You'll notice I am logging everything to the console so I can see what the values are:
void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *) = ^(ALAsset *asset)
{
NSDictionary *metadata = asset.defaultRepresentation.metadata;
NSLog(#"Image Meta Data: %#",metadata);
NSDictionary *gpsdata = [metadata objectForKey:#"{GPS}"];
self.lat = [gpsdata valueForKey:#"Latitude"];
self.lng = [gpsdata valueForKey:#"Longitude"];
NSLog(#"\nLatitude: %#\nLongitude: %#",self.lat,self.lng);
};
NSURL *assetURL = [mediaInfo objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:ALAssetsLibraryAssetForURLResultBlock
failureBlock:^(NSError *error) {
}];
UIImage *img = [mediaInfo objectForKey:#"UIImagePickerControllerEditedImage"];
previewImage.image = nil;
self.previewImage.image = img;
NSData *imageData = UIImagePNGRepresentation(img);
if ([imageData length] > 0) {
self._havePictureData = YES;
}
i think you should grab the value using following:
CLLocation *location = [asset valueForProperty:ALAssetPropertyLocation];
Are you sure you're not missing a minus sign on that 118? 34.257, -118.5373 is nicely inside Los Angeles, California.
While you can get the location from the asset per #Allen, it is also valid to get it from the GPS metadata as you were trying to do initially. I'm not 100% sure the asset library coordinate will be the same as the coord in the GPS metadata, it depends on how Apple stores this coord. For example, if you are using a timestamp, the Asset library timestamp is different than the EXIF creation date (a different topic, admittedly).
In any case, the reason you have the coord wrong is b/c you also need to get the direction info as follows:
NSDictionary *metadata = asset.defaultRepresentation.metadata;
NSLog(#"Image Meta Data: %#",metadata);
NSDictionary *gpsdata = [metadata objectForKey:#"{GPS}"];
self.lat = [gpsdata valueForKey:#"Latitude"];
self.lng = [gpsdata valueForKey:#"Longitude"];
// lat is negative is direction is south
if ([[gpsdata valueForKey:#"LatitudeRef"] isEqualToString:#"S"]) {
self.lat = -self.lat;
}
// lng is negative if direction is west
if ([[gpsdata valueForKey:#"LongitudeRef"] isEqualToString:#"W"]) {
self.lng = -self.lng;
}
NSLog(#"\nLatitude: %#\nLongitude: %#",self.lat,self.lng);
This also will works,
void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *) = ^(ALAsset *asset)
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
NSDictionary *metadata = rep.metadata;
NSMutableDictionary *GPSDictionary = [[[metadata objectForKey:(NSString *)kCGImagePropertyGPSDictionary]mutableCopy] autorelease];
};
I believe that the reason that there isn't a negative sign is because of the metadata: exif:GPSLongitudeRef: W which (I believe) means that there should be a negative sign in front of the longitude since it is referencing the western hemisphere. I believe that this also applies to the latitude but with exif:GPSLatitudeRef: N for Northern and Southern hemispheres. Hope that this helped. Just realized this is exactly what #XJones said. Metadata using ImageMagick.
I'm familiar with the Google Maps API and I'm trying to learned the iOS MapKit library now. I have an iPhone application which takes an input string and geocodes it using Google Maps geocoder service. I also want to set an appropriate zoom level for the new map but I'm not quite sure how to do it.
After reading, Determining zoom level from single LatLong in Google Maps API
My plan was to parse the JSON response from the Google Maps API and extract the ExtendedData field.
"ExtendedData":{
"LatLonBox":{
"north":34.13919,
"south":34.067018,
"east":-118.38971,
"west":-118.442796
}
Then using that I would like to set the bounds of my map accordingly using the MapKit setRegion function.
I started laying out a function to do this, but I'm a little lost on the logic...
- (void) setMapZoomForLocation(CLLocationCoordinate2D location, double north, double south, double east, double west){
// some fancy math here....
// set map region
MKCoordinateRegion region;
region.center = location;
MKCoordinateSpan span;
// set the span so that the map bounds are correct
span.latitudeDelta = ???;
span.longitudeDelta = ???;
region.span = span;
[self.mapView setRegion:region animated:YES];
}
Alternatively, I guess I could just used the Accuracy result from a geocode result to set a sort of default zoom level. I'm just not sure how to compute the equivalent default zoom levels for the various results.
See https://code.google.com/apis/maps/documentation/geocoding/v2/#GeocodingAccuracy
------------------------ Update: Solution I Used ------------------------------
// parse json result
NSDictionary *results = [jsonString JSONValue];
NSArray *placemarks = (NSArray *) [results objectForKey:#"Placemark"];
NSDictionary *firstPlacemark = [placemarks objectAtIndex:0];
// parse out location
NSDictionary *point = (NSDictionary *) [firstPlacemark objectForKey:#"Point"];
NSArray *coordinates = (NSArray *) [point objectForKey:#"coordinates"];
CLLocationCoordinate2D location;
location.latitude = [[coordinates objectAtIndex:1] doubleValue];
location.longitude = [[coordinates objectAtIndex:0] doubleValue];
// DEBUG
//NSLog(#"Parsed Location: (%g,%g)", location.latitude, location.longitude);
// parse out suggested bounding box
NSDictionary *extendedData = (NSDictionary *) [firstPlacemark objectForKey:#"ExtendedData"];
NSDictionary *latLngBox = (NSDictionary *) [extendedData objectForKey:#"LatLonBox"];
double north = [[latLngBox objectForKey:#"north"] doubleValue];
double south = [[latLngBox objectForKey:#"south"] doubleValue];
double east = [[latLngBox objectForKey:#"east"] doubleValue];
double west = [[latLngBox objectForKey:#"west"] doubleValue];
// DEBUG
//NSLog(#"Parsed Bounding Box: ne = (%g,%g), sw = (%g,%g)", north, east, south, west);
MKCoordinateSpan span = MKCoordinateSpanMake(fabs(north - south), fabs(east - west));
MKCoordinateRegion region = MKCoordinateRegionMake(location, span);
[self.mapView setRegion:region animated:YES];
If all you want to do is generate a MKCoordinateRegion, you shouldn't need to know anything at all about zoom level. Just create a coordinate region using the width and height of the LatLonBox.
MKCoordinateSpan span = MKCoordinateSpanMake(fabs(north - south), fabs(east - west));
MKCoordinateRegion region = MKCoordinateRegionMake(location, span);
[self.mapView setRegion:region animated:YES];