Google Map Integration in OS X Application - objective-c

I would like to use use Google Map in my MAC application.
I found the iOS SDK of Google Maps but not for OS X.
I want to show two annotation and a line connecting them on Google Map. Coordinate of both annotation are dynamic as per user selection.
Below is the way I find out that can work:
Call a API and pass the location coordinate for both annotation.
Now Server side a html form is generate using javascript and create a page which is showing the 2 annotation and line connecting them.
In Api Response I will get the URL of that html page.
I will show this page in UIWebView.
I want to know is there any other way I can achieve this.
I want to distribute application outside the mac app store and to distribute outside mac store I need to sign app with Developer ID which does not support the MAPs.
I didn't find anything related to this that's why I created this thread.
Thanks in advance.

I recently ported the Mapbox iOS SDK over to OS X. It has a lot of the features of MapKit, but it’s open source and should also work in a developer-signed application such as yours. To use the Mapbox OS X SDK, download the latest release from the GitHub repository (look for releases beginning with “osx-”) and follow the instructions in README.md. An API reference is included.
I want to show two annotation and a line connecting them on Google Map. Coordinate of both annotation are dynamic as per user selection.
To display the annotations on-screen, you’ll need the MGLPointAnnotation and MGLPolyline classes. You can move the point annotations dynamically by setting their coordinate properties. The polyline, however, is immutable; to change its path, remove the existing polyline and add a new one with the new coordinates.

You will have to make it with WebKit and the Google Maps API.
MapKit is available in OS X 10.9 Mavericks: https://developer.apple.com/library/mac/documentation/MapKit/Reference/MapKit_Framework_Reference/index.html
There are of course many ways of hiding the fact that you're using WebKit but if they violate Apple's or Google's TOS then submission to the App Store won't be possible.
Hope this will be helpful!

Related

Is screen share annotation supported with Agora WebSDK?

I am trying to implement a platform where I can interact with somebody and should be able to annotate on the shared screen. I am exploring agora, normal 1-1 call with screen sharing seems to be working, just wanted to know if the annotation feature is there or not. I tried searching on the agora docs, could not find any.
Please share the useful links
Screen annotation isn't supported by the Web SDK yet. You can either make a feature request or build it on top of the screenshare yourself. You can use an HTML canvas overlay on top of the screen share to draw the annotations and send the draw data to the remote users using Agora RTM.

Does Telegram API support swipes?

If I have a bunch of pictures, Is there any way to see all the pictures like on the Right Image, instead of the standard way (Left Image)
It's not the responsibility of Telegram API to support swipes. This will be down to the UI library you are using to build your app. All native apps should support swipes by default, however, by the looks of your UI widget you are trying to implement a Carousel as there are forward and back icons. If you let us know what platform you are targeting, what language / framework you are using there should be loads of useful suggestions.

How can I determine if my Cocoa Desktop application is on the list of apps to be opened at login?

I am developing a SandBoxed Cocoa Application. I have successfully implemented the Launch at login feature by using the Core Foundation function:
SMLoginItemSetEnabled
I have based the implementation on This tutorial
But now I need a way to determine if my app is set to be launched at login, so that I can show the button in the appropriate position upon launch. I would expect a similar Core Foundation function to find out if a bundle identifier is on the list of login items, but I didn't find it.
Another problem is that, by using this Core Foundation approach, although it is recommended by Apple, my app is still inconsistent with the "Open at Login" tick in my application dock menu.
You can do it using the functions in the LSSharedFileList.h header, which is in LaunchServices.framework, which is in CoreServices.framework. As far as I can tell, Apple hasn't documented this stuff except in the header comments, but that's probably enough. The basic outline is that you first create a LSSharedFileListRef using the function LSSharedFileListCreate for the list type kLSSharedFileListSessionLoginItems. Then copy a snapshot of the list (which is a CFArrayRef) using LSSharedFileListCopySnapshot. Then for any item in the array, you can get its URL using LSSharedFileListItemCopyResolvedURL. That last function requires Mac OS X 10.10 or later, while I think the others date back to 10.5.
By the way, the docs on SMLoginItemSetEnabled say that it's for setting an embedded helper app as a login item, but it sounds like you're talking about a freestanding app.
Apple's sandbox documentation says:
With App Sandbox, you cannot create a login item using functions in the LSSharedFileList.h header file. For example, you cannot use the function LSSharedFileListInsertItemURL.
But maybe you can still use the shared file list functions on a read-only basis.

Show users heading location(north,east west,soth) using Google Map SDK for iOS

I am working on an application where is want to show the user his heading location, like in Apple maps (magnetic direction) which is a small triangle and will rotate as the user moves physically using the app. I am using Google Maps SDK and would like to know if Google provides this option, i have researched a bit but dint get. Also, their official Google Map App doesn't have the feature.
Also i would like to know if there is any other better option for the same.
Simply add mapView.myLocationEnabled = YES; should works for your require.
Except you want this feature works well in China.

Identify the monitor with the browser window in FireBreath

I am using FireBreath to create a cross browser plugin which makes use of some native libraries for the respective platform (some .NET based DLLs for Windows and Objective-C based dylibs/frameworks for Mac). Native libraries display UI screens. In order to improve usability, if the user has a multi/extended monitor setup, i would like the native UIs to appear on the same screen as the browser window is currently on.
If an identifier to the monitor with the browser window can be retrieved, that can be passed down to the native components which can be configured to display their UIs on that monitor. I have used FireBreath's getWindowPosition() method to get the rect coordinates of the plugin and used that info to identify the correct monitor in the Windows platform.
However, the coordinates returned in Mac seems to be always 0 (or 1) irrespective of monitor on which the browser window currently resides. I understand that we have to configure an event model and a drawing model in order for this to work in Mac. I have tried following event/drawing model combinations without much success.
1) Cocoa/CoreGraphics
2) Carbon/CoreGraphics
Any help in this regard is much appreciated. Also please do share if there are other approaches to achieve the same. What i want to achieve is to identify the monitor on which the current active browser window resides in Mac. I am unsure at this point, but it maybe possible to achieve this at Objective-C level (without any changes at FireBreath level). Also please note that i want to support Safari, Firefox and Chrome browsers.
You won't like this answer, but simply put you can't do that on Mac. The problem is that with CoreGraphics you are only given a CGContextRef to work with, and it doesn't know where it will be drawn. It was technically possible in older browsers to get an NSWindow by exploiting some internal implementation details, but many browsers that's no longer possible and it was never supported.
Other drawing models are the same; CoreAnimation you have a CALayer but it doesn't know which screen or monitor it is drawn to. I personally think it's a bit annoying as well, but I do not know of any way to find out which monitor your plugin is rendered to, particularly since most of them actually copy the buffer to something else and render in a different process.
I did manage to come up with a workaround and i am just replying here for the completeness of the thread. As #taxilian explained, it is not possible to retrieve plugin coordinates using the window reference. As an alternative approach, Javascript 'Window' object has 2 properties called 'screenX' and 'screenY' that return X and Y coordinates of the browser window relative to the screen. If the user has an extended monitor setup, these are the absolute coordinates with respect to the full extended screen. We can use these values to determine the monitor with the browser window (if the X coordinate is outside the bounds of the primary monitor's width, then the browser should essentially be on the extended monitor). We can retrieve DOM properties from Firebreath as explained in the following link:
http://www.firebreath.org/display/documentation/Invoking+methods+on+the+DOM