Auto open OS X app when specific USB device connected - objective-c

I am writing a OS X app and wanted to open the app whenever specific USB device is connected (specially
camera).
As per my research one solution is to make use of launchd to auto open the app when USB is
connected by creating plist file with specific criteria. I was able to open the app when
any USB device is connected like (iPhone or Pen Drive) but was not able to figure out way to only
open the app when Camera is connected. Below is the plist snapshot that i am using for launchd:
<key>LaunchEvents</key>
<dict>
<key>com.apple.iokit.matching</key>
<dict>
<key>com.apple.device-attach</key>
<dict>
<key>bDeviceClass</key>
<integer>0</integer>
<key>bDeviceSubClass</key>
<integer>0</integer>
<key>bDeviceProtocol</key>
<integer>0</integer>
<key>IOProviderClass</key>
<string>IOUSBDevice</string>
<key>IOMatchStream</key>
<true/>
<key>IOMatchLaunchStream</key>
<true/>
</dict>
</dict>
</dict>
My questions are:
Is there any other way to auto open the app without using launchd. I
have seen the apps those opens when specific USB device is
connected like iPhoto and iTunes when iPhone is connected and even
many custom apps.
If there's no other way apart from launchd then
is it possible to modify the launchd to only open app when camera is
connected to the system.
Thanks in advance for your help.

Sharing my research and working solution so that other can use if needed.
After my further research it seems like that OS X ImageCapture keeps mapping of devices to the auto open app path in com.apple.ImageCapture2 plist file.
If you run defaults -currentHost read com.apple.ImageCapture2 on terminal you will get list of something like (it may be empty if none of the app has used auto open feature):
"00000000-0000-0000-0000-000004A93AAA" = {
ICADeviceTypeKey = ICADeviceTypeCamera;
autolaunchApplicationPath = "";
};
HotPlugActionArray = (
);
HotPlugActionPath = "";
LastHotPlugActionPath = "";
Here HotPlugActionPath is the key and any app path assigned to this key will be opened when connected to system. You can also assign specific app to particular device using persistentID string (00000000-0000-0000-0000-000004A93AAA) and autolaunchApplicationPath. I edited programatically this plist file and assigned my app path here and it works as i was expecting.

Related

How to open default iOS apps from a React Native app?

I'm creating a React Native app and I'd like to be able to open default apps from it. Specifically:
Phone.app (clicking on a button in my app and opening iOS modal to call a specific number)
Apple Maps App (clicking on a button in my app and opening the Apple Maps with routing to specified target destination)
I understand I can open the system maps with
<TouchableOpacity
onPress={() =>
Linking.openURL(`geo:${latitude},${longitude}`)}>
<Text>Open Map</Text>
</TouchableOpacity>```
and similarly call openURL("tel:" + telephone) for phone number. (I omit error handling in the code snippet purposefully.)
Next, it should be necessary to update ios/Info.plist to allow these application query schemes. I tried putting this into it:
<key>LSApplicationQueriesSchemes</key>
<array>
<string>geo</string>
<string>tel</string>
</array>
I'm still getting EUNSPECIFIED errors. I guess that maybe I'm not supposed to put the LSApplicationQueriesSchemes keys as geo and tel. I was, however, unable to find out what should go in there.
Thanks!
If you are using simulator then only web URLs can be tested. Other
URLs require actual device for testing.
geo is android only.

How to add NSPhotoLibraryUsageDescription, NSCameraUsageDescription, and NSMicrophoneUsageDescription to info.plist for react-native-image-picker

I'm new to xcode and react-native. I'm trying to use react-native-image-picker to add a user profile (uploaded to s3). react-native-image-picker's getting started assumes you have knowledge of info.plist. I'm not 100% sure how to proceed given:
For iOS 10+, Add the NSPhotoLibraryUsageDescription,
NSCameraUsageDescription, and NSMicrophoneUsageDescription (if
allowing video) keys to your Info.plist with strings describing why
your app needs these permissions
I know the info.plists are found in the ios folder, but
which info.plist do these permissions need to get added to (there's multiple inside ios folder: build, RNapp, RNapp-tvOS, RNapp.xcodeproj, etc)?
how does the XML look?
Should this be happening in xcode instead of my text editor?
docs
if you don't providing the privacy key in Info.plist, then your app is crash. You can see its log why crashed.
You will find these code below in the info.plist of your xcode , open in text editor.
adding these will grant the permission for using camera, PhotoLibrary, Video
<key>NSCameraUsageDescription</key>
<string>${PRODUCT_NAME} Camera Usage</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>${PRODUCT_NAME} PhotoLibrary Usage</string>
<key>NSVideoSubscriberAccountUsageDescription</key>
<string>${PRODUCT_NAME} Video Subscribe Usage</string>
add ti info.plist
<key>NSPhotoLibraryUsageDescription</key>
<string>Photo Library Access Warning</string>
You want to edit the plist that you need the permissions for. If you are making a mobile app that would be: RNapp.
You could do this in a text editor but the easiest way to do it is in Xcode.
Open the plist, on the last item (making sure it is not expanded_ hit the + button to create a new row to provide a key to define a value for. Xcode should autocomplete on the keys you provided above and set the value to the appropriate type.
Hope this helps.

Capture screen and audio with objective-c

I'm using an AVCaptureSession to create a screen recording (OSX), but i'd also like to add the computer audio to it (not the microphone, but anything that's playing through the speakers) i'm not really sure how to do that so the first thing i tried was adding an audio device like so:
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
After adding this device the audio was recorded, but it sounded like it was captured through the microphone. Is it possible to actually capture the computer's output sound this way? like quicktime does.
Here's an open source framework that supposedly makes it as easy to capture speaker output as it is a screenshot.
https://github.com/pje/WavTap
The home page for WavTap does mention that it requires kernel extension signing privileges to run under MacOS 10.10 & newer, and that requires signing into your Apple Developer Account and submitting this form. More information can be found here.

iOS Location Background Service stops after 10 minutes

I have a Titanium app that registers an iOS background service, which logs the device's GPS data every 30 seconds. I've registered it as a location service, which is supposed to prevent it from stopping after 10 minutes, however it's not working. Here is the relevant portion of my tiapp.xml:
<ios>
<plist>
<dict>
<key>UISupportedInterfaceOrientations~iphone</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UISupportedInterfaceOrientations~ipad</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UIBackgroundModes</key>
<array>
<string>location</string>
</array>
<key>UIRequiredDeviceCapabilities</key>
<array>
<string>gps</string>
<string>location-services</string>
</array>
</dict>
</plist>
</ios>
Here is how I register it in alloy.js:
if(utils.ios) {
console.log('registering ios background service');
Ti.App.iOS.registerBackgroundService({ url: 'tracking/backgroundService.js' });
}
And the background service itself:
var timeout = constants.tracking.interval * 1000;
console.log('starting background gps tracking');
setInterval(function() {
var user = settings.user();
if(user && user.password) {
//user is logged in, let's track them.
gpsTracking.track();
}
else {
console.log('user is not logged in so not tracking');
}
}, timeout);
This was tested on the iPhone Simulator, I haven't tested on an actual iOS device because the developer site is still down so I can't create a provisioning profile.
I checked my info.plist in the build folder and it's correctly adding the key/array values for UIBackgroundModes and UIRequiredDeviceCapabilities, so I'm not sure what to check next.
Any ideas?
Please refer Mr.Chris Bill's answer here. I'll describe it for you.
Here is a great resource for understanding how the app interacts with the
background. There are only a couple different types of apps that are
allowed to run in the background (and execute code) for extra time. (I
think the normal limit is about 10 minutes).
Apps that play audible content to the user while in the background, such as a music player app
Apps that keep users informed of their location at all times, such
as a navigation app
Apps that support Voice over Internet Protocol
(VoIP)
Newsstand apps that need to download and process new content
Apps that receive regular updates from external accessories
Support for some types of background execution must be declared in advance by
the app that uses them. An app declares support for a service using
its Info.plist file. Add the UIBackgroundModes key to your Info.plist
file and set its value to an array containing one or more of the
following strings:
audio : The app plays audible content to the user
while in the background. (This content includes streaming audio or
video content using AirPlay.)
location : The app keeps users informed of
their location, even while it is running in the background.
voip : The app provides the ability for the user to make phone calls using an
Internet connection.
newsstand-content : The app is a Newsstand app that
downloads and processes magazine or newspaper content in the
background.
external-accessory—The app works with a hardware accessory
that needs to deliver updates on a regular schedule through the
External Accessory framework. bluetooth-central—The app works with a
Bluetooth accessory that needs to deliver updates on a regular
schedule through the CoreBluetooth framework.
So take the info.plist file from your build folder and put it into the root of your project.
Then edit it as described above. Hope this helps!
I face the same problem,seems titanium has a cache logic for GPS location, and can't get the actual latitude and longitude.Also meet the problem like you, 10 minutes it stopped, and if I initial the system iOS map to get the location successfully, back to titanium app, it works again. Very wired.
After a long time investigation and try.
Finally solve the problem by setting the accuracy to ACCURACY_NEAREST_TEN_METERS and using the location event. Now I can get continually GPS updating, the accuracy is also very good, 5 meters.
Titanium.Geolocation.setAccuracy(ACCURACY_SET);
Ti.Geolocation.addEventListener('location',evtGpsResult);
Hope it can save your problem.

iTunes scripting with Scripting Bridge & Sandboxing

I have an app which tells iTunes to play music using the ScriptingBridge framework. The app either tells iTunes to play a playlist or a certain track. The app is also sandboxed.
To play a playlist, here's what I have:
iTunesPlaylist* playlist = ...
[playlist playOnce: YES];
To play a track, it's pretty straightforward as well:
iTunesTrack* track = ...
[track playOnce: YES];
Since my app is sandboxed, I have the following lines in my entitlements file:
<key>com.apple.security.scripting-targets</key>
<dict>
<key>com.apple.iTunes</key>
<array>
<string>com.apple.iTunes.library.read</string> // I also have this to read the playlists/tracks the user has on his library
<string>com.apple.iTunes.playback</string>
</array>
</dict>
I have tested without app sandboxing and the code works perfectly. With sandboxing though the playlist code works fine, but playing a track does not work. I checked with the Console app and nothing seems to be logged that concerns sandboxd and my app.
At first I thought that I might be missing some access-group in my entitlements file, but then I thought that wouldn't make sense because I already have the playback one. And I couldn't find any list of access groups for iTunes on the net (I even tried using sdef to get a property list from iTunes and search for 'access-group' but found nothing - it's not there) so I couldn't confirm if I needed any more.
To sum up, why is sandbox preventing this from working?
Never mind. It turns out I was calling filteredArrayUsingPredicate: on an SBElementArray to find out the track I wanted to play and that somehow was messing things up. Now I use the method objectWithName: and it works.