I have an app which tells iTunes to play music using the ScriptingBridge framework. The app either tells iTunes to play a playlist or a certain track. The app is also sandboxed.
To play a playlist, here's what I have:
iTunesPlaylist* playlist = ...
[playlist playOnce: YES];
To play a track, it's pretty straightforward as well:
iTunesTrack* track = ...
[track playOnce: YES];
Since my app is sandboxed, I have the following lines in my entitlements file:
<key>com.apple.security.scripting-targets</key>
<dict>
<key>com.apple.iTunes</key>
<array>
<string>com.apple.iTunes.library.read</string> // I also have this to read the playlists/tracks the user has on his library
<string>com.apple.iTunes.playback</string>
</array>
</dict>
I have tested without app sandboxing and the code works perfectly. With sandboxing though the playlist code works fine, but playing a track does not work. I checked with the Console app and nothing seems to be logged that concerns sandboxd and my app.
At first I thought that I might be missing some access-group in my entitlements file, but then I thought that wouldn't make sense because I already have the playback one. And I couldn't find any list of access groups for iTunes on the net (I even tried using sdef to get a property list from iTunes and search for 'access-group' but found nothing - it's not there) so I couldn't confirm if I needed any more.
To sum up, why is sandbox preventing this from working?
Never mind. It turns out I was calling filteredArrayUsingPredicate: on an SBElementArray to find out the track I wanted to play and that somehow was messing things up. Now I use the method objectWithName: and it works.
Related
I'm using an AVCaptureSession to create a screen recording (OSX), but i'd also like to add the computer audio to it (not the microphone, but anything that's playing through the speakers) i'm not really sure how to do that so the first thing i tried was adding an audio device like so:
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
After adding this device the audio was recorded, but it sounded like it was captured through the microphone. Is it possible to actually capture the computer's output sound this way? like quicktime does.
Here's an open source framework that supposedly makes it as easy to capture speaker output as it is a screenshot.
https://github.com/pje/WavTap
The home page for WavTap does mention that it requires kernel extension signing privileges to run under MacOS 10.10 & newer, and that requires signing into your Apple Developer Account and submitting this form. More information can be found here.
I am writing a OS X app and wanted to open the app whenever specific USB device is connected (specially
camera).
As per my research one solution is to make use of launchd to auto open the app when USB is
connected by creating plist file with specific criteria. I was able to open the app when
any USB device is connected like (iPhone or Pen Drive) but was not able to figure out way to only
open the app when Camera is connected. Below is the plist snapshot that i am using for launchd:
<key>LaunchEvents</key>
<dict>
<key>com.apple.iokit.matching</key>
<dict>
<key>com.apple.device-attach</key>
<dict>
<key>bDeviceClass</key>
<integer>0</integer>
<key>bDeviceSubClass</key>
<integer>0</integer>
<key>bDeviceProtocol</key>
<integer>0</integer>
<key>IOProviderClass</key>
<string>IOUSBDevice</string>
<key>IOMatchStream</key>
<true/>
<key>IOMatchLaunchStream</key>
<true/>
</dict>
</dict>
</dict>
My questions are:
Is there any other way to auto open the app without using launchd. I
have seen the apps those opens when specific USB device is
connected like iPhoto and iTunes when iPhone is connected and even
many custom apps.
If there's no other way apart from launchd then
is it possible to modify the launchd to only open app when camera is
connected to the system.
Thanks in advance for your help.
Sharing my research and working solution so that other can use if needed.
After my further research it seems like that OS X ImageCapture keeps mapping of devices to the auto open app path in com.apple.ImageCapture2 plist file.
If you run defaults -currentHost read com.apple.ImageCapture2 on terminal you will get list of something like (it may be empty if none of the app has used auto open feature):
"00000000-0000-0000-0000-000004A93AAA" = {
ICADeviceTypeKey = ICADeviceTypeCamera;
autolaunchApplicationPath = "";
};
HotPlugActionArray = (
);
HotPlugActionPath = "";
LastHotPlugActionPath = "";
Here HotPlugActionPath is the key and any app path assigned to this key will be opened when connected to system. You can also assign specific app to particular device using persistentID string (00000000-0000-0000-0000-000004A93AAA) and autolaunchApplicationPath. I edited programatically this plist file and assigned my app path here and it works as i was expecting.
I have a Titanium app that registers an iOS background service, which logs the device's GPS data every 30 seconds. I've registered it as a location service, which is supposed to prevent it from stopping after 10 minutes, however it's not working. Here is the relevant portion of my tiapp.xml:
<ios>
<plist>
<dict>
<key>UISupportedInterfaceOrientations~iphone</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UISupportedInterfaceOrientations~ipad</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UIBackgroundModes</key>
<array>
<string>location</string>
</array>
<key>UIRequiredDeviceCapabilities</key>
<array>
<string>gps</string>
<string>location-services</string>
</array>
</dict>
</plist>
</ios>
Here is how I register it in alloy.js:
if(utils.ios) {
console.log('registering ios background service');
Ti.App.iOS.registerBackgroundService({ url: 'tracking/backgroundService.js' });
}
And the background service itself:
var timeout = constants.tracking.interval * 1000;
console.log('starting background gps tracking');
setInterval(function() {
var user = settings.user();
if(user && user.password) {
//user is logged in, let's track them.
gpsTracking.track();
}
else {
console.log('user is not logged in so not tracking');
}
}, timeout);
This was tested on the iPhone Simulator, I haven't tested on an actual iOS device because the developer site is still down so I can't create a provisioning profile.
I checked my info.plist in the build folder and it's correctly adding the key/array values for UIBackgroundModes and UIRequiredDeviceCapabilities, so I'm not sure what to check next.
Any ideas?
Please refer Mr.Chris Bill's answer here. I'll describe it for you.
Here is a great resource for understanding how the app interacts with the
background. There are only a couple different types of apps that are
allowed to run in the background (and execute code) for extra time. (I
think the normal limit is about 10 minutes).
Apps that play audible content to the user while in the background, such as a music player app
Apps that keep users informed of their location at all times, such
as a navigation app
Apps that support Voice over Internet Protocol
(VoIP)
Newsstand apps that need to download and process new content
Apps that receive regular updates from external accessories
Support for some types of background execution must be declared in advance by
the app that uses them. An app declares support for a service using
its Info.plist file. Add the UIBackgroundModes key to your Info.plist
file and set its value to an array containing one or more of the
following strings:
audio : The app plays audible content to the user
while in the background. (This content includes streaming audio or
video content using AirPlay.)
location : The app keeps users informed of
their location, even while it is running in the background.
voip : The app provides the ability for the user to make phone calls using an
Internet connection.
newsstand-content : The app is a Newsstand app that
downloads and processes magazine or newspaper content in the
background.
external-accessory—The app works with a hardware accessory
that needs to deliver updates on a regular schedule through the
External Accessory framework. bluetooth-central—The app works with a
Bluetooth accessory that needs to deliver updates on a regular
schedule through the CoreBluetooth framework.
So take the info.plist file from your build folder and put it into the root of your project.
Then edit it as described above. Hope this helps!
I face the same problem,seems titanium has a cache logic for GPS location, and can't get the actual latitude and longitude.Also meet the problem like you, 10 minutes it stopped, and if I initial the system iOS map to get the location successfully, back to titanium app, it works again. Very wired.
After a long time investigation and try.
Finally solve the problem by setting the accuracy to ACCURACY_NEAREST_TEN_METERS and using the location event. Now I can get continually GPS updating, the accuracy is also very good, 5 meters.
Titanium.Geolocation.setAccuracy(ACCURACY_SET);
Ti.Geolocation.addEventListener('location',evtGpsResult);
Hope it can save your problem.
I am developing an iPhone game, and I will implement Game Center in it.
Apparently, I need to do some configuration with my application in iTunes Connect first. Therefore I have to "create" my application in the iTunes Connect page, right? But, how do I do that without getting Apple's staff review my app since I am still working on it?
I tried making a new app like normal, but it asks me for a bunch of stuff like screenshots etc, but I have none to offer in the first place...
Fixed. It seems they won't even review it if there is no binary uploaded yet.
I'm trying to play a YouTube video within an iPhone app using the technique in this URL
http://iphoneincubator.com/blog/audio-video/how-to-play-youtube-videos-within-an-application
The technique works fine and the video plays fine, except that I'm getting this warning.
warning: Unable to read symbols for "/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.1.2 (7D11)/Symbols/System/Library/Internet Plug-Ins/YouTubePlugIn.webplugin/YouTubePlugIn" (file not found).
That does slow down the app for the first time I got the warning. Seems like a lot of people is getting the same warning, but none of the forums I read seems to have the solution to get rid of the warning.
Do I need to download something or do specific things?
I also tried adding the YouTube framework from "/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.1.2.sdk/System/Library/PrivateFrameworks/YouTube.framework"
Doesn't seem to solve the issue.
Please enlight.
Some clips might not play in mobile devices. Check if you clips does.
The "/Developer/Platforms/iPhoneOS.platform/DeviceSupport/4.2.1 (8C148)/Symbols/usr/lib/dyld' has changed; re-reading symbols" warnings can be ignored (the clips should play anyway if it is available on mobile devices).
Note. I don't know the correct reason why some clips don't play on mobile, it might be a transcoding issue (not being transcoded for mobile) or some publisher settings.
Would be interesting to know.
You need to run your app on the device. The Simulator doesn't have the YouTube app.