Detecting when a space changes in Spaces in Mac OS X - objective-c

Let's say I want to write a simple Cocoa app to make the Spaces feature of Leopard more useful. I would like to configure each space to have, say, different
screen resolutions
keyboard layouts
volume (for audio)
So there are two parts to my question:
I suppose there are ways to modify these three things independently of Spaces, right? If so, how?
How can I detect in my app when a space change occurs, and when that happens, determine what space the user just switched to? Does Leopard send out some distributed notifications or something?
Update: There has to be some public API way of doing this, judging from all the Spaces-related apps on the Mac App Store.

As Peter says, in 10.6 you can use the NSWorkSpace NSWorkspaceActiveSpaceDidChangeNotification to get a notification when the workspace changes.
You can then determine the current space using Quartz API, the kCGWindowWorkspace dictionary key holds the workspace.
e.g:
int currentSpace;
// get an array of all the windows in the current Space
CFArrayRef windowsInSpace = CGWindowListCopyWindowInfo(kCGWindowListOptionAll | kCGWindowListOptionOnScreenOnly, kCGNullWindowID);
// now loop over the array looking for a window with the kCGWindowWorkspace key
for (NSMutableDictionary *thisWindow in (NSArray *)windowsInSpace)
{
if ([thisWindow objectForKey:(id)kCGWindowWorkspace])
{
currentSpace = [thisWindow objectForKey(id)kCGWindowWorkspace] intValue];
break;
}
}
Alternatively you can get the Space using the private API, take a look at CGSPrivate.h which allows you to do this:
int currentSpace = 0;
CGSGetWorkspace(_CGSDefaultConnection(), &currentSpace);
To change the screen resolution you'll want to look at Quartz services, for altering the volume this may be helpful.

NSWorkspace posts a NSWorkspaceActiveSpaceDidChangeNotification on its own notification center, but only on Snow Leopard.

Related

In iOS 9 the uitableview footer text alignment/spacing changed, what can I do?

Hei.
These two pictures below show what changed. Here's my code for it and the code didn't change.
- (NSString *)tableView:(UITableView *)tableView titleForFooterInSection:(NSInteger)section
{
if (section == 0) {
return #"If enabled, closing apps via the app switcher won't actually close the app itself.\n\nThis option is perfect to use with Fast Freeze.";
}
NSString *offDescription = #"OFF\nDisables the backgrounding capability completely. The app has to restart every time you close it.";
NSString *fastFreezeDescription = #"FAST FREEZE\nThis mode is similar to what 'Smart Close' by rpetrich did. Usually an app has up to 10 minutes to perform tasks in the background before it gets suspended in memory. Since this can be an unnecessary battery drain, Fast Freeze will suspend the app right after you close it.";
NSString *nativeDescription = #"NATIVE\nThis is Apple's built in way of backgrounding.";
NSString *unlimitedNativeDescription = #"UNLIMITED NATIVE\nThis background mode allows apps to execute background tasks for an unlimited period of time, so the app won't get suspended in memory after 10 minutes.";
NSString *foregroundDescription = #"FOREGROUND\nForeground tricks the system into thinking that the app wasn't closed and is still running in foreground. This is the perfect way to continue to listen to internet streams or videos while using another app.";
return [NSString stringWithFormat:#"\n%#\n\n%#\n\n%#\n\n%#\n\n%#", offDescription, fastFreezeDescription, nativeDescription, unlimitedNativeDescription, foregroundDescription];
}
Here are the screenshots:
This is the view before (iOS 8), notice that "\nOFF\nDisables ..." has the gap to the last UITableViewCell like it should be.
This is the view after (iOS 9), the gap is off. Too much space there.
So, does anyone know why this happens? If anyone has an fix or something, please tell me!
Thanks in advance!
This seems like an Apple's bug (iOS 9.0 - 9.1), the more lines footer has, the bigger misplacement gets.
You can even reproduce this in the storyboard using static table view.
I cound't find any workaround yet, the best advice I can give is to file a bug report to Apple.
Update
The bug in storyboard seems to be fixed in Xcode 7.2 beta 2. However this issue still persists when you run the app even on iOS 9.2 simulator.
Update 2
Narrowed down the reproduction of this bug. Basically something breaks after your app presents a table view section header. Check this repo for details.

multi track mp3 playback for iOS application

I am doing an application that involves playing back a song in a multi track format (drums, vocals, guitar, piano, etc...). I don't need to do any fancy audio processing to each track, all I need to be able to do is play, pause, and mute/unmute each track.
I had been using multiple instances of AVAudioPlayer but when performing device testing, I noticed that the tracks are playing very slightly out of sync when they are first played. Furthermore, when I pause and play the tracks they continue to get more out of sync. After a bit of research I've realized that AVAudioplayer just has too much latency and won't work for my application.
In my application I basically had an NSArray of AVAudioPlayers that I would loop through and play each one or pause/stop each one, I'm sure this is what caused it to get out of sync on the device.
It seemed like apple's audio mixer would work well for me, but when I try implementing it I get a EXC_BAD_ACCESS error that I can't figure out.
I know the answer is to use OpenAL or audio units but It just seems unnecessary to spend weeks learning about these when all I need to do is play around 5 .mp3 tracks at the same time. Does anyone have any suggestions on how to accomplish this? Thanks
thanks to admsyn's suggestion I was able to come up with a solution.
AVAudioPlayer has a currentTime property that returns the current time of a track and can also be set.
So I implemented the startSynchronizedPlayback as stated by admsyn and then added the following when I stopped the tracks:
-(void) stopAll
{
int count = [tracksArr count];
for(int i = 0; i < count; i++)
{
trackModel = [tracksArr objectAtIndex:i]
if(i = 0)
{
currentTime = [trackModel currentTime]
}
[trackModel stop]
[trackModel setCurrentTime:currentTime]
}
{
This code basically loops through my array of tracks which each hold their own AVAudioPlayer, grabs the current time from the first track, then sets all of the following tracks to that time. Now when I use the startSynchronizedPlayback method they all play in sync, and pausing unpausing keeps them in sync as well. Hope this is helpful to someone else trying to keep tracks in sync.
If you're issuing individual play messages to each AVAudioPlayer, it is entirely likely that the messages are arriving at different times, or that the AVAudioPlayers finish their warm up phase out of sync with each other. You should be using playAtTime: and the deviceCurrentTime property to achieve proper synchronization. Note the description of deviceCurrentTime:
Use this property to indicate “now” when calling the playAtTime: instance method. By configuring multiple audio players to play at a specified offset from deviceCurrentTime, you can perform precise synchronization—as described in the discussion for that method.
Also note the example code in the playAtTime: discussion:
// Before calling this method, instantiate two AVAudioPlayer objects and
// assign each of them a sound.
- (void) startSynchronizedPlayback {
NSTimeInterval shortStartDelay = 0.01; // seconds
NSTimeInterval now = player.deviceCurrentTime;
[player playAtTime: now + shortStartDelay];
[secondPlayer playAtTime: now + shortStartDelay];
// Here, update state and user interface for each player, as appropriate
}
If you are able to decode the files to disk, then audio units are probably the solution which would provide the best latency. If you decide to use such an architecture, you should also check out Novocaine:
https://github.com/alexbw/novocaine
That framework takes a lot of the headache out of dealing with audio units.

Receive remote control events without audio

Here is some background information, otherwise skip ahead to the question in bold. I am building an app and I would like it to have access to the remote control/lock screen events. The tricky part is that this app does not play audio itself, it controls the audio of another device nearby. The communication between devices is not a problem when the app is in the foreground. As I just found out, an app does not assume control of the remote controls until it has played audio with a playback audio session, and was the last do so. This presents a problem because like I said, the app controls ANOTHER device's audio and has no need to play its own.
My first inclination is to have the app play a silent clip every time it is opened in order to assume control of the remote controls. The fact that I have to do this makes me wonder if I am even going to be allowed to do it by Apple or if there is another way to achieve this without fooling the system with fake audio clips.
QUESTION(S): Will Apple approve an app that plays a silent audio clip in order to assume control of the remote/lock screen controls for the purpose of controlling another device's audio? Is there any way of assuming control of the remote controls without an audio session?
P.S. I would prefer to have this functionality on iOS 4.0 and up.
P.P.S I have seen this similar question and it has gotten me brainstorming but the answer provided is not specific to what I need to know.
NOTE: As of iOS 7.1, you should be using MPRemoteCommandCenter instead of the answer below.
You create various system-provided subclasses of MPRemoteCommand and assign them to properties of the [MPRemoteCommandCenter sharedCommandCenter].
I'm keeping the rest of this around for historical reference, but the following is not guaranteed to work on recent iOS versions. In fact, it just might not.
You definitely do need an audio player but not necessarily an explicit session to take control of the remote control events. (AVAudioSession is implicit to any app that plays audio.) I spent a decent amount of time playing with this to confirm this.
I've seen a lot of confusion on the internet about where to set up the removeControlEventRecievedWithEvent: method and various approaches to the responder chain. I know this method works on iOS 6 and iOS 7. Other methods have not. Don't waste your time handling remote control events in the app delegate (where they used to work) or in a view controller which may go away during the lifecycle of your app.
I made a demo project to show how to do this.
Here's a quick rundown of what has to happen:
You need to create a subclass of UIApplication. When the documentation says UIResponder, it means UIApplication, since your application class is a subclass of UIResponder. In this subclass, you're going to implement the remoteControlReceivedWithEvent: and canBecomeFirstResponder methods. You want to return YES from canBecomeFirstResponder. In the remote control method, you'll probably want to notify your audio player that something's changed.
You need to tell iOS to use your custom class to run the app, instead of the default UIApplication. To do so, open main.m and change this:
return UIApplicationMain(argc, argv, nil, NSStringFromClass([RCAppDel`egate class]));
to look like this:
return UIApplicationMain(argc, argv, NSStringFromClass([RCApplication class]), NSStringFromClass([RCAppDelegate class]));
In my case RCApplication is the name of my custom class. Use the name of your subclass instead. Don't forget to #import the appropriate header.
OPTIONAL: You should configure an audio session. It's not required, but if you don't, audio won't play if the phone is muted. I do this in the demo app's delegate, but do so where appropriate.
Play something. Until you do, the remote controls will ignore your app. I just took an AVPlayer and gave it the URL of a streaming site that I expect to be up. If you find that it fails, put your own URL in there and play with it to your heart's content.
This example has a little bit more code in there to log out remote events, but it's not all that complicated. I just define and pass around some string constants.
I bet that a silent looping MP3 file would help work towards your goal.
Moshe's solution worked great for me! However one issue I noticed is when you paused the audio, the media controls would go away and you won't be able to play it again without going back into the app. If you set the Media Info on the lock screen when you play the audio then this won't happen:
NSDictionary *mediaInfo = #{MPMediaItemPropertyTitle: #"My Title",
MPMediaItemPropertyAlbumTitle: #"My Album Name",
MPMediaItemPropertyPlaybackDuration: [NSNumber numberWithFloat:0.30f]};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:mediaInfo];

How do I exclude iPad 2 and iPod Touch 5th Generation in a function?

I'm trying to create a function that works on ALL iOS devices except the iPad 2 and the iPod Touch 5th Gen.
- (void)doSomething {
// if iPad 2 or iPod 5th Gen
if ()
{
NSLog(#"You're using an iPad 2 or iPod 5th Gen. Sorry!");
}
else
{
NSLog(#"Any other iOS device. Congrats!");
} }
Can someone post a quick sample snippet of how I would accomplish this?
If your app has a major hardware requirement (i.e., it can't really function/do anything useful if the hardware isn't present on the device), you should add an entry to the UIRequiredDeviceCapabilities entry in your Info.plist for your app. This will keep people who don't have the necessary hardware to use your app from purchasing it/downloading it by accident. It will also cause the App store to show a list of all the models that support your software, so people can see what they need in order to use it.
If your app has a function that requires something specific, there are generally in-framework tests you can do to see if the device has the required features/hardware. If this isn't your app's central purpose, you can then enable/disable this feature of your app based on the device's capabilities. You wouldn't want to try and query which device the user is running (except in maybe very limited circumstances), but would rather query whether the device is capable of doing what you want.
Since you mentioned an auto-focusing camera, we'll use that as an example. If your app requires this to do anything useful, you should add the UIRequiredDeviceCapabilities key to your Info.plist file and add the entry auto-focus-camera to the array. This will ensure that only users who have a device with an auto-focusing camera will be able to purchase and install your app. For more information on UIKit keys for Info.plist, including this one, see the Information Property List Key Reference.
If, on the other hand, your app is usable by any device but has a feature that requires an auto-focusing camera, you can test for it's presence using the AVFoundation framework. You can get what you need here from the AVCaptureDevice class. For example, to check if you have access to an auto-focusing camera for video/stills:
// Check for the default camera
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if( camera && [camera isFocusModeSupported:AVCaptureFocusModeAutoFocus] ) {
// this device has a default video source capable of autofocus, so enable the feature
} else {
// this device does not have the required hardware, so disable the feature
}
Check this:
if ([[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
//device with autofocus
}
else {
//device without autofocus
}

iOS settings toggle switch works the first time but not again

I am using the iPad settings app to change some button sounds and a background image. It all works well and the settings are maintained from one app launch to another in the simulator. Now I have implemented a toggle switch to either set sets of sounds off or on. When the app launches, whatever state the switch is in, it works; e.g. if the "Alert Sounds" switch is OFF the alert sounds are silent and if I change it to ON the sounds will start working. However, if I turn the switch back OFF the sounds still keep working. However, if the state is ON when the app launches, the sounds work, but will not be silenced when the switch is set to OFF.
Note that this is different than the settings not taking effect until a second round of settings. That was a previous problem I solved (thanks to stack overflow) by using:
- (void)applicationDidBecomeActive:(UIApplication *)application
{
[[NSUserDefaults standardUserDefaults] synchronize];
}
I have methods named:
- (void)defaultsChanged:(NSNotification *)NSUserDefaultsDidChangeNotification
(which is called when the notification is sent)
and
-(void)setValuesFromPreferences
(which is called in ViewDidLoad)
The logic looks like this in both:
// Set alert sounds from preferences
NSString *alertSoundPreference = [userDefaults stringForKey:kAlertSound];
BOOL alertSoundEnabled = [userDefaults boolForKey:kAlertSoundEnabled];
if (alertSoundEnabled)
{
// Create the URLs for the alert audio files
// Store the alert sound URLs as a CFURLRef instances
// Create system sound objects representing the alert sound files
}
I do not have an else, because I assume that no sound resources will be specified if alertSoundEnabled is NO.
I have searched for explanations and tutorials that mention this problem but have not found any yet, so I'm asking here. Thanks for any suggestions.
viewDidLoad is not necessarily called when the app becomes active again (nor does viewWill/DidAppear, IIRC), as the whole point of iOS 4+ multitasking is to prevent such loading/unloading and recreation of objects on app-switching.
If I had to guess, the sounds are already allocated when the user had the switch ON at original launch/viewDidLoad; however, if your code does nothing to explicitly disassociate them when it loads back up, they would continue playing, as they are all already set up.
As such, I'd try adding an else clause that (upon alertSoundEnabled == NO) destroys your system sound objects.