I have encountered a weird problem with one of my OS X apps. The app uses AXAPI to create event taps and to monitor keyboard and mouse events. Some users with OS 10.6-10.7 report that when the app is active, their mouse doesn't function correctly - they have to click twice or more, otherwise system doesn't see the click at all.
When these users switched off AXAPI, the problem disappeared.
The weird thing is that I have never encountered this problem on development computers, nor did the testers. Yet around 10% of reporting users have experienced it.
I use active event tap at HID level and I also handle mouse events, but I never return NULL from callback.
The problem is almost certanly in AXAPI, but I can't understand where exactly it is.
Eventually I switched to filtering Mouse Events with a separate passive EventTap. The problem disappeared after the update of the app, but there was also a minor OS X update around that time, so I can't actually tell what fixed the bug.
Josh, thank you for your idea anyway.
Related
I've been using Wineskin for quite a while, and, as of late, I've been attempting to use Winemac.drv (or Mac Driver) from CodeWeavers more than X11. The main difference is that Winemac is fully programmed in Objective-C (including its Window system), while the X11 approach uses X11 dylibs and .nib files for windows.
Since Winemac's still in development, however, it lacks in some OS X functionality*. The feature in mind is bouncing on Dock. What's the problem with the bouncing? Well, it simply just doesn't bounce. More specifically at launch. The code somehow overrides the user's option for "Animating apps on launch", or does something that completely ignores the usual app launching animation.
I've still very little experience in Objective-C, so I might've missed some key documentation from Apple's Mac Documentation Library, but my question is:
Can apps usually override this option, or might this be the case just for Wine? If they can, how?
EDIT: I've stated incorrectly that the Mac Driver missed on functionality, when the reality is, as mentioned by Ken Thomases, that Wine processes start at background, so no icon is shown on the Dock at launch, and that means no animation.
I'm the developer of the Mac driver for Wine.
The issue is that all Wine processes start life as background processes with no presence on the Dock. Many Wine processes remain that way because they never present any windows.
When a Wine process does present a window for the first time, it transforms itself from a background process to a foreground process. At this time, it gains a presence on the Dock and in the Command-Tab application switcher and gets a main menu bar. It just so happens that the Dock does not bounce the icon of an app which transforms from a background process to a foreground process. Basically, the Dock is getting involved well after the process was launched and bouncing is for a process which is launching.
An application can make its Dock icon bounce by calling -[NSApplication requestUserAttention:]. However, this does nothing if the application is already active. Also, the bounce animation has a different quality. It's sharper and more urgent, rather than a relaxed bounce.
Basically, there's no way to achieve what you want for the general case. It may be possible to construct a script-based app bundle that configures the environment and then execs Wine. Since the app is bundled and describes itself in its Info.plist as a normal foreground app, it will get a Dock icon immediately and that icon should bounce. I'm not entirely sure how things will behave from there, in terms of the execed Wine taking over the Dock icon. Even if it works for the initial process, any Wine processes which are launched by the initial process will revert to behaving in the manner you're familiar with. (For example, many games have a patcher/launcher which launches a secondary process for the game itself. You might get the patcher/launcher icon to bounce, but that wouldn't help for the game process's icon.)
Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.
I am building a small app that I can use to do Interval Training,
it schedules a series of UILocalNotifications, all scheduled at the same
time and all fired within a few minutes of each other.
The idea is that you put in your headphones and start a workout,
when you hear one kind of sound you rest and another kind of
sound you workout. I do this with localnotifications, it works
just fine. The reason for doing it like this and not just
having the app run with a timer is that I would like for
the Nike +iPod app to run in the foreground at the same time.
The notifications are just an alert and an OK button:
[notif setHasAction:NO];
So the idea is: Pop in the headphones, start my app, it schedules a series of notification - then start the Nike +iPod workout. When you hear the Notification sound, change from rest to workout or vice versa.
Ok, when the workout is over there are 15+ notifications on the screen and they need to be manually dismissed, this is a bit annoying and not at all user friendly.
My question is now if there is a way to post only sound notifications? OR make sure earlier notifications are removed as new ones pop up OR is there a different/better way of going about achieving the functionality of getting a "sound indicator" while the app is in the background?
Hope someone can lend a bit of experience or a good idea for an alternative:)
Thanks in advance.
I can't confirm how well this would actually work, but if you set the alertBody property on your UILocalNotification object to nil (this is the default value) when you create them, it should prevent an alert from appearing on screen as you fire them.
In addition, you might also want to set the hasAction property to NO, which prevents the user from seeing the action buttons (if you had an alert), or the slider (if they had the device locked).
But as for the actual stacked notification alerts - I don't believe there's any way to prevent them getting stacked. That's something which might be worth raising a Radar for, so Apple could consider allowing that to happen in future iOS versions.
I'm wondering if there's any way to have my application be notified when a drag-and-drop operation starts anywhere on the screen, even if I don't have an active window there.
I've looked into the normal drag-and-drop APIs, but I haven't spotted anything that does this. The NSDraggingDestination protocol along with the -[NSWindow/NSView registerForDraggedTypes:] method allows you to notice when someone is dragging something and that crosses over into your window, but I'd like to notice it when any dragging operation is started anywhere on the screen.
Any tips on how to go about this? Is there a standard Cocoa API for it, or is there a private API / some kind of dirty hack to get this information?
Thanks in advance :)
Take a look at NSEvent’s +addGlobalMonitorForEventsMatchingMask:handler:. I’m not sure if you can track mouse dragging but it’s certainly possible to track mouse button up/down events.
i haven't done it,
but i am assuming you need some kind of external software monitoring ALL mouse activity on the system, and reporting it to your app (or your application doing this itself),
as dragging events are usually reported in your app only when there is activity inside your app's window..
I'm creating an application that emulates MacBook's multi-touch Trackpad. As you may know, on MacBook's trackpad
if you swipe 4 fingers up, it triggers the Show Desktop.
if you swipe 4 fingers down, it shows the Exposé.
However, if the Show Desktop is being activated and you swipe 4 fingers down, it will come back to the normal mode. The same goes with the Exposé: if the Exposé is being activated and you swipe 4 fingers up, it will also come back to the normal mode.
Here is the problem: I use the keyboard shortcut F3 to show the Exposé and F11 to show the Show Desktop. The problem is, when the Show Desktop is being activated, if I press F3, it will go straight to the Exposé. And when the Exposé is being activated, if I press F11 it will go straight to the Show Desktop. But I want it to behave like Trackpad, which I guess its code may look like this
- FourFingersDidSwipeUp {
if (isExposeBeingActivated() || isShowDesktopBeingActivated()) {
pressKey("Esc");
} else {
pressKey("F11");
}
}
But I don't know how to implement the "isExposeBeingActivated()" and "isShowDesktopBeingActivated()" methods. I've tried creating a window and check whether its size has changed (on assumption that if the Expose is being activated, its size should be smaller), but the system always returns the same size. I tried monitoring the background processes during the Expose, but nothing happened. Does anyknow have any suggestions on this?
(I'm sorry if my English sounds weird.)
As far as I know, there's no public interface to any Exposé related functionality beyond the ability to specify the "collection behavior" of your own application's windows.
came here after reading your email. I understand the problem. After a bit of googling I found out what you already know, that there's really no official API or documentation for Exposé. A very ugly solution I've thought of could be having Exposé trigger a timer equal to the total time it takes to show all windows fully (guessing this is constant). If a swipe up would be done within that timer, it would mean that Exposé would still be active (isExposeBeingActivated()), so you would trigger a cancel instead of a Show Desktop. This wouldn't cover the use of the "slow motion" Exposé (via SHIFT key). Maybe you can detect if it's a normal or "slow motion" Exposé call?
Really sorry if this doesn't make sense at all within your application's scope, guess I'm just really saying the first solution I thought of.
Cheers.
Pedro.