I have not done this since long ago in NeXTStep and seem to remember that you could create objects which could be made part of the event loop and could generate events that were co-equal with those from mouse and keyboard events. But... that was 20 years ago and I may have confounded it with something else.
In my particular case, I need a listener that checks a select() to see if one of many UDP ports have received a DGRAM. I need this to happen without disturbing the mouse and keyboard events, although it would be nice if I gave the UDP check a higher priority.
Basically, I have streams of numbers from one or more other systems that are to be displayed in a GUI, and I want the user to still be able to use buttons and such.
According to the cocoa events guide you can raise an event with type NSApplicationDefined.
[NSEvent otherEventWithType:NSApplicationDefined location:modifierFlags:timestamp:windowNumber:context:subtype:data1:data2:]
As for how to raise them, I'm not exactly sure what you need, but this post on the apple developer lists shows how to register a UDP listener. It raises a notification via NSNotificationCenter, but you could make that an event.
Related
I've been trying for hours to use global hotkeys and "consume" the key event so it is not forwarded any more to the application where the key event is originally coming from.
So what I want to do is:
- a user presses a shortcut with application A in front, e.g. Cmd+F3
- my application (application B) receives this shortcut through the global event handler and sends mouse and keyboard events to application A
It's probably easiest to think of it as a macro.
I'm using DDHotkey and it works quite fine. The problem I have is that DDHotkey doesn't "consume" the key events and modifiers. That means that when my application starts sending mouse and keyboard events, the Cmd key from the actual global shortcut is still pressed.
This leads to erroneous behavior in my case (I'm double-clicking a textfield programmatically and that doesn't open when Cmd is pressed for example).
So what I'd like to do is really consume the key event and the modifier keys so that they are not forwarded to application A. Alternatively, I would "flush" the event queue before sending the key events to application A.
Is there any way to achieve this easily?
An even more reliable approach if it works for your use case would likely be to not script the UI by triggering mouse events and key presses, and instead use the Accessibility API to trigger the more high-level actions (like using Accessibility to tell a button it has been pressed). Unless the app contains some unfortunate code, that should not look at the modifier keys.
Telling the OS from an event monitor to remove key states would probably cause lots of issues: It would be confusing if the user then actually released the physical keys and a second keyUp came in. Even if the OS tries to avoid that, it is just asking for other edge case bugs - what if the user pressed a modifier key while your code is scripting the UI?
But if the applications you are scripting do not support Accessibility, nor AppleScript, nor any other more high level approach to automation, what you could do is wait for the user to release your hotkey (i.e. wait for keyUp events) and only then trigger your scripted actions. Might be necessary to use performSelector:withObject:afterDelay:0.0 to get out of the keyUp handler before you do that.
Is there any API in Windows 8 that tells you whether there is a Keyboard connected to your device? I believe the OS should have information about this, but I am not sure that it is exposed.
I checked the Windows.Devices.Input.KeyboardCapabilities(). It only returns an object with a property keyboardPresent that equals to 1 on my touch device or non-touch device.
The problem has already been reported on stackoverflow without solution (Provide another solution but which also don't seem to work).
Maybe a work around could be to add a TextBox outside the screen and set the focus on it and register InputPane.GetForCurrentView().Showing and to see if it is fired or not. If it is you can deduce that there is no keyboard connected (and you might be able to reset the focus to the page inside the event so that the keyboard don't actually pop up) and if it don't fired that the there is a keyboard connected. That not a really good solution but might be the best available ...
I'm writing a plugin that needs to be notified as changes happen to open files in an Editor. This needs to happen in real time (similar to how syntax checking is done currently).
ResourceChangeEvents works when I only need to get notified when a file is saved.
IPropertyChangeListener will tell me when a editor is marked as dirty.
This question is similar, but is more geared to getting events on a single editor instance and won't scale well for all editors.
What about keypress notifications in an editor? I'm a little surprised they don't cause PropertyChange events. How can I get such notifications for all editors?
You always have to deal with one text editor at a time. You would create an org.eclipse.ui.IPartListener and start listening to the editor on partActivated and stop listening to the editor on partDeactivated.
Also, I think you probably want to use org.eclipse.jface.text.IDocumentListener to listen to changes in the ITextEditor (instead of targetting the low level widget itself).
I have an issue with WindowsHookEx in vb.net. If my pc is overloaded especially from 3D rendering, windows automatically disconnects my keyboard hook and my hotkeys stop working. I searched around and it seems that there is no way to detect whether a hook is active or disconnected. So I tried this method presented by "moodforaday"
Is it possible to detect when a low-level keyboard hook has been automatically disconnected by Windows?
hook-has-been-automatically-d
He states that using GetLastInputInfo periodically and store GetLastInputInfo to another variable when a key is used and compare the results. If the tick is much newer than your older variable then its likely that its disconnected. Its a great method but the ticks can go up from other things like the mouse. In my Hook class there is no Mouse hook therefore I cannot store a variable of the tick count when the mouse is moved. So now I ended up having it create a new instance of the hook class and hook again. It checks every second if the stored tick is older than new tick by 10000 ticks.
Is it alright to keep creating new instances of Hooks? It will keep Hooking/Unhooking constantly and I'm wondering if that is going to be a problem for Windows.
Also if anyone has another method to detect if a hook is disconnected please let me know would fix this whole hassle.
Do your 3D rendering in a background thread. Use Control.Invoke only for code where you directly access UI controls.
Alternately, you could split the rendering into very small pieces and post them to yourself as messages, to be handled on the main thread. This way you will be able to handle both internal and external messages.
In both cases, your application will be responding in a timely fashion, Windows will have no reason to consider it non-responding, and your keyboard shortcuts will stay in place.
I am implementing a text service on windows. Things work fine. However when I shift window focus to another application and shift focus back to the original application, the selected text services gets de-activated (I notice a call to ITfTextInputProcessor::Deactivate). I think this call is unexpected. Post this call, The service has to be re-activated manually. I am surely doing something goofy. Just that I don't know what it is.
Offhand, I'd say that you are indeed doing something goofy. :) In particular, I'd pay careful attention to your ITfThreadMgrEventSink::OnSetFocus implementation (and, obviously, you need to implement ITfThreadMgrEventSink in your text service and connect it via AdviseSink if you haven't already.)
After more research, I've figured out what’s happening:
When you set focus back to Word, TSF gets the current thread’s active keyboard layout (actually a locale ID).
It then compares that keyboard layout with the language ID of the currently active text service.
If they’re different, TSF then activates the text service for the active keyboard layout, and deactivates any previously active text service.
I believe this behavior is different on Vista/Windows 7.
The fix would be to use LoadKeyboardLayout/ActivateKeyboardLayout to set the process keyboard layout in your ITfTextInputProcessor::Activate implementation. Apparently some apps also need you to call ITfInputProcessorProfiles::ChangeCurrentLanguage() as well.