Native caret position macos cocoa - objective-c

I want to be able to get the global caret position inside any application in Mac High Sierra using cocoa or appleScript. I already use NSEvent to get the keyboard and mouse hook but is there a way to get the caret position hook?
The caret is different from the mouse position. It moves on key event or mouse click. In windows, you can get the caret position almost anywhere. I want to know if there is the equivalent for macos.
I want to show a popup over the text caret, if i type on the keyboard or line return, it moves with the text.I tried getting the position of the key event, (locationInWindow) but it give me back the mouse position. I am not sandbox so i can even call applescripts
UPDATE : It is possible doing this by getting the bounds of the letter before the caret with the use of accessibility API.
thanks

I don't have the opportunity to try it for myself just yet, so you might beat me to the punch of confirm/reject this.
UIEvent has addGlobalMonitorForEventsMatchingMask:handler: where the mask can have a value of NSEventMaskCursorUpdate and presumably the returned NSEvent object will contain a coordinate that can be acted upon (i.e. converted to screen-space).
Caveat here is the docs explicitly say
Key-related events may only be monitored if accessibility is enabled
or if your application is trusted for accessibility access (see
AXIsProcessTrusted).
Your post seems to suggest that you do not wish to use Accessibility API ("but if not using accessibility API") so that may mean you're out of luck in the specific combination of requirements you seek to fulfill.

Related

Multiple input.event triggers on single button press caused by Mouse or Gamepad Joystick movement in Godot 3.2.3

I'm building a game in Godot and I am running into an issue where Input.is_action_pressed, Input.is_action_just_pressed, and Input.is_action_just_released are all triggering multiple times if the mouse or gamepad joysticks are moving while clicking the buttons. I have tried checking for is_echo, but nothing registers as an echo.
I am looking for input via:
func _input(event):
if Input.is_action_just_released("AttackRange"):
fireGun()
This is very easily repeatable for me right now. All I have to do is move my mouse around while clicking, or moving either of the joysticks on the gamepad while pressing buttons. I can't figure out what is causing this. Should I be listening for inputs in a different way?
Help would be greatly appreciated!
Yeah, you are mixing ways to get input.
Either use _input and process only the input you get in the event parameter. This is usually better for pointing input (mouse or touch).
Or put your code in _process (or _physics_process if necessary) and use the Input object.
In this particular case, I'd move the code you have to _process.

LabVIEW disabled Slider is enabled

I have a large problem disabling a slider in LabVIEW. Here is my minimal example:
I have a simple Slider, which is disabled and grayed out if the value is higher than 5. Otherwise the Slider is enabled.
If I drag the slider higher than 5, the Slider gets grayed out. But I am still able to move the slider around and change the value. Only after I dropped the Slider, the Slider is disabled to use.
In my opinion, this is a large bug of LabVIEW. Is there any way to disable the Slider correct during drag?
Thank you for your answers!
Additional information:
Like I said, the snippet is only my minimal example to show the basic problem. In my application the following is happening:
I have s statemachine with a state that enables the Slider and a state that disables my Slider. The state can change every moment, so it´s possible, the user is using the slider at the moment of statechange --> moment of disabling. At this moment the slider should be disabled (it only gets grayed out) directly ... not after releasing it. So limiting the maximum is not real target. I want to prohibit all slider actions for a user.
"Link to question asked on NI Discussion Forums"
As suggested by Alexander_Sobolev on the NI forum (but I promise I thought of it independently!), you can end the slider drag by generating a mouse up event. On Windows you can do this with Simulate Mouse.vi from the NI site, which calls mouse_event from user32.dll:
Note that one of that VI's mouse position inputs is erroneously marked as 'Required'; I fixed that before creating the code above.
I do think this is a UI technique that should only be used if it's really justified by the requirements of the system, and if the users will understand why it works like that; otherwise it could make for a frustrating and annoying user experience. I don't think it's a bug, rather a design decision, because the opposite behaviour could be equally undesired in other circumstances.
I guess you could set the slider value to 5 inside the case structure, alongside the greying out, by adding another property node. This should keep the slider stuck at 5, if the user tries to pull it above.
This appears to be strange behavior as the Value Change event is triggered while the mouse button is held down even when the control is Disabled & Grayed Out.
One way I can think of to limit the value would be to update the Data Entry Limits Maximum property for this control and setting the Response to Value Outside Limits for Maximum to be Coerce.

Intercept wxListCtrl scroll events

I'm using wxWidgets 2.8 on a Linux box.
I'd like to get notifications of scroll events from wxListCtrl (or wxListView). Basically I want to be notified when someone uses the scrollbars.
I tried with EVT_SCROLLWIN and EVT_SCROLL without success.
Can someone provide me some sample code?
Unfortunately I cannot find any documentation/sample about this topic. Any pointer?
I need to intercept this event because I'm using a wxListView under MOTIF (sic) and when I scroll the list, the new items are not redrawn (basically I see the list empty until I click an item). So, my hack would be to call wxWindow::Redraw() after a scroll.
Any alternative solution to my original problem?
Thanks.
You won't get scroll events for what can be (even if it is not in wxGTK, actually) a native control, this is just not something that wxWidgets guarantees because it is very difficult (and maybe impossible) to implement in general.
Sorry.

Floating NSWindow steals focus

I'm trying to make an app that sort-of functions like the spotlight search that was demonstrated on WWDC.
I managed to get it to the floating level with kCGFloatingWindowLevelKey, however the window steals the focus from whatever window was previously active. I would like it to keep the focus, and still take input in the textfield from the user. Is that doable?
Answers in swift is preferred, but objective-c works as well.

Showing the keyboard in a Microsoft Surface application

I am creating an application that has multipile browsers open.
Each browser has its own keyboard to type with but I don't know how to show the keyboard in this application. When the user wants to type any URL, I have to show keyword for each user.
The normal keyboard should - at least in the surface mode - appear as soon as a textbox gets a focus. The drawback: there is only on open at a time.
If you really need to have multiple touch-keyboards, you would need to implement a custom control displaying and emulating a keyboard (you would need to handle different layouts on your own!). Basically it could be implemented as a bunch of buttons, each one adding a letter to a label. A delete button would delete the last letter. Marking, copying, deleting and so one would be interesting parts to be implementing.
We have done something similar (although only one keyboard) to emulate a handy keyboard for a promotion. To be honest: it wasn't the best experience in terms of usability (a bit worse than the included keyboard). It has fit it's need, and on screen keyboards aren't best in class experience at all (you could argue, but I like my mechanical keyboard a lot more than any virtual keyboard, so this might be a matter of taste)