I have a form in my Cocoa app that contains an NSSegmentedControl that I want to be controllable via the keyboard. It seems that NSSegmentedControl is very reluctant to become the first responder, however.
Setting the initial first responder of the window to the segmented control does nothing -- it will not have keyboard focus when the window is first loaded. It does receive focus if I manually set the first responder like this, however:
[segmentedControl.window makeFirstResponder: segmentedControl];
That will work fine if the only part of the form is the segmented control. If I add another field (say, an NSTextField), and I set the nextResponder of the segmented control to that field, the segmented control will never become first responder. Focus will immediately go to the text field, and pressing tab to switch back to the segmented control doesn't work.
I've tried subclassing NSSegmentedControl and overriding acceptsFirstResponder, becomeFirstResponder, etc. to no avail. The only one that makes any difference is resignFirstResponder -- if I return NO from that method then the segmented control will indeed retain focus, but obviously I don't want it to retain focus all the time.
Any ideas on how to get the control to behave like a normal responder?
It's behaving as intended. Not all controls participate in the "key view loop". Full keyboard navigation is turned on through Universal Access in System Preferences for all apps and it's not for individual apps to implement on their own.
It's best not to use a segmented control in a form intended for heavy keyboard entry. NSPopUpButton works more closely to what we all exepect in a web form so it's not as if it's necessarily the wrong choice in your app's UI.
Rather than answer exactly the question you asked (which someone else can do), I humbly suggest you choose on the side of functionality at the cost of a slightly prettier UI element since that prettier UI element wasn't intended to get along with the keyboard.
Related
I am working on a small game for Apple TV, and one thing I can't get to work is moving the focus from one button to the other (programmatically), or temporarily remove focus from any object (buttons) as the game is doing stuff.
I've seen the guide about objects and focus, but is there really no way to programmatically move the focus to an other part of the screen as the input is needed there (instead of letting the user move all the way across the screen)?
There will always be a view that has focus. It's not possible to not have a focused view.
You could temporarily change the focused appearance of an item, so it doesn't appear to have focus, but that would likely be be confusing for the user, or conflict with Human Interface Guidelines.
There's no explicit way to programmatically move focus from one control to another, per the App Programming Guide for tvOS:
The Focus Engine Controls Focus
Only the focus engine can explicitly update focus, meaning there is no API for directly setting the focused view or moving focus in a certain direction. ...
The focus engine controls focus to make sure that it does not move around the screen unexpectedly, and that it behaves similarly across different applications.
Answers to other questions have suggested that you could "game" the system by overriding preferredFocusedView, conditionally setting it to the desired control to move to, then requesting a focus update. Such an approach would likely be fragile.
I'm currently working on an OSX menubar app that uses a custom status item view and a NSPopover to display content.
I'm trying to get it to dismiss at times that would make sense like when spaces are changed, since the popover doesn't move spaces like a window does, or when mission controller is activated.
Currently, when in mission control, the NSPopover stays on top as shown in this screenshot.
Currently I'm using NSEvent addGlobalMonitorForEventsMatchingMask: with some mouse event masks and that works alright but doesn't cover all needed events.
So, is there a way to detect when major OS events happen like opening mission control, changing spaces etc?
Any help would be greatly appreciated.
You can get notified of space changes by registering for NSWorkspace's NSWorkspaceActiveSpaceDidChangeNotification. There isn't a notification as such for Mission Control, but you might investigate whether NSWorkspaceDidActivateApplicationNotification or other notifications can be used to determine what you need.
HTH
I have a custom NSPanel: http://cl.ly/K8SY
I have it set to NSPopUpMenuWindowLevel, the level at which I want it to stay as. An example is the spotlight menu, when you click on it any other focus in the windows in the background remains yet you can still type into the search field.
I open it with:
[window orderFront:nil]
but this doesn't focus on the window as well as the background.
Is it possible to achieve this? If so, how?
You need to use the -[NSWindow makeKeyAndOrderFront:] call instead.
NOTE: keyboard focus can only be directed at one view in one window. Cocoa's notion of mainWindow and keyWindow can be different windows, but its only the first responder within the keyWindow that accepts keyboard input.
Is there a way to have my app's window receive keyboard and/or mouse events (i.e. user clicking on window's buttons) while still retaining focus to another, unrelated app?
I've tried configuring my window at different levels, including [myWindow setLevel:NSPopUpMenuWindowLevel] to no avail.
You should be able to handle mouse clicks without ordering your window front by making your views (at least the ones that handle mouse clicks) respond to acceptsFirstMouse: messages by sending NSApp a preventWindowOrdering message and then returning YES.
You should not make your app handle typing without ordering itself front. The user might not realize where their typing is going if the field where it's appearing is obscured by another window.
Found it. Simple, yet elusive.
Use NSPanel and make sure panel style is Non Activating (NSNonactivatingPanelMask) or tick the same option in IB's inspector.
I'm trying to implement some rudimentary tabs in a Cocoa editor I'm working on. I am using an NSSegmentedControl and adding segments to it as tabs. I'm using a custom NSSegmentedCell subclass for the tabs to draw a little 'x' icon next to the text for closing tabs and so far it's been going pretty smooth.
However, I cannot figure out how to actually process mouse events for the tabs to check if someone moused over (or clicked) the 'x' icon. I tried overriding "mouseMoved" in my NSSegmentedControl subclass, but for some odd reason it stops getting called when I add a new segment to it (I set "setAcceptsMouseMovedEvents" to yes in awakeFromNib, do I have to also set it somewhere else??). NSSegmentedCells, being NSCell subclasses seem to not have any mouse event processing, aside from mouse tracking, which gets triggered only when the control is clicked.
So the question is, how would I properly process mouse events, either in the NSSegmentedControl or in the NSSegmentedCell subclass?
Take a look at NSTrackingArea. You can add a tracking area to your NSSegmentedControl and get mouse-entered events on that to highlight the close button.
As for getting the click events, you're probably best off using a separate NSActionCell subclass for the close button and do some hit testing there.