In Cocoa I developed app which is agent (runs only as icon on upper status bar).
It can display popover window which is basically subclass of NSWindow with NSView as it's content.
Into another NSView subclass (which represents icon on status bar) I'm adding
self.settingsPopoverTransiencyMonitor = [NSEvent addGlobalMonitorForEventsMatchingMask:NSLeftMouseDownMask|NSRightMouseDownMask handler:^(NSEvent *event) {
[selfReference hideSettingsPopover];
}];
So when user clicks outside popover windows it hides.
I want to implement similar behaviour when user swipes with four fingers up/down (so when Exposé or Mission Control are being launch).
I tried with lot of mask that are available in NSEvent.h but none of them helped.
First enable touch events in your NSView:
view.acceptsTouchEvents = YES
In your NSView subclass, override the touch event methods declared in the NSResponder superclass. If you want to recognize a 4-finger swipe gesture, you probably want -swipeWithEvent:. Since this will fire for swipe events that have any number of fingers, you would want to filter it to 4-finger gestures only. The method -[NSEvent touchesMatchingPhase:inView:] will return an array of NSTouch objects, one for each finger (ie. the count of the array is equivalent to the number of fingers).
To summarize, the implementation would look something like this:
- (void)swipeWithEvent:(NSEvent *)event
{
NSArray *touches = [event touchesMatchingPhase:NSTouchPhaseTouching inView:self];
if (touches.count == 4) {
// Handle event here
}
}
Related
My app is not document based, and its sole window is managed by a custom, xib-based NSWindowController subclass that I instantiate within the app delegate code:
- (void) applicationDidFinishLaunching:(NSNotification*) aNotification
{
_mainWindowController = [MainWindowController new];
// (stored in ivar just to prevent deallocation)
//[_mainWindowController showWindow:self];
// ↕︎ Not sure about the difference between these two... both seem to work.
[[_mainWindowController window] makeKeyAndOrderFront:self];
}
I have subclassed NSClipView to "center content inside a scroll view" (instead of having it pegged to the lower left corner) when it is zoomed to a size smaller than the clip view, and also implement custom functionality on mouse drag etc.
My window does have a title bar.
My window isn't borderless (I think), so I am not subclassing NSWindow.
I have overriden -acceptsFirstResponder, -canBecomeKeyView and -becomeFirstResponder in my NSClipview subclass (all return YES).
The drag events do trigger -mouseDown: etc., and if I set a breakpoint there, the first responder at that point is the same as the window hosting my clip view: [self.window firstResponder] and [self window] give the same memory address.
What am I missing?
Update
I put together a minimal project reproducing my setup.
I discovered that if my custom view is the window's main view, -keyDown: is called without problems. But if I place a scroll view and replace its clip view by my custom view (to do that, I need to change the base class from NSView to NSClipView, of course!), -keyDown: is no longer triggered.
I assume it has something to do with how NSScrollView manages events (however, as I said before, -mouseDown:, -mouseDragged: etc. seem to be unaffected).
I also discovered that I can override -keyDown: in my window controller, and that seems to work, so I have decided to do just that (still open to an answer, though). Also, since I'm trying to detect the shift key alone (not as a modifier of another key), I'd rather use:
- (void) flagsChanged:(NSEvent *) event
{
if ([event modifierFlags] & NSShiftKeyMask) {
// Shift key is DOWN
}
else{
// Shift key is UP
}
}
...instead of -keyDown: / -keyUp: (taken from this answer).
Here's how the scroll views work: One scroll view is paging enabled in the horizontal direction. Each 'page' of this scroll view contains a vertically scrolling UITableView. Without modification, this works OK, but not perfectly.
The behaviour that's not right: When the user scrolls up and down on the table view, but then wants to flick over to the next page quickly, the horizontal flick/swipe will not work initially - it will not work until the table view is stationary (even if the swipe is very clearly horizontal).
How it should work: If the swipe is clearly horizontal, I'd like the page to change even if the table view is still scrolling/bouncing, as this is what the user will expect too.
How can I change this behaviour - what's the easiest or best way?
NOTE For various reasons, a UIPageViewController as stated in some answers will not work. How can I do this with cross directional UIScrollViews (/one is a table view, but you get the idea)? I've been banging my head against a wall for hours - if you think you can do this then I'll more than happily award a bounty.
According to my understanding of the question, it is only while the tableView is scrolling we want to change the default behaviour. All the other behaviour will be the same.
SubClass UITableView. UITableViews are subClass of UIScrollViews. On the UITableView subClass implement one UIScrollView's UIGestureRecognizer's delegate method
- (BOOL)gestureRecognizer:(UIPanGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UISwipeGestureRecognizer *)otherGestureRecognizer
{
//Edit 1
//return self.isDecelerating;
//return self.isDecelerating | self.bounces; //If we want to simultaneous gesture on bounce and scrolling
//Edit 2
return self.isDecelerating || self.contentOffset.y < 0 || self.contentOffset.y > MAX(0, self.contentSize.height - self.bounds.size.height); // #Jordan edited - we don't need to always enable simultaneous gesture for bounce enabled tableViews
}
As we only want to change the default gesture behaviour while the tableView is decelerating.
Now change all 'UITableView's class to your newly created tableViewSubClass and run the project, swipe should work while tableView is scrolling. :]
But the swipe looks a little too sensitive while tableView is scrolling. Let's make the swipe a little restrictive.
SubClass UIScrollView. On the UIScrollView subclass implement another UIGestureRecognizer's delegate method gestureRecognizerShouldBegin:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
if ([gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]]) {
CGPoint velocity = [(UIPanGestureRecognizer *)gestureRecognizer velocityInView:self];
if (abs(velocity.y) * 2 < abs(velocity.x)) {
return YES;
}
}
return NO;
}
We want to make the "swipe is clearly horizontal". Above code only permits gesture begin if the gesture velocity on x axis is double than on y axis. [Feel free to increase the hard coded value "2" if your like. The higher the value the swipe needs to be more horizontal.]
Now change the `UiScrollView' class (which has multiple TableViews) to your ScrollViewSubClass. Run the project. :]
I've made a project on gitHub https://github.com/rishi420/SwipeWhileScroll
Although apple doesn't like this method too much:
Important: You should not embed UIWebView or UITableView objects in UIScrollView objects. If you do so, unexpected behavior can result
because touch events for the two objects can be mixed up and wrongly
handled.
I've found a great way to accomplish this.
This is a complete solution for the problem. In order to scroll the UIScrollView while your UITableView is scrolling you'll need to disable the interaction you have it.
- (void)viewDidLoad
{
[super viewDidLoad];
_myScrollView.contentSize = CGSizeMake(2000, 0);
data = [[NSMutableArray alloc]init];
for(int i=0;i<30;i++)
{
[data addObject:[NSString stringWithFormat:#"%d",i]];
}
UITapGestureRecognizer * tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
[self.view addGestureRecognizer:tap];
}
- (void)handleTap:(UITapGestureRecognizer *)recognizer
{
[_myTableView setContentOffset:_myTableView.contentOffset animated:NO];
}
- (void)scrollViewWillBeginDecelerating:(UIScrollView *)scrollView
{
scrollView.userInteractionEnabled = NO;
}
- (void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
scrollView.userInteractionEnabled = YES;
}
To sum up the code above, if the UITableView is scrolling, set userInteractionEnabled to NO so the UIScrollView will detect the swipe. If the UITableView is scrolling and the user taps on the screen, userInteractionEnabled will be set to YES.
Instead of using UIScrollView as a container for these multiple table views, try using a UIPageViewController.
You can even integrate this into your existing view controller setup as a child view controller (directly replacing the UIScrollView).
In addition, you'll likely want to implement the required methods from UIPageViewControllerDataSource and possibly one or more of the methods from UIPageViewControllerDelegate.
Did you try the methods : directionalLockEnabled of both your table and scroll and set them up to horizontal for one and vertical for the other ?
Edit :
1)
What you want to do is very complicate since the touch wait some time (like 0.1s) to know what your movement will be. And if your table is moving, it will take your touch immediately whatever it is (because it's suppose to be reactive movement on it).
I don't see any other solution for you but to override touch movement from scratch to detect immediately the kind of mouvement you want (like if the movement will be horizontal) but it will be more than hard to do it good.
2)
Another solution I can advise you is to make your table have left and right margin, where you can touch the parent scroll (pages thing so) and then even if your table is scrolling, if you touch here, only your paging scroll will be touched. It's simpler, but could not fit with your design maybe...
Use UIPageViewController and in the -viewDidLoad method (or any other method what best suits your needs or design) get UIPageViewController's UIScrollView subview and assign a delegate to it. Keep in mind that, its delegate property won't be nil. So optionally, you can assign it to another reference, and then assign your object, which conforms to UIScrollViewDelegate, to it. For example:
id<UIScrollViewDelegate> originalPageScrollViewDelegate = ((UIScrollView *)[pageViewController.view.subviews objectAtIndex:0]).delegate;
[((UIScrollView *)[pageViewController.view.subviews objectAtIndex:0]) setDelegate:self];
So that you can implement UIScrollViewDelegate methods with ease. And your UIPageViewController will call your delegate's -scrollViewDidScroll: method.
By the way, you may be obliged to keep original delegate, and respond to delegate methods with that object. You can see an example implementation in ViewPagerController class on my UI control project here
I faced the same thing recently. My UIScrollview was on paging mode and every page contained a UITableView and like you described it worked but not as you'd expected it to work. This is how solved it.
First I disabled the scrolling of the UIScrollview
Then I added a UISwipeGestureRecognizer to the actual UITableView for left and right swipes.
The action for those swipes were:
[scroll setContentOffset:CGPointMake(currentPointX + 320, PointY) animated:YES];
//Or
[scroll setContentOffset:CGPointMake(currentPointX - 320 , PointY) animated:YES];
This works flawlessly, the only down side is that if the user drags his finger on the UITableVIew that will be considered as a swipe. He won't be able to see half of screen A and half of screen B on the same screen.
You could subclass your scroll view and your table views, and add this gesture recognizer delegate method to each of them...
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:
(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
I can't be sure this is exactly what you are after, but it may come close.
I have a pretty simple question for which I could not find a simple answer.
When using cocoa (osx, xcode) and a method called "mouseDown" which detects if mouse has clicked on a view, how to detect on which object mouse has clicked? I just need a class name so I can know if the user has clicked on, for example NSImageView, WebView, NSTextView or on a NSView it self? Or even better, if I have two NSImageViews on my NSView, how to detect on which one it was clicked?
Cheers.
In your view mouseDown method, you can call the hitTest: method to get the farthest descendant of the receiver in the view hierarchy that was clicked:
So in your view subclass, you could do something like:
- (void)mouseDown:(NSEvent *)theEvent
{
id clickedObject = [self hitTest:[theEvent locationInWindow]];
if ([clickedObject isKindOfClass:[NSImageView class]]) {
NSLog(#"Clicked an ImageView");
} else if ([clickedObject isKindOfClass:[WebView class]]) {
NSLog(#"Clicked a WebView");
}
}
Your question seems a bit odd though, because normally you don't need to do this hit testing yourself.
If you're trying to get a click event when a particular image is clicked, a better way would be to use a borderless button with an image set and then implementing an action method and connecting that to the button.
I'd like to display an NSWindow when right clicking an item in an NSTableView, similarly to how the available outlets are shown in Interface Builder when you right click an object:
Unfortunately you can only use an NSMenu subclass as the menu property.
I also didn't find a delegate method of NSTableView that notifies about right clicks.
I was able to subclass NSTableView and implement rightMouseDown: and rightMouseUp: to be notified about those events, but if I set the menu property of the row cells to nil, they are not highlighted when right clicked, even though I call the super implementation):
- (void)rightMouseDown:(NSEvent *)theEvent {
[super rightMouseDown:theEvent];
NSPoint eventLocation = [theEvent locationInWindow];
eventLocation = [self convertPoint:eventLocation fromView:nil];
NSInteger rowIndex = [self rowAtPoint:eventLocation];
NSLog(#"Right clicked at row index %d", rowIndex);
}
I would like to have the highlight effect in the image below but display a window instead of the context menu:
First for the right click: explicitly select the row on right click (e.g. via this message). Then create your own NSWindow descendant, set an own NSView class as contentView and in the view you can draw the black background, rounded borders and what not. Show this window in your right click handler.
You can use an NSPopover, which works quite nicely. A popover creates a window for you, even if it is somewhat hidden. You'll get it from your controls if you send them the window message, and can register to listen for events, for instance.
The whole popover can be created in IB, and just have to implement the showRelativeToRect:ofView:preferredEdge: method in code.
To catch the right click event, you can use rightMouseDown:, which is originally defined in NSResponder, but is overridden in NSView to simply catch the event and show menu and it doesn't pass the event upwards in the responder chain (or the inheritance chain, for that matter). Hence, you simply implement that method to call showRelativeToRect:ofView:preferredEdge:.
You will typically need to have the contents in an NSViewController and its own accompanying nib file.
The NSPopover's contentViewController property can be set in IB, too.
All in all, not much code needed.
This tutorial is useful.
I have a question that I've searched but can't find a definative answer to. Here is my layout:
UIView - ViewController
|_UIScrollView - added programatically
| |_UIView to hold a backgound/perimeter - added programmatically
|_UIView 1 - added programmatically
|_UIView 2 - added programmatically
and so on
My question is how come the ViewController calls "touchesMoved" only once when I move say UIView 2 on touch?
Now UIView has it's own touchesMoved method, but I need the controller's touchesMoved to get called as I need it to talk to the ScrollView to update its position. Such as when that UIView 2 is near the corner, so that the ScrollView moves a little to fully show UIView 2.
If there is no way around this is there a way to update ScrollView from UIView 2 to scroll when its near a corner?
Edit:
I think I may have found a work around. Not sure if this will be accepted by Apple but:
I just made a call to a instance variable that is = to self.superview which then allows me to talk back to my ScrollView within UIView's touchesMoved
in that i can call the method [ScrollView setContentOffset:(CGPoint)contentOffset animated:(BOOL)animated] so my ScrollView gets updated as the subview(UIView2) moves close to the edge of the UIWindow.
Thank you for the suggestions.
The behavior you describe is the result of the UIScrollView hijacking the touch moved event. In other words, as soon as the UIScrollView detect that a touch moved event falls within its frame, it takes control of it. I experienced the same behavior while trying so create a special swipe handler, and it failed each time a UIScrollView was also interested by the swipe.
In my case, I solved the issue by intercepting the event in sendEvent: overridden in my custom UIWindow, but I don't know if you want to do the same. In any case, this is what worked for me:
- (void)sendEvent:(UIEvent*)event {
NSSet* allTouches = [event allTouches];
UITouch* touch = [allTouches anyObject];
UIView* touchView = [touch view];
//-- UIScrollViews will make touchView be nil after a few UITouchPhaseMoved events;
//-- by storing the initialView getting the touch, we can overcome this problem
if (!touchView && _initialView && touch.phase != UITouchPhaseBegan)
touchView = _initialView;
//-- do your own management of the event
//-- let the event propagate if you want also the default event management
[super sendEvent:event];
}
An alternative approach that you might investigate is attaching a gesture recognizer to your views -- they have a pretty high priority, so maybe the UIScrollView will not mess with them and it might work better for you.
If there is no way around this is there a way to update ScrollView from UIView 2 to scroll when its near a corner?
Have you tried to make the UIScrollView scroll by calling:
- (void)setContentOffset:(CGPoint)contentOffset animated:(BOOL)animated