Detect scroll wheel movement and direction - objective-c

I'm programming a game using the new Sprite-Kit framework and want to use the mouse scroll wheel to change the player's gun. First of all i want to handle when the scroll wheel moves. I tried the following method, from the Cocoa Event Handling Guide
- (void)scrollWheel:(NSEvent *)theEvent
{
NSLog(#"SCROOOOOOOLL iN MOVeMENT");
}
Result: nothing, i didn't handle when i moved the mouse wheel.
Any idea of how can i handle this event ?
UPDATE:
I saw two answers talking about class derivative from NSResponder or NSView but my class derives from SKScene, i'm programming using the Sprite-kit framework and obj-c doesn't allow multiple inheritance.
Here the class definition:
#import <SpriteKit/SpriteKit.h>
#interface OpcionesMenu : SKScene
#end

Even though SKScene derives from NSResponder, it isn't a direct recipient of AppKit events. Rather, the SKView (derived from NSView/UIView) that houses the scene handles the events. You can observe this by setting a breakpoint on your scene's mouseDown method:
-[TWTMyScene mouseDown:]
-[SKView mouseDown:] ()
-[NSWindow sendEvent:] ()
-[NSApplication sendEvent:] ()
-[NSApplication run] ()
NSApplicationMain ()
So you can subclass SKView and forward the scrollWheel event yourself. Or, if you like living dangerously, you can just paste this category at the end of any of your code and it will take care of it for you:
#implementation SKView(ScrollTouchForwarding)
- (void)scrollWheel:(NSEvent *)event {
[self.scene scrollWheel:event];
}
#end

I think that could help:
- (void)scrollWheel:(NSEvent *)theEvent
{
[super scrollWheel:theEvent];
NSLog(#"user scrolled %f horizontally and %f vertically", [theEvent deltaX], [theEvent deltaY]);
}

I know this is an old question, but I was having this exact problem with a collection view. So I have a NSScrollView that contains a NSCollectionView and some other views such as labels and buttons.
Anyway I wanted to get the collection view to be able to scroll horizontally and parent scroll view to to scroll vertically. I achieved this by storing and comparing the deltaX and deltaY values.
I made a custom scroll view class and set it to the collection view's scroll view.
Here is the header code:
#import <Cocoa/Cocoa.h>
#interface CustomCollectionScrollView : NSScrollView {
// Current scroll position.
CGFloat positionAxisX;
CGFloat positionAxisY;
}
#end
And here is the implementation code:
#import "CustomCollectionScrollView.h"
#implementation CustomCollectionScrollView
/// DRAW RECT METHOD ///
-(void)drawRect:(NSRect)dirtyRect {
[super drawRect:dirtyRect];
positionAxisX = 0.0;
positionAxisY = 0.0;
}
/// SCROLL METHODS ///
-(void)scrollWheel:(NSEvent *)theEvent {
// Move the parent scroll view scroll view if the user is
// scrolling vertically otherwise move the collection scroll view.
if ((theEvent.deltaX == positionAxisX) && (theEvent.deltaY != positionAxisY)) {
[self.nextResponder scrollWheel:theEvent];
} else if ((theEvent.deltaX != positionAxisX) && (theEvent.deltaY == positionAxisY)) {
[super scrollWheel:theEvent];
}
// Update the current position values.
positionAxisX = theEvent.deltaX;
positionAxisY = theEvent.deltaY;
}
#end

I don't know much about SpriteKit, but I see that there is an SKView class that is a subclass of NSView. I would guess that you should subclass SKView and put your scrollWheel: method there.

SKScene is part of SpriteKit which is an iOS framework, not an AppKit framework. Hence there is no scroll wheel handling at all (and no NSResponder (but UIResponder) in the inheritance chain). You can only work with touch events in the sprite kit.

Related

Move NSWindow by dragging NSView that is above other views

I have a macOS application that contains an NSTableView and an NSVisualEffectView. The visual effect view is acting like a bar at the bottom of the window, it is in the table view (containing a few buttons/etc..).
Anyway if I want to move the NSWindow by dragging the visual effect view, it will only work if the table view is not below the visual effect view. The reason I want visual effect view to be above the table view is so that I get a nice blur effect when then the user is scrolling through the table view content.
However, when the visual effect view is above the table view, the mouse/drag/etc events are not registered. Instead, they get passed to the table view. How can I stop this from happening?
I tried subclassing NSVisualEffectView, but everything I have tried has failed. Here is my code:
#import <Cocoa/Cocoa.h>
#interface BottomMainBar : NSVisualEffectView {
}
#end
Here is the implementation code:
#import "BottomMainBar.h"
#implementation BottomMainBar
/// DRAW RECT METHOD ///
-(void)drawRect:(NSRect)dirtyRect {
[super drawRect:dirtyRect];
[self setWantsLayer:YES];
[self.window setMovableByWindowBackground:YES];
[self setAcceptsTouchEvents:YES];
[self registeredDraggedTypes];
}
/// OTHER METHODS ///
-(BOOL)mouseDownCanMoveWindow {
return YES;
}
-(BOOL)acceptsFirstMouse:(NSEvent *)event {
return YES;
}
-(void)mouseDown:(NSEvent *)event {}
-(void)mouseDragged:(NSEvent *)event {}
-(void)mouseUp:(NSEvent *)event {}
-(void)mouseEntered:(NSEvent *)event {}
-(void)mouseExited:(NSEvent *)event {}
#end
Nothing I have tried has worked, how can I stop the visual effect view from passing on the mouse events to the layer below it?
Thanks for your time, Dan.
In the end I managed to find out a solution and thankfully it involves NO libraries or open source code (and obviously no private apis).
The problem
I have a NSVisualEffectView that spans the width of my view controller and it 38 px tall. It is positioned at the top of my view controller. It acts as a custom toolbar that contains a few buttons and labels. It is placed above a NSTableView that displays all sorts of content (images, video, text, etc...).
I placed the visual effect view above the table view, because I wanted to have a nice blur effect when the user scrolled the table view. The problem with this, is that the mouse down events on the visual effect view, get passed to table view and NOT the overall NSWindow. This results in the user being unable to drag and move the NSWindow, when they click and drag the visual effect view (because the mouse down events are not passed to the window).
I noticed that the top 10px of the visual effect DID pass the mouse down events to the window and not the table view. This is because the window's title bar is around 10-15px tall. However my visual effect view is 38px tall, so the bottom half of my visual effect view was unable to move the window.
The solution
The solution involves making two subclasses, one for the visual effect view and another for the NSWindow. The subclass for the visual effect view, simply passes the mouse down events to the nextResponder (which can be the table view or the window - depending on the size of the window title bar).
Header code (Visual Effect View class):
#import <Cocoa/Cocoa.h>
#interface TopMainBar : NSVisualEffectView {
}
#end
Implementation code (Visual Effect View class):
#import "TopMainBar.h"
#implementation TopMainBar
/// INIT WITH FRAME ///
-(id)initWithFrame:(NSRect)frameRect {
if ((self = [super initWithFrame:frameRect])) {
[self setWantsLayer:YES];
[self.window setMovableByWindowBackground:YES];
}
return self;
}
/// MOUSE METHODS ///
-(void)mouseDown:(NSEvent *)event {
[self.window mouseDown:event];
}
#end
The subclass for the window involves turning the window title bar into a toolbar, this in effect increases the size of the title bar (and as it happens increases it to around 38 px which is exactly what I needed). The ideal solution, would involve being able to increase the title bar height to any custom size, however that is not possible, so the toolbar solution is the only way.
Because the size of the title bar is increased, all the mouse down events are not passed to the window and not the table view. This enables the user to drag the window from any part of the visual effect view.
Header code (Window class):
#import <Cocoa/Cocoa.h>
#interface CustomWindow : NSWindowController <NSWindowDelegate> {
}
// UI methods.
-(BOOL)isWindowFullScreen;
#end
Implementation code (Window class):
#import "CustomWindow.h"
#interface CustomWindow ()
#end
#implementation CustomWindow
/// WINDOW DID LOAD ///
-(void)windowDidLoad {
[super windowDidLoad];
// Ensure this window is the current selected one.
[self.window makeKeyAndOrderFront:self];
// Ensure the window can be moved.
[self.window setMovableByWindowBackground:YES];
// Set the window title bar options.
self.window.titleVisibility = NSWindowTitleHidden;
self.window.titlebarAppearsTransparent = YES;
self.window.styleMask |= (NSWindowStyleMaskFullSizeContentView | NSWindowStyleMaskUnifiedTitleAndToolbar | NSWindowStyleMaskTitled);
self.window.movableByWindowBackground = YES;
self.window.toolbar.showsBaselineSeparator = NO;
self.window.toolbar.fullScreenAccessoryView.hidden = YES;
self.window.toolbar.visible = ![self isWindowFullScreen];
}
/// UI METHODS ///
-(BOOL)isWindowFullScreen {
return (([self.window styleMask] & NSWindowStyleMaskFullScreen) == NSWindowStyleMaskFullScreen);
}
/// WINDOW METHODS ///
-(void)windowWillEnterFullScreen:(NSNotification *)notification {
self.window.toolbar.visible = NO;
}
-(void)windowDidEnterFullScreen:(NSNotification *)notification {
self.window.toolbar.visible = NO;
}
-(void)windowWillExitFullScreen:(NSNotification *)notification {
self.window.toolbar.visible = YES;
}
-(void)windowDidExitFullScreen:(NSNotification *)notification {
self.window.toolbar.visible = YES;
}
/// OTHER METHODS ///
-(BOOL)mouseDownCanMoveWindow {
return YES;
}
#end
In the custom window class you can see that I am changing the toolbar visibility depending on the full screen state of the window. This is to stop title bar appearing and covering my custom visual effect view up, when the window goes into full screen mode.
In order for this to work, you need to add an empty toolbar to your window, you can do this in interface builder, by dragging and dropping a NSToolbar object, to your window.
Make sure you connect the window to the window delegate, otherwise the full screen delegate method will not be called.
Conclusion
This solution involves increasing the size of the title bar by changing it into a toolbar. The mouse down events that are passed from the visual effect view class, are then read by the window (not any other view behind it) and thus the window can be moved.

NSScrollView how to start from top left corner

How to set scrollView document view to be pinned to top left corner? If scrollView is big enough/bigger than its content, everything is drawn from bottom to up and it not looks right. I have to override isFlipped of scrollView?
I was searching internet and overriding isFlipped to return trueis not everything. I don't want to make my documentView flipped because then I have to make changes in that class to make everything looks like I want.
I created simple NSView class as an container for elements that i want to have inside my scrollView and everything looks perfect. I hope this will help someone!
#interface FlippedView : NSView
#end
and implementation:
#implementation FlippedView
- (void)drawRect:(NSRect)dirtyRect {
[super drawRect:dirtyRect];
// Drawing code here.
}
- (BOOL) isFlipped
{
return YES;
}
#end
Swift 4 Method to invert axises :
from https://stackoverflow.com/a/40381180/5464805 thank's to Ben Leggiero :
import Cocoa
class FlippedView: NSView {
override var isFlipped: Bool { return true }
}
Then in the storyboard, set this class to the NSView below the NSClipView and it do the trick.
However it won't appear in StoryBoard so you'll have to build and run

How to use touchesBegan from one UIView in another UIViewController

I have made a graph with data in a UIView called HeartrateGraph. In a UIViewController named HRGraphInfo, I have a connected label that should output values when the graph is touched. The problem is, I don't know how to send a touched event using delegates from the UIView to the UIViewController.
Here is my touch assignment code in the UIView:
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
for (int i = 0; i < kNumberOfPoints; i++)
{
if (CGRectContainsPoint(touchAreas[i], point))
{
graphInfoRF.heartRateGraphString = [NSString stringWithFormat:#"Heart Rate reading #%d at %# bpm",i+1, dataArray[i]];
graphInfoRF.touched = YES;
break;
}
}
This segment of code is in a touchesBegan and properly stores the data value and number in the object graphInfoRF (I just did not show the declarations of dataArray, kNumberOfPoints, etc).
I am able to access graphInfoRF in the UIViewController using:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if (graphInfoRF.touched == YES) {
self.heartRateLabel.text = graphInfoRF.heartRateGraphString;
}
else {
self.heartRateLabel.text = #"No data got over to this file";}
}
The label will show the correct string, but only after the data point on the graph is touched AND the label is touched right after. How do I change the touchesBegan so that once I touch the data point on the graph it will fill the label with the data automatically without the need for a second and separate touch on the label?
All ViewControllers comes with a single view it manages once it's initialized. You should be familiar with this view, you see it whenever you use a ViewController in the Interface Builder and you can access it using self.view if you're modifying a subclass.
Since ViewControllers come with a view, it also receives touch events for that view. Implementing touchesBegan in the ViewController will then receive events for that view, and normally any subviews that view is managing. Since you've down your own implementation of 'touchesBegan' in your HeartRateGraph, and since HeartRateGraph is a subview of ViewControllers main view, HeartRateGraph will receive and handle the touch event first before the ViewController ever has a chance to receive and handle the event like it normally would (think of bubbling up).
So what's happening is, the code to change the label in ViewController is only called when the label is touched because the label is a subview of the ViewController's main view... and also label doesn't have its own touches implementation, so ViewController and is able to retrieve and handle the event the way you want only when you click somewhere outside the graph. If you understand then there are two ways to solve this.
Either pass the event up to your superview
[self.superview touchesBegan:touches withEvent:eventargs];
or the proper recommended way of doing it:
Protocols and Delegates where your View makes a delegate call to it ViewController letting it know the graph has been touched and the ViewController needs to update its contents

How to Stop a UIScrollView from Swallowing Touches

I have UIScrollView that has many subviews. When I scroll, I want to tap on a subview which I want to be dragged. Is there a possible way to make the UIScrollView stop from swallowing touches? Or is it possible to start new touch when you cancel the scrolling (like what it scrolls and I tapped on it, the subview will be tapped as well so I can drag it out)?
Subclass UIScrollView and override the - (BOOL)touchesShouldCancelInContentView:(UIView *)view method. Here's an example to allow uibutton touches to pass through:
#import "scrollViewWithButtons.h"
#implementation scrollViewWithButtons
- (BOOL)touchesShouldCancelInContentView:(UIView *)view
{
return ![view isKindOfClass:[UISlider class]];
}
#end

When does setter method get called from UIView subclass

I am taking the free Stanford course on iTunesU(193P) and we created setting up a class that is a subclass of UIView and created a public property called scale. The idea was that when we pinch, the scale of the view is changed accordingly but I am confused about when the setter of the property scale gets called. here is the relevant code below:
#interface FaceView : UIView
#property (nonatomic) CGFloat scale; //anyone who wants do publicly can set my scale
-(void)pinch:(UIPinchGestureRecognizer *)gesture;
#end
#synthesize scale = _scale;
#define DEFAULT_SCALE 0.90
-(CGFloat)scale{
if(!_scale){
return DEFAULT_SCALE;
}else {
return _scale;
}
}
-(void)setScale:(CGFloat)scale{
NSLog(#"setting the scale");
if(scale != _scale){
_scale = scale;
[self setNeedsDisplay];
}
}
-(void)pinch:(UIPinchGestureRecognizer *)gesture{
if ( (gesture.state == UIGestureRecognizerStateChanged) || (gesture.state == UIGestureRecognizerStateEnded)){
self.scale *= gesture.scale;
gesture.scale = 1;
}
}
When I am in "pinch mode" the setScale method continues to be called as I am pinching as my NSLog statement prints out until I stop the pinch. When or how does the setScale method continued to be called when there isn't any code programmatically calling it? Perhaps I missed something along the way here.
#cspam, remember that to set the gesture recognizer is 2 steps:
1) Adding a gesture recognizer to UIView - This kind of confused me in the lecture but eventhough he is saying add a gesture recognizer to UIView, he really means add a gesture recognizer FOR UIView, IN UIViewController. That is the code that you are missing that you have to add in the UIViewController subclass in this case (the Faceviewcontroller - your name might be different) and that is what will keep calling your pinch method in FaceView above:
UIPinchGestureRecognizer *pinchGesture=[[UIPinchGestureRecognizer alloc]initWithTarget:self.faceView action:#selector(pinch)];
[self.faceView addGestureRecognizer:pinchGesture];
You would add this code in your UIViewController (subclass) in the setter method of your UIView [in other words, create and connect an IBOutlet property in your UIViewController to your UIView in the storyboard] and override the setter method to include the code above.
2) The second part is what you have in your code. So pinch method will be called everytime the controller senses a pinch gesture.
Hope this helps.