UIControl Subclass - Events called twice - objective-c

I'm currently working on a custom UIControl Subclass. To track the touches I use the following Method:
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event {
NSLog(#"Start");
CGPoint location = [touch locationInView:self];
if ([self touchIsInside:location] == YES) {
//Touch Down
[self sendActionsForControlEvents:UIControlEventTouchDown];
return YES;
}
else {
return NO;
}
}
This works as expected and #"Start" is loged exactely once. The next step is that I add a Target and a Selector with UIControlEventTouchDown.
[markItem addTarget:self action:#selector(action:) forControlEvents:UIControlEventTouchUpInside];
This works also and the action: method is called. But that's my problem. The action is called twice. What am I doing wrong? I just use [self sendActionsForControlEvents:UIControlEventTouchDown]; once and the target action is called twice. What's wrong with my code?
Sandro Meier

The first call to the action method happens automatically by the event dispatcher once you've called:
[markItem addTarget:self action:#selector(action:) forControlEvents:UIControlEventTouchUpInside];
to register the handler.
So when you then call:
//Touch Down
[self sendActionsForControlEvents:UIControlEventTouchDown];
you are generating the second call to your handler (and any others that are hooked up).
So it seems like you don't need both the action handler and the beginTracking - use one or the other.
Update:
Given your comment and further thought: since you are a subclass of UIControl, I think you probably don't want to be registering for event handlers for yourself.
Instead you should exclusively use:
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event;
- (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event;
- (void)endTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event;
- (void)cancelTrackingWithEvent:(UIEvent *)event; // event may be nil if cancelled for non-event reasons, e.g. removed from window
Also the tracking instance variable.
So I think you should not be posting events or listening to events. Further, is it actually possible to get a beginTrackingWithTouch event if it's not in your view? Doesn't seem like it would be. So I don't think you need the testing to see if it's in your view.
So I think it might be worth stepping back and thinking about what you are trying to do and re-reading UIControl documentation. Specifically:
Subclassing Notes
You may want to extend a UIControl subclass for
either of two reasons:
To observe or modify the dispatch of action messages to targets for
particular events To do this, override sendAction:to:forEvent:,
evaluate the passed-in selector, target object, or UIControlEvents bit
mask, and proceed as required.
To provide custom tracking behavior (for example, to change the
highlight appearance) To do this, override one or all of the following
methods: beginTrackingWithTouch:withEvent:,
continueTrackingWithTouch:withEvent:, endTrackingWithTouch:withEvent:.
The first part is for having your UIControl subclass do non-standard handling of target action processing for clients or users of your control (that doesn't sound like what you are trying to do, though you didn't really give a high-level description).
The second part sounds more like what you are wanting to do - custom tracking within your UIControl subclass.

I had this same problem as well. When I would register an action with
[button addTarget:self action:#selector(action:) forControlEvents:UIControlEventTouchDown];
the action method would be fired once, but when I registered it with
[button addTarget:self action:#selector(action:) forControlEvents:UIControlEventTouchUpInside];
the action method would be fired twice.
The thing that fixed it for me was to make sure that my - (void) action: method return IBAction instead of void. I am not sure why this seemed to fix it though.

Hm.. Check my code for your aims:
UIContr.h
#interface UIContr : UIControl {
}
#end
UIContr.m
#import "UIContr.h"
#implementation UIContr
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
// Initialization code.
}
return self;
}
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event {
NSLog(#"Start");
CGPoint location = [touch locationInView:self];
if (CGRectContainsPoint(self.frame, location)) {
//Touch Down
[self sendActionsForControlEvents:UIControlEventTouchDown];
return YES;
}
else {
return NO;
}
}
- (void)dealloc {
[super dealloc];
}
#end
How to use in UIViewController:
- (void)viewDidLoad {
UIContr *c = [[UIContr alloc] initWithFrame:CGRectMake(20, 20, 100, 100)];
[c addTarget:self action:#selector(action:) forControlEvents:UIControlEventTouchUpInside];
c.backgroundColor = [UIColor redColor];
[self.view addSubview:c];
}
-(void)action:(id)sender{
NSLog(#"123");
}

Related

ScrollView with Images - changing tabs on touchesended does not work

In my app I have a UIScrollView and within that I have programmatically created UIImageViews. This is based off the Tab-bar template in Xcode. What I'm trying to do is to change Tabs when a particular image is pressed, and whilst I'm able to register the event as a touch, changing the tabIndex does nothing.
Typing in the following code:
UITabBarController *tab=self.tabBarController;
if (tab){
NSLog(#"Good");
}
else {
NSLog(#"No Good");
Always results in No Good being Logged. The code I've written can be seen here, where my UIScrollView is of type ScrollViewer:
#implementation scrollViewer
- (void) touchesEnded: (NSSet *) touches withEvent: (UIEvent *) event {
if (!self.dragging) {
NSLog(#"Recieved Touch!");
UITouch *aTouch = [touches anyObject];
MainView *theInstance = [[MainView alloc] init];
CGPoint point = [aTouch locationInView:self];
theInstance.ycoord = point.y;
[theInstance touchHand];
}
[super touchesEnded: touches withEvent: event];
}
#end
Any help is appreciated thanks!
You never gave us any sample output for:
NSLog(#"Y Coord= %d",adjustedy);
But I will assume the value should cause the switch.
So first you need to log self.tabbarcontroller to be sure it is not nil. If it is ok the system may not want to let you switch inside touch events (it's bad form anyway). I would do the setting as this:
dispatch_async(dispatch_get_main_queue(), ^{ do code here } );
So it runs later.

Why won't my Cocos2d test app fire "touchesBegan" events?

In my app delegate, I made sure I have the line:
[glView setMultipleTouchEnabled: YES];
And I have a simple layer meant only to figure out how multi touch works. The .mm file looks like:
#import "TestLayer.h"
#implementation TestLayer
-(id) init
{
if( (self=[super init])) {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
return self;
}
-(void) draw{
[super draw];
glColor4f(1.0, 0.0, 0.0, 0.35);
glLineWidth(6.0f);
ccDrawCircle(ccp(500,500), 250,CC_DEGREES_TO_RADIANS(360), 60,YES);
}
-(void) ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"got some touches");
}
-(void) ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"some touches moved.");
}
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
NSLog(#"a touch began");
return FALSE;
}
#end
When I touch the screen, I always see "a touch began", but no matter how I touch it (simulator or actual device), I never see "some touches moved" or "got some touches".
Is there something further I need to do to make multi touch work?
Specifically, I'm just trying to do basic pinch-to-zoom functionality... I heard there is some sort of gesture recognizer for iPhone...does it work for Coco2ds? Would it work even if I can't get simple multi touch events to fire?
UIGestureRecognizers absolutely work for Cocos2D, I personally used them, you just need to add them to the correct view by using:
[[[CCDirector sharedDirector] openGLView] addGestureRecognizer:myGestureRecognizer];
Regarding your touches, I guess you enabled them for the scene you are working in?
scene.isTouchEnabled = YES;
In any case you shouldn't use the addTargetDelegate method, take a look here
add self.isTouchEnabled = YES; to your init
and for the gesture recognizers look at the other answer

UIImage not updating when called from another object

I have an UIImageView with a method to change the image it displays. It works fine, if the method is triggered from a touch event received by the UIImageView itself, but the same method fails to update the UIImageView, if called from another object, which triggered the touch event. I have an NSLog call in the method in question and thus can see, that the method is called in both cases. But only in one case I can see the actual change of the image in the other case the view is not updated.
When it works, I do not need to setNeedsDisplay or setNeedsLayout, but that is what I tried to fix the problem. Setting needsDisplay and needsLayout in the UIImageView as well as its superview. To no avail. I can see, that the image is actually changed, if I rotate the device, which causes a refresh and I see, that the UIImageView indeed changed.
The superview calls the method eventAtLocation: on an OutletCollection using makeObjectsPerformSelector:withObject:
It sure looks like an embarrassing mistake on my part, but I can't figure it out since hours. I am running out of ideas what to try :-(
Here's the code in question:
- (void)eventAtLocation:(NSValue *)location
{
CGPoint loc = [self.superview convertPoint:[location CGPointValue] toView:self];
if ([self pointInside:loc withEvent:nil]) {
if (!self.isAnimating) {
[self autoSwapImage];
}
}
else{
if (self.isAnimating) {
[self cancelAnimation];
}
}
}
- (void)cancelAnimation
{
self.isAnimating = NO;
[NSObject cancelPreviousPerformRequestsWithTarget:self];
}
- (void) swapImage
{
currentImage++;
if (currentImage > numberOfImages)
currentImage = 1;
NSLog(#"set image to %i", currentImage);
self.image = [UIImage imageNamed:[NSString stringWithFormat:#"img_%i.jpg", currentImage]];
[self setNeedsDisplay];
}
- (void) autoSwapImage
{
self.isAnimating = YES;
[self swapImage];
[self performSelector:#selector(autoSwapImage) withObject:self afterDelay:0.1];
}
// The following works, but if eventAtLocation: is called form the superview swapImage gets called (-> NSLog output appears), but the view is not updated
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//[self autoSwapImage];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
//[NSObject cancelPreviousPerformRequestsWithTarget:self];
}
#end
It looks like you aren't actually setting the UIImageView's image, you are just setting self.image. You need to do something like
self.myImageView.image = [UIImage imageNamed:[NSString stringWithFormat:#"img_%i.jpg", currentImage]];
Also you'll want to put a check in that [UIImage imageNamed:] actually is returning an image, and that its not nil.
UIImage* image = [UIImage imageNamed:#"someImage"];
if(!image){
NSLog(#"%#",#"Image Not Found");
}

objective C: Buttons created from subclass of UIButton class not working

I am creating a subclass of UIButton in order to create my own customized buttons. My code as follows:
//interface file (subclass of uIButton
#interface UICustomButton : UIButton
{
Answer *answer;
NSString *btnType;
}
#property (nonatomic, retain) Answer *answer;
#property (nonatomic, assign) NSString *btnType;
- (id)initWithAnswer:(Answer *)ans andButtonType:(NSString *)type andFrame:(CGRect)frame;
- (void)buttonPressed;
#end
//Implementation file (.m)
#implementation UICustomButton
#synthesize answer,btnType;
- (id)initWithAnswer:(Answer *)ans andButtonType:(NSString *)type andFrame:(CGRect)frame;
{
self = [super initWithFrame:frame];
if (self)
{
self = [[UIButton alloc]initWithFrame:CGRectMake(0, 0, frame.size.width, frame.size.height)];
self.backgroundColor = [UIColor colorWithHexString:#"#E2E4E7"];
}
[self addTarget:self action:#selector(buttonPressed) forControlEvents:UIControlStateNormal];
self.answer = ans;
self.btnType = type;
return self;
}
I am facing some issues in getting the above code to work. I have 2 problems
1) The buttons are not responding to the selector method "buttonPressed"
2) I am hitting a runtime error for the lines 'self.answer = ans' and 'self.btnType = type' Stack trace as follows:
-[UIButton setAnswer:]: unrecognized selector sent to instance 0x614ebc0
2011-06-23 00:55:27.038 onethingaday[97355:207] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[UIButton setAnswer:]: unrecognized selector sent to instance 0x614ebc0'
What am I doing wrong here?
This is happening because you are creating a UIButton type object and not a UICustomButton type inside the init method when you do
self = [[UIButton alloc]initWithFrame:CGRectMake(0, 0, frame.size.width, frame.size.height)];
Try replacing your init method for
- (id)initWithAnswer:(Answer *)ans andButtonType:(NSString *)type andFrame:(CGRect)frame;
{
self = [self initWithFrame:CGRectMake(0, 0, frame.size.width, frame.size.height)];
if (self)
{
self.backgroundColor = [UIColor colorWithHexString:#"#E2E4E7"];
[self addTarget:self action:#selector(buttonPressed) forControlEvents:UIControlEventTouchUpInside];
self.answer = ans;
self.btnType = type;
}
return self;
}
This will cause self to be a UICustomButton type object.
Also, you are using a wrong type for the UIControlState parameter when you add the target to your button using the addTarget:action:forControlEvents: method
You should use value among the ones bellow:
UIControlEventTouchDown
UIControlEventTouchDownRepeat
UIControlEventTouchDragInside
UIControlEventTouchDragOutside
UIControlEventTouchDragEnter
UIControlEventTouchDragExit
UIControlEventTouchUpInside
UIControlEventTouchUpOutside
UIControlEventTouchCancel
EDIT:
Notes on UIButton subclassing
Many references on the web say you should NOT subclass the UIButton class, but not only anybody said why but what also deeply annoyed me was that the UIButton Class Reference does not say anything about it at all.
If you take UIWebView Class Reference for example, it explicitly states that you should not subclass UIWebView
Subclassing Notes The UIWebView class
should not be subclassed.
the big deal with UIButton is that it inherits from UIControl and a good and simple explanation is on the UIControl Class Reference itself
Subclassing Notes You may want to
extend a UIControl subclass for either
of two reasons:
To observe or modify the dispatch of
action messages to targets for
particular events
To provide custom
tracking behavior (for example, to
change the highlight appearance)
So, this means that you CAN subclass a UIButton, but you should be careful on what you are doing. Just subclass it to change its behavior and not its appearance. To modify a UIButton appearance you should use the interface methods provided for that, such as:
setTitle:forState:
setBackgroundImage:forState:
setImage:forState:
References worth reading
The UIView Programming Guide: View and Window Architecture -> Tips for Using Views Effectively -> Do Not Customize Controls by Embedding Subviews
Source: my post here
Not sure this was in the docs before, but anyway these are the current notes on + (id)buttonWithType:(UIButtonType)buttonType...
To me it looks like subclassing is OK as long as you use init instead of buttonWithType. I have yet to try it myself however.
Discussion This method is a convenience constructor for creating
button objects with specific configurations. It you subclass UIButton,
this method does not return an instance of your subclass. If you want
to create an instance of a specific subclass, you must alloc/init the
button directly.
When creating a custom button—that is a button with the type
UIButtonTypeCustom—the frame of the button is set to (0, 0, 0, 0)
initially. Before adding the button to your interface, you should
update the frame to a more appropriate value.
If you want to get notifications when the user is interacting with your buttons, just sublcass UIButton and implement these methods:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesBegan");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesCancelled");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesMoved");
}
No init method required.
Edit
This answer reaches back several years, and things have changed - as Apple docs now explicitly mention subclassing and gives some hints.
So the following answer might be irrelevant or wrong for current development and might be ignored if you're interested in the current state of the art.
UIButton is not meant to be subclassed.
You are better off making a category and defining a factory method that delivers your needed button (with proper call to buttonWithType:). initWithFrame: is not the correct way to initialize a button anyway.
//
// BtnClass.m
#import "BtnClass.h"
#implementation BtnClass
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
//added custum properities to button
-(id)initWithCoder:(NSCoder *)aDecoder
{
NSLog(#"initWithCoder");
self = [super initWithCoder: aDecoder];
if (self) {
// Initialization code
_numberOfItems=[[UILabel alloc]initWithFrame:CGRectMake(40, 8, 160, 30)];
_numberOfItems.textAlignment=NSTextAlignmentLeft;
_numberOfItems.font = [UIFont boldSystemFontOfSize:18.0];
_numberOfItems.textColor = [UIColor darkGrayColor];
[self addSubview:_numberOfItems];
_leftImage=[[UIImageView alloc]initWithFrame:CGRectMake(10, 10, 25, 25)];
[self addSubview:_leftImage];
_rightImage=[[UIImageView alloc]initWithFrame:CGRectMake(280, 10, 15, 15)];
[self addSubview:_rightImage];
[self setImage:[UIImage imageNamed:#"list-bg2-1.png"] forState:UIControlStateNormal];
[_rightImage setImage:[UIImage imageNamed:#"carat.png"]];
self.backgroundColor=[UIColor blackColor];
if(self.tag==1)
{
[_leftImage setImage:[UIImage imageNamed:#"notes-icon.png"]];
}
if(self.tag==2)
{
[_leftImage setImage:[UIImage imageNamed:#"photos-icon.png"]];
}
if(self.tag==3)
{
[_leftImage setImage:[UIImage imageNamed:#"videos-icon.png"]];
}
}
return self;
}
//selected method of uibutton
-(void)setSelected:(BOOL)selected
{
[super setSelected:selected];
if(selected)
{
[self setImage:nil forState:UIControlStateNormal];
_numberOfItems.textColor = [UIColor whiteColor];
[_rightImage setImage:[UIImage imageNamed:#"carat-open.png"]];
if(self.tag==1)
{
[_leftImage setImage:[UIImage imageNamed:#"white-notes-icon.png"]];
}
else if(self.tag==2)
{
[_leftImage setImage:[UIImage imageNamed:#"white-photo-icon.png"]];
}
else
{
[_leftImage setImage:[UIImage imageNamed:#"white-video-icon.png"]];
}
}
else{
_numberOfItems.textColor = [UIColor darkGrayColor];
if(self.tag==1)
{
[_leftImage setImage:[UIImage imageNamed:#"notes-icon.png"]];
}
if(self.tag==2)
{
[_leftImage setImage:[UIImage imageNamed:#"photos-icon.png"]];
}
if(self.tag==3)
{
[_leftImage setImage:[UIImage imageNamed:#"videos-icon.png"]];
}
[self setImage:[UIImage imageNamed:#"list-bg2-1.png"] forState:UIControlStateNormal];
[_rightImage setImage:[UIImage imageNamed:#"carat.png"]];
}
}
/*
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect
{
// Drawing code
}
*/
#end

Allowing interaction with a UIView under another UIView

Is there a simple way of allowing interaction with a button in a UIView that lies under another UIView - where there are no actual objects from the top UIView on top of the button?
For instance, at the moment I have a UIView (A) with an object at the top and an object at the bottom of the screen and nothing in the middle. This sits on top of another UIView that has buttons in the middle (B). However, I cannot seem to interact with the buttons in the middle of B.
I can see the buttons in B - I've set the background of A to clearColor - but the buttons in B do not seem to receive touches despite the fact that there are no objects from A actually on top of those buttons.
EDIT - I still want to be able to interact with the objects in the top UIView
Surely there is a simple way of doing this?
You should create a UIView subclass for your top view and override the following method:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// UIView will be "transparent" for touch events if we return NO
return (point.y < MIDDLE_Y1 || point.y > MIDDLE_Y2);
}
You may also look at the hitTest:event: method.
While many of the answers here will work, I'm a little surprised to see that the most convenient, generic and foolproof answer hasn't been given here. #Ash came closest, except that there is something strange going on with returning the superview... don't do that.
This answer is taken from an answer I gave to a similar question, here.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self) return nil;
return hitView;
}
[super hitTest:point withEvent:event] will return the deepest view in that view's hierarchy that was touched. If hitView == self (i.e. if there is no subview under the touch point), return nil, specifying that this view should not receive the touch. The way the responder chain works means that the view hierarchy above this point will continue to be traversed until a view is found that will respond to the touch. Don't return the superview, as it is not up to this view whether its superview should accept touches or not!
This solution is:
convenient, because it requires no references to any other views/subviews/objects;
generic, because it applies to any view that acts purely as a container for touchable subviews, and the configuration of the subviews does not affect the way it works (as it does if you override pointInside:withEvent: to return a particular touchable area).
foolproof, there's not much code... and the concept isn't difficult to get your head around.
I use this often enough that I have abstracted it into a subclass to save pointless view subclasses for one override. As a bonus, add a property to make it configurable:
#interface ISView : UIView
#property(nonatomic, assign) BOOL onlyRespondToTouchesInSubviews;
#end
#implementation ISView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self && onlyRespondToTouchesInSubviews) return nil;
return hitView;
}
#end
Then go wild and use this view wherever you might use a plain UIView. Configuring it is as simple as setting onlyRespondToTouchesInSubviews to YES.
There are several ways you could handle this. My favorite is to override hitTest:withEvent: in a view that is a common superview (maybe indirectly) to the conflicting views (sounds like you call these A and B). For example, something like this (here A and B are UIView pointers, where B is the "hidden" one, that is normally ignored):
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint pointInB = [B convertPoint:point fromView:self];
if ([B pointInside:pointInB withEvent:event])
return B;
return [super hitTest:point withEvent:event];
}
You could also modify the pointInside:withEvent: method as gyim suggested. This lets you achieve essentially the same result by effectively "poking a hole" in A, at least for touches.
Another approach is event forwarding, which means overriding touchesBegan:withEvent: and similar methods (like touchesMoved:withEvent: etc) to send some touches to a different object than where they first go. For example, in A, you could write something like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([self shouldForwardTouches:touches]) {
[B touchesBegan:touches withEvent:event];
}
else {
// Do whatever A does with touches.
}
}
However, this will not always work the way you expect! The main thing is that built-in controls like UIButton will always ignore forwarded touches. Because of this, the first approach is more reliable.
There's a good blog post explaining all this in more detail, along with a small working xcode project to demo the ideas, available here:
http://bynomial.com/blog/?p=74
You have to set upperView.userInteractionEnabled = NO;, otherwise the upper view will intercept the touches.
The Interface Builder version of this is a checkbox at the bottom of the View Attributes panel called "User Interaction Enabled". Uncheck it and you should be good to go.
Custom implementation of pointInside:withEvent: indeed seemed like the way to go, but dealing with hard-coded coordinates seemed odd to me. So I ended up checking whether the CGPoint was inside the button CGRect using the CGRectContainsPoint() function:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return (CGRectContainsPoint(disclosureButton.frame, point));
}
Lately I wrote a class that will help me with just that. Using it as a custom class for a UIButton or UIView will pass touch events that were executed on a transparent pixel.
This solution is a somewhat better than the accepted answer because you can still click a UIButton that is under a semi transparent UIView while the non transparent part of the UIView will still respond to touch events.
As you can see in the GIF, the Giraffe button is a simple rectangle but touch events on transparent areas are passed on to the yellow UIButton underneath.
Link to class
I guess I'm a bit late to this party, but I'll add this possible solution:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView != self) return hitView;
return [self superview];
}
If you use this code to override a custom UIView's standard hitTest function, it will ignore ONLY the view itself. Any subviews of that view will return their hits normally, and any hits that would have gone to the view itself are passed up to its superview.
-Ash
Just riffing on the Accepted Answer and putting this here for my reference. The Accepted Answer works perfectly. You can extend it like this to allow your view's subviews to receive the touch, OR pass it on to any views behind us:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// If one of our subviews wants it, return YES
for (UIView *subview in self.subviews) {
CGPoint pointInSubview = [subview convertPoint:point fromView:self];
if ([subview pointInside:pointInSubview withEvent:event]) {
return YES;
}
}
// otherwise return NO, as if userInteractionEnabled were NO
return NO;
}
Note: You don't even have to do recursion on the subview tree, because each pointInside:withEvent: method will handle that for you.
This approach is quite clean and allows that transparent subviews are not reacting to touches as well. Just subclass UIView and add the following method to its implementation:
#implementation PassThroughUIView
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
for (UIView *v in self.subviews) {
CGPoint localPoint = [v convertPoint:point fromView:self];
if (v.alpha > 0.01 && ![v isHidden] && v.userInteractionEnabled && [v pointInside:localPoint withEvent:event])
return YES;
}
return NO;
}
#end
Setting userInteraction property disabled might help. Eg:
UIView * topView = [[TOPView alloc] initWithFrame:[self bounds]];
[self addSubview:topView];
[topView setUserInteractionEnabled:NO];
(Note: In the code above, 'self' refers to a view)
This way, you can only display on the topView, but won't get user inputs. All those user touches will through this view and the bottom view will respond for them. I'd use this topView for displaying transparent images, or animate them.
My solution here:
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint pointInView = [self.toolkitController.toolbar convertPoint:point fromView:self];
if ([self.toolkitController.toolbar pointInside:pointInView withEvent:event]) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
Hope this helps
There's something you can do to intercept the touch in both views.
Top view:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Do code in the top view
[bottomView touchesBegan:touches withEvent:event]; // And pass them on to bottomView
// You have to implement the code for touchesBegan, touchesEnded, touchesCancelled in top/bottom view.
}
But that's the idea.
Here is a Swift version:
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
return !CGRectContainsPoint(buttonView.frame, point)
}
Swift 3
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for subview in subviews {
if subview.frame.contains(point) {
return true
}
}
return false
}
I have never built a complete user interface using the UI toolkit, so I don't have much experience with it. Here is what I think should work though.
Every UIView, and this the UIWindow, has a property subviews, which is an NSArray containing all the subviews.
The first subview you add to a view will receive index 0, and the next index 1 and so forth. You can also replace addSubview: with insertSubview: atIndex: or insertSubview:aboveSubview: and such methods that can determine the position of your subview in the hierarchy.
So check your code to see which view you add first to your UIWindow. That will be 0, the other will be 1.
Now, from one of your subviews, to reach another you would do the following:
UIView * theOtherView = [[[self superview] subviews] objectAtIndex: 0];
// or using the properties syntax
UIView * theOtherView = [self.superview.subviews objectAtIndex:0];
Let me know if that works for your case!
(below this marker is my previous answer):
If views need to communicate with each other, they should do so via a controller (that is, using the popular MVC model).
When you create a new view, you can make sure it registers itself with a controller.
So the technique is to make sure your views register with a controller (which can store them by name or whatever you prefer in a Dictionary or Array). Either you can have the controller send a message for you, or you can get a reference to the view and communicate with it directly.
If your view doesn't have a link back the controller (which may be the case) then you can make use of singletons and/or class methods to get a reference to your controller.
I think the right way is to use the view chain built into the view hierarchy.
For your subviews that are pushed onto the main view, do not use the generic UIView, but instead subclass UIView (or one of its variants like UIImageView) to make MYView : UIView (or whatever supertype you want, such as UIImageView). In the implementation for YourView, implement the touchesBegan method. This method will then get invoked when that view is touched. All you need to have in that implementation is an instance method:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event ;
{ // cannot handle this event. pass off to super
[self.superview touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event]; }
this touchesBegan is a responder api, so you dont need to declare it in your public or private interface; it's one of those magic api's you just have to know about. This self.superview will bubble up the request eventually to the viewController. In the viewController, then, implement this touchesBegan to handle the touch.
Note that the touches location (CGPoint) is automatically adjusted relative to the encompassing view for you as it is bounced up the view hierarchy chain.
Just want to post this, coz I had somewhat similar problem, spent substantial amount of time trying to implement answers here without any luck. What I ended up doing:
for(UIGestureRecognizer *recognizer in topView.gestureRecognizers)
{
recognizer.delegate=self;
[bottomView addGestureRecognizer:recognizer];
}
topView.abView.userInteractionEnabled=NO;
and implementing UIGestureRecognizerDelegate :
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Bottom view was a navigation controller with number of segues and I had sort of a door on top of it that could close with pan gesture. Whole thing was embedded in yet another VC. Worked like a charm. Hope this helps.
Swift 4 Implementation for HitTest based solution
let hitView = super.hitTest(point, with: event)
if hitView == self { return nil }
return hitView
Derived from Stuart's excellent, and mostly foolproof answer, and Segev's useful implementation, here is a Swift 4 package that you can drop into any project:
extension UIColor {
static func colorOfPoint(point:CGPoint, in view: UIView) -> UIColor {
var pixel: [CUnsignedChar] = [0, 0, 0, 0]
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
let context = CGContext(data: &pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
context!.translateBy(x: -point.x, y: -point.y)
view.layer.render(in: context!)
let red: CGFloat = CGFloat(pixel[0]) / 255.0
let green: CGFloat = CGFloat(pixel[1]) / 255.0
let blue: CGFloat = CGFloat(pixel[2]) / 255.0
let alpha: CGFloat = CGFloat(pixel[3]) / 255.0
let color = UIColor(red:red, green: green, blue:blue, alpha:alpha)
return color
}
}
And then with hitTest:
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
guard UIColor.colorOfPoint(point: point, in: self).cgColor.alpha > 0 else { return nil }
return super.hitTest(point, with: event)
}