UIButton does not remove target for UIControlEventtouchDragInside - objective-c

having some problems with removing targets from UIButton. Basically I have a calendar and want to be able to move a button representing an event saved in the calendar around. I start with:
[self.chosenButton addTarget:self action:#selector(dragMoving:withEvent:) forControlEvents:UIControlEventTouchDragInside];
and then after the moving is done I call
[self.chosenButton removeTarget:nil action:NULL forControlEvents:UIControlEventTouchDragInside];
After that, however, I can still move the button around even though it should remain still. In the dragMoving:withEvent: function I only assign the coordinates of the button depending on the touch point and check its validity (if it is within the screen, etc.)
Any idea why is the dragMoving:WithEvent still being called?
Thanks

It's because you actually do not remove target, try:
[self.chosenButton removeTarget:self action:#selector(dragMoving:withEvent:) forControlEvents:UIControlEventTouchDragInside];

Related

NSTextView sometimes get grayed out and unable to interact with

I have a NSView inside of my "menubar". I have a button where which i click on and add a new NSMenuItem to the menu. However when i run this code which is inside my custom view in the init method sometimes the view will get grayed out and I'm unable to select it. Any ideas what this could be the cause of this. The problem seems to affect all created NSMenuItems after this problem occur.
-(id)initWithFrame:(NSRect)frameRect andTag:(int)tagz{
textfeild = [[NSTextView alloc]initWithFrame:CGRectMake(19,1 , 110, 18)];
[textfeild setFont:[NSFont fontWithName:#"Helvetica" size:15]];
[textfeild setString:[NSString stringWithFormat:#"Notespace %d",tagz]];
[textfeild selectAll:self];
[self addSubview:textfeild];
}
I don't think you are supposed to use [self addSubview:] on a menu item. Instead, you are supposed to create an NSView and place your custom view inside it. Documentation is here: https://developer.apple.com/library/mac/documentation/Cocoa/Conceptual/MenuList/Articles/ViewsInMenuItems.html
However, according to the documentation views inside a menu item may not receive keyboard events. It may work sometimes, but since the documentation specifically mentions it you should not place any view that needs keyboard events inside a menu.
What you should do instead, is avoid NSMenu altogether and simply create an NSWindow where the menu would normally appear. You can make your window look/behave like a menu.

iOS - How to control UIBezierPath drawing with UISlider?

I want to control the drawing of my UIBezierPath using a UISlider.
The aim is to be able to scrub the timeline (UISlider) and cause the line (UIBezierPath) to draw based on the current position of the slider.
So that when I scrub forward the path renders forward & when I scrub back the path renders backwards.
I have the code to render the path drawing in both directions based on this sample code which I added reverse path to (https://github.com/ole/Animated-Paths)
The main thing I am struggling to understand is how can I have the path render when I move the slider since I cant really use animations as they would not update with the sliders movement..
Cheers
Damien
If you are tracking the "Control Events" of the UISlider, for example like this:
[yourSlider addTarget:self action:#selector(seekBegan:) forControlEvents:UIControlEventTouchDown];
[yourSlider addTarget:self action:#selector(seekContinues:) forControlEvents:UIControlEventTouchDragInside];
[yourSlider addTarget:self action:#selector(seekEnded:) forControlEvents:UIControlEventTouchUpInside];
.. you can then do your animations inside "seekContinues:". I can't say how smooth it will be though! as there may be a lot of calls to that method.
So I found that I was unable to use animations to achieve what I wanted so I used a custom UIView subclass and implemented in its drawRect method the code to handle the animations
Basically I set a index in the subclass that controlled how many of the points the drawrect method draws.
This gave me the desired approach and allowed me to change the value based on the slider.
For the drawing I used CGContextMoveToPoint & CGContextAddLineToPoint to render the paths
Cheers

Clicking through NSWindow/CALayer

So I'm working on an issue I have when trying to do some simple animation using CAKeyframeAnimation, and I believe my problem is more related to not fully understanding how NSWindow, NSView, and CALayer work together. 
I have two main objects in question. MyContainerWindow (NSWindow subclass) and MyMovableView (NSView subclass). My goal is to be able to animate MyMovableView back and forth across the screen, while maintaining the ability to click on anything through MyContainerWindow unless you are clicking on wherever MyMovableView is. I am able to accomplish the first part fine, by calling -addAnimation:forKeyPath: on myMovableView.layer, and everything is great except I can't click through MyContainerWindow. I could make the window smaller, but then the animation would clip by the bounds of the window.
Important points: 
1) MyContainerWindow is initWithFrame to [[NSScreen mainScreen] frame], NSBorderlessWindowMask, defer no, buffered
2) I setWantsLayer:TRUE to MyMovableView
3) MyContainerWindow is clear, and I want it to be as if there wasnt a window at all, but need it so I have a larger canvas to animate on.
Is there something obvious I'm missing to be able to click through an NSWindow?
Thanks in advance!
My solution in this scenario was actually to use:
[self ignoresMouseEvents:YES];
I originally was hoping to be able to retain the mouse events on the specific CALayer that I'm animating, but upon some further research I understand this comes with the cost of custom drawing everything from scratch, which is not ideal for this particular project.

UIButton firing selector inconsistantly

I have a UIButton linked up in IB correctly(I believe). The button fires inconsistently, every time I reload the view to show updated info, the button works sometimes and sometimes does not.It gives no errors. I can't find a pattern to when it works and when it doesn't, the same code is run every time I open the view and it still works when it wants. Besides linking it in IB I have also tried to addTarget in ViewDidLoad and remove the IB connection but it still has the same inconsistency,
[_buttonScreen addTarget:self action:#selector(buttonScreenClicked) forControlEvents:UIControlEventTouchUpInside];
If I add NSLog(#"Clicked"); to buttonScreenClicked I see that the method doesn't always get called, what would cause it to do this, I have made sure that I set:
[_buttonScreen setAlpha:0.1];
[_buttonScreen setHidden:NO];
[_buttonScreen setUserInteractionEnabled:YES];
I have no Image, text, or color in the button, but it still works sometimes.
I'm using AFKPageFlipper on the same view but it still had the same problem before I added AFKPageFlipper, so I don't think its that.
If anyone could point me in any direction to start trouble shooting this problem I would appreciate.
Thanks
I just had the same problem and worked it out. The 5 seconds is the clue.
Somewhere you have a gesture recognizer covering the same space as your button. More specifically you have a gesture recognizer that is eating your Taps but not your LongPresses. If you just tap the button the gesture recognizer runs off with your event; Hold your finger down long enough and the gesture recognizer no longer considers it a tap and the event is passed through to your button.
Instrument your Tap gesture recognizer handlers and the problem should pop out at you.
Make sure you don't have any other UIView descendants overlaying the button (like a transparent UIScrollView) as these will intercept the touch events first.
Also make sure that the containing view (the view with the button in) is correctly sized, by default you can place a view outside the bounds of another view and the clipsToBounds is set to false so you will see it but not be able to interact with it.
Things to try:
Do you have any other actions on the button?
Do you have any other UIViews which could possibly be accepting the key presses (above or below, or un-shown)
Also, please check that you have only one UIViewController instance for this screen. Other issues may arrise because of that.
What happens if you dont set the alpha level?
Do you release the object properly in the dealloc only ?
Hope this helps

Cocoa window position anomaly

I have a weird problem with positioning a window on screen. I want to center the window on the screen, but i don't know how to do that. Here's what i've got. The window is created from nib by the main controller:
IdentFormController *ftf = [[IdentFormController alloc] initWithWindowNibName:#"IdentForm"];
[[ftf window] makeKeyAndOrderFront:self];
Now the IdentFormController has awakeFromNib() method in which it tries to position the window. For the sake of simplicity i've just tried to do setFrameOrigin(NSMakePoint(0, 0)). What happens is as follows:
The first time i create this window, everything works as expected. But if i create it again after releasing the previous, it starts appearing at random positions. Why does it do that?
So as I understand it you want to center the window on the screen?
Well assuming NSWindow *window is your window object then there are two methods...
[window center];
This is the best way to do it but it will ofset to take into account visual weight and the Dock's presence.
If you want dead center then this would work...
// Calculate the actual center
CGFloat x = (window.screen.frame.size.width - window.frame.size.width) / 2;
CGFloat y = (window.screen.frame.size.height - window.frame.size.height) / 2;
// Create a rect to send to the window
NSRect newFrame = NSMakeRect(x, y, window.frame.size.width, window.frame.size.height);
// Send message to the window to resize/relocate
[window setFrame:newFrame display:YES animate:NO];
This code is untested but it gives you a fair idea of what you need to do to get this thing working the way you want, personally I would advise you stick with Apple's code because it has been tested and is what the user would expect to see, also from a design perspective as a designer my self I don't always rely on the actual center to be where the optical center is.
You're probably running afoul of automatic window positioning. Have you tried calling
[myWindowController setShouldCascadeWindows: NO];
?
First of all, it sounds like you need to check "dealloc on close" or "release on close" in the NSWindow's property inspector. Then the window will clean up after itself and you can remove the (risky) call to [self release] in your own code.
awakeFromNib is called after all objects from the nib have been unarchived and outlets have been connected, but that may be too early to be setting the window coordinates. I believe Cocoa does some work to automatically position subsequent windows below and to the right of existing windows, so that new windows don't completely obscure old ones. It is likely doing this AFTER you set the position in awakeFromNib, stomping on your changes.
The best place to set your window position is probably in one of the NSWindow delegate methods (windowWillBecomeVisible: perhaps), or possibly right before you call makeKeyAndOrderFront:.
Check out if you can set the centre of your window with the centre of your screen. And set the window position on it. It might work out.