This is my first question on Stack Overflow, after reading many results from Googling various programming problems.
My question is about iOS development.
Basically I have a button that, when pressed, creates another button on the view. I want the user to be able to drag that button around the form. From searching, I found code that shows how to move a button, but that requires having an outlet connected to it (so you can set the center property). Since the button has been created programatically in another method, I can't figure out how to set the center property.
Here's my addButton: method:
-(IBAction)addButton:(id)sender{
CGRect frame = CGRectMake(5.0, 25.0, 40.0, 40.0);
UIButton *button = [UIButton buttonWithType:UIButtonTypeRoundedRect];
[button addTarget:self action:#selector(moveButton:) forControlEvents:UIControlEventTouchDragInside];
[button setTitle:#"Test Button" forState:UIControlStateNormal];
button.frame = frame;
[myView addSubview:button];
}
And what I have typed into the touchesMoved: method:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
}
I'm pretty new to iOS development, so I'm not really sure how to solve this.
You should make the second button a property of your class. First declare the property in the .h+.m file just like the first one was automatically by xCode. Then at the end of your example code add self.secondButtonPropertyNameHere = button;
Once it's a property of the class, any method can set it's properties.
Edit:
Instead of a UIButton property make an NSMutableArray property. Then just add the new button to the array so you can keep a reference to them. Just remember if you go to remove one call [buttonToRemove removeFromSuperview] and remove from the array.
Edit(2):
Your initial question was about making a second button and moving it around. The property of UIButton will work well for that. Because you're moving one object on the view receiving the touches* events. If I were you I would stick with this approach until you are comfortable with the touches* events.
When you are ready to move on. You can manage multiple runtime created UIViews (in your case you want to use buttons) you have to choices:
1) In the touchesBegan: of your view controller iterate thru your array of buttons to see if perhaps the touch fell on one of them and then track the touch until the touchesEnded: call.
2) Subclass the UIView and let it handle it's own touches* events.
The second is much easier to manage.
Edit(3):
You will need to set [button setUserInteractionEnabled:NO] if you wish the superview to handle the touches* events instead of the buttons.
I've done something similar, but in a different way: you can create buttons, then move them around in a rect (actually a scaled view of the later real sized view) and change the size too, with 4 sliders and other buttons for fine tuning. It's very simple by just setting the frame of the target button to new values all the time. I didn't do it with dragging because the button could get too small for the finger.
Related
I have an array that varies in size. I want to create NSButtons for each element in a Mac OSX Application. So say this is the array: "thing1", "thing2", "thing3", then I want three buttons on the screen titled "button1", "button2", "button3" respectively.
This obviously needs to be done programmatically. I do know how to create buttons programmatically and how to give them a fixed position.
But this cannot be the solution. How would I make the whole thing dynamic, so that no matter how many buttons are going to be created, it won't lead to massive UI issues? (So basically: place all buttons next to another within a certain container for example)
Thank you.
you can create button programmatically this way:
NSButton *myButton = [[[NSButton alloc] initWithFrame:NSMakeRect(x, y, width, height)] autorelease];
[[windowOutlet contentView] addSubview: myButton];
[myButton setTitle: #"Button title!"];
[myButton setButtonType:NSMomentaryLightButton]; //Set what type button You want
[myButton setBezelStyle:NSRoundedBezelStyle]; //Set what style You want
[myButton setTarget:self];
[myButton setAction:#selector(buttonPressed)];
}
-(void)buttonPressed {
NSLog(#"Button pressed!");
//Do what You want here...
}
I think you need to use horizontal NSStackView for this. Each item will be a custom NSView (create custom nib) with buttons and elements as its subviews. For right arranging controls inside the item use AutoLayout.
For dynamically placing views you should use constraints. This way you can set up relationship's between your views, instead of placing them at fixed positions. Check out the documentation over here:
https://developer.apple.com/library/prerelease/mac/documentation/AppKit/Reference/NSLayoutConstraint_Class/
There are a few possible approaches.
1) NSStackView
A new-ish class (appeared in 10.9) which handles horizontal or vertical stacking of subviews.
2) Constraints
Constraints allow you to define views' spatial relationships to one another, but they're most easily managed in Interface Builder and can be a bit complicated to work with programmatically.
3) Setting each button's frame in code
With a fairly straightforward loop, you can position the buttons how you like. This is the most "manual" of the three approaches, but also arguably the most clear and predictable. Here's an example of a method which will position buttons.
- (void)arrangeViews:(NSArray <NSView *> *)views startingAt:(NSPoint)startPt spacing:(NSSize)spacing
{
NSPoint loc = startPt;
for (NSView *view in views) {
NSRect viewFrame = view.frame;
viewFrame.origin = loc;
view.frame = viewFrame;
loc.x += spacing.width;
loc.y += spacing.height;
}
}
If you want to arrange 3 buttons vertically, each one 50 pixels apart, then you'd call the method like this:
[self arrangeViews:#[btn0, btn1, btn2] startingAt:NSMakePoint(100,100) spacing:NSMakeSize(0,-50)];
I'm working on an iPad app that lets you control different things in a prototype of an intelligent house. For example it lets you turn lights on and off. For this, I have made a UIImageView that shows the floor plan of the house and added UIButtons as subviews for each lamp that can be toggled.
As you can see the buttons are placed perfectly on the floor plan using the setFrame method of each UIButton. However, when I rotate the iPad to portrait orientation, the following happens:
The buttons obviously still have the same origin, however it is not relative to the repositioning of the image.
The floor plan image has the following settings for struts and springs:
and has its content mode set to Aspect Fit.
My question is
how do I dynamically reposition each UIButton, such that it has the same relative position. I figure I have to handle this in the {did/should}AutorotateToInterfaceOrientation delegate method.
It should be noted that the UIImageView is zoomable and to handle this I have implemented the scrollViewDidZoom delegate method as follows:
for (UIView *button in _floorPlanImage.subviews) {
CGRect oldFrame = button.frame;
[button.layer setAnchorPoint:CGPointMake(0.5, 1)];
button.frame = oldFrame;
button.transform = CGAffineTransformMakeScale(1.0/scrollView.zoomScale, 1.0/scrollView.zoomScale);
}
Thank you in advance!
I find the best way to layout subviews is in the - (void) layoutSubviews method. You will have to subclass your UIImageView and override the method.
This method will automatically get called whenever your frame changes and also gets called the first time your view gets presented.
If you put all your layout code in this method, it prevents layout fragmentation and repetition, keeps your view code in your views, and most things just work by default.
I have a (vertical) UISlider inside a UIScrollview. I'd like to be able to change the value of the slider, and, without lifting my finger, scroll the scrollview left or right.
Desired behavior:
Touch down inside vertical UISlider, followed by a finger drag left or right causes the scrollview to scroll
Actual behavior:
Touch down inside vertical UISlider, followed by a finger drag left or right causes no movement in UIScrollview. A touch down outside the UISlider followed by a drag will scroll the scrollview as expected
UIView has a property called exclusiveTouch which seems as if it might be related to my problem. I tried setting it to NO, with no luck.
So, how can is set up my UISliders so that the scrollview beneath them will respond to touches which originate inside the UISliders?
Have you tried subclassing UIScrollView and implementing - (BOOL)touchesShouldCancelInContentView:(UIView *)view? According to the Apple documentation:
// called before scrolling begins if touches have already been delivered to a subview of the scroll view. if it returns NO the touches will continue to be delivered to the subview and scrolling will not occur
// not called if canCancelContentTouches is NO. default returns YES if view isn't a UIControl
If you simply return NO if the view is your UISlider, this may do what you want, assuming your UIScrollView only scrolls horizontally. If this doesn't work, you likely will have to do custom touch handling (ie. overriding touchesBegan:withEvent:, touchesChanged:withEvent:, etc.) for both your UIScrollView and your UISlider.
What you are seeing is the intended behavior.
Each touch event only gets handled by one control. What exclusiveTouch does is actually to prevent other touch events from being delivered to other views.
To do what are trying to do you would have to do some of the touch handling yourself. Passing the event to both your views. You could do either do it by implementing all the touchesBegan:, touchesMoved: etc. methods and pass the events to both views. You can read more about that approach in the UIResponder documentation. Another approach is to do the event handling in a UIGestureRecognizer on the scroll view that hit tests the slider and updates the value of the slider using the y-delta. You can read more about gesture recognizers and event handling in the section about Gesture Recognizers in the Event Handling Guide for iOS.
Side note:
Go to the Settings app and toggle a switch half way (for example the Airplane mode toggle) and then drag down. Nothing will happen. The rest of the OS behaves the same way. Are you sure that this is the interaction that you really want to do? Apps that behave differently often feel weird and unfamiliar.
Your question confused me a bit. You are saying a vertical slider - but dragging left and right?
If you wish to scroll the scrollview when dragging the UISlider, the proper way to do so is
[mySlider addTarget:self action:#selector(sliderMoved:) forControlEvents:UIControlEventValueChanged];
and
- (void) sliderMoved:(UISlider*) slider {
myScrollView.contentOffset.x = slider.value * (myScrollView.contentSize.width - myScrollView.bounds.size.width);
}
Hope this is what you want.
You need to set delaysContentTouches as NO and prevent for UISlider objects to scroll, Check below code.
mySlider.delaysContentTouches = NO;
- (BOOL)touchesShouldBegin:(NSSet *)touches withEvent:(UIEvent *)event inContentView:(UIView *)view {
if ([view isKindOfClass:[UISlider class]])
{
UITouch *touchEvent = [[event allTouches] anyObject];
CGPoint locationEvent = [touchEvent locationInView:view];
CGRect thumbRect;
UISlider *mySlide = (UISlider*) view;
CGRect trackRect = [mySlide trackRectForBounds:mySlide.bounds];
thumbRect = [mySlide thumbRectForBounds:mySlide.bounds trackRect:trackRect value:mySlide.value];
if (CGRectContainsPoint(thumbRect, locationEvent))
return YES;
}
return NO;
}
I think you can get some reference from this example in this example it is shown that how to cancel any touch or any gesture recognizers and apply them to other views.
May this lead you to the solution of your problem and if it will just let me know about it
Happy Codding :)
In my program I have configured a UIView to be placed over a number of UIBUttons. This is so you can pan across the view and have the buttons move over as well as selecting the buttons themselves. However, it is not possible to select the buttons when they are underneath this UIView. Does anybody know how to configure a UIButton's userInteractionEnabled while it is under these conditions?
This technique has worked for me - caveat: I just typed this in it's not compiled or nuthin it's given as a starting point.
Create the buttons and place them at the top of the view hierarchy (i.e. where they interact normally and respond to taps, on top of the UIView)
When creating the buttons - keep a reference to them in an array.
NSMutableArray* arrayOfPeskyButtons;
Keep a record of the button frames
NSMutableArray* arrayOfPeskyFrames = [NSMutableArray mutableArray];
for ((UIButton*)button in arrayOfPeskyButtons)
{
[arrayOfPeskyFrames addObject:[NSValue valueWithCGRect:button.frame];
}
When your app responds to an action on the UIView - panning or if you change it to a UIScrollView (probably best btw), then move the frame of the buttons correspondingly so that they appear fixed in place to the user.
I have used this technique to create 'overlays' of buttons which sit on top of scrollviews and it works surprisingly well. Simply in the scrollView delegate method.
-(void)scrollViewDidScroll:(UIScrollView *)scrollView
take note of the scrollviews offset and adjust each of the frames of the buttons accordingly depending on the offset of the scrollView
CGPoint offset = scrollView.contentOffset;
Iterate through the array of buttons and move them.
int index = 0;
for ((UIButton*)button in arrayOfPeskyButtons)
{
CGRect originalFrame = [[arrayOfPeskyFrames objectAtIndex:index]CGRectValue];
CGRect currentFrame = button.frame;
//adjust frame
//e.g.
currentFrame.origin.x = originalFrame.origin.x + offset.x;
button.frame = currentFrame.;
index++;
}
Instead of changing the value of the UIButton's userInteractionEnabled, you should set the userInteractionEnabled value of the UIView on top of the UIButton to no.
When the UIView's userInteractionEnabled is set to YES, the UIView was 'eating up' the events. The button would then never receive any events.
In one of my iPhone projects, I have three views that you can move around by touching and dragging. However, I want to stop the user from moving two views at the same time, by using two fingers. I have therefore tried to experiment with UIView.exclusiveTouch, without any success.
To understand how the property works, I created a brand new project, with the following code in the view controller:
- (void)loadView {
self.view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 460)];
UIButton* a = [UIButton buttonWithType:UIButtonTypeInfoDark];
[a addTarget:self action:#selector(hej:) forControlEvents:UIControlEventTouchUpInside];
a.center = CGPointMake(50, 50);
a.multipleTouchEnabled = YES;
UIButton* b = [UIButton buttonWithType:UIButtonTypeInfoDark];
[b addTarget:self action:#selector(hej:) forControlEvents:UIControlEventTouchUpInside];
b.center = CGPointMake(200, 50);
b.multipleTouchEnabled = YES;
a.exclusiveTouch = YES;
[self.view addSubview:a];
[self.view addSubview:b];
}
- (void)hej:(id)sender
{
NSLog(#"hej: %#", sender);
}
When running this, hej: gets called, with different senders, when pressing any of the buttons - even though one of them has exclusiveTouch set to YES. I've tried commenting the multipleTouchEnabled-lines, to no avail. Can somebody explain to me what I'm missing here?
Thanks,
Eli
From The iPhone OS Programming Guide:
Restricting event delivery to a single view:
By default, a view’s exclusiveTouch property is set to NO. If you set
the property to YES, you mark the view so that, if it is tracking
touches, it is the only view in the window that is tracking touches.
Other views in the window cannot receive those touches. However, a
view that is marked “exclusive touch” does not receive touches that
are associated with other views in the same window. If a finger
contacts an exclusive-touch view, then that touch is delivered only if
that view is the only view tracking a finger in that window. If a
finger touches a non-exclusive view, then that touch is delivered only
if there is not another finger tracking in an exclusive-touch view.
It states that the exclusive touch property does NOT affect touches outside the frame of the view.
To handle this in the past, I use the main view to track ALL TOUCHES on screen instead of letting each subview track touches. The best way is to do:
if(CGRectContainsPoint(thesubviewIcareAbout.frame, theLocationOfTheTouch)){
//the subview has been touched, do what you want
}
I was encountering an issue like this where taps on my UIButtons were getting passed through to a tap gesture recognizer that I had attached to self.view, even though I was setting isExclusiveTouch to true on my UIButtons. Upon reviewing the materials here so far, I decided to put some code in my tap gesture code that checks if the tap location is contained in any UIButton frame and if that frame is also visible on the screen at the same time. If both of those conditions are true, then the UIButton will already have handled the tap, and the event triggered in my gesture recognizer can then be ignored as a pass through of the event. My logic allows me to loop over all subviews, checking if they are of type UIButton, and then checking if the tap was in that view and the view is visible.
#objc func singleTapped(tap: UITapGestureRecognizer)
{
anyControlsBreakpoint()
let tapPoint = tap.location(in: self.view)
// Prevent taps inside of buttons from passing through to singleTapped logic
for subview in self.view.subviews
{
if subview is UIButton {
if pointIsInFrameAndThatFrameIsVisible(view: subview, point: tapPoint)
{
return // Completely ignores pass through events that were already handled by a UIButton
}
}
}
Below is the code that checks if point was inside a visible button. Note that I hide my buttons by setting their alpha to zero. If you are using the isHidden property, your logic might need to look for that.
func pointIsInFrameAndThatFrameIsVisible(view : UIView, point : CGPoint) -> Bool
{
return view.frame.contains(point) && view.alpha == 1
}