Unable to push the VIEW - objective-c

I've a DashBoard View which has labels and imageViews drawn on it.
In my application, i want to push the a view which is in StaticPage.h when I touch any ImageView.
I used the concept of UITouch which has method "- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event" as follows:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
NSLog(#"touches began");
if ( [touch view] == _icon) {
StaticPage *staticPage = [[StaticPage alloc] initWithNibName:#"StaticPage" bundle:nil];
[self.navigationController pushViewController:staticPage animated:YES];
[staticPage release];
}
}
where the StaticPage which inherits from UIViewController.
_icon is a UIImageView.
In the AppDelegate, this method "- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions" is as follows:
self.window.rootViewController = self.viewController;
[_window addSubview:navController.view];
[self.window makeKeyAndVisible];
return YES;
where navController is declared in .h file of StaticPage as UINavigationController *navController; and synthesized in .m file of StaticPage.
I wonder why my view from StaticPage is not getting pushed??
Any Help... Thank You in Advance

I figured out the navigation but not using method pushViewController:animated.
Using presentModalViewController:animated we can navigate to the desired page.

Related

how to add 3dtouchforce to UIButton?

I want to measure the touch force when tapping or dragging a button. i have created a UITapGestureRecognizer (for tapping) and added it to myButton like this:
UITapGestureRecognizer *tapRecognizer2 = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(buttonPressed:)];
[tapRecognizer2 setNumberOfTapsRequired:1];
[tapRecognizer2 setDelegate:self];
[myButton addGestureRecognizer:tapRecognizer2];
i have created a method called buttonPrssed like this:
-(void)buttonPressed:(id)sender
{
[myButton touchesMoved:touches withEvent:event];
myButton = (UIButton *) sender;
UITouch *touch=[[event touchesForView:myButton] anyObject];
CGFloat force = touch.force;
forceString= [[NSString alloc] initWithFormat:#"%f", force];
NSLog(#"forceString in imagePressed is : %#", forceString);
}
I keep getting zero values (0.0000) for touch. any help or advice would be appreciated. i did a search and found DFContinuousForceTouchGestureRecongnizer sample project but found it too complicated. I use iPhone 6 Plus s that has touch. i can also measure touch when tapping on any other area in the screen but not on buttons using this code:
- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
//CGFloat maximumPossibleForce = touch.maximumPossibleForce;
CGFloat force = touch.force;
forceString= [[NSString alloc] initWithFormat:#"%f", force];
NSLog(#"forceString is : %#", forceString);
}
You are getting 0.0000 in buttonPressed because the user already lifted his finger when this is called.
You are right, that you need to get the force in the touchesMoved method, but you need to get it in the UIButton's touchesMoved method. Because of that you need to subclass UIButton and override its touchesMoved method:
Header file:
#import <UIKit/UIKit.h>
#interface ForceButton : UIButton
#end
Implementation:
#import "ForceButton.h"
#implementation ForceButton
- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
[super touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGFloat force = touch.force;
CGFloat relativeForce = touch.force / touch.maximumPossibleForce;
NSLog(#"force: %f, relative force: %f", force, relativeForce);
}
#end
Also, there is no need to use a UITapGestureRecognizer to detect a single tap on a UIButton. Just use addTarget instead.

Xcode keyboard issue on scroll view

I have a scroll view on top single view. I have some textfields and UIPickers on it. Now I know how to make a keyboard go off when return is pushed. But, I am trying to get the keyboard off from textfield when the background is tapped or UIpicker is selected. I tried doing this...
Interface :
- (IBAction)textFieldReturn:(id)sender;
- (IBAction)backgroundTouched:(id)sender;
Implementation :
-(IBAction)textFieldReturn:(id)sender
{
[sender resignFirstResponder];
}
-(IBAction)backgroundTouched:(id)sender
{
[textField resignFirstResponder];
}
But the problem is I cant make sroll view as control type to make it work..
Try like this may be it helps you but not sure,
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
if (![[touch view] isKindOfClass:[UITextField class]]) {
[yourtextfield resignFirstResponder];
}
}
And for getting touch event on scrollview you have to take geasture recognization,
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(singleTap)];
[scroll addGestureRecognizer:singleTap];
-(void)singleTap{
[text resignFirstResponder];
//write whatever you want it.
}

Getting touches for a view with subview

i want to have a top view (subclass of UIView), which catches touches using
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
...
.h
#interface XView : UIView<UITableViewDataSource, UITableViewDelegate>
#property (nonatomic, strong) UITableView * tableView;
#end
this works fine, if the view is empty, but as soon as i insert (addSubview) lets say a UITableView
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
self.tableView = [[UITableView alloc] initWithFrame:self.frame];
self.tableView.delegate = self;
self.tableView.dataSource = self;
[self addSubview:self.tableView];
}
return self;
}
Than the touch methods inside XView are not triggered
You would either need to subclass your UITableView or use a UITapGestureRecognizer to solve this. I'd personally go with the UITapGestureRecognizer.
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapCode:)];
[self.tableView addGestureRecognizer:tap];
and then you can handle your tap here:
- (void)tapCode:(UITapGestureRecognizer *)recognizer
{
// code goes here
}
If you subclass UITableView, you can put your touchesBegan:withEvent in there.

Getting the UITouch objects for a UIGestureRecognizer

Is there a way to get the UITouch objects associated with a gesture? UIGestureRecognizer doesn't seem to have any methods for this.
Jay's right... you'll want a subclass. Try this one for size, it's from one of my projects. In DragGestureRecognizer.h:
#interface DragGestureRecognizer : UILongPressGestureRecognizer {
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
#end
#protocol DragGestureRecognizerDelegate <UIGestureRecognizerDelegate>
- (void) gestureRecognizer:(UIGestureRecognizer *)gr movedWithTouches:(NSSet*)touches andEvent:(UIEvent *)event;
#end
And in DragGestureRecognizer.m:
#import "DragGestureRecognizer.h"
#implementation DragGestureRecognizer
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesMoved:touches withEvent:event];
if ([self.delegate respondsToSelector:#selector(gestureRecognizer:movedWithTouches:andEvent:)]) {
[(id)self.delegate gestureRecognizer:self movedWithTouches:touches andEvent:event];
}
}
#end
Of course, you'll need to implement the
- (void) gestureRecognizer:(UIGestureRecognizer *)gr movedWithTouches:(NSSet*)touches andEvent:(UIEvent *)event;
method in your delegate -- for example:
DragGestureRecognizer * gr = [[DragGestureRecognizer alloc] initWithTarget:self action:#selector(pressed:)];
gr.minimumPressDuration = 0.15;
gr.delegate = self;
[self.view addGestureRecognizer:gr];
[gr release];
- (void) gestureRecognizer:(UIGestureRecognizer *)gr movedWithTouches:(NSSet*)touches andEvent:(UIEvent *)event{
UITouch * touch = [touches anyObject];
self.mTouchPoint = [touch locationInView:self.view];
self.mFingerCount = [touches count];
}
If it is just the location you are interested in, you do not have to subclass, you will be notified of location changes of the tap from the UIGestureRecognizer.
Initialize with target:
self.longPressGestureRecognizer = [[[UILongPressGestureRecognizer alloc] initWithTarget: self action: #selector(handleGesture:)] autorelease];
[self.tableView addGestureRecognizer: longPressGestureRecognizer];
Handle UIGestureRecognizerStateChanged to get location changes:
- (void)handleGesture: (UIGestureRecognizer *)theGestureRecognizer {
switch (theGestureRecognizer.state) {
case UIGestureRecognizerStateBegan:
case UIGestureRecognizerStateChanged:
{
CGPoint location = [theGestureRecognizer locationInView: self.tableView];
[self infoForLocation: location];
break;
}
case UIGestureRecognizerStateEnded:
{
NSLog(#"Ended");
break;
}
default:
break;
}
}
If you only need to find out the location of the gesture, you can call either locationInView: or locationOfTouch:inView: on the UIGestureRecognizer object. However if you want to do anything else, then you'll need to subclass.
If you're writing your own UIGestureRecognizer you can get the touch objects overriding:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
or the equivalent moved, ended or canceled
The Apple docs has lots of info on subclassing
Here is a method to get a long press added to an arbitrary UIView.
This lets you run longpress on any number of UIViews.
In this example I restrict access to the UIView method through tags but you could use isKindOfClass as well.
(From what I found you have to use touchesMoved for LongPress because touchesBegan fires before the LongPress becomes active)
subclass - .h
#import <UIKit/UIKit.h>
#import <UIKit/UIGestureRecognizerSubclass.h>
#interface MyLongPressGestureRecognizer : UILongPressGestureRecognizer
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
#property (nonatomic) CGPoint touchPoint;
#property (strong, nonatomic) UIView* touchView;
#end
subclass - .m
#import "MyLongPress.h"
#implementation MyLongPressGestureRecognizer
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
{
UITouch * touch = [touches anyObject];
self.touchPoint = [touch locationInView:self.view];
self.touchView = [self.view hitTest:[touch locationInView:self.view] withEvent:event];
}
#pragma mark - Setters
-(void) setTouchPoint:(CGPoint)touchPoint
{
_touchPoint =touchPoint;
}
-(void) setTouchView:(UIView*)touchView
{
_touchView=touchView;
}
#end
viewcontroller - .h
//nothing special here
viewcontroller - .m
#import "ViewController.h"
//LOAD
#import "MyLongPress.h"
#interface ViewController ()
//lOAD
#property (strong, nonatomic) MyLongPressGestureRecognizer* longPressGesture;
#end
#implementation PDViewController
- (void)viewDidLoad
{
[super viewDidLoad];
//LOAD
self.longPressGesture =[[MyLongPressGestureRecognizer alloc] initWithTarget:self action:#selector(longPressHandler:)];
self.longPressGesture.minimumPressDuration = 1.2;
[[self view] addGestureRecognizer:self.longPressGesture];
}
//LOAD
-(void) setLongPressGesture:(MyLongPressGestureRecognizer *)longPressGesture {
_longPressGesture = longPressGesture;
}
- (void)longPressHandler:(MyLongPressGestureRecognizer *)recognizer {
if (self.longPressGesture.touchView.tag >= 100) /* arbitrary method for limiting tap */
{
//code goes here
}
}
Simple and fast:
NSArray *touches = [recognizer valueForKey:#"touches"];
Where recognizer is your UIGestureRecognizer

objective-c touch-events

I've got a set of Images and would like to know which I have touched. How could I implement that...?
To be more precise:
A "Home-Class" will instantiate a couple of Image-Classes:
Image *myImageView = [[Image alloc] initWithImage:myImage];
The image-class looks something like this:
- (id) initWithImage: (UIImage *) anImage
{
if ((self = [super initWithImage:anImage]))
{
self.userInteractionEnabled = YES;
}
return self;
}
later on, I use these touches-event-methods also in the Image-class:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{}
My problem at the moment: the touchesBegan/Ended methods will be fired no matter where I touched the screen, but I would like to find out which of the Images has been touched.....
Whenever you get the touch, you check if that touch happen in between your image area. Here is the example code, lets suppose you have UIImage object called img.
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if (location.x >= img.x && location.x <= img.x && location.y >= img.y && location.y <= img.y) {
// your code here...
}
}
Inside your *.h (interface) file:
#interface MyViewController : UIViewController{
IBOutlet UIImageView *imageViewOne;
IBOutlet UIImageView *imageViewTwo;
UIImageView * alphaImage;
}
-(BOOL)isTouch:(UITouch *)touch WithinBoundsOf:(UIImageView *)imageView;
Place UIImageView components on your *.xib and bind them with 'imageViewOne' and 'imageViewTwo' using the "File's owner".
Go to the *.m (implementation) file and:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if ([self isTouch:touch WithinBoundsOf:imageViewOne])
{
NSLog(#"Fires first action...");
}
else if([self isTouch:touch WithinBoundsOf:imageViewTwo]){
NSLog(#"Fires second action...");
}
}
//(Optional 01) This is used to reset the transparency of the touched UIImageView
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[alphaImage setAlpha:1.0];
}
-(BOOL)isTouch:(UITouch *)touch WithinBoundsOf:(UIImageView *)imageView{
CGRect _frameRectangle=[imageView frame];
CGFloat _imageTop=_frameRectangle.origin.y;
CGFloat _imageLeft=_frameRectangle.origin.x;
CGFloat _imageRight=_frameRectangle.size.width+_imageLeft;
CGFloat _imageBottom=_frameRectangle.size.height+_imageTop;
CGPoint _touchPoint = [touch locationInView:self.view];
/*NSLog(#"image top %f",_imageTop);
NSLog(#"image bottom %f",_imageBottom);
NSLog(#"image left %f",_imageLeft);
NSLog(#"image right %f",_imageRight);
NSLog(#"touch happens at %f-%f",_touchPoint.x,_touchPoint.y);*/
if(_touchPoint.x>=_imageLeft &&
_touchPoint.x<=_imageRight &&
_touchPoint.y>=_imageTop &&
_touchPoint.y<=_imageBottom){
[imageView setAlpha:0.5];//optional 01 -adds a transparency changing effect
alphaImage=imageView;//optional 01 -marks the UIImageView which deals with the transparency changing effect for the moment.
return YES;
}else{
return NO;
}
}
That's how I handled that. I got the idea having read the post of "itsaboutcode".