New to xcode,i'm creating a simple login form in xcode 4.2 and i would like to hide the keyboard,i have the correct code i think,from the tutorial it says i need to change the class of the view to UIControl but there is no option for this, is there another way when working with storyboards?
- (IBAction)backGroundTouched:(id)sender
{
[emailTextField resignFirstResponder];
[passTextField resignFirstResponder];
}
Assuming you are doing them inside the viewCotroller, invoke
[self.view endEditing:YES];
If your two text fields are subviews of some higher-level view you can also use [higherLevelView endEditing]; and not care which subview is currently active.
Make sure your both text fields is connect with it's IBOutlets.
No need to change UIView to UIControl.
// Connect every textfield's "Did end on exit" event with this method.
-(IBAction)textFieldReturn:(id)sender
{
[sender resignFirstResponder];
}
// Use this method also if you want to hide keyboard when user touch in background
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[emailTextField resignFirstResponder];
[passTextField resignFirstResponder];
}
I followed this tutorial: http://www.techotopia.com/index.php/Writing_iOS_7_Code_to_Hide_the_Keyboard and it's working for me:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
if ([_textField isFirstResponder] && [touch view] != _textField) {
[_textField resignFirstResponder];
}
[super touchesBegan:touches withEvent:event];
}
Related
Is it possible to detect user interaction (like swiping a scrollview or start editing a textfield) without using delegates or notifications? Just using the most basic touch events possible.
I've been trying to use touchesBegan and touchesEnded but these are not always called.
Yes you can just connect an IBAction to your UITextField Sent Events - Editing Changed, so every time the text field changes your IBAction will be called. You can do the same for Editing Did Begin and Editing Did End.
https://stackoverflow.com/a/29045440/2303865
You can identify the touched object and perform actions you required.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touch = [touches anyObject];
if ([touch.view isKindOfClass: UITextField.class])
{
if (![(UITextField *)touch.view isFirstResponder])
{
//start editing
[(UITextField *)touch.view becomeFirstResponder];
}
}
else
{
}
}
I'm trying to implement a uitableview that its rows can be dragged to right and left (and show something behind them).
The code works fine, I've implemented it using the following methods:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
My problem is that the rows also contain UIButtons that when clicking them- should be clicked but when dragging - should drag the entire cell.
I've found this solution. Basically to bubble up the events when clicking on the UIButtons:
[super touchesBegan:touches withEvent:event];
[self.nextResponder touchesBegan:touches withEvent:event];
But, it seems taht the event touchesMoved only bubbles once.
I've seen all sort of questions in this area. Example. But I don't see any solution or responses.
Any help, suggestion or creative workaround would be appreciated!
Instead of implementing touchesBegan, etc. why not use a UIPanGestureRecognizer? I tested this with just a simple rectangular view which was mostly covered by a UIButton. The view was dragged, no matter where I touched, and the button method fired if I clicked over the button.
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[self.theView addGestureRecognizer:panGesture]; //theView is IBOutlet for small view containing a button
}
-(void)viewDidAppear:(BOOL)animated {
self.currentViewFrame = self.theView.frame;
}
- (IBAction)handlePanGesture:(UIPanGestureRecognizer *)sender {
CGPoint translate = [sender translationInView:self.view];
CGRect newFrame = self.currentViewFrame;
newFrame.origin.x += translate.x;
newFrame.origin.y += translate.y;
sender.view.frame = newFrame;
if (sender.state == UIGestureRecognizerStateEnded)
self.currentViewFrame = newFrame;
}
-(IBAction)doClick:(id)sender {
NSLog(#"click");
}
Just check to see which one was touched using the tag system.
In my app delegate, I made sure I have the line:
[glView setMultipleTouchEnabled: YES];
And I have a simple layer meant only to figure out how multi touch works. The .mm file looks like:
#import "TestLayer.h"
#implementation TestLayer
-(id) init
{
if( (self=[super init])) {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
return self;
}
-(void) draw{
[super draw];
glColor4f(1.0, 0.0, 0.0, 0.35);
glLineWidth(6.0f);
ccDrawCircle(ccp(500,500), 250,CC_DEGREES_TO_RADIANS(360), 60,YES);
}
-(void) ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"got some touches");
}
-(void) ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"some touches moved.");
}
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
NSLog(#"a touch began");
return FALSE;
}
#end
When I touch the screen, I always see "a touch began", but no matter how I touch it (simulator or actual device), I never see "some touches moved" or "got some touches".
Is there something further I need to do to make multi touch work?
Specifically, I'm just trying to do basic pinch-to-zoom functionality... I heard there is some sort of gesture recognizer for iPhone...does it work for Coco2ds? Would it work even if I can't get simple multi touch events to fire?
UIGestureRecognizers absolutely work for Cocos2D, I personally used them, you just need to add them to the correct view by using:
[[[CCDirector sharedDirector] openGLView] addGestureRecognizer:myGestureRecognizer];
Regarding your touches, I guess you enabled them for the scene you are working in?
scene.isTouchEnabled = YES;
In any case you shouldn't use the addTargetDelegate method, take a look here
add self.isTouchEnabled = YES; to your init
and for the gesture recognizers look at the other answer
I'm currently doing an application where I'm trying to detect touch-positions of the user.
I changed from ccTouchBegan to ccTouchesBegan in the process of implementing the "detect touch-position" function.
But I can't get it to work. I changed from ccTouchBegan to ccTouchesBegan:
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
instead of using:
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
When i did this the whole thing crashes when I click the screen. Generating a SIGABRT error higlighting:
#ifdef __IPHONE_OS_VERSION_MAX_ALLOWED
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
NSAssert(NO, #"Layer#ccTouchBegan override me");
return YES;
}
#endif
#end
So my questions are:
Why do you think it crashes?
What's the difference between ccTouchBegan & ccTouchesBegan? Multi touch abilities?
For further help, this is my code:
-(id) init
{
if( (self=[super init])) {
[CCTexture2D setDefaultAlphaPixelFormat:kCCTexture2DPixelFormat_RGBA8888];
self.isTouchEnabled = YES;
// Set up background
background = [CCSprite spriteWithFile:#"Image.png"];
background.scaleX = 1;
background.scaleY = 1;
background.position = ccp(0,0);
[self addChild:background];
[[CCTouchDispatcher sharedDispatcher]addTargetedDelegate:self
priority:0
swallowsTouches:YES];
// Preload sound effect
soundFile = [SimpleAudioEngine sharedEngine];
if (soundFile != nil) {
[soundFile preloadBackgroundMusic:#"sound.wav"];
}
}
return self;
}
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
NSLog(#"ccTouchesBegan");
// Sets the sound variable to YES
ifOne = YES;
prevPos = [touch locationInView:[touch self]];
prevPos = [[CCDirector sharedDirector] convertToGL:[touch locationInView:touch.self]];
[self schedule:#selector(timerUpdate:) interval:0.1];
//return YES;
}
It's a nice feature in cocos2d which lets you swallow touches in cases that you want to handle only a single touch event.
Try adding this function to your class:
- (void) registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:INT_MIN+1 swallowsTouches:YES];
}
ccTouchesBegan happens the second you tap the screen
ccTouchesEnded happens the second you let go after tapping on the screen
and instead of
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
instead of using:
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
try using
-(void)ccTouchesBegan:(UITouch *)...
or
-(void)ccTouchesBegan:(NSSet *)...
or
-(BOOL)ccTouchesBegan:(NSSet *)...
your problem may just be invalid data types or some crap like that, my advice is just try switching the types of touches around.
I would give more info but you didn't provide alot of information to work with, so this is the best I can do.
I'm making an UISearchBar option in the navigationbar of my app.
My app consists of multiple views and subviews.
I have this mainview which has 3 other views on himself. one of it is empty (for now) the other 2 have tableviews on them.
I want my keyboard to show when I'm searching and hide when i'm doing the actual search or when i touch/click outside the uisearchbar.
Im using the searchbardelegate as is required.
Im able to hide the keyboard using [searchBar resignFirstResponder] in the following ways.
When im pressing the return key.
When i cancel search manually
When i press any of the keyboard buttons to search or cancel.
When i touch an empty part of the screen using
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if ([mySearchBar isFirstResponder] && [touch view] != mySearchBar) {
[mySearchBar resignFirstResponder];
}
[super touchesBegan:touches withEvent:event];
}
What i not seem to be able to do is make it respond to touching one of my 2 tableviews. Or when im refilling the mainview to contain something different entirely.
Ive tried changing the touchesbegan method to resign the searchbar when touching the tableviews but it hasnt worked so far.
I've tried several other things found by my dear friend mr. google, but it all seems to be something other then I need.
Anyone have any ideas of what I might do to fix this problem?
EDIT:
It appears so that, when using breakpoints, touchesbegan-method does respond to the backgroundview but it doesnt respond when i touch either of the tableviews or the navigationbar (containing the uisearchbar).
Solved it!
- (BOOL) searchBarShouldBeginEditing:(UISearchBar *)searchBar
{
[self.myViewController1.customView1 setUserInteractionEnabled:NO];
[self.myViewController2.customView2 setUserInteractionEnabled:NO];
[searchBar setShowsCancelButton:YES animated:[mySettings animation]];
return YES;
}
I started by shutting down the userInteraction on my 2 subviews, at the moment I start using the searchBar.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if ([self.mySearchBar isFirstResponder] && [touch view] != self.mySearchBar)
{
[self.mySearchBar resignFirstResponder];
[self.myViewController1.customView1 setUserInteractionEnabled:YES];
[self.myViewController2.customView2 setUserInteractionEnabled:YES];
}
[super touchesBegan:touches withEvent:event];
}
Then when I click/tap/touch outside the searchBar I first resign the keyboard which is first responder still AND AFTER that I set userInteraction back on for the 2 subviews.
The order of doing this is vital!
This piece of code allows u to resign the keyboard in a single mainViewController even when it is crowded with a massload of subviews.
Have you tried this,
UISearchBar *searchBar;
Next set the Getter and Setter Property of UISearchBar.
and call the any method
[searchBar resignFirstResponder];