Spritekit and Touch event (Coordinates seems flipped) - objective-c

I have added an image on the SKScene using the following code -
char1 = [[SKSpriteNode alloc] initWithImageNamed:#"CH_designb_nerd.png"];
char1.position = CGPointMake(225.0, 65.0);
char1.anchorPoint = CGPointMake(0, 0);
[char1 setScale:0.35];
[self addChild:char1];
Then I was trying to create a touch event on the image but CGRect of the image seems totally flipped on the other side of the screen. I have used the following code
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if (CGRectContainsPoint(char1.frame, location)) {
NSLog(#"yaaa!!");
}
else{
NSLog(#"%.2f",location.x);
NSLog(#"%.2f",char1.position.x);
}
}
Im totally lost on this as the image is placed perfectly but the coordinates or the CGRect seems flipped. The image is placed on the bottom but only when i touch on top of the screen does it say touch occurred.

I imagine this is due to getting the wrong click location. I would instead look at using the SKNode equivalent for hit testing. Something like this:
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKSpriteNode *char1 = (SKSpriteNode *)[self childNodeWithName:#"char1Name"];
if ([char1 containsPoint:location])
{
...
}
There might be some syntax errors here, typing from memory but you get the gist. Note: you would need to set the name property for char1 to perform the lookup, or you could just enumerate self.children if there is only that node.

Try this.
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
Nice example of spriteKit Link

Related

Drag and drop. Nudging other images while moving

I've implemented drag and drop for multiple images which works fine but I'm facing one issue. When I'm dragging one image, other images will be nudged along when my finger moves over them whilst Im still dragging the original image. I'd love to be able to have only one image moveable at once.
Heres some of my code.
-(void)touchesBegan: (NSSet *) touches withEvent:(UIEvent *)event{
UITouch* touch = [touches anyObject];
for (UIImageView *noseImage in noseArray) {
if ([touch.view isEqual:noseArray]) {
firstTouchPoint = [touch locationInView:[self view]];
xd = firstTouchPoint.x - [[touch view]center].x;
yd = firstTouchPoint.y - [[touch view]center].y;
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint oldPoint = [touch previousLocationInView:touch.view];
CGPoint newPoint = [touch locationInView:touch.view];
CGPoint diff = CGPointMake(newPoint.x - oldPoint.x, newPoint.y - oldPoint.y);
for (UIImageView *noseImageView in noseArray) {
if (CGRectContainsPoint(noseImageView.frame, newPoint)) {
CGPoint cntr = [noseImageView center];
[noseImageView setCenter:CGPointMake(cntr.x + diff.x, cntr.y + diff.y)];
}
}
}
Typically you would determine which image is being dragged in the touchesBegan: and remember that. Then in the touchesMoved:, move the remembered image the given amount.
But a Gesture Recogniser works a lot easier than these low level methods, so I suggest you use that instead.
You wrote:
for (UIImageView *noseImage in noseArray) {
if ([touch.view isEqual:noseArray]) {
Are you sure the second line shouldn't be the following?
if ([touch.view isEqual: noseImage]) {

draw a line in sprite kit in touchesmoved

I would like to draw a line in sprite kit along the points collected in touchesmoved.
Whats the most efficient way of doing this? I've tried a few times and my line is either wrong on the y axis, or takes up a lot of processing power that the fps goes down to 10 a second.
Any ideas?
You could define a CGpath and modify it by adding lines or arcs in your touch moved function. After that, you can create a SKShapeNode from your path and configure it as you prefer.
If you want to draw the line while the finger is moving on the screen you can create the shape node when the touch begins with an empty path and then modify it.
Edit: I wrote some code, it works for me, draws a simple red line.
In your MyScene.m:
#interface MyScene()
{
CGMutablePathRef pathToDraw;
SKShapeNode *lineNode;
}
#end
#implementation
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint positionInScene = [touch locationInNode:self];
pathToDraw = CGPathCreateMutable();
CGPathMoveToPoint(pathToDraw, NULL, positionInScene.x, positionInScene.y);
lineNode = [SKShapeNode node];
lineNode.path = pathToDraw;
lineNode.strokeColor = [SKColor redColor];
[self addChild:lineNode];
}
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
CGPoint positionInScene = [touch locationInNode:self];
CGPathAddLineToPoint(pathToDraw, NULL, positionInScene.x, positionInScene.y);
lineNode.path = pathToDraw;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
// delete the following line if you want the line to remain on screen.
[lineNode removeFromParent];
CGPathRelease(pathToDraw);
}
#end

get touch location scrollView

I need get touch location, and I created an method to get it.. but when I touch in my ScrollView, the method dont return the location, only out of scrollView.
this is the code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint pos = [touch locationInView:nil];
NSLog(#"Position of touch: %.0f, %.0f", pos.x, pos.y);
}
someone knows how to do it?
thanks!
CGPoint pos = [touch locationInView:nil];
You are retrieving the coordinates of the touch with respect to the window's origin. Per the documentation:
view
The view object in whose coordinate system you want the touch
located. A custom view that is handling the touch may specify self to
get the touch location in its own coordinate system. Pass nil to get
the touch location in the window’s coordinates.
To get the coordinates in your scroll view, you need to pass self:
CGPoint pos = [touch locationInView:self];

How to move a UIImageView with Touch

I'm trying to create moving functionality to my imageView (maskPreview in the code below), so that users can move a picture, which is contained in maskPreview, around the screen. Here's my code for touch begin and touch moved:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch= [touches anyObject];
originalOrigin = [touch locationInView:maskPreview];
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch = [touches anyObject];
CGPoint lastTouch = [touch previousLocationInView:self.view];
CGFloat movedDistanceX = originalOrigin.x-lastTouch.x;
CGFloat movedDistanceY = originalOrigin.y-lastTouch.y;
[maskPreview setFrame:CGRectMake(maskPreview.frame.origin.x+movedDistanceX, maskPreview.frame.origin.y + movedDistanceY, maskPreview.frame.size.width, maskPreview.frame.size.height)];
}
}
but I'm getting some weird responses from the app. I haven't put restrictions on how far the imageview can move, i.e. to prevent it from going out of the screen, but even if it's a small move, my imageview goes wild and disappears.
Thanks alot in advance for all the help
Implementing touchesBegan and so on is way overkill in this modern world. You're just confusing the heck out of yourself, and your code will quickly become impossible to understand or maintain. Use a UIPanGestureRecognizer; that's what it's for. Making a view draggable with a UIPanGestureRecognizer is trivial. Here's the action handler for a UIPanGestureRecognizer that makes the view draggable:
- (void) dragging: (UIPanGestureRecognizer*) p {
UIView* vv = p.view;
if (p.state == UIGestureRecognizerStateBegan ||
p.state == UIGestureRecognizerStateChanged) {
CGPoint delta = [p translationInView: vv.superview];
CGPoint c = vv.center;
c.x += delta.x; c.y += delta.y;
vv.center = c;
[p setTranslation: CGPointZero inView: vv.superview];
}
}
There are two problems with your code. First, this line is wrong:
CGPoint lastTouch = [touch previousLocationInView:self.view];
It should be this:
CGPoint lastTouch = [touch previousLocationInView:maskPreview];
Really, you shouldn't even be using previousLocationInView:. You should just be using locationInView: like this:
CGPoint lastTouch = [touch locationInView:maskPreview];
Second, you are getting the signs of movedDistanceX and movedDistanceY wrong. Change them to this:
CGFloat movedDistanceX = lastTouch.x - originalOrigin.x;
CGFloat movedDistanceY = lastTouch.y - originalOrigin.y;
Also, the documentation for touchesBegan:withEvent: says this:
If you override this method without calling super (a common use pattern), you must also override the other methods for handling touch events, if only as stub (empy) implementations.
So make sure you're also overriding touchesEnded:withEvent: and touchesCancelled:withEvent:.
Anyway, you can do this quite a bit more simply. One way is to make touchesBegan:withEvent: empty and do all the work in touchesMoved:withEvent: by using both previousLocationInView: and locationInView:, and updating maskPreview.center instead of maskPreview.frame. You won't even need the originalOrigin instance variable:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch = [touches anyObject];
CGPoint p0 = [touch previousLocationInView:maskPreview];
CGPoint p1 = [touch locationInView:maskPreview];
CGPoint center = maskPreview.center;
center.x += p1.x - p0.x;
center.y += p1.y - p0.y;
maskPreview.center = center;
}
}
Another way to do this is by using a UIPanGestureRecognizer. I leave that as an exercise for the reader.

Point not in Rect but CGRectContainsPoint says yes

If I have a UIImageView and want to know if a user has tapped the image. In touchesBegan, I do the following but always end up in the first conditional. The window is in portrait mode and the image is at the bottom. I can tap in the upper right of the window and still go into the first condition, which seems very incorrect.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:touch.view];
if(CGRectContainsPoint(myimage.frame, location) == 0){
//always end up here
}
else
{ //user didn't tap inside image}
and the values are:
location: x=303,y=102
frame: origin=(x=210,y=394) size=(width=90, height=15)
Any suggestions?
First, you get the touch with:
UITouch *touch = [[event allTouches] anyObject];
Next, you want to be checking for the locationInView relative to your image view.
CGPoint location = [touch locationInView:self]; // or possibly myimage instead of self.
Next, CGRectContainsPoint returns a boolean, so comparing it to 0 is very odd. It should be:
if ( CGRectContainsPoint( myimage.frame, location ) ) {
// inside
} else {
// outside
}
But if self is not myimage then the myimage view may be getting the touch instead of you - its not clear from your question what object self is it is is not a subclass of the UIImageView in question.
Your logic is simply inverted. The CGRectContainsPoint() method returns bool, i.e. true for "yes". True is not equal to 0.