How to get the real position of a sub node in SceneKit after rotation? - objective-c

I am developing a scene with SceneKit. I have a main node with a subnode:
// Main node
SCNNode* planet = [SCNNode node];
planet.geometry = [SCNSphere sphereWithRadius:2];
planet.position = SCNVector3Make(0, -3, 5);
// sub-node
SCNNode* satellite = [SCNNode node];
satellite.geometry = [SCNSphere sphereWithRadius:0.4];
satellite.position = SCNVector3Make(4, 0, 0);
[planet addChildNode:satellite];
[scene.rootNode addChildNode:planet];
I use a NSTimer to make some actions and some animations. In the timer event I do that:
planetRotation += 0.1;
planet.rotation = SCNVector4Make(0,1,0, planetRotation);
But if i try to get the position of the satellite node I always get the same value.
I tried to get the positionnode to know the real position of the satellite node but nothing changes.
How can I get the real position of a sub-node when I change the rotation of the parent node?
Thanks in advance

the position of a node is expressed in its parent coordinate system. Just like for views in UIKIt/AppKit. If you change the frame of a view, the frame of its subviews does not change.
What you want is what we call the world transform of the subnode (i.e. its transform expressed in the coordinate system of the scene's root node).
You can have a look at worldTransform and -convertPosition:toNode:.

The node's position won't change due to physics, that is why you aren't seeing it change. You need to call the node's presentationNode to get the position of the node as it is presented onscreen:
node.presentationNode.position

Related

SceneKit follow camera?

I am trying to make a follow camera in scenekit. I have just started, so try to bear with me. I have a node (robotNode) and am trying to have the camera follow the robot. I have partially achived this by doing adding the camera as a child node of the robot:
cameraNode = [SCNNode node];
cameraNode.camera = [SCNCamera camera];
[robotNode addChildNode:cameraNode];
// place the camera
cameraNode.position = SCNVector3Make(0, 0, 10);
But the problem is when I start to rotate the camera, it doesn't follow the node anymore. See here:
How can I get it to continue to follow the node?
What you've written will create a camera node a fixed distance from the robot, but you've done nothing to control where the camera points.
Create an SCNLookAtConstraint whose target is the robot node. Attach that to the camera node.
// warning, written in browser, untested
SCNLookAtConstraint *robotStare = [SCNLookAtConstraint lookAtConstraintWithTarget:robotNode];
// and maybe also
robotStare.gimbalLockEnabled = YES;
cameraNode.constraints = #[robotStare];
// OP added this, to make camera follow node. I'm skeptical.
cameraNode.camera.usesOrthographicProjection = YES;

Preventing SKSpriteNode from going off screen

I have a SKSpriteNode that moves with the accelerometer by using the following code:
-(void)processUserMotionForUpdate:(NSTimeInterval)currentTime {
SKSpriteNode* ship = (SKSpriteNode*)[self childNodeWithName:#"fishderp"];
CMAccelerometerData* data = self.motionManager.accelerometerData;
if (fabs(data.acceleration.y) > 0.2) {
[gameFish.physicsBody applyForce:CGVectorMake(0, data.acceleration.y)];
}
}
This works well however, the node (gamefish) moves off the screen. How can I prevent this and have it stay on the screen?
Try using an SKConstraint which was designed exactly for this purpose and introduced in iOS8:
Just add this to the setup method of the gameFish node. The game engine will apply the constraint after the physics has run. You won't have to worry about it. Cool huh?
// get the screensize
CGSize scr = self.scene.frame.size;
// setup a position constraint
SKConstraint *c = [SKConstraint
positionX:[SKRange rangeWithLowerLimit:0 upperLimit:scr.width]
Y:[SKRange rangeWithLowerLimit:0 upperLimit:scr.width]];
gameFish.constraints = #[c]; // can take an array of constraints
The code depends on whether you have added the gameFish node to self or to another node (something like a "worldNode"). If you have added it to self, look at the code below:
// get the screen height as you are only changing your node's y
float myHeight = self.view.frame.size.height;
// next check your node's y coordinate against the screen y range
// and adjust y if required
if(gameFish.position.y > myHeight) {
gameFish.position = CGPointMake(gameFish.position.x, myHeight);
}
For the bottom you can do a check of < 0 or whatever value you need.

Camera as child of Object3D positioning issues

Setup a simple scene here:
http://jsfiddle.net/majman/Sps3c/
I was initially trying to demonstrate a problem I was having with rotating a parent container while having the camera maintain it's relative offset position, but when setting up this example I couldn't even adjust the camera's initial position.
Current Problem:
// camera
camera = new THREE.PerspectiveCamera(45, window.innerWidth / window.innerHeight, 5, 150);
camera.position.z = 50; // this doesn't work?
// object to contain camera & helper
cameraContainer = new THREE.Object3D();
cameraContainer.rotation.order = "YXZ"; // maybe not necessary
// add to container
cameraContainer.add(camera);
scene.add(cameraContainer);
Now when rotating the cameraContainer, the camera's rotation follows - but I'd like the camera's position to be offset from the cameraContainer. I'm unable to modify any position properties for some reason.
Your code is working fine. You are confused because the CameraHelper is not displaying the camera in its actual position. You need to add the CameraHelper as a child of the scene.
// camera helper
cameraHelper = new THREE.CameraHelper( camera2 );
scene.add( cameraHelper );
updated fiddle: http://jsfiddle.net/Sps3c/1/
Tip: I added an OrbitController to your demo for a better view of the situation.
three.js r.66

THREE.js rotating camera around an object using orbit path

I am struggling in solving this problem.
On my scene, I have a camera which looks at the center of mass of an object. I have a some buttons that enable to set camera position on particular view (front view, back view,...) along a invisible sphere that surroung the object (constant radius).
When I click on the button, i would like the camera to move from its start position to the end position along the sphere surface. When camera moves I would like it to keep fixing center of mass of the object.
Has anyone have a clue on how to achieve this?
Thanks for help!
If you are happy/prefer to use basic trigonometry then in your initialisation section you could do this:
var cameraAngle = 0;
var orbitRange = 100;
var orbitSpeed = 2 * Math.PI/180;
var desiredAngle = 90 * Math.PI/180;
...
camera.position.set(orbitRange,0,0);
camera.lookAt(myObject.position);
Then in your render/animate section you could do this:
if (cameraAngle == desiredAngle) { orbitSpeed = 0; }
else {
cameraAngle += orbitSpeed;
camera.position.x = Math.cos(cameraAngle) * orbitRange;
camera.position.y = Math.sin(cameraAngle) * orbitRange;
}
Of course, your buttons would modify what the desiredAngle was (0°, 90°, 180° or 270° presumably), you need to rotate around the correct plane (I am rotating around the XY plane above), and you can play with the orbitRange and orbitSpeed until you hare happy.
You can also modify orbitSpeed as it moves along the orbit path, speeding up and slowing down at various cameraAngles for a smoother ride. This process is called 'tweening' and you could search on 'tween' or 'tweening' if you want to know more. I think Three.js has tweening support but have never looked into it.
Oh, also remember to set your camera's far property to be greater than orbitRadius or you will only see the front half of your object and, depending on what it is, that might look weird.

Particles inside a moving Box2d world are getting drawn on top instead of inside a layer

I'm using LevelHelper to build my level, and I'm adding some particles (dynamically initialized CCParticleSystemQuad's) inside my level. All works fine until I move the world (it's a dynamically drawn world in Box2D in which I follow the player with the camera). If I move the world, newly added particles,which are emitting continuously, are drawn at the right position but in the particle-animation afterwards the particles seem to be drawn relatively of the global world/screen position. This gives a weird 'trippy' effect which looks totally unrealistic. The particles should be redrawn/refreshed inside the world
LevelHelperLoader * lh = gameLayer.lh;
LHLayer * layer = [lh layerWithUniqueName:#"MAIN_LAYER"];
NSArray * array = [lh spritesWithTag:WORTEL];
CCParticleSystemQuad * particle;
CGPoint position;
for (LHSprite * sprite in array) {
particle = [CCParticleSystemQuad particleWithFile:#"DirtParticles.plist"];
[layer addChild:particle z:0];
position = sprite.position;
position.y += sprite.contentSize.height * 0.5f;
[particle setPosition:position];
[particle resetSystem];
}
Does anybody know what I might be doing wrong?
Try changing the particle position type:
particle.positionType = kCCPositionTypeFree;
The alternatives are kCCPositionTypeRelative and kCCPositionTypeGrouped. You may have to try all to see which of them best fits your scenario, I'm guessing it's either "free" or "relative".