Cocos2d: Emitted particles do not pan correctly with screen - objective-c

I have setup a particle emitter to show a glowing orb which looks great (added by the code below). The only issue is that when I pan around the level the particles that have already been created pan around too rather than staying local to the emitter location; the emitter itself pans around correctly and emits new particles from the correct location
CCParticleSystem *orb = [CCParticleSystemQuad particleWithFile:#"orb.plist"];
orb.position = ccp((screenSize.width / 2),screenSize.height);
[self addChild: orb];
What do I have to do to ensure that emitted particles also pan around with the screen?

There are three possible behaviors for particles positioning (positionType property of particle system). As stated in cocos2d sources:
kCCPositionTypeFree - Living particles are attached to the world and are unaffected by emitter repositioning.
kCCPositionTypeRelative - Living particles are attached to the world but will follow the emitter repositioning. Use case: Attach an emitter to an sprite, and you want that the emitter follows the sprite.
kCCPositionTypeGrouped - Living particles are attached to the emitter and are translated along with it.
I'm not properly understanding what is your expected behavior. Try all these modes at first.
Also, cocos2d has great demo which is distributed with sources. Check ParticleTest example.

Related

Sprite kit physics in endless runner game?

How would I use sprite kit physics in my endless runner game?
An endless runner fakes motion by keeping the player stationary but moving the background and all the other objects by a set speed.
BUT, I want to simulate physics.
What if I let my player move with the physics engine, move the background by the displacement of the player from the original position, and then move the player back to it's original position?
Would this be smooth and look good? If so, then what methods of sprite kit do I use so no visual errors show to the user.
What's the proper solution?
Thank you.
The proper way is to center the scene on a node. The best way to learn how to do so is to go ahead and visit the docs here (go to section titled 'Centering Scene on a Node'), but if you encounter any problems with the implementation let us know!
In case you're wondering how it works, the background stays stationary (unless you want parallax scrolling), while the character moves. However, every frame the camera 'follows' a player, meaning wherever you moved your player with the physics, the screen will follow and keep the character at the center.
Edit
Here is the code I use in one of my games to center on the sprite (which is a plane controlled by buttons):
-(void)didSimulatePhysics {
... #Code here to simulate plane movement and such
SKNode *camera = [self childNodeWithName:#"//camera"];
SKNode *player = [self childNodeWithName:#"//sprite"];
if (player.position.y >= 0) camera.position = CGPointMake(player.position.x, player.position.y);
else camera.position = CGPointMake(player.position.x, 0);
[self centerOnNode:camera];
if (velocity>1){
self.directionMeter.zRotation = -M_PI/2 + mdirectionOfTravel;
}
}
-(void)centerOnNode:(SKNode *)node {
CGPoint cameraPositionInScene = [node.scene convertPoint:node.position fromNode:node.parent];
node.parent.position = CGPointMake(node.parent.position.x - cameraPositionInScene.x, node.parent.position.y - cameraPositionInScene.y);
}
Basically, when the player is above a certain position, I move the camera up with the player. When the player moves anywhere on the x-axis, the camera always moves with them. I have contact detection to find where the player hits the ground (and thus loses), and the background color (the sky) changes according to the altitude of the plane (the higher the plane, the darker the blue).

Getting direction of object moved by the physics engine

i am using spritekit to learn about physics behaviour in games, when a sprite node ball bounces off a surface, the physics engine will automatically calculate its new direction.
how do i get this new direction in code?
i am using swift, xcode spritekit. but object c spritekit (cocos2d) is fine too
(the aim is to get a fire trail behind the bouncing ball, but i dont know how to rotate the particle emitter based on ball's vector of movement. the emitter is currently added as a child of the ball node)
To get the "direction" you should read the velocity property of the physicsBody of your node.
To rotate your particle emitter, you should modify the zRotation property of the node.
Well, the most best idea for getting a rotation of a node is getting eulerAngles. But since it showed radians, I converted to degrees. This is the code that I got when getting the rotation of a camera Node
print(round(GLKMathRadiansToDegrees(Float(gameController.camera.eulerAngles.z))))
Hope This helps

How to clip part of a sprite based on its position?

I'm designing a game in Cocos2d, and at one point I have coins shooting out on a platform from a zelda-ish perspective. I'd like to display the coin's shadow sprite (a different sprite from the coin) on the platform, but mask or clip the shadow sprite on the edge of the platform. The coin can continue off the edge of the platform, but the shadow should stop at the edge. The platform also moves, so I need the shadow sprite to track with the platform's movement.
I thought it could work to use a CCClippingNode for this, but I can't add it as a child of anything in a spriteBatchNode which is how I'm making my platform. Without having the shadow as a child of the platform, I'll mess up z-order and the shadow movement won't track correctly. I also checked out Ray Wenderlich's tutorial on masking a sprite but I don't think that'll work since it looks like it masks an individual sprite texture and not an area of the view where the sprite shouldn't be displayed. Any ideas on how to solve this?

three.js - no update of scene lighting when light moving

I have a simple THREE.js scene where I want a point light to move dynamically with the camera. The movement works well but the lighting in the scene is not updated immediately. When I change camera position (and thus light position) it takes some time (approx. 2-3 seconds) till the lighting in the scene renders correctly in respect of the new light position. I have already updated the matrix/matrixWorld of the lights in every frame. Is a further update needed? How to tell Three.js that light position has changed?
There is no specific way needed to tell THREE.js lightsources have changed their positions. Here it is important how to make the update of transformation matrices. I have a lightGroup (instance of Object3D) which is the parent node for all lightsources in my scene. Updating the lightGroup transformation according to the camera works with this setup:
if (camera !== undefined)
{
lightGroup.position.x = camera.position.x;
lightGroup.position.y = camera.position.y;
lightGroup.position.z = camera.position.z;
lightGroup.lookAt(camera.target);
lightGroup.updateMatrix();
lightGroup.updateMatrixWorld();
}
The update of the transformation matrix is the point here. Another approach is to hang the camera into the THREE.js scenegraph and attach lights as its children, if you want to avoid a seperate lightGroup node.
By the way: Specific update of THREE.js scene is only needed if you change the number of lightsources in a scene. See https://github.com/mrdoob/three.js/wiki/Updates at Materials.
I know it is old post, however maybe someone will need other solutions.
To move lights dynamicly with camera You can just attach them to camera.
directionalLight = new THREE.DirectionalLight(0xffffff,0.6);
directionalLight.position.set(0,20,0);
directionalLight.rotation.set(+20/180*Math.PI,0,0);
directionalLight.castShadow = true;
camera.add(directionalLight);
scene.add(camera);

Ogre3D Camera, RenderWindow and Viewport understanding

While looking through the tutorials I've seen the Ogre::Camera::getCameraToViewportRay method being used. I was trying understand what it does.
First I imagine a viewport, being placed somewhere in the 3D scene, let's say on the screen of the TV object. I can easily imagine how to transform the 2D coordinate on the viewport to the 3D coordinate of the scene and then to make a ray from the camera position point through that point on the VP.
But I can not understand how it's done when the VP is on the the RenderWindow(on my monitor). I mean, where is the render window in the scene, where is the point on the renderwindow's VP in the scene? How is the point on the renderwindow's VP transformed into a 3D point of the scene?
Thanks for answer!
The viewport shows what you see through a camera, but the viewport is in front of the camera.
There is a stackoverflow post with information about the relation of camera and viewport and a nice visual illustration: https://stackoverflow.com/a/7125486/2168872
The camera to viewport ray is a worldspace ray, starting from your camera and intersecting the viewport at a certain point, e.g. where your mouse cursor points to.