Objective-C UIBezierPath, physics body - objective-c

I have clouds of different sizes ( seen in the picture ) and im trying to give them a physics body automatically. But I just dont get how to do it. I have a player, its just something like a Ball so I could go for bodyWithCircleOfRadius.
I make it variable for every spawned cloud with this code :
SKSpriteNode * cloud = [SKSpriteNode spriteNodeWithImageNamed:#"wolke"];
float sizeScale = [self getRandomNumberBetween:0.2 to:0.5];
cloud.xScale = sizeScale;
cloud.yScale = sizeScale;
Now somehow I have to fit a PhysicsBody to it. Id be happy for any help. (maybe an oval or ellipse, but how to make the path variable to its size?)
Kind Regards

For this i would definitely go with bodyWithCircleOfRadius. The reason is a perfect physics body path around an object would take up too much extra computing power. And should only be used when absolutely necessary. According to apple:
When choosing a shape for your physics body, do not be overly precise.
More complex shapes require more work to be properly simulated. For
volume-based bodies, use the following guidelines:
A circle is the most efficient shape. A path-based polygon is the
least efficient shape, and the computational work scales with the
complexity of the polygon.
The Code for a circle would start out like this:
SKSpriteNode *sprite = [SKSpriteNode spriteNodeWithImageNamed:#"sphere.png"];
sprite.physicsBody = [SKPhysicsBody bodyWithCircleOfRadius:sprite.size.width/2];

Related

Shrinking a dirty rect

Trying to optimize a falling sand simulation and I'm implementing optimizations that the noita devs talked about in their GDC talk. At around 10:45 they talk about how they use dirty rects. I've started trying to implement a similar system.
Currently, I am able to create a dirty rect that covers the particles that need updating. I do this by every time a valid particle(particle is not air or solid like a wall) is set inside a chunk, I call a function to update the dirty rect giving the placed particles position as an argument. From there, I can easily calculate the new min/max of the rectangle from this position.
Here's a gif of that working.
and here's the code for updating the rect:
public void UpdateDirtyRect(int2 newPos)
{
minX = Math.Min(minX, newPos.x);
minY = Math.Min(minY, newPos.y);
maxX = Math.Max(maxX, newPos.x);
maxY = Math.Max(maxY, newPos.y);
dirtyrect = .(.(minX, minY), .(maxX, maxY));
//Inflate by two pixels. Not doing this will cause the rect to not change size as particles update
dirtyrect=dirtyrect.Inflate(2);
}
The problem, as can be seen in the gif, is that I currently have no way to shrink the dirty rect. I can do a few things, such as detecting when a particle is erased/replaced with air/solid particle on the boundary edge of the dirty rect, but I'm unsure on what to do from there.
Here’s one approach that might work for you.
Keep the dirty rectangle updated by the previous frame.
Compute the dirty rectangle updated by one frame only.
Combine these two rectangles into a single one that contains both of them.
Use the rectangle from step 3 to update the screen.
Replace the previous frame rectangle with the one you have computed on step 2. Not the combined one you computed on step 3, doing so would cause the same problem you’re describing.

How to use a shaderModifier to alter the color of specific triangles in a SCNGeometry

First, before I go on, I have read through: SceneKit painting on texture with texture coordinates which seems to suggest I'm on the right track.
I have a complex SCNGeometry representing a hexasphere. It's rendering really well, and with a full 60fps on all of my test devices.
At the moment, all of the hexagons are being rendered with a single material, because, as I understand it, every SCNMaterial I add to my geometry adds another draw call, which I can't afford.
Ultimately, I want to be able to color each of the almost 10,000 hexagons individually, so adding another material for each one is not going to work.
I had been planning to limit the color range to (say) 100 colors, and then move hexagons between different geometries, each with their own colored material, but that won't work because SCNGeometry says it works with an immutable set of vertices.
So, my current thought/plan is to use a shader modifier as suggested by #rickster in the above-mentioned question to somehow modify the color of individual hexagons (or sets of 4 triangles).
The thing is, I sort of understand the Apple doco referred to, but I don't understand how to provide the shader with what I think must essentially be an array of colour information, somehow indexed so that the shader knows which triangles to give what colors.
The code I have now, that creates the geometry reads as:
NSData *indiceData = [NSData dataWithBytes:oneMeshIndices length:sizeof(UInt32) * indiceIndex];
SCNGeometryElement *oneMeshElement =
[SCNGeometryElement geometryElementWithData:indiceData
primitiveType:SCNGeometryPrimitiveTypeTriangles
primitiveCount:indiceIndex / 3
bytesPerIndex:sizeof(UInt32)];
[oneMeshElements addObject:oneMeshElement];
SCNGeometrySource *oneMeshNormalSource =
[SCNGeometrySource geometrySourceWithNormals:oneMeshNormals count:normalIndex];
SCNGeometrySource *oneMeshVerticeSource =
[SCNGeometrySource geometrySourceWithVertices:oneMeshVertices count:vertexIndex];
SCNGeometry *oneMeshGeom =
[SCNGeometry geometryWithSources:[NSArray arrayWithObjects:oneMeshVerticeSource, oneMeshNormalSource, nil]
elements:oneMeshElements];
SCNMaterial *mat1 = [SCNMaterial material];
mat1.diffuse.contents = [UIColor greenColor];
oneMeshGeom.materials = #[mat1];
SCNNode *node = [SCNNode nodeWithGeometry:oneMeshGeom];
If someone can shed some light on how to provide the shader with a way to color each triangle indexed by the indices in indiceData, that would be fantastic.
EDIT
I've tried looking at providing the shader with a texture as a container for color information that would be indexed by the VertexID however it seems that SceneKit doesn't make the VertexID available. My thought was to provide this texture (actually just an array of bytes, 1 per hexagon on the hexasphere), via the SCNMaterialProperty class and then, in the shader, pull out the appropriate byte, based on the vertex number. That byte would be used to index an array of fixed colors and the resultant color for each vertex would then give the desired result.
Without a VertexID, this idea won't work, unless there is some other, similarly useful piece of data...
EDIT 2
Perhaps I am stubborn. I've been trying to get this to work, and as an experiment I created an image that is basically a striped rainbow and wrote the following shader, thinking it would basically colour my sphere with the rainbow.
It doesn't work. The entire sphere is drawn using the colour in the top left corner of the image.
My shaderModifer code is:
#pragma arguments
sampler2D colorMap;
uniform sampler2D colorMap;
#pragma body
vec4 color = texture2D(colorMap, _surface.diffuseTexcoord);
_surface.diffuse.rgba = color;
and I apply this using the code:
SCNMaterial *mat1 = [SCNMaterial material];
mat1.locksAmbientWithDiffuse = YES;
mat1.doubleSided = YES;
mat1.shaderModifiers = #{SCNShaderModifierEntryPointSurface :
#"#pragma arguments\nsampler2D colorMap;\nuniform sampler2D colorMap;\n#pragma body\nvec4 color = texture2D(colorMap, _surface.diffuseTexcoord);\n_surface.diffuse.rgba = color;"};
colorMap = [SCNMaterialProperty materialPropertyWithContents:[UIImage imageNamed:#"rainbow.png"]];
[mat1 setValue:colorMap forKeyPath:#"colorMap"];
I had thought that the _surface.diffuseTexcoord would be appropriate but I'm beginning to think I need to somehow map that to a coordinate in the image by knowing the dimensions of the image and interpolating somehow.
But if this is the case, what units are _surface.diffuseTexcoord in? How do I know the min/max range of this so that I can map it to the image?
Once again, I'm hoping someone can steer me in the right direction if these attempts are wrong.
EDIT 3
OK, so I know I'm on the right track now. I've realised that by using _surface.normal instead of _surface.diffuseTexcoord I can use that as a latitude/longitude on my sphere to map to an x,y in the image and I now see the hexagons being colored based on the color in the colorMap however it doesn't matter what I do (so far); the normal angles seem to be fixed in relation to the camera position, so when I move the camera to look at a different point of the sphere, the colorMap doesn't rotate with it.
Here is the latest shader code:
#pragma arguments
sampler2D colorMap;
uniform sampler2D colorMap;
#pragma body
float x = ((_surface.normal.x * 57.29577951) + 180.0) / 360.0;
float y = 1.0 - ((_surface.normal.y * 57.29577951) + 90.0) / 180.0;
vec4 color = texture2D(colorMap, vec2(x, y));
_output.color.rgba = color;
ANSWER
So I solved the problem. It turned out that there was no need for a shader to achieve my desired results.
The answer was to use a mappingChannel to provide the geometry with a set of texture coordinates for each vertex. These texture coordinates are used to pull color data from the appropriate texture (it all depends on how you set up your material).
So, whilst I did manage to get a shader working, there were performance issues on older devices, and using a mappingChannel was much much better, working at 60fps on all devices now.
I did find though that despite the documentation saying that a mapping channel is a series of CGPoint objects, that wouldn't work on 64 bit devices because CGPoint seems to use doubles instead of floats.
I needed to define my own struct:
typedef struct {
float x;
float y;
} MyPoint;
MyPoint oneMeshTextureCoordinates[vertexCount];
and then having built up an array of these, one for each vertex, I then created the mappingChannel source as follows:
SCNGeometrySource *textureMappingSource =
[SCNGeometrySource geometrySourceWithData:
[NSData dataWithBytes:oneMeshTextureCoordinates
length:sizeof(MyPoint) * vertexCount]
semantic:SCNGeometrySourceSemanticTexcoord
vertexCount
floatComponents:YES
componentsPerVector:2
bytesPerComponent:sizeof(float)
dataOffset:0
dataStride:sizeof(MyPoint)];
EDIT:
In response to a request, here is a project that demonstrates how I use this. https://github.com/pkclsoft/HexasphereDemo

How can I move a CCNode along an arc, with linear animation speed?

I'm trying to animate a CCNode in a semi circle motion and have it constantly move at the same speed. I thought I could achieve this with Bezier animation.
I'm trying to find the correct implementation to run an action with CCActionBezierBy (ref) that will not have an ease rate at all.
CGFloat duration = 5;
// bezierConfig is already set
CGFloat rate = 0.0f;
id action = [CCActionBezierBy actionWithDuration:duration bezier:bezierConfig];
id ease = [CCActionEaseRate actionWithAction:action rate:rate];
id spawn = [CCActionSpawn actions:action, ease, nil];
As I manipulate the rate I can see results, with 0 being the lowest ease animation. But how can I make the animation completely linear?
Place moving node in parent node. Its coordinates from parent root will be moving radius. Then make 2 rotation actions. One rotation of parent with constant speed. And rotation of node itself to the opposite direction.

Troubles Prepositioning Node Using Coordinate System Conversions

Okay so I have been trying to preposition a sprite node before adding it to the scene. The only problem is that I need to know the (0, 0.5) or (left, middle) position of the node, in scene coordinates before I can position it properly.
I know about the convertPoint:(CGPoint) toNode/fromNode:(SKSpriteNode *) methods and currently I have worked out the following within the Scene's code:
[node convertPoint:CGPointMake(0,0.5) toNode: self]
I also wasn't sure if it was confusing self (the scene) with self (the node), so I tried
SKScene *scene = self;
[node convertPoint:CGPointMake(0,0.5) toNode: scene]
I am pretty sure that I didn't have to make the distinction but I tried any ways.
The logged result of both attempts was (0,0.5).
The node.position is (50, 100).
In case the above is not clear, I am trying to find the position on the edge of the frame, which should be equal to the nodes width. The reason why I am not using width though is because I am placing it in respect to another node and the two nodes may be rotated.
The theories I am trying to reference are from Apples Spritekit Programming Guide
If there is an easier way to establish a distance between two nodes based on the width of one node, taking into account rotation, feel free to post it for I would love to know, although I still need the node conversion for other methods.
Thank you in advance for all of your help.
You shouldn't change the anchor point once the node has been added as it will inherently change its position. If you are using anchor 0.5, 0.5 to rotate the nodes leave it like that. If you want to get the maxX point from a rotated node you could do something like this:
SKSpriteNode *sprite2 = [SKSpriteNode spriteNodeWithColor:[UIColor blackColor] size:CGSizeMake(50, 50)];
float angle = - sprite.zRotation;
CGPoint dirVect = CGPointMake(cosf(angle), sinf(angle));
CGFloat distance = sprite.frame.size.width/2 + sprite2.frame.size.width/2;
CGPoint destPoint = CGPointMake(sprite.position.x + (dirVect.x * distance),
sprite.position.y + (dirVect.y * distance));
sprite2.position = destPoint;
[self addChild:sprite2];
Where sprite is the node you have rotated and sprite2 the node you want to add respect of the first node. distance should be the distance (excuse the pun) between the anchor points of the two nodes.
Let me know if this is what you are looking for. If not, a screenshot would help :)

Applying a vortex / whirlpool effect in Box2d / Cocos2d for iPhone

I've used Nick Vellios' tutorial to create radial gravity with a Box2D object. I am aware of Make a Vortex here on SO, but I couldn't figure out how to implement it in my project.
I have made a vortex object, which is a Box2D circleShape sensor that rotates with a consistent angular velocity. When other Box2D objects contact this vortex object I want them to rotate around at the same angular velocity as the vortex, gradually getting closer to the vortex's centre. At the moment the object is attracted to the vortex's centre but it will head straight for the centre of the vortex, rather than spinning around it slowly like I want it to. It will also travel in the opposite direction than the vortex as well as with the vortex's rotation.
Given a vortex and a box2D body, how can I set the box2d body to rotate with the vortex as it gets 'sucked in'.
I set the rotation of the vortex when I create it like this:
b2BodyDef bodyDef;
bodyDef.type = b2_dynamicBody;
bodyDef.angle = 2.0f;
bodyDef.angularVelocity = 2.0f;
Here is how I'm applying the radial gravity, as per Nick Vellios' sample code.
-(void)applyVortexForcesOnSprite:(CCSpriteSubclass*)sprite spriteBody:(b2Body*)spriteBody withVortex:(Vortex*)vortex VortexBody:(b2Body*)vortexBody vortexCircleShape:(b2CircleShape*)vortexCircleShape{
//From RadialGravity.xcodeproj
b2Body* ground = vortexBody;
b2CircleShape* circle = vortexCircleShape;
// Get position of our "Planet" - Nick
b2Vec2 center = ground->GetWorldPoint(circle->m_p);
// Get position of our current body in the iteration - Nick
b2Vec2 position = spriteBody->GetPosition();
// Get the distance between the two objects. - Nick
b2Vec2 d = center - position;
// The further away the objects are, the weaker the gravitational force is - Nick
float force = 1 / d.LengthSquared(); // 150 can be changed to adjust the amount of force - Nick
d.Normalize();
b2Vec2 F = force * d;
// Finally apply a force on the body in the direction of the "Planet" - Nick
spriteBody->ApplyForce(F, position);
//end radialGravity.xcodeproj
}
Update I think iForce2d has given me enough info to get on my way, now it's just tweaking. This is what I'm doing at the moment, in addition to the above code. What is happening is the body gains enough velocity to exit the vortex's gravity well - somewhere I'll need to check that the velocity stays below this figure. I'm a little concerned I'm not taking into account the object's mass at the moment.
b2Vec2 vortexVelocity = vortexBody->GetLinearVelocityFromWorldPoint(spriteBody->GetPosition() );
b2Vec2 vortexVelNormal = vortexVelocity;
vortexVelNormal.Normalize();
b2Vec2 bodyVelocity = b2Dot( vortexVelNormal, spriteBody->GetLinearVelocity() ) * vortexVelNormal;
//Using a force
b2Vec2 vel = bodyVelocity;
float forceCircleX = .6 * bodyVelocity.x;
float forceCircleY = .6 * bodyVelocity.y;
spriteBody->ApplyForce( b2Vec2(forceCircleX,forceCircleY), spriteBody->GetWorldCenter() );
It sounds like you just need to apply another force according to the direction of the vortex at the current point of the body. You can use b2Body::GetLinearVelocityFromWorldPoint to find the velocity of the vortex at any point in the world. From Box2D source:
/// Get the world linear velocity of a world point attached to this body.
/// #param a point in world coordinates.
/// #return the world velocity of a point.
b2Vec2 GetLinearVelocityFromWorldPoint(const b2Vec2& worldPoint) const;
So that would be:
b2Vec2 vortexVelocity = vortexBody->GetLinearVelocityFromWorldPoint( suckedInBody->GetPosition() );
Once you know the velocity you're aiming for, you can calculate how much force is needed to go from the current velocity, to the desired velocity. This might be helpful: http://www.iforce2d.net/b2dtut/constant-speed
The topic in that link only discusses a 1-dimensional situation. For your case it is also essentially 1-dimensional, if you project the current velocity of the sucked-in body onto the vortexVelocity vector:
b2Vec2 vortexVelNormal = vortexVelocity;
vortexVelNormal.Normalize();
b2Vec2 bodyVelocity = b2Dot( vortexVelNormal, suckedInBody->GetLinearVelocity() ) * vortexVelNormal;
Now bodyVelocity and vortexVelocity will be in the same direction and you can calculate how much force to apply. However, if you simply apply enough force to match the vortex velocity exactly, the sucked in body will probably go into orbit around the vortex and never actually get sucked in. I think you would want to make the force quite a bit less than that, and I would scale it down according to the gravity strength as well, otherwise the sucked-in body will be flung away sideways as soon as it contacts the outer edge of the vortex. It could take a lot of tweaking to get the effect you want.
EDIT:
The force you apply should be based on the difference between the current velocity (bodyVelocity) and the desired velocity (vortexVelocity), ie. if the body is already moving with the vortex then you don't need to apply any force. Take a look at the last code block in the sub-section titled 'Using forces' in the link I gave above. The last three lines there do pretty much what you need if you replace 'vel' and 'desiredVel' with the sizes of your bodyVelocity and vortexVelocity vectors:
float desiredVel = vortexVelocity.Length();
float currentVel = bodyVelocity.Length();
float velChange = desiredVel - currentVel;
float force = body->GetMass() * velChange / (1/60.0); //for a 1/60 sec timestep
body->ApplyForce( b2Vec2(force,0), body->GetWorldCenter() );
But remember this would probably put the body into orbit, so somewhere along the way you would want to reduce the size of the force you apply, eg. reduce 'desiredVel' by some percentage, reduce 'force' by some percentage etc. It would probably look better if you could also scale the force down so that it was zero at the outer edge of the vortex.
I had a project where I had asteroids swirling around a central point (there are things jumping between them...which is a different point).
They are connected to the "center" body via b2DistanceJoints.
You can control the joint length to make them slowly spiral inward (or outward). This gives you find grain control instead of balancing force control, which may be difficult.
You also apply tangential force to make them circle the center.
By applying different (or randomly changing) tangential forces, you can make the
crash into each other, etc.
I posted a more complete answer to this question here.