How to turn mouse position and a number into a speed in XYZ - game-development

I have been working on plane controls, for a while. Is there a way to convert a look vector (mouse x and y) and a speed to a x, y, z velocity?

Related

How to make a mountainous area flat in Blender?

how can i make this selected area flat and not mountainous?
Image of problem
Scale selected faces to 0 on Z axis. You can do that by selecting scale operator S then clicking Z to operate on Z axis only a than clicking 0 to se Z scale as zero.

X axis vs Y axis Kotlin

I drew a perfect large square in canvas and then split that large square into 15x15 smaller squares.
Now I am moving a bitmap using an image along X and Y Axis. The bitmap is of same size as the smaller square.
It moves along X axis perfectly one square at a time. But it covers more than one square along Y axis. So I have to multiply by approximately 0.93 to get to the size of the smaller square. But the problem is that it gets more complicated with different devices cellphone vs tablet.
Does anybody know why Y axis is different from X axis?
Thank you in advance...
Most of the displays have a slightly different densities between the X and Y axis, using my device and running the following command:
adb shell dumpsys display
searching for "density", I got the result:
PhysicalDisplayInfo{1080 x 2280, 60.000004 fps, density 3.0, 442.451 x 438.727 dpi, secure true, appVsyncOffset 0, bufferDeadline 17666666}
Depending on how you're declaring your square that may be the cause.

How to convert relative GPS coordinates to a "local custom" x, y, z coordinate?

Let's say I know two persons are standing at GPS location A and B. A is looking at B.
I would like to know B's (x, y, z) coordinates based on A, where the +y axis is the direction to B (since A is looking at B), +z is the vertically to the sky. (therefore +x is right-hand side of A)
I know how to convert a GPS coordinate to UTM, but in this case, a coordinate system rotation and translation seem needed. I am going to come up with a calculation, but before that, will there be some codes to look at?
I think this must be handled by many applications, but I could not find so far.
Convert booth points to 3D Cartesian
GPS suggest WGS84 so see How to convert a spherical velocity coordinates into cartesian
Construct transform matrix with your desired axises
see Understanding 4x4 homogenous transform matrices. So you need 3 perpendicular unit vectors. The Y is view direction so
Y = normalize(B-A);
one of the axises will be most likely up vector so you can use approximation
Z = normalize(A);
and as origin you can use point A directly. Now just exploit cross product to create X perpendicular to both and make also Y perpendicular to X and Z (so up stays up). For more info see Representing Points on a Circular Radar Math approach
Transfrom B to B' by that matrix
Again in the QA linked in #1 is how to do it. It is simple matrix/vector multiplication.

Obtaining 3D location of an object being looked at by a camera with known position and orientation

I am building an augmented reality application and I have the yaw, pitch, and roll for the camera. I want to start placing objects in the 3D environment. I want to make it so that when the user clicks, a 3D point pops up right where the camera is pointed (center of the 2D screen) and when the user moves, the point moves accordingly in 3D space. The camera does not change position, only orientation. Is there a proper way to recover the 3D location of this point? We can assume that all points are equidistant from the camera location.
I am able to accomplish this independently for two axes (OpenGL default orientation). This works for changes in the vertical axis:
x = -sin(pitch)
y = cos(pitch)
z = 0
This also works for changes in the horizontal axis:
x = 0
y = -sin(yaw)
z = cos(yaw)
I was thinking that I should just make combine into:
x = -sin(pitch)
y = sin(yaw) * cos(pitch)
z = cos(yaw)
and that seems to be close, but not exactly correct. Any suggestions would be greatly appreciated!
It sounds like you just want to convert from a rotation vector (pitch,yaw,roll) to a rotation matrix. The conversion can bee seen on the Wikipedia article on rotation matrices. The idea is that once you have constructed your matrix, to transform any point simply.
final_pos = rot_mat*initial_pose
where final and initial pose are 3x1 vectors and rot_mat is a 3x3 matrix.

how to get trajectory of an object1 when we swipe an object2?

I am writing an game app in iPad using cocos2d. And the game is in landscape mode. It have a sprite gun that shoots, and the sprite is the middle (512,10).
The targets appear along the x-axis. By swiping on the sprite gun I have to generate a trajectory of the bullet according to the angle I have swiped.
So, I have initial and final coordinates of touch of gun. And the angle. How can I get the trajectory ?
Thank You.
Assuming the ground is flat, no air resistance, and the bullet is fired at coordinates (0, 0), the formula for height as a function of distance travelled along the ground is as follows:
a = launch angle
v = launch speed
x = distance travelled along the ground
y = distance above the ground
g = acceleration due to gravity.
y(x) = (x * tan(a)) - ( ( (g / ( cos(a) * cos(a) ) ) / (2 * v * v) ) * (x * x) )
Check what units your maths/trigonometry library uses for angles (degrees or radians)
So, assuming the bullet is moving in +ve x direction, plot (0, y(0)), (1, y(1)), (2, y(2)) etc. until y(x) is < 0, meaning that the bullet has hit the ground.
(Don't forget to add 512 to x, and 10 to y when plotting, to match the start point at your gun sprite position).
Here endeth the maths lesson. Over to you on the iPad code.
If you want to get really fancy, the Wikipedia Trajectory page is fairly thorough.