How to include function in documentation test example? - testing

I am trying to make a simple documentation example but it doesn't compile. I tried:
/// Given an axis in 3D returns the indices of the 3 basis axis in the order such
/// that the first index represents forward (the input) the next the side and the final
/// the up. i.e. it computes the even permutation of basis index that creates the
/// rotation that aligns the basis with the selected axis.
///
/// # Arguments
///
/// * index an index from 0-2 selecting the forward axis.
///
/// # Examples
///
/// ```
/// use crate::axis_index_to_basis_indices;
/// let x_index = 0;
/// let y_index = 1;
/// let z_index = 2;
/// let (forward, side, up) = axis_index_to_basis_indices(x_index);
/// assert!(forward == x_index);
/// assert!(side == y_index);
/// assert!(up == z_index);
///
/// let (forward, side, up) = axis_index_to_basis_indices(z_index);
/// assert!(forward == z_index);
/// assert!(side == x_index);
/// assert!(up == y_index);
/// ```
fn axis_index_to_basis_indices(index: i32) -> (i32, i32, i32)
{
match index
{
0 => (0, 1, 2),
1 => (1, 2, 0),
2 => (2, 0, 1),
_ => panic!(),
}
}
Which gives:
error[E0432]: unresolved import `crate::axis_index_to_basis_indices`
--> dual_contouring.rs:107:5
|
3 | use crate::axis_index_to_basis_indices;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ no `axis_index_to_basis_indices` in the root
error: aborting due to previous error
For more information about this error, try `rustc --explain E0432`.
Couldn't compile the test.
failures:
dual_contouring.rs - axis_index_to_basis_indices (line 106)
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.02s
error: doctest failed, to rerun pass `--doc`
The official docs are incomplete in this regard. Reading them one would assume the include statement should be use doc::name_of_crate but that also doesn't work.

Related

Not using vertex attributes based on push constants

I have a GLSL vertex shader where one of the attributes is only used if a push constant is set:
layout(location = 0) in ivec2 a_pos;
layout(location = 1) in ivec2 a_nrm;
layout(location = 2) in float a_Height;
void main()
{
<...>
float Offset = ( u_enabling_flag > 0.0 ) ? a_Height : 0.0;
< some calculation involving Offset >
I get the following validation error:
vkDebug: Validation: 0: Validation Error: [ UNASSIGNED-CoreValidation-Shader-InputNotProduced ] Object 0: handle = 0x3a000000003a, type = VK_OBJECT_TYPE_SHADER_MODULE; | MessageID = 0x23e43bb7 | Vertex shader consumes input at location 2 but not provided
The graphical output looks fine but is there a possibility to get rid of the error?
Regards.
The graphical output looks fine but is there a possibility to get rid of the error?
"Vertex shader consumes input at location 2 but not provided"
Remove the input at location 2 from the shader, or attach a buffer binding at that location.

How to use inline documentation for tests in rust

I am trying to get rust to test an example from my inline documentation. I wrote the following code:
#[derive(Clone)]
#[derive(PartialEq)]
#[derive(Debug)]
enum Color {
Black = 1,
Empty = 0,
White = -1
}
/// Chages Color to the next player
///
/// Returns: White if player is Black, Black if player is White and Empty if
/// player is Empty.
///
/// Examlple
/// ```
/// assert_eq!(flip(&Color::White),Color::Black);
///```
// Invariant Color must represent Black as 1, Empty as 0 and White as -1!
fn flip(player: &Color)->Color{
let intrepresentation :i8 = (player.clone() as i8) * (-1);
unsafe{
std::mem::transmute(intrepresentation)
}
}
fn main() {
assert_eq!(flip(&Color::White),Color::Black);
}
Then I run
rustdoc --test src/main.rs
Which gave me:
running 1 test
test src/main.rs - flip (line 16) ... FAILED
failures:
---- src/main.rs - flip (line 16) stdout ----
error[E0433]: failed to resolve: use of undeclared type or module `Color`
--> src/main.rs:17:18
|
3 | assert_eq!(flip(&Color::White),Color::Black);
| ^^^^^ use of undeclared type or module `Color`
error[E0433]: failed to resolve: use of undeclared type or module `Color`
--> src/main.rs:17:32
|
3 | assert_eq!(flip(&Color::White),Color::Black);
| ^^^^^ use of undeclared type or module `Color`
error[E0425]: cannot find function `flip` in this scope
--> src/main.rs:17:12
|
3 | assert_eq!(flip(&Color::White),Color::Black);
| ^^^^ not found in this scope
error: aborting due to 3 previous errors
Some errors have detailed explanations: E0425, E0433.
For more information about an error, try `rustc --explain E0425`.
Couldn't compile the test.
failures:
src/main.rs - flip (line 16)
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out
How can I get rustc to find flip and Color. The test runs fine in the main function. I have also tried the command:
cargo test
but that did not run any tests.
I have tried add the following line to the example:
/// use crate::{flip, Color};
making:
// Chages Color to the next player
///
/// Returns: White if player is Black, Black if player is White and Empty if
/// player is Empty.
///
/// Examlple
/// ```
/// use crate::{flip, Color};
/// assert_eq!(flip(&Color::White),Color::Black);
///```
but that gives an error
martin#martin-laptop:~/test_code$ rustdoc --test src/main.rs
running 1 test
test src/main.rs - main (line 23) ... FAILED
failures:
---- src/main.rs - main (line 23) stdout ----
error[E0432]: unresolved import `crate::Color`
--> src/main.rs:24:14
|
3 | use crate::{ Color};
| ^^^^^ no `Color` in the root
error[E0425]: cannot find function `flip` in this scope
--> src/main.rs:25:12
|
4 | assert_eq!(flip(&Color::White),Color::Black);
| ^^^^ not found in this scope
error: aborting due to 2 previous errors
Some errors have detailed explanations: E0425, E0432.
For more information about an error, try `rustc --explain E0425`.
Couldn't compile the test.
failures:
src/main.rs - main (line 23)
test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out
I have also tried to Color and flip as public:
#[derive(Clone)]
#[derive(PartialEq)]
#[derive(Debug)]
pub enum Color {
Black = 1,
Empty = 0,
White = -1
}
/// Chages Color to the next player
///
/// Returns: White if player is Black, Black if player is White and Empty if
/// player is Empty.
///
/// Examlple
/// ```
/// use crate::{flip, Color};
/// use std::env;
/// assert_eq!(flip(&Color::White),Color::Black);
///```
// Invariant Color must represent Black as 1, Empty as 0 and White as -1!
pub fn flip(player: &Color)->Color{
let intrepresentation :i8 = (player.clone() as i8) * (-1);
unsafe{
std::mem::transmute(intrepresentation)
}
}
fn main() {
assert_eq!(flip(&Color::White),Color::Black);
}
but that gave the same error.
Doc tests (tests inside /// ```) are compiled separately as tiny programs of their own. Therefore:
They can only access public items: pub mod, pub fn, pub struct, ...
They can only access library crates that export items to be used by other crates — if your program is in main.rs then it's a binary crate.
You have to fully qualify or use the names, like use my_library::Color;.
If you want to test things that don't fit this, then you should use #[test] tests instead:
#[test]
fn flip_test() {
assert_eq!(flip(&Color::White), Color::Black);
}
Any function located anywhere in your program with the attribute #[test] will be run as a test. So, they can access private items since they're in the same module (or a submodule; it's common to put them inside a module named tests in the same file, with mod tests { ... }).
You can find more information about how to write test functions and organize your tests at The Rust Programming Language: How to Write Tests.
I have also tried to Color and flip as public:
/// use crate::{flip, Color};
This doesn't work because crate refers to the current crate, which for a doc test is the test program, not your main crate.

CGAL: Is 2D poly partitioning supported with epeck kernel?

I'd like to use CGAL convex partitioning in an application that is based on the epeck kernel, but trying to compile such throws the following error:
error:
no matching constructor for initialization of 'CGAL::Partition_vertex<CGAL::Partition_traits_2<CGAL::Epeck> >'
A simple test case for this is to take, for example, the greene_approx_convex_partition_2.cpp example from the distribution and try to change the kernel parameterization to epeck.
Are/can the 2D convex partitioning routines supported on an epeck kernel? Any pointers or advice much appreciated!
thanks much,
Here is a workaround:
--- a/include/CGAL/Partition_2/Indirect_edge_compare.h
+++ b/include/CGAL/Partition_2/Indirect_edge_compare.h
## -69,7 +69,7 ## class Indirect_edge_compare
else
{
// construct supporting line for edge
- Line_2 line = _construct_line_2(*edge_vtx_1, *edge_vtx_2);
+ Line_2 line = _construct_line_2((Point_2)*edge_vtx_1, (Point_2)*edge_vtx_2);
return _compare_x_at_y_2(*vertex, line) == SMALLER;
}
}
## -98,10 +98,10 ## class Indirect_edge_compare
// else neither endpoint is shared
// construct supporting line
- Line_2 l_p = _construct_line_2(*p, *after_p);
+ Line_2 l_p = _construct_line_2((Point_2)*p, (Point_2)*after_p);
if (_is_horizontal_2(l_p))
{
- Line_2 l_q = _construct_line_2(*q, *after_q);
+ Line_2 l_q = _construct_line_2((Point_2)*q, (Point_2)*after_q);
if (_is_horizontal_2(l_q))
{
## -130,7 +130,7 ## class Indirect_edge_compare
return q_larger_x;
// else one smaller and one larger
// construct the other line
- Line_2 l_q = _construct_line_2(*q, *after_q);
+ Line_2 l_q = _construct_line_2((Point_2)*q, (Point_2)*after_q);
if (_is_horizontal_2(l_q)) // p is not horizontal
{
return _compare_x_at_y_2((*q), l_p) == LARGER;
I have also noticed that while greene_approx_convex_partition_2 with epeck results in the compiler error mentioned above, the alternative approx_convex_partition_2 compiles just fine with epeck right out of the box.

Object height using Kinect

For example, I am standing in-front of my Kinect. The Kinect can identify the joints, and it will expose them as a data structure. Till this point I am clear.
So, can we define the height as the difference between Head joint - ((LeftAnkle + RightAnkle)/2)?
I have tried trigonometric formulas, but there are two problems I am facing. One is identifying the person in the view. The second one is identifying the exact positions of Top of head and bottom of foot.
I have tried the point cloud, but got lost in how to generate the point cloud specific to a person. I mean without including the background objects.
Please suggest some ideas about how I can calculate the height of a person using the Kinect?
You can convert the Head Joint into global coordinate system. There is no need to do any math. The y coordinate in global coordinate will be his height.
All you need to do is check what pixel the head joint is and convert the pixel + depth informations into word coordinate space in mm.
I don't know what API you are using, but if it's being capable to segment a human and return his joint's, probably you are using OpenNI/NITE or Microsoft SDK. Both of them have a function that converts a pixel + depth coordinate into a x,y,z in mm. I don't know exactly what are the functions but their names would be something like : depth_to_mm, or disparity_to_mm. You need to check both documentations to find it, or you can do it by yourself.
This site have informations on how to convert depth to mm: http://nicolas.burrus.name/index.php/Research/KinectCalibration
I have extracted the two points - Head and Left foot (or Right Foot), then i found the euclidean distance between these points gave the distance with 4 inch variation. My test results are satisfactory, so we are using this approach as temporary work around.
An old question but i found a very nice explanation and example here.
It also explains that height isnt mearly a function of the head and ankle points, but instead a function of the following line segments:
Head - ShoulderCenter
ShoulderCenter - Spine
Spine - HipCenter
HipCenter - KneeLeft or KneeRight
KneeLeft / KneeRight - AnkleLeft /
AnkleRight
AnkleLeft / AnkleRight - FootLeft / FootRight
Here is a formula for Kinect SDK 2.0. Full project available at https://github.com/jhealy/kinect2/tree/master/020-FaceNSkin_HowTallAmI ....
using System;
using Microsoft.Kinect;
// Skeleton is now Bones
public enum BodyHeightMeasurementSystem
{
Meters = 0, Imperial = 1
}
public static class BodyHeightExtension
{
// change this to change the way values are returned, by default everything is meters
public static BodyHeightMeasurementSystem MeasurementSystem = BodyHeightMeasurementSystem.Meters;
/// <summary>
/// Get Height of a body in CM
/// </summary>
/// <param name="TargetBody">used for extension method purposes - uses should not see</param>
/// <returns>
/// positive value: height in meters
/// -1.0 : null body passed in
/// -2.0 : body not tracked, no height available
/// </returns>
public static double Height( this Body TargetBody )
{
if ( TargetBody == null ) return -1.0;
if (TargetBody.IsTracked == false) return -2.0;
const double HEAD_DIVERGENCE = 0.1;
Joint _head = TargetBody.Joints[JointType.Head];
Joint _neck = TargetBody.Joints[JointType.Neck];
// var spine = skeleton.Joints[JointType.Spine]; // ?
Joint _spine = TargetBody.Joints[JointType.SpineShoulder];
// var waist = skeleton.Joints[JointType.HipCenter]; // ?
// jeh: spinemid is ignored
Joint _waist = TargetBody.Joints[JointType.SpineBase];
Joint _hipLeft = TargetBody.Joints[JointType.HipLeft];
Joint _hipRight = TargetBody.Joints[JointType.HipRight];
Joint _kneeLeft = TargetBody.Joints[JointType.KneeLeft];
Joint _kneeRight = TargetBody.Joints[JointType.KneeRight];
Joint _ankleLeft = TargetBody.Joints[JointType.AnkleLeft];
Joint _ankleRight = TargetBody.Joints[JointType.AnkleRight];
Joint _footLeft = TargetBody.Joints[JointType.FootLeft];
Joint _footRight = TargetBody.Joints[JointType.FootRight];
// Find which leg is tracked more accurately.
int legLeftTrackedJoints = NumberOfTrackedJoints(_hipLeft, _kneeLeft, _ankleLeft, _footLeft);
int legRightTrackedJoints = NumberOfTrackedJoints(_hipRight, _kneeRight, _ankleRight, _footRight);
double legLength = legLeftTrackedJoints > legRightTrackedJoints ? Length(_hipLeft, _kneeLeft, _ankleLeft, _footLeft)
: Length(_hipRight, _kneeRight, _ankleRight, _footRight);
// default is meters. adjust if imperial to feet
double _retval = Length(_head, _neck, _spine, _waist) + legLength + HEAD_DIVERGENCE;
if (MeasurementSystem == BodyHeightMeasurementSystem.Imperial) _retval = MetricHelpers.MetersToFeet(_retval);
return _retval;
}
/// <summary>
/// Returns the upper height of the specified skeleton (head to waist). Useful whenever Kinect provides a way to track seated users.
/// </summary>
/// <param name="skeleton">The specified user skeleton.</param>
/// <returns>The upper height of the skeleton in meters.</returns>
public static double UpperHeight( this Body TargetBody )
{
Joint _head = TargetBody.Joints[JointType.Head];
// used to be ShoulderCenter. Think its SpineMid now
Joint _neck = TargetBody.Joints[JointType.SpineMid];
// .Spine is now .SpineShoulder
Joint _spine = TargetBody.Joints[JointType.SpineShoulder];
// HipCenter is now SpineBase
Joint _waist = TargetBody.Joints[JointType.SpineBase];
return Length(_head, _neck, _spine, _waist);
}
/// <summary>
/// Returns the length of the segment defined by the specified joints.
/// </summary>
/// <param name="p1">The first joint (start of the segment).</param>
/// <param name="p2">The second joint (end of the segment).</param>
/// <returns>The length of the segment in meters.</returns>
public static double Length(Joint p1, Joint p2)
{
return Math.Sqrt(
Math.Pow(p1.Position.X - p2.Position.X, 2) +
Math.Pow(p1.Position.Y - p2.Position.Y, 2) +
Math.Pow(p1.Position.Z - p2.Position.Z, 2));
}
/// <summary>
/// Returns the length of the segments defined by the specified joints.
/// </summary>
/// <param name="joints">A collection of two or more joints.</param>
/// <returns>The length of all the segments in meters.</returns>
public static double Length(params Joint[] joints)
{
double length = 0;
for (int index = 0; index < joints.Length - 1; index++)
{
length += Length(joints[index], joints[index + 1]);
}
return length;
}
/// <summary>
/// Given a collection of joints, calculates the number of the joints that are tracked accurately.
/// </summary>
/// <param name="joints">A collection of joints.</param>
/// <returns>The number of the accurately tracked joints.</returns>
public static int NumberOfTrackedJoints(params Joint[] joints)
{
int trackedJoints = 0;
foreach (var joint in joints)
{
// if (joint.TrackingState == JointTrackingState.Tracked)
if ( joint.TrackingState== TrackingState.Tracked )
{
trackedJoints++;
}
}
return trackedJoints;
}
/// <summary>
/// Scales the specified joint according to the specified dimensions.
/// </summary>
/// <param name="joint">The joint to scale.</param>
/// <param name="width">Width.</param>
/// <param name="height">Height.</param>
/// <param name="MaxX">Maximum X.</param>
/// <param name="MaxY">Maximum Y.</param>
/// <returns>The scaled version of the joint.</returns>
public static Joint ScaleTo(Joint joint, int width, int height, float MaxX, float MaxY)
{
// SkeletonPoint position = new SkeletonPoint()
Microsoft.Kinect.CameraSpacePoint position = new Microsoft.Kinect.CameraSpacePoint()
{
X = Scale(width, MaxX, joint.Position.X),
Y = Scale(height, MaxY, -joint.Position.Y),
Z = joint.Position.Z
};
joint.Position = position;
return joint;
}
/// <summary>
/// Scales the specified joint according to the specified dimensions.
/// </summary>
/// <param name="joint">The joint to scale.</param>
/// <param name="width">Width.</param>
/// <param name="height">Height.</param>
/// <returns>The scaled version of the joint.</returns>
public static Joint ScaleTo(Joint joint, int width, int height)
{
return ScaleTo(joint, width, height, 1.0f, 1.0f);
}
/// <summary>
/// Returns the scaled value of the specified position.
/// </summary>
/// <param name="maxPixel">Width or height.</param>
/// <param name="maxBody">Border (X or Y).</param>
/// <param name="position">Original position (X or Y).</param>
/// <returns>The scaled value of the specified position.</returns>
private static float Scale(int maxPixel, float maxBody, float position)
{
float value = ((((maxPixel / maxBody ) / 2) * position) + (maxPixel / 2));
if (value > maxPixel)
{
return maxPixel;
}
if (value < 0)
{
return 0;
}
return value;
}
}

XNA - how to tell if a thumb stick was "twitched" in a certain direction

Is there anything in the API (3 or 4) to tell me if the stick moved in one direction, as in a menu where it's equivalent to hitting a direction on the DPad? There appear to be some Thumbstick* members in the Buttons enum, but I can't find decent documentation on them.
Just want to make sure I'm not missing something obvious before I go and roll my own. Thanks!
There is no XNA method to tell you if a thumbstick was "twitched" this frame.
The easiest method is to store the old thumbstick state. If the state was zero and is now non-zero, it has been twitched.
Addition:
Instead of checking if the state was zero and is now non-zero. You can use the thumbstick buttons from the enumeration you mention in your question to determine if the stick has been "twitched". In this case you are treating the stick like a DPad and have to test each direction independently. The following code shows this method:
private void ProcessUserInput()
{
GamePadState gamePadState = GamePad.GetState(PlayerIndex.One);
if (m_lastGamePadState.IsButtonUp(Buttons.LeftThumbstickUp) && gamePadState.IsButtonDown(Buttons.LeftThumbstickUp))
{
PrevMenuItem();
}
if (m_lastGamePadState.IsButtonUp(Buttons.LeftThumbstickDown) && gamePadState.IsButtonDown(Buttons.LeftThumbstickDown))
{
NextMenuItem();
}
m_lastGamePadState = gamePadState;
}
The thumbsticks on an Xbox 360 controller can be pushed "in" like buttons, which map to GamePadButtons.LeftStick and GamePadButtons.RightStick. These are obviously not what you want.
Here is the code that I use for detecting "presses" in any direction (where padLeftPushActive is stored between frames):
Vector2 padLeftVector = gamePadState.ThumbSticks.Left;
bool lastPadLeftPushActive = padLeftPushActive;
if(padLeftVector.Length() > 0.85f)
padLeftPushActive = true;
else if(padLeftVector.Length() < 0.75f)
padLeftPushActive = false;
if(!lastPadLeftPushActive && padLeftPushActive)
{
DoSomething(Vector2.Normalize(padLeftVector));
}
It should be fairly simple to modify this so that it detects just presses in the particular directions necessary for your menu.
Is the GamePadState.Thumbsticks property what you're looking for?
Here's the solution I came up with, in case it's useful for anyone:
enum Stick {
Left,
Right,
}
GamePadState oldState;
GamePadState newState;
/// <summary>
/// Checks if a thumbstick was quickly tapped in a certain direction.
/// This is useful for navigating menus and other situations where
/// we treat a thumbstick as a D-Pad.
/// </summary>
/// <param name="which">Which stick to check: left or right</param>
/// <param name="direction">A vector in the direction to check.
/// The length, which should be between 0.0 and 1.0, determines
/// the threshold.</param>
/// <returns>True if a twitch was detected</returns>
public bool WasStickTwitched(Stick which, Vector2 direction)
{
if (direction.X == 0 && direction.Y == 0)
return false;
Vector2 sold, snew;
if (which == Stick.Left)
{
sold = oldState.ThumbSticks.Left;
snew = newState.ThumbSticks.Left;
}
else
{
sold = oldState.ThumbSticks.Right;
snew = newState.ThumbSticks.Right;
}
Vector2 twitch = snew;
bool x = (direction.X == 0 || twitch.X / direction.X > 1);
bool y = (direction.Y == 0 || twitch.Y / direction.Y > 1);
bool tnew = x && y;
twitch = sold;
x = (direction.X == 0 || twitch.X / direction.X > 1);
y = (direction.Y == 0 || twitch.Y / direction.Y > 1);
bool told = x && y;
return tnew && !told;
}