Moving SCNNode in ARSCNView using a gesture - objective-c

I have been trying to use UIPanGestureRecognizer to move an SCNNode attached to a UISceneView in iOS ARKit. It moves fine in the x dimension, but I cannot seem to figure out how to move it in the y direction. I have followed two different approaches, trying each without success:
How do I find my mouse point in a scene using SceneKit?
https://github.com/rajubd49/ARKit-Sample-ObjC
Any insight on why the y dimension does not seem to behave the same as the x would be greatly appreciated. Could this have something to do with motion only in the XZ plane or something? Here is the code following the GitHub sample code cited in #2 above):
==============
//Move SCNNode
- (void)handlePanGesture:(UIPanGestureRecognizer *)pagr
{
switch (pagr.state)
{
case UIGestureRecognizerStateBegan:
{
CGPoint tapPoint = [pagr locationInView:sceneView];
NSLog(#"%s tapPoint %#",__FUNCTION__,NSStringFromCGPoint(tapPoint));
NSArray *hitResults = [sceneView hitTest:tapPoint types:ARHitTestResultTypeFeaturePoint | ARHitTestResultTypeEstimatedHorizontalPlane];
lastHitTestResult = [hitResults firstObject];
}
break;
case UIGestureRecognizerStateChanged:
{
CGPoint tapPoint = [pagr locationInView:sceneView];
if (windScreenView.buttonState == GypsyTargetAdded)
{
NSArray *hitResults = [sceneView hitTest:tapPoint types:ARHitTestResultTypeFeaturePoint | ARHitTestResultTypeEstimatedHorizontalPlane];
ARHitTestResult *result = [hitResults lastObject];
[SCNTransaction begin];
SCNMatrix4 lastMatrix = SCNMatrix4FromMat4(lastHitTestResult.worldTransform);
SCNVector3 lastVector = SCNVector3Make(lastMatrix.m41, lastMatrix.m42, lastMatrix.m43);
SCNMatrix4 newMatrix = SCNMatrix4FromMat4(result.worldTransform);
SCNVector3 newVector = SCNVector3Make(newMatrix.m41, newMatrix.m42, newMatrix.m43);
CGFloat dx = newVector.x-lastVector.x;
CGFloat dy = newVector.y-lastVector.y;
SCNVector3 adjVector = SCNVector3Make(gypsyTargetNode.position.x + dx, gypsyTargetNode.position.y + dy, gypsyTargetNode.position.z);
gypsyTargetNode.position = adjVector;
[SCNTransaction commit];
NSLog(#"%s lastVector: x = %f, y = %f, z = %f",__FUNCTION__,lastVector.x,lastVector.y,lastVector.z);
NSLog(#"%s newVector: x = %f, y = %f, z = %f",__FUNCTION__,newVector.x,newVector.y,newVector.z);
NSLog(#"%s dx = %f, dy = %f",__FUNCTION__,dx,dy);
NSLog(#"%s gypsyTargetNode.position: x = %f, y = %f, z = %f",__FUNCTION__,gypsyTargetNode.position.x,gypsyTargetNode.position.y,gypsyTargetNode.position.z);
NSLog(#"%s hitResults.count: %li",__FUNCTION__,hitResults.count);
lastHitTestResult = result;
}
}
break;
case UIGestureRecognizerStateEnded:
{
lastHitTestResult = nil;
}
break;
default:
break;
}
}
and here are snippets of the output console. Note that the calculated "dx" is always .000000 .
[CalibrationController handlePanGesture:] tapPoint {207.33332824707031, 507}
[CalibrationController handlePanGesture:] lastVector: x = -0.016018, y = -0.083625, z = 0.006696
[CalibrationController handlePanGesture:] newVector: x = -0.015562, y = -0.083862, z = 0.006658
[CalibrationController handlePanGesture:] dx = 0.000456, dy = **-0.000237**
[CalibrationController handlePanGesture:] gypsyTargetNode.position: x = -0.018142, y = -0.087894, z = 0.008041
[CalibrationController handlePanGesture:] hitResults.count: 2
[CalibrationController handlePanGesture:] lastVector: x = -0.015562, y = -0.083862, z = 0.006658
[CalibrationController handlePanGesture:] newVector: x = -0.015562, y = -0.083862, z = 0.006658
[CalibrationController handlePanGesture:] dx = 0.000000, dy = **0.000000**
[CalibrationController handlePanGesture:] gypsyTargetNode.position: x = -0.018142, y = -0.087894, z = 0.008041
[CalibrationController handlePanGesture:] hitResults.count: 2
[CalibrationController handlePanGesture:] lastVector: x = -0.015562, y = -0.083862, z = 0.006658
[CalibrationController handlePanGesture:] newVector: x = -0.015150, y = -0.083862, z = 0.006565
[CalibrationController handlePanGesture:] dx = 0.000412, dy = **0.000000**
[CalibrationController handlePanGesture:] gypsyTargetNode.position: x = -0.017730, y = -0.087894, z = 0.008041
[CalibrationController handlePanGesture:] hitResults.count: 2
[CalibrationController handlePanGesture:] lastVector: x = -0.015150, y = -0.083862, z = 0.006565
[CalibrationController handlePanGesture:] newVector: x = -0.014686, y = -0.083862, z = 0.006361
[CalibrationController handlePanGesture:] dx = 0.000463, dy = **0.000000**

For panning gesture implement the following approach:
let myScene = SCNScene(named: "scene.scn")!
let modelNode: SCNNode = myScene.rootNode
var selectedNode: SCNNode? = nil
override func viewDidLoad() {
super.viewDidLoad()
let moveGesture = UIPanGestureRecognizer(target: self,
action: #selector(moveModel))
self.sceneView.addGestureRecognizer(moveGesture)
}
Check important conditions if any:
private func myMethod(at position: CGPoint) -> SCNNode? {
let nnn = self.sceneView.hitTest(position, options: nil).first(where: {
$0.node !== modelNode
})?.node
return nnn
}
Fill a gesture recognizer #objc method:
#objc func moveModel(_ gesture: UIPanGestureRecognizer) {
let location = gesture.location(in: self.sceneView)
switch gesture.state {
case .began:
selectedNode = myMethod(at: location)
case .changed:
guard let result = self.sceneView.hitTest(location,
types: .existingPlane).first
else {
return
}
let transform = result.worldTransform
let newPosition = SIMD3<Float>(transform.columns.3.x,
transform.columns.3.y,
transform.columns.3.z)
selectedNode?.simdPosition = newPosition
default:
selectedNode = nil
}
}

Related

Count pixels in an irregular shape

I have an existing app, written in Obj C, that works and is on the iTunes App Store. I am updating it to Swift. I am having trouble with the pixel classification and counting portion of the app. The Objective C code is below: I would appreciate any suggestions as how to proceed with the Swift 3 version.
void classifyPixel(unsigned char r, unsigned char g, unsigned char b,
NSUInteger *coverRed, NSUInteger *coverGreen, NSUInteger *coverYellow, NSUInteger *coverBlue, NSUInteger *coverWhite)
{
if ((r == 255) && (g == 0) && (b == 0))
*coverRed = *coverRed + 1;
else if ((r == 0) && (g == 255) && (b == 0))
*coverGreen = *coverGreen + 1;
else if ((r == 255) && (g == 255) && (b == 0))
*coverYellow = *coverYellow +1;
else if ((r == 0) && (g == 0) && (b== 255))
*coverBlue = *coverBlue + 1;
else if ((r == 255) & (g == 255) && (b == 255))
*coverWhite = *coverWhite +1;
}
- (IBAction)computeArea:(id)sender {
// Taken from Hugh's original code. With some slight clean up
UIImage *image = self.imageView.image;
NSUInteger coverRed = 0;
NSUInteger coverGreen = 0;
NSUInteger coverYellow = 0;
NSUInteger coverBlue = 0;
NSUInteger coverWhite = 0;
NSUInteger coverTotal = 0;
size_t image_width = image.size.width;
size_t image_height = image.size.height;
size_t bitsPerComponent = 8;
size_t bytesPerPixel = 4;
size_t bytesPerRow = bytesPerPixel*image_width;
unsigned char* raster_buffer = (unsigned char*) calloc(bitsPerComponent, bytesPerRow *image.size.height);
if(raster_buffer !=NULL)
{
CGContextRef context = CGBitmapContextCreate(
(void*) raster_buffer,
image.size.width,
image.size.height,
bitsPerComponent,
bytesPerRow,
CGImageGetColorSpace(image.CGImage),
(CGBitmapInfo)kCGImageAlphaPremultipliedLast);
if (context != NULL)
{
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, image.size.width, image.size.height), image.CGImage);
unsigned char* row_start_p = raster_buffer;
for (size_t y=0; y<image_height; y++)
{
unsigned char* nth_pixel_p = row_start_p;
for (size_t x=0; x<image_width; x++)
{
classifyPixel(nth_pixel_p[bytesPerPixel*x],
nth_pixel_p[bytesPerPixel*x+1],
nth_pixel_p[bytesPerPixel*x+2],
&coverRed,&coverGreen,&coverYellow,&coverBlue,&coverWhite);
coverTotal++;
}
row_start_p+=bytesPerRow;
}
}
}
free(raster_buffer);
float uu = coverGreen;
float vv = coverYellow;
float ww = coverRed;
float xx = coverBlue;
float yy = coverWhite;
float zz = uu + vv + ww + xx + yy;
float aa = (uu/zz)*100;
float bb = (vv/zz)*100;
float cc = (ww/zz)*100;
float dd = (xx/zz)*100;
float ee = (yy/zz)*100;
totalcount = [NSNumber numberWithFloat:zz];
greencountresult = [NSString stringWithFormat:#"%.1f%%",aa];
yellowcountresult = [NSString stringWithFormat:#"%.1f%%",bb];
redcountresult = [NSString stringWithFormat:#"%.1f%%",cc];
bluecountresult = [NSString stringWithFormat:#"%.1f%%",dd];
whitecountresult = [NSString stringWithFormat:#"%.1f%%",ee];
NSLog(#"Area compute! Red %lu Green %lu Yellow %lu Blue %lu White %lu\n",(unsigned long)coverRed,(unsigned long)coverGreen,(unsigned long)coverYellow,(unsigned long)coverBlue,(unsigned long)coverWhite);
}

MKPolygon area calculation

I'm trying to make an area calculation category for MKPolygon.
I found some JS code https://github.com/mapbox/geojson-area/blob/master/index.js#L1 with a link to the algorithm: http://trs-new.jpl.nasa.gov/dspace/handle/2014/40409.
It says:
Here is my code, which gave a wrong result (thousands times more than actual):
#define kEarthRadius 6378137
#implementation MKPolygon (AreaCalculation)
- (double) area {
double area = 0;
NSArray *coords = [self coordinates];
if (coords.count > 2) {
CLLocationCoordinate2D p1, p2;
for (int i = 0; i < coords.count - 1; i++) {
p1 = [coords[i] MKCoordinateValue];
p2 = [coords[i + 1] MKCoordinateValue];
area += degreesToRadians(p2.longitude - p1.longitude) * (2 + sinf(degreesToRadians(p1.latitude)) + sinf(degreesToRadians(p2.latitude)));
}
area = area * kEarthRadius * kEarthRadius / 2;
}
return area;
}
- (NSArray *)coordinates {
NSMutableArray *points = [NSMutableArray arrayWithCapacity:self.pointCount];
for (int i = 0; i < self.pointCount; i++) {
MKMapPoint *point = &self.points[i];
[points addObject:[NSValue valueWithMKCoordinate:MKCoordinateForMapPoint(* point)]];
}
return points.copy;
}
double degreesToRadians(double radius) {
return radius * M_PI / 180;
}
#end
What did I miss?
The whole algorithm implemented in Swift 3.0 :
import MapKit
let kEarthRadius = 6378137.0
// CLLocationCoordinate2D uses degrees but we need radians
func radians(degrees: Double) -> Double {
return degrees * M_PI / 180;
}
func regionArea(locations: [CLLocationCoordinate2D]) -> Double {
guard locations.count > 2 else { return 0 }
var area = 0.0
for i in 0..<locations.count {
let p1 = locations[i > 0 ? i - 1 : locations.count - 1]
let p2 = locations[i]
area += radians(degrees: p2.longitude - p1.longitude) * (2 + sin(radians(degrees: p1.latitude)) + sin(radians(degrees: p2.latitude)) )
}
area = -(area * kEarthRadius * kEarthRadius / 2);
return max(area, -area) // In order not to worry about is polygon clockwise or counterclockwise defined.
}
The final step for i = N-1 and i+1 = 0 (wrap around) is missing in your loop.
This may help to someone...
You need to pass the shape edge points into below method and it returns the correct area of a polygon
static double areaOfCurveWithPoints(const NSArray *shapeEdgePoints) {
CGPoint initialPoint = [shapeEdgePoints.firstObject CGPointValue];
CGMutablePathRef cgPath = CGPathCreateMutable();
CGPathMoveToPoint(cgPath, &CGAffineTransformIdentity, initialPoint.x, initialPoint.y);
for (int i = 1;i<shapeEdgePoints.count ;i++) {
CGPoint point = [[shapeEdgePoints objectAtIndex:i] CGPointValue];
CGPathAddLineToPoint(cgPath, &CGAffineTransformIdentity, point.x, point.y);
}
CGPathCloseSubpath(cgPath);
CGRect frame = integralFrameForPath(cgPath);
size_t bytesPerRow = bytesPerRowForWidth(frame.size.width);
CGContextRef gc = createBitmapContextWithFrame(frame, bytesPerRow);
CGContextSetFillColorWithColor(gc, [UIColor whiteColor].CGColor);
CGContextAddPath(gc, cgPath);
CGContextFillPath(gc);
double area = areaFilledInBitmapContext(gc);
CGPathRelease(cgPath);
CGContextRelease(gc);
return area;
}
static CGRect integralFrameForPath(CGPathRef path) {
CGRect frame = CGPathGetBoundingBox(path);
return CGRectIntegral(frame);
}
static size_t bytesPerRowForWidth(CGFloat width) {
static const size_t kFactor = 64;
// Round up to a multiple of kFactor, which must be a power of 2.
return ((size_t)width + (kFactor - 1)) & ~(kFactor - 1);
}
static CGContextRef createBitmapContextWithFrame(CGRect frame, size_t bytesPerRow) {
CGColorSpaceRef grayscale = CGColorSpaceCreateDeviceGray();
CGContextRef gc = CGBitmapContextCreate(NULL, frame.size.width, frame.size.height, 8, bytesPerRow, grayscale, kCGImageAlphaNone);
CGColorSpaceRelease(grayscale);
CGContextTranslateCTM(gc, -frame.origin.x, -frame.origin.x);
return gc;
}
static double areaFilledInBitmapContext(CGContextRef gc) {
size_t width = CGBitmapContextGetWidth(gc);
size_t height = CGBitmapContextGetHeight(gc);
size_t stride = CGBitmapContextGetBytesPerRow(gc);
// Get a pointer to the data
unsigned char *bitmapData = (unsigned char *)CGBitmapContextGetData(gc);
uint64_t coverage = 0;
for (size_t y = 0; y < height; ++y) {
for (size_t x = 0; x < width; ++x) {
coverage += bitmapData[y * stride + x];
}
}
// NSLog(#"coverage =%llu UINT8_MAX =%d",coverage,UINT8_MAX);
return (double)coverage / UINT8_MAX;
}

Doesn't draw my grid properly in objective C

I have a problem with counting my module. I am doing the following.
- (void)makeGrid:withData:(NSDictionary * )data
{
NSLog(#"aantal is: %d",[data count]);
int xStart = 0;
int yStart = 0;
int xCurrent = xStart;
int yCurrent = yStart;
int xStepSize = 165;
int yStepSize = 251;
int xCnt = 3 ;
int yCnt = [data count] % 3;
int cellCounter = 0;
UIView * gridContainerView = [[UIView alloc] init];
[keeperView addSubview:gridContainerView];
for (int y = 0; y < yCnt; y++) {
for (int x = 0; x < xCnt; x++) {
printf("xCurrent %d yCurrent %d \n", xCurrent, yCurrent);
NSString *url1 = #"player_imgUrl";
NSString *url2 = [NSString stringWithFormat:#"%i", x];
NSString *url3 = [url1 stringByAppendingString:url2];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:[data objectForKey:url3]]];
UIImage* myImage = [[UIImage alloc] initWithData:imageData];
UIImageView * myView = [[UIImageView alloc] initWithImage:myImage];
CGRect rect = myView.frame;
rect.origin.x = xCurrent;
rect.origin.y = yCurrent;
myView.frame = rect;
myView.tag = cellCounter;
[gridContainerView addSubview:myView];
//just label stuff
UILabel * myLabel = [[UILabel alloc] init];
[gridContainerView addSubview:myLabel];
//--------------------------------
xCurrent += xStepSize;
cellCounter++;
}
xCurrent = xStart;
yCurrent += yStepSize;
}
CGRect repositionRect = gridContainerView.frame;
repositionRect.origin.y = 100;
gridContainerView.frame = repositionRect;
}
My NSLog says that in my data object are 16 values. And when I run it, it only shows 3 imageviews. Does anybody know what I am doing wrong?
Please help,
Kind regads.
After reading your code, the number of imageviews should be (xCnt * yCnt).
Your problem is here:
int yCnt = [data count] % 3;
the yCnt is 1 when your data is 16, that's why your result is 3 imageviews only.
to overcome this issue, you should do the following:
yCnt = (data.count / xCnt) + 1;
for (int y = 0 ; y < yCnt; y++)
{
for (int x = 0; x < xCnt; x++)
{
if ((y == (yCnt - 1)) && (x > (data.count % xCnt)))
{
break;
}
else {
// Your Grid code here
}
}
}
Hope This helps.
int yCnt = [data count] % 3;
gives 1 when [data count] == 16;

How do I convert a UIColor to a hexadecimal string?

I have a project where I need to store the RGBA values of a UIColor in a database as an 8-character hexadecimal string. For example, [UIColor blueColor] would be #"0000FFFF".
I know I can get the component values like so:
CGFloat r,g,b,a;
[color getRed:&r green:&g blue: &b alpha: &a];
but I don't know how to go from those values to the hex string. I've seen a lot of posts on how to go the other way, but nothing functional for this conversion.
Get your floats converted to int values first, then format with stringWithFormat:
int r,g,b,a;
r = (int)(255.0 * rFloat);
g = (int)(255.0 * gFloat);
b = (int)(255.0 * bFloat);
a = (int)(255.0 * aFloat);
[NSString stringWithFormat:#"%02x%02x%02x%02x", r, g, b, a];
Here it goes. Returns a NSString (e.g. ffa5678) with a hexadecimal value of the color.
- (NSString *)hexStringFromColor:(UIColor *)color
{
const CGFloat *components = CGColorGetComponents(color.CGColor);
CGFloat r = components[0];
CGFloat g = components[1];
CGFloat b = components[2];
return [NSString stringWithFormat:#"%02lX%02lX%02lX",
lroundf(r * 255),
lroundf(g * 255),
lroundf(b * 255)];
}
Swift 4 answer by extension UIColor:
extension UIColor {
var hexString: String {
let colorRef = cgColor.components
let r = colorRef?[0] ?? 0
let g = colorRef?[1] ?? 0
let b = ((colorRef?.count ?? 0) > 2 ? colorRef?[2] : g) ?? 0
let a = cgColor.alpha
var color = String(
format: "#%02lX%02lX%02lX",
lroundf(Float(r * 255)),
lroundf(Float(g * 255)),
lroundf(Float(b * 255))
)
if a < 1 {
color += String(format: "%02lX", lroundf(Float(a * 255)))
}
return color
}
}

digits after decimal point

how to get digits after decimal point from float number in objective c
OK, this is C-style, but I imagine the process would be the same.
int decimals = (number -((int)number) );
while( decimals > 0.0 )
{
reportNextNumber( (int)(decimals*10) );
decimals = (number -((int)number) );
}
this code works
CGFloat x = 2.43;
// CGFloat x = 3.145;
// CGFloat x = 2.0003;
// CGFloat x = 1.0;
// CGFloat x = 3.1415926535;
NSLog(#"%f -> %#", x, #([self numberOfFractionDigits:x]));
- (NSString *)numberOfFractionDigits:(CGFloat)number {
CGFloat fractionalPart = number - (NSInteger)number;
NSMutableString *r = [NSMutableString stringWithString:#""];
while (fractionalPart) {
[r appendFormat:#"%#", #((NSUInteger)(fractionalPart * 10 + .5))];
number *= 10;
fractionalPart = number - (NSInteger)number;
}
return r;
}
output:
2.430000 -> 43
3.145000 -> 145
2.000300 -> 0003
1.000000 ->
3.141593 -> 1415926535 // for x = 3.1415926535;