iOS Align a UIView parallel to X-Z plane - objective-c

I realize that by default a UIView is parallel to the X-Y plane, the Z-axis coming straight at you.
How do I rotate this UIView so that it is parallel to X-Z plane? I think I have to use CATransform3D but am not sure what angle I should give it? M_PI_2 or -M_PI_2 ?
Anybody has any idea? Thanks for help...
EDIT: I know that by doing this, the UIView will not be visible to the user, but I want to set a particular UIView this way & then do some rotation animation on it. So this is what I want..

WHat you probably want to do is to apply a transformation to the view's backing layer. The transform property on UIView is an affine transformation, not allowing any 3D manipulation. But accessing view.layer.transform you have a full 4x4 matrix for your transformation needs, allowing nice funky 3d stuff.
Add QuartzCore.framework to your target.
Import <QuartzCore/QuartzCore.h> to get access to the CALayer API.
Code away.
For example:
// Rotate 1/4 by the Y-axis.
myView.layer.transform = CATransform3DMakeRotation(M_PI_2, 0.0f, 1.0f, 0.0f);

Related

Animation on side menu bar using program

https://github.com/romaonthego/RESideMenu This is link of RESideMenu.
I am new to ios and i am working on new project in which i have made this kind of side menu bar. But one things which i want is that the window which is going back must get tilt on one side.
Actually I want one side to get decrease in size like flipkart app in ios. So can you please tell me if there is some way to fix this by doing any coding.
You need to apply a CATransform3D to the UIView's layer property transform. An affine transform is a transform in which all sides of the UIView remain parallel. You will need to apply a non affine transform because the top and bottom of your view will not be parallel during the flip effect. You will need to use the underlying UIView's CALayer transform property to apply the CATransform3D. You can access this property through the layer property on your view object. FYI, books have been written about Core Animation, so it is not a light topic especially for a beginner, but as you can see from the code below it is quite simple to apply a 3D rotation to your view.
//Transform
CATransform3D transform = CATransform3DIdentity;
//Modify the perspective transform
transform.m34 = - 1.0 / 500.0;
//Rotate
transform = CATransform3DRotate(transform, M_PI_4, 0, 1, 0);
//Apply transform to the layer
self.layerView.layer.transform = transform;

How to find back the real position on the image on iOS?

Here is the view I got, I got a layer view, detect user touch, and a image view, which showing the image. The layer view is cover on top of the image view. The image view's image is aspect fit. So, it won't lost the ratio. If in my layer view touch on 100, 240, it is a layer view coordinate, but not the image's coordinate. I would like to know how to convert the layer view's coordinate to a image's coordinate. In this example, the image size may be 180*180, so, the coordinate in layer view in the image is 60, 90.
Thanks.
If I'm understanding this question correctly, you want to take a point, which is currently in relation to the layer's coordinate system, and convert it to the image view's coordinate system?
In that case, there are a couple of ways to do this.
Easiest is to use convertPoint:fromView: or convertPoint:toView:
CGPoint imageViewTouchPoint = [layerView convertPoint:touchPoint fromView:imageView];
CGPoint imageViewTouchPoint = [imageView convertPoint:touchPoint toView:layerView];
Either one should work.
EDIT - I realize now that this is only if the UIImageView has the same frame as the UIImage, which you said it might not, due to the UIViewContentModeScaleAspectFit property.
In this case, unless I'm mistaken, the image frame is calculated inside the UIImageView drawRect: method and isn't a property that gets set. This means you'll have to calculate this on your own.
Definitely get the imageViewTouchPoint from one of the methods above (just in case you want to use the same logic on a UIImageView which isn't the full screen size).
You will then need to calculate the scaled image frame. There are a couple of ways to do this. Some people go brute force and manually calculate based on which side of the image is longer, then determining which side should be scaled. Then they calculate the origin by by centering the image and subtracting the image and image view's sides and dividing by two.
I like to write as little code as possible if it's unnecessary, even if it means importing a framework. If you import AVFoundation you get a method AVMakeRectWithAspectRatioInsideRect which you can use to actually calculate the scaled rectangle in one line of code.
CGRect imageRect = AVMakeRectWithAspectRatioInsideRect(image.size, imageView.frame);
Whichever method you use, you will then simply translate your touched point with the scaled image origin:
CGPoint imageTouchPoint = CGPointMake(imageViewTouchPoint.x - imageRect.origin.x, imageViewTouchPoint.y - imageRect.origin.y);
You have to do the math yourself. Calculate the aspect ratio of your image and compare with the aspect ratio of the image view's bounds.
Look at this question: How to Get Image position in ImageView
After searching more, got a hack:
CGSize imageInViewSize = [photo resizedImageWithContentMode:UIViewContentModeScaleAspectFit bounds:imageView.size interpolationQuality:kCGInterpolationNone].size;
CGRect overlayRect = CGRectMake((imageView.frame.size.width - imageInViewSize.width) / 2,
(imageView.frame.size.height - imageInViewSize.height) / 2,
imageInViewSize.width,
imageInViewSize.height);
NSLog(#"Frame of Image inside UIImageView: Left:%f Top:%f Width:%f Height:%f \n", overlayRect.origin.x, overlayRect.origin.y, overlayRect.size.width, overlayRect.size.height);

Change NSImage Origin

Is it possible to change the origin of an NSImage? If so how would I go about doing this. I have coordinates in regular cartesian system some of them with negative values and I am trying to draw them at the corresponding point in the NSImage but since the origin is at (0,0) there are some missing.
EDIT:Say I have an drawing aspect that needs to be done to an image at the point (-10,-10), currently this doesn't show up. Is there a way to fix that?
If it's like in iOS (you may have to adapt a little the code) and if my memory is still good, you have to do this, since origin is readOnly:
CGRect myFrame = yourImage.frame;
myFrame.origin.x=newX; myFrame.origin.y=newY;
yourImage.frame = myFrame;
I think you are confusing an NSImage with it's container. An NSImage has no bounds or frame, and thus no origin. It does have a size which may represent the pixel dimensions of its birtmap representation ( if it has one) or otherwise could represent it's bounding box ( if it is a vector image). Drawing in an image at a pixel location of (-10,-10) doesn't really make sense.
An NSImage is displayed in a container ( typically an NSImageView), and the container's bounds.origin will dictate the placement of the image relative to the imageView, but you can't modify pixels beyond the edge of the bitmap plane.
In any case you probably want to be using a subclassed NSView in which you would override the drawRect method for your custom drawing. NSView does have a bounds.origin but this is not relevant to your in-drawing coordinates, but rather to the position of the drawn content as a whole to the view's bounding box. The coordinate system that you will be drawing into will be referenced to your graphics context which will (usually) pin the origin (0,0) to the bottom left corner (OSX) or top left corner (iOS). If you are trying to represent negative points on a Cartesian plane, you will need to apply a translation transform to map your points into this positive coordinate space.
I'm trying to explain in a few words, badly, something which Apple explains in great detail in their Quartz 2D Programming Guide.

Rotate image to point

I have the following:
CGPoint pos //center of an image
CGPoint target //a point, somewhere in the coordinate system
float rotation //the current rotation of the image to the x-axis, clockwise, so "right" would be 90°
Now I want the image to rotate around it's centerpoint (pos) so that it looks directly towards the targetpoint.
My idea was: Calculate the angle corresponding to the x-axis, subtract rotation, and then rotate it.
Two things:
1.) I fail at calculating the angle. (Yes, I know it's all in rad...)
2.) What's best for rotating?
CGAffineTransform? But then I'd need an imageView
Or: save context, set origin to center of image, rotate context, draw image, restore context? More complicated, but no imageview neeeded...
Draw it on a CALayer, and move the layer around anyway you like.

Complex CATransition

I am trying to create a CATransition to a UIView.
I want to move the UIView to the right, and at the same time (and always on the same point), rotate it.
It is better explained by the image.
I am able to move it with a CATransition, and also to rotate it with a CABasicAnimation, but I don't know how to do those together.
Thanks.
-(void)scaleAndRotate:(UIImageView*)myView andAngle:(float)angle {
CGAffineTransform scaleTrans = CGAffineTransformMakeScale(1.5,1.5);
CGAffineTransform rotateTrans =CGAffineTransformMakeRotation(angle * M_PI / 180);
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:0.5];
myView.transform = CGAffineTransformConcat(scaleTrans, rotateTrans);
[UIView commitAnimations];
}
In the above method : replace & with what u want ...
It'll work ,,, surely :)
All the best
You should consider making two copies of the image, rotating one, and masking them both so that they can be placed next to each other in the L shape.
Using this technique, you'll be doing two translations at once (moving the mask and the underlying image) to both images A and T. But, notice that rotation will not be animated. You'll put image T into the rotated state immediately and just reveal it by moving it under the mask (while simultaneously doing the opposite on image A to hide it). So you're not actually combining translation and rotation into one animation, but rather using just translation with a mask on both the original image (A) and a rotated copy (T).
You'll need to mask the left side of one and the right side of the other. The shape of the mask should have an opposite 45 degree angle on both, then you can bring those angled edges together to form the L. As time progress, you just move the mask in each until the first image is totally gone and you're left with your end state.
The masking part is the hard part. See this answer on masking a UIImage with CoreGraphics: masking a UIImage
The mask PNG would basically just be a rectangle with one side at a 45 degree angle. You could create that in the image editor of your choice (Photoshop, GIMP, Acorn).
Note: this approach will create a sharp edge at the corner. The other approach would be to warp the pixels around that corner as they move from the vertical downward motion to the horizontal rightward motion. (I think) This would be much more involved.