CALayer renderInContext - core-animation

I use
CATransform3D rotationAndPerspectiveTransform = CATransform3DIdentity;
rotationAndPerspectiveTransform.m34 = 1.0 / -500;
It's success, but I use "renderInContext:context" get CGImage from context, I found the image effect is not changed !
How can I get this effect image from CALayer?

My Original reply follows bellow. It was valid at the time I posted it, but Apple does not allow the usage of UIGetScreenImage anymore. To the best of my knowledge after the launch of iOS4 there's no alternative how to render layers with 3D transformations + your app will be rejected if you use UIGetScreenImage
From the iPhone developer docs on renderInContext :
Additionally, layers that use 3D
transforms are not rendered, nor are
layers that specify backgroundFilters,
filters, compositingFilter, or a mask
values.
So renderInContext is not the function you need to use to render a layer that has a 3D transformation applied.
Best you can do is call : UIGetScreenImage, which basically will give you a screenshot and you can then extract the image out of this screenshot.

Related

Animation on side menu bar using program

https://github.com/romaonthego/RESideMenu This is link of RESideMenu.
I am new to ios and i am working on new project in which i have made this kind of side menu bar. But one things which i want is that the window which is going back must get tilt on one side.
Actually I want one side to get decrease in size like flipkart app in ios. So can you please tell me if there is some way to fix this by doing any coding.
You need to apply a CATransform3D to the UIView's layer property transform. An affine transform is a transform in which all sides of the UIView remain parallel. You will need to apply a non affine transform because the top and bottom of your view will not be parallel during the flip effect. You will need to use the underlying UIView's CALayer transform property to apply the CATransform3D. You can access this property through the layer property on your view object. FYI, books have been written about Core Animation, so it is not a light topic especially for a beginner, but as you can see from the code below it is quite simple to apply a 3D rotation to your view.
//Transform
CATransform3D transform = CATransform3DIdentity;
//Modify the perspective transform
transform.m34 = - 1.0 / 500.0;
//Rotate
transform = CATransform3DRotate(transform, M_PI_4, 0, 1, 0);
//Apply transform to the layer
self.layerView.layer.transform = transform;

CIContext drawImage:inRect:fromRect: scaling

I am using CIContext method - (void)drawImage:(CIImage *)im inRect:(CGRect)dest fromRect:(CGRect)src to draw my image to screen. But I need to implement zoom-in/zoom-out method. How could I achieve it? I think zoom-in could be achieved increasing dest rect, because apple docs says:
The image is scaled to fill the destination rectangle.
But what about zoom-out? Because if dest rectangle is scaled down, then image is drawn in it's actual size, but only part of image is visible then (part that fits in dest rectangle).
What could you suggest?
You may try using this for image resizing (zooming). Hope this helps you.
Take a look at this little toy app I made
It's to demonstrate the NSImage version of your CIContext method:
- (void)drawInRect:(NSRect)dstRect
fromRect:(NSRect)srcRect
operation:(NSCompositingOperation)op
fraction:(CGFloat)delta
I did this to find out exactly how the rects relate to each other. It's interactive, you can play with the sliders and move/zoom the images. Not a solution, but it might help you work things out.
You can use CIFilter to resize your CIImage before drawing. Quartz Composer comes with a good example of using this filter. Just look up the description of the Image Resize filter in QC.
EDIT:
Another good filter for scaling is CILanczosScaleTransform. There is a snippet demonstrating basic usage.

CALayer and view disappeared

I have a large image managed with CATiledLayer (like the Large Image Downsizing iOS sample code).
I had a drawing view (UIView overrided with drawing methods) on it but when I zoom a lot, I get the following message and my view disappeared..
-[<CALayer: 0xb253aa0> display]: Ignoring bogus layer size (25504.578125, 15940.361328)
Is there a way to avoid this ?
Sounds like the levelsOfDetail and levelsOfDetailBias you are setting are allowing for more zoom than the tiled layer should allow given the max layer size allowable for the layer. Try changing those to lessen how much the user can zoom.
Here is a great article explaining some of the undocumented behavior of CATiledLayer.

how to get the custom projection matrix of UIImagePickerController 's camera

I use UIImagePickerController as camera. Now I want to integrate with ogre 3d render to make an ar application. Does someone know how to get the custom projection matrix of UIImagePickerController.
I convert cameracontroller.cameraviewtransform to a matrix. and the view matrix I just convert 2dar affinetransform to transform3d.
but it doesn't work.
Worse comes to worst, you can generate the projection transform from what you know about the camera, which is primarily just the FOV angle. Things like the orientation and location of the camera in an AR application are simply offsets from whatever your defaults are, and this data can be retrieved from the gyro and accelerometer.

Resizing CATiledLayer's Using Scale Transforms vs. Bounds Manipulation

I've got my layer hosted workspace working so that using CATiledLayers for hundreds of images works nicely when the workspace is zoomed out substantially. All the images use lower resolution representations, and my application is much more responsive when panning and zooming large numbers of images.
However, within my application I also provide the user the ability to resize layers with a resize handle. Before I converted image layers to use CATiledLayers I was doing layer resizes by manipulating the bounds of the image layer according to the resize delta (mouse drag), and it worked well. But now with CATiledLayers in place, this is causing CATiledLayers to get confused when I mix resizing of layers through bounds manipulation and zooming/unzooming the workspace through scale transforms.
Specifically, if I resize a CATiledLayer to half the width/height size (1/4 the area), the image inside it will suddenly scale to a further 1/2 the resized frame leaving 3/4 of the frame empty. This seems to be exactly when the inner CATiledLayer logic gets invoked to provide a lower resolution image representation. It works fine if I don't touch the resize handler and just zoom/unzoom the workspace.
Is there a way to make zooming/resizing play nice together with CATiledLayers, or am I going to have to convert my layer resize logic to use scale transforms instead of bounds manipulations?
I ended up solving this by converting my layer resize logic to use scale transforms by overriding the setBounds: method for my custom image layer class to scale it's containing CATiledLayer, and repositioning accordingly. Also it is important to make sure the CATiledLayer's autoresizingMask is set to kCALayerNotSizable since we are handling resizes manually in setBounds:.
Note: be sure to call the superclass's implementation of setBounds:.