How to get Vuforia working with AVCaptureVideoPreviewLayer and SceneKit? - rendering

I am developing an Augmented Reality app using the Vuforia SDK. I am trying to use AVCaptureVideoPreviewLayer and SceneKit for application rendering instead of raw OpenGL calls provided by Vuforia sample code.
I got the AVCaptureVideoPreviewLayer and SceneView working without Vuforia, i.e. I managed to draw 3D scene on top of camera video background. The code is at: https://github.com/lge88/scenekit-test0/blob/master/scenekit-test0/GameViewController.swift#L74-L85:
func initViews() {
let rootView = self.view
let scnView = createSceneView()
let scene = createScene()
scnView.scene = scene
let videoView = createVideoView()
rootView.addSubview(videoView)
rootView.addSubview(scnView)
}
The implementation can be summarized as:
Create a UIView called videoView.
Initialize an AVCaptureVideoPreviewLayer, and add it as a sublayer of videoView.
Create a SCNView called scnView and initialize the scene o scnView.
Add both videoView and scnView to the root UIView.
Currently I am trying to integrate Augmented Reality feature, GameViewController.swift#L68-L71:
initViews()
animateScene()
initControls()
ARServer(size:viewFrame.size, done: initARCallback)
ARServer is a class that takes care of the Vuforia initialization, its implementation is taken from Vuforia ImageTargets sample code. The tracker is working, it can successfully track the targets of the sample dataset.
However the AVCaptureVideoPreviewLayer rendering doesn't work correctly, the area of the video rendering area is resized, and the video layer is not updating, it shows a static image captured when the tracker camera started. Here is how it looks from a ipad screenshot: https://github.com/lge88/scenekit-test0/blob/master/video-preview-layer-issue.png

This strategy could get ugly on you really fast. Better would be to render everything into one view with one OpenGL context. If Vuforia wants to do its own GL stuff, it can share that view/context, too.
Look at Apple's GLCameraRipple sample code for getting live camera imagery into GL, and SCNRenderer for making SceneKit render its content into an arbitrary OpenGL (ES) context.
Alternatively, if you just want to get camera imagery into a SceneKit view, remember you can assign any Core Animation layer to the contents of a material — this should work for AVCaptureVideoPreviewLayer, too.

Related

blurred image from AVCaptureVideoPreviewLayer in iOS7

I am developing a iOS7 video recording application. The camera screen in our application requires to show a blurred backgound similar to the one shown in iOS7 control Center. While the video preview is being shown, we need to show the blurred control center.
As suggested in WWDC video 226, I have used the code provided below to get camera preview snapshot and then applying blur and setting this blurred image to my view.
UIGraphicsBeginImageContextWithOptions(_camerapreview.frame.size, NULL, 0);
[_camerapreview drawViewHierarchyInRect:rect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lightImage = [newImage applyLightEffect]
Here _camerapreview is a UIView which contains AVCaptureVideoPreviewLayer. The *newImage obtained from the context is for some reason black.
However if i use [_camerapreview snapshotView] returns me a UIview with previewlayer content. But there is no way to apply blur on UIview.
How can I get a blurred image from the camera preview layer?
I would suggest that you put your overlay onto the video preview itself as a composited video layer, and add a blur filter to that. (Check the WWDC AVFoundation sessions and the AVSimpleEditoriOS sample code for compositing images onto video) That way you're staying on the GPU instead of doing readbacks from GPU->CPU, which is slow. Then drop your overlay's UI elements on top of the video preview within a clear background UIView.
That should provide greater performance. As good as Apple? Well, they are using some private stuff developers don't yet have access to...
Quoting Apple from this Technical Q&A:
A: Starting from iOS 7, the UIView class provides a method
-drawViewHierarchyInRect:afterScreenUpdates:, which lets you render a snapshot of the complete view hierarchy as visible onscreen into a
bitmap context. On iOS 6 and earlier, how to capture a view's drawing
contents depends on the underlying drawing technique. This new method
-drawViewHierarchyInRect:afterScreenUpdates: enables you to capture the contents of the receiver view and its subviews to an image
regardless of the drawing techniques (for example UIKit, Quartz,
OpenGL ES, SpriteKit, AV Foundation, etc) in which the views are
rendered
In my experience regarding AVFoundation is not like that, if you use that method on view that host a preview layer you will only obtain the content of the view without the image of the preview layer. Using the -snapshotViewAfterScreenUpdates: will return a UIView that host a special layer. If you try to make an image from that view you won't see nothing.
The only solution I know are AVCaptureVideoDataOutput and AVCaptureStillImageOutput. Each one has its own limit. The first one can't work simultaneously with a AVCaptureMovieFileOutput acquisition, the latter makes the shutter noice.

Take a screenshot of an UIView where its subviews are camera sessions

I'm building an app where I need to take a screenshot of a view whose subviews are camera sessions (AVFoundation sessions). I've tried this code:
CGRect rect = [self.containerView bounds];
UIGraphicsBeginImageContextWithOptions(rect.size,YES,0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.containerView.layer renderInContext:context];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Which effectively gets me an UIImage with the views, only that the camera sessions are black:
I've tried the private method UIGetScreenImage() and works perfectly, but as Apple doesn't allows this, I can't use it. I've also tried the one in Apple's docs but it's the same. I've tracked the problem to AVFoundation sessions using layers. How can I achieve this? The app has a container view with two views which are stopped camera sessions.
If using iOS 7, it's fairly simple and you could do something like this from a UIViewController:
UIView *snapshotView = [self.view snapshotViewAfterScreenUpdates:YES];
You can also use this link from a widow: iOS: what's the fastest, most performant way to make a screenshot programatically?
For iOS 6 and earlier, I could only find the following Apple Technical Q&A: [How do I take a screenshot of my app that contains both UIKit and Camera elements?]
Capture the contents of your camera view.
Draw that captured camera content yourself into the graphics context that you are rendering your UIKit elements. (Similar to what you did in your code)
I too am currently looking for a solution to this problem!
I am currently out at the moment so I can't test what I have found, but take a look at these links:
Screenshots-A Legal Way To Get Screenshots
seems like its on the right track - here is the
Example Project (and here is the initial post)
When I manage to get it to work I will definitely update this answer!

Optimize QuartzCore Animation with Shadows and Rasterization

Within my iOS app, I have a uiview that needs to be animated, transformed with gestures, shaded (using quartzcore shadows), and edited. When I perform animations and gestures on this UIView it is extremely "laggy". The animations aren't very "laggy" on the iPhone, however when using the iPad the animations become almost unresponsive (to the point where it seems like my app is crashing). I've tested my app using Instruments, and the app isn't taking up much memory / CPU / power until the animations begin. I have tested both on the device and on my Intel i7 8GB iMac and the animations are "laggy" on both.
The animation I am performing is nothing complex, it is simply a translation across the X Axis. After looking through every line of code related to the animation, I found that these lines are the issue(s):
viewer.layer.masksToBounds = NO;
viewer.layer.shadowOffset = CGSizeMake(-1, 1);
viewer.layer.shadowRadius = 3;
viewer.layer.shadowOpacity = 0.3;
The above code adds a shadow to the view that lags whenever I animate it (viewer). If I use the above code, but I add the following line animations work nicely:
viewer.layer.shouldRasterize = YES;
The problem with this code is that is seriously decreases the quality of the content displayed inside of the UIView (viewer). Here's an image with shouldRasterize set to YES:
Then the UIView without shouldRasterize:
Those screenshots are both from the same Retina iPad.
The ultimate question: How can I smoothly perform animations on a UIView with shadows (preferably using QuartzCore)? Is there a way to rasterize the content without degrading its quality?
The shadow properties on CALayer can be very inefficient while animating because it requires recalculating the shadow on every frame based on the contents of the layer. Luckily, the expensive part is the shadow path calculation, so if you just create a CGPath representing the shape of your content and assign it to layer.shadowPath then performance will skyrocket.
Since your layer seems to be completely filled with opaque data, the shadowPath is pretty simple:
layer.shadowPath = [UIBezierPath bezierPathWithRect:(CGRect){CGPointZero, layer.bounds.size}].CGPath;
The only downside is you'll need to edit this whenever the size of the layer changes.

How draw line/arcs on CIImage?

I have a CIImage and I am trying to draw lines on mouse drag along the path. Can I do that ?
I am building an app for mac, and I am new to Core Image.
You need create a graphics context first and then retrieve your image from that. Here's a link that will answer your question. It's for the iPhone but Mac and iPhone are very similar.
How to create a UIImage from the current graphics context?

animate images in cocos2d

I am newbie in cocos2D But have developed several view based application.
So what I want to know is:
Is there any alternate ,for animationImages used for animating the images in imageview, in cocos2D?
I want to use the same thing i.e: Animate the images same as any gif file.
Thanks in advance....
Animation in Cocos2D is handled by the CCAnimation class, you can read the documentation here:
http://www.cocos2d-iphone.org/api-ref/latest-stable/interface_c_c_animation.html
There are a number of examples for animating sprites, using a Sprite Sheet, single Images, a Texture Cache, etc.
This is one of my favorite ones:
http://www.raywenderlich.com/1271/how-to-use-animations-and-sprite-sheets-in-cocos2d