HDR implementation in AVCaptureSession - cocoa-touch

Is there a convenient way to use HDR photo capture in iOS 6 or iOS 7 while capturing image using AVCaptureSession? I've searched through StackOverflow and couldn't find anything.

No, sadly, this feature is not exposed by the APIs currently. The only way would be to use the UIImagePickerController but it's probably not what you want.

Related

How to get first video frame?

I am programming media player using VLCKit. I want to take preview picture of the video. How can i do that using VLCKit or maybe another tools?
P.S. I've already used AVFoundation and QTKit, but it didn't work. They argue on video format (.mkv)
You want to use VLCKit's thumbnailer class. It is doing everything for you.

How to use photos filters in iOS 7

Quick question here, I didn't find the answer around here or in apple docs. Is it possible to use the filters in camera mode like apple does in the camera application?
If you want to Apply filters on photo, selected from Photogallery or captured from Camera then You can refer to this link Apply Filter on Image
Ok it seems it is not possible to use Apple's filters options. That what I wanted to know. I'll check alternatives. Thank you.

AVFoundation capture UIImage

I'm trying to capture one or more UIImages programmatically using AVFoundation.
I set up the sessions and input devices and everything, but when I try to find explanations on how to actually take the photos, all I get is buffeled information about connections and what not.
I couldn't find a single example of actually taking photos and saving it to UIImage for further processing. All the example use a constant kCGImagePropertyExifDictionary Which doesn't seems to exist in iOS 5 SDK..
Can someone please provide me with a code or an explanation from top to bottom on how to take and save an image from the front facing camera to a UIImage using AVFoundation?
Thanks alot!
To use kCGImagePropertyExifDictionary, you should #import <ImageIO/ImageIO.h>.
All of the other information you seek is inside the AVFoundation Programming guide - particularly the Media Capture section.

360 degree video in MPMoviePlayerController

I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.

Library for displaying image sets on an iPad

I'm looking for some cool libraries out there to display a bunch of images in my iPad application.
I think they are already out there, and I don't have to start from scratch, but I can't find some good ones.
Check out AQGridView as well.
Check out KTPhotoBrowser, I use it in a universal app, and it works pretty good.
Also, search Github : Photo Gallery for more open-source solutions.
See the answers to these questions on Stack Overflow:
How To Create A Gallery on iOS
float:left in objective-c
There are probably more questions about the same subject. Seek and you shall find.
I would simply create a web page with all the images, and then display it using UIWebView.
Did you want to do the coding yourself?
I found the Apple example "LazyTableImages" to be a useful and elegant way to display many images in a table view. It's important to a) keep the scrolling nice and smooth, and b) not freeze the user interface while downloading.
Have you tried using a UIImageView?