I am making app like Instagram .i am using GPUImage framework ,in this i have to take photos and videos and share. i able to capture photos using this framework and now i have to capture video but i am struggling how to change camera mode photos to video. any help and tutorial then its very good for me. I used this code camera for photos mode.
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
self.imagepicker.sourceType = UIImagePickerControllerSourceTypeCamera;
[[NSBundle mainBundle] loadNibNamed:#"OverlayView" owner:self options:nil];
self.overlayView.frame = self.imagepicker.cameraOverlayView.frame;
self.imagepicker.cameraOverlayView = self.overlayView;
self.overlayView = nil;
CGSize result = [[UIScreen mainScreen] bounds].size;
self.imagepicker.showsCameraControls = NO;
self.imagepicker.allowsEditing = NO;
self.imagepicker.wantsFullScreenLayout = NO;
// self.imagepicker.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];
}
else{
self.imagepicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
}
In my case, I'm using GPUImage to do both (pictures and videos). Therefore I've created two objects: one of type GPUImageStillCamera(pictures) and other of type GPUImageVideoCamera (videos).
So whenever you need to switch between cameras you basically stop the GPUImageStillCamera capture and initialize a video camera (note that you have to adapt this snippet to your project):
func initializeVideoCamera() {
// Stop the capture of GPUImageStillCamera
stillCamera.stopCameraCapture()
videoCamera = GPUImageVideoCamera.init(sessionPreset: AVCaptureSessionPreset1920x1080, cameraPosition: .Back)
videoCamera?.outputImageOrientation = .Portrait
videoCamera?.addTarget(filter)
// If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
unlink(pathToMovieFile)
initializeWriteWithPath(pathToMovieFile)
videoCamera?.startCameraCapture()
}
Related
Hopefully someone can help.
I've got an application that currently uses the UIWebview to play audio, which works even if the device is locked as I enable mediaPlaybackAllowsAirPlay.
However I started to move the code to use the new WKWebView and I've enabled mediaPlaybackAllowsAirPlay, However when I press the lock button, the WebView stops playing the audio.
Any ideas how I can get WKWebview to behave like the UIWebView?
Update, Here's an example of my code:
WKUserContentController *wkUController = [[WKUserContentController alloc] init];
WKWebViewConfiguration *theConfiguration = [[WKWebViewConfiguration alloc] init];
theConfiguration.userContentController = wkUController;
theConfiguration.allowsAirPlayForMediaPlayback = YES;
webViewActive = [[WKWebView alloc] initWithFrame:viewForWeb.frame configuration:theConfiguration];
webViewActive.navigationDelegate = self;
webViewActive.frame = viewForWeb.bounds;
[webViewActive setAutoresizingMask: UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth];
[viewForWeb addSubview:webViewActive];
NSURL *nsurl=[NSURL URLWithString:#"URL TO AUDIO"];
NSURLRequest *nsrequest=[NSURLRequest requestWithURL:nsurl];
[webViewActive loadRequest:nsrequest];
Thanks
I'm very new at iOS and Objective C and I'm trying to design something that it requires to take two consecutive pictures (and save them both) And just learned how to use the camera in iOS 5, so I was wondering how to implement something like that.
I need to be able to open the camera, take a picture, save it and then automatically follows with the camera again to take another picture
P.D. I've been using this to use the camera:
- (void) useCamera
{
if ([UIImagePickerController isSourceTypeAvailable:
UIImagePickerControllerSourceTypeCamera])
{
UIImagePickerController *imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.mediaTypes = [NSArray arrayWithObjects:(NSString *) kUTTypeImage,nil];
imagePicker.allowsEditing = NO;
[self presentModalViewController:imagePicker animated:YES];
newMedia = YES;
}
}
hint: Use AVCaptureDevice, AVCaptureDeviceInput, AVCaptureStillImageOutput, AVCaptureSession.
these 2 links might be helpful. I am fairly new to iOS programing myself but this looks like something promising
How fast can iPhone to be programmed take 2 pictures at one time?
iPhone SDK 4 AVFoundation - How to use captureStillImageAsynchronouslyFromConnection correctly?
I want to add different effects to an UIImage. Currently I am trying with using GUIImage library. https://github.com/BradLarson/GPUImage
But, I am unable to run most of the examples provided. Only working example for me is FilterShowcase It works great.
But the problem is, its work only with camera. What i want is Load an static image on to a UIImageView and apply those filters.
I tried, but unable to do it.
This is how I tried it
- (void)setupFilter;
{
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
//// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];
// videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
UIImageView *imgView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 350)];
imgView.Image=[UIImage imageNamed:#"WID-small.jpg"];
[self.view addSubview:imgView];
BOOL needsSecondImage = NO;
switch (filterType)
{
case GPUIMAGE_SEPIA:
{
self.title = #"Sepia Tone";
self.filterSettingsSlider.hidden = NO;
[self.filterSettingsSlider setValue:1.0];
[self.filterSettingsSlider setMinimumValue:0.0];
[self.filterSettingsSlider setMaximumValue:1.0];
filter = [[GPUImageSepiaFilter alloc] init];
sourcePicture = [[GPUImagePicture alloc] initWithImage:imgView.image smoothlyScaleOutput:YES];
[sourcePicture processImage];
[sourcePicture addTarget:filter];
}; break;
Actually what I tried to do is, Load a still image instead of videoCamera. But it doesn't work.
If anyone can help me, its highly appreciated.
First, there are a couple of reasons why the other sample applications might not be building for you. As is stated in this answer, make sure that you have the scheme for the application selected in the upper left of your Xcode window, not the GPUImage framework project. If changing that doesn't help, exit out of Xcode, delete the relevant project directories from your DerivedData directory, and restart Xcode. That sometimes seems to be needed due to a bug in recent Xcode versions.
The filtering of an image is described in the documentation for the project, which is in the Readme.md file and on the page you link above. In particular, see the section titled "Processing a still image":
There are a couple of ways to process a still image and create a
result. The first way you can do this is by creating a still image
source object and manually creating a filter chain:
UIImage *inputImage = [UIImage imageNamed:#"Lambeau.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageSepiaFilter *stillImageFilter = [[GPUImageSepiaFilter alloc] init];
[stillImageSource addTarget:stillImageFilter];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentlyProcessedOutput];
For single filters that you wish to apply to an image, you can simply
do the following:
GPUImageSepiaFilter *stillImageFilter2 = [[GPUImageSepiaFilter alloc] init];
UIImage *quickFilteredImage = [stillImageFilter2 imageByFilteringImage:inputImage];
The SimpleImageFilter example application shows how to do this in more detail.
I have some pretty basic code to capture a still image using AVFoundation.
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
AVVideoCodecJPEG, AVVideoCodecKey,
nil];
[newStillImageOutput setOutputSettings:outputSettings];
[outputSettings release];
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
[newCaptureSession beginConfiguration];
newCaptureSession.sessionPreset = AVCaptureSessionPreset640x480;
[newCaptureSession commitConfiguration];
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
self.stillImageOutput = newStillImageOutput;
self.videoInput = newVideoInput;
self.captureSession = newCaptureSession;
[newStillImageOutput release];
[newVideoInput release];
[newCaptureSession release];
My method that captures the still image is also pretty simple and prints out the orientation which is AVCaptureVideoOrientationPortrait:
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];
if ([stillImageConnection isVideoOrientationSupported]){
NSLog(#"isVideoOrientationSupported - orientation = %d", orientation);
[stillImageConnection setVideoOrientation:orientation];
}
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error) { // HANDLE }
};
if (imageDataSampleBuffer != NULL) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
self.stillImageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
self.stillImage = [UIImage imageWithData:self.stillImageData];
UIImageWriteToSavedPhotosAlbum(self.stillImage, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
else
completionBlock(nil, error);
}];
}
So the device understands it's in portrait mode as it should be, the exif attachements show me:
PixelXDimension = 640;
PixelYDimension = 480;
so it seems to know that we're in 640x480 and that means WxH (obviously...)
However when I email the photo to myself from Apples Photos app, I get a 480x640 image if I check the properties in Preview. This didn't make any sense to me until I dug further into image properties to find out that the image orientation is set to "6 (Rotated 90 degrees CCW)" I'm sure CCW is counter clockwise
So looking at the image in a browser:
http://tonyamoyal.com/stuff/things_that_make_you_go_hmm/photo.JPG
We see a the image rotated 90 degrees CCW and it is 640x480.
I'm really confused about this behavior. When I take a 640x480 still image using AVFoundation, I would expect the default to have no rotated orientation. I expect a 640x480 image oriented exactly as my eye sees the image in the preview layer. Can someone explain why this is happening and how to configure the capture so that when I save my image to the server to later display in a web view, it is not rotated 90 degrees CCW?
This happens because the orientation set in the metadata of the new image is being affected by the orientation of the AV system that creates it. The layout of the actual image data is, of course, different from the orientation mentioned in your metadata. Some image viewing programs respect the metadata orientation, some ignore it.
You can affect the metadata orientation of the AV system by calling:
AVCaptureConnection *videoConnection = ...;
if ([videoConnection isVideoOrientationSupported])
[videoConnection setVideoOrientation:AVCaptureVideoOrientationSomething];
You can affect the metadata orientation of a UIImage by calling:
UIImage *rotatedImage = [[UIImage alloc] initWithCGImage:image.CGImage scale:1.0f orientation:UIImageOrientationSomething];
But the actual data from the AVCapture system will always appear with the wider dimension as X and the narrower dimension as Y, and will appear to be oriented in LandscapeLeft.
If you want the actual data to line up with what your metadata claims, you need to modify the actual data. You can do this by writing the image out to a new image using CGContexts and AffineTransforms. Or there is an easier workaround. Use the UIImage+Resize package as discussed here. And resize the image to it's current size by calling:
UIImage *rotatedImage = [image resizedImage:CGSizeMake(image.size.width, image.size.height) interpolationQuality:kCGInterpolationDefault];
This will rectify the data's orientation as a side effect.
If you don't want to include the whole UIImage+Resize thing you can check out it's code and strip out the parts where the data is transformed.
I am trying to explore ipad dev using phonegap. The problem that i have now is,
I have four different launch images for different orientations. When the app is launched, the correct launch image is being shown for a moment. After this,phonegap tries to load a imageview with default.png and displays it till webview is fully loaded. The problem lies here.. The imageview is autorotated based on the current orientation. So if the current orientation is LandscapeLeft, imageview tries to rotate default.png before displaying it Which is not i wanted and that is why i have different launch images.So in effect, you will have a landscapeleft.png and then the auto-rotated default.png before i get to see the webview.
So i tried changing the phonegapdelegate like this (in applicationDidFinishLaunching)
UIImage* image=nil;
if( [UIApplication sharedApplication].statusBarOrientation == UIInterfaceOrientationLandscapeRight ){
NSLog(#"In default5");
image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"Default5" ofType:#"png"]];
}
if( [UIApplication sharedApplication].statusBarOrientation == UIInterfaceOrientationLandscapeLeft ){
NSLog(#"In default6");
image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"Default6" ofType:#"png"]];
}
if( [UIApplication sharedApplication].statusBarOrientation == UIInterfaceOrientationPortraitUpsideDown ){
NSLog(#"In default7");
image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"Default7" ofType:#"png"]];
}
imageView = [[UIImageView alloc] initWithImage:image];
[image release];
It didn't work and upon debugging i found that the statusbarorientation is always portrait. That may be because phonegap tries to set portrait as the statusbarorientation in the same applicationDidFinishLaunching method(before loading this image view).
Can someone tell me how to load the image view with correct image?
USE THESE NAMES:
Default-LandscapeLeft-ipad.Png
Default-LandscapeRight-ipad.Png
Default-Portrait-ipad.Png
Default-PortraitUpsideDown-ipad.Png
More: http://www.weston-fl.com/blog/?p=840
applicationDidFinishLaunching is too early for the app to receive orientation change notifications from the os. Hence you cannot determine the orientation here.
So the best possible solution is to do it in a lifecycle stage which is capable of determining the current orientaion.In my case, webViewDidStartLoading