UIActivityViewController progress & result - ios7

I am using UIActivityViewController to save a bunch of video assets to user's camera roll, but the problem is there is no way to know whether the save to photo library was successful or not, and also get error code if it was unsuccessful. Is there any way to override the default behavior of builtin activity? I see that the completionHandler of UIActivityViewController is pretty useless in this regard.

Use the completionBlock to do.
ALAssetsLibrary *lib = [[[ALAssetsLibrary alloc] init] autorelease];
if ([lib videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) {
[lib writeVideoAtPathToSavedPhotosAlbum:videoURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if (!error)
{
[self performSelectorOnMainThread: #selector(dismissAlertView) withObject: nil, waitUntilDone:NO];
}
}];
}
- (void)dismissAlertView
{
//dismiss your alertview here.
}

Related

ios 7 non deprecated solution to resuming background music from duck [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am able to duck my background audio while playing new sounds. However I am unable to resume the background audio level to maximum again. When my delegate tries to "unduck" it just keeps being ducked. The normal fix for this is AudiosessionSetProperty, but that's deprecated in iOS 7 and Apple doesn't give any hints in the deprecation warnings or documentation.
I call this method on load of view.
- (void) configureAVAudioSession
{
//get your app's audioSession singleton object
AVAudioSession* session = [AVAudioSession sharedInstance];
//error handling
BOOL success;
NSError* error;
success=[session setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&error];
if (!success)
{
NSLog(#"AVAudioSession error :%#",error);
}
else
{
}
success = [session setActive:YES error:&error];
if (!success) {
NSLog(#"Error setting active %#",error);
}
else
{
NSLog(#"succes settings active");
}
}
This is when I play audio
-(void)playTimeOnGo
{
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"just-like-magic"
ofType:#"mp3"]];
self.audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:url
error:nil];
self.audioPlayer.delegate=(id<AVAudioPlayerDelegate>)self;
//get your app's audioSession singleton object
AVAudioSession* session = [AVAudioSession sharedInstance];
//error handling
BOOL success;
NSError* error;
success=[session setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionDuckOthers error:&error];
[self.audioPlayer prepareToPlay];
[self.audioPlayer play];
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
}
This is my delegate when audio is done to resume background audio and undock audio
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)data successfully:(BOOL)flag{
[self configureAVAudioSession];
NSLog(#"playback ended");
}
So how do I unduck the background music again without deprecated APIs? calling [self configureAVAudioSession]; apparently doesn't work....
Ladies an gentlemen, I provide you with a non-deprecated working example of how to properly use ducking in iOS 7.
In your whatever "view load method" call this method, where the bool is set to YES. this will mix the background audio and prepare it for ducking
- (void) configureAVAudioSession:(bool) active
{
//get your app's audioSession singleton object
AVAudioSession* session = [AVAudioSession sharedInstance];
//error handling
BOOL success;
NSError* error;
success=[session setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&error];
if (!success)
{
NSLog(#"AVAudioSession error :%#",error);
}
else
{
}
success = [session setActive:active error:&error];
if (!success) {
NSLog(#"Error setting active %#",error);
}
else
{
//NSLog(#"success settings active");
}
}
Play you audio file with this method. Watch how ducking happens.. remember to set the delegate as i have in this example EVERY time you play a new audio alert. dont set it once in view load methods.
-(void)playTimeOnGo
{
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
//alertName is the name of your audio file without extention. extenstions is, doh extenstion like "mp"
pathForResource:_dataManager.optionsSettings.setStarts.alertName
ofType:_dataManager.optionsSettings.setStarts.alertExtension]];
self.audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:url
error:nil];
self.audioPlayer.delegate=(id<AVAudioPlayerDelegate>)self;
//get your app's audioSession singleton object
AVAudioSession* session = [AVAudioSession sharedInstance];
//error handling
BOOL success;
NSError* error;
success=[session setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionDuckOthers error:&error];
[self.audioPlayer prepareToPlay];
[self.audioPlayer play];
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
}
the delegate will call this method after playback and turn up the volume of background music :) Please dont get confused that i am using a thread for this. u can just call the method if you wish but this stalls the main thread about a second maybe even two seconds. So it thats okay with you, dont use a thread and just call [self configureAVAudioSession:NO];
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)data successfully:(BOOL)flag
{
_playBackFinishedDelegateThead = [[NSThread alloc] initWithTarget:self selector:#selector(configureAVAudioSession:) object:NO];
[_playBackFinishedDelegateThead start];
}
This example IS 100% tested and working in my app.

Switching from AFNetworking to RestKit

I started developing my application using AFNetworking. Everything went OK till I want to use core data. I know there is an additional class (AFIncrementalStore) for that. But because I'm new to IOS-development and there is not a lot of information about that. I decided to switch to RestKit because here is a lot more information. Now, I followed a tutorial about AFNetworking. Here I created an API class which this method in it.
+(API *)sharedInstance
{
static API *sharedInstance = nil;
static dispatch_once_t oncePredicate;
dispatch_once(&oncePredicate, ^ {
sharedInstance = [[self alloc]initWithBaseURL:[NSURL URLWithString:kAPIHost]];
});
return sharedInstance;
}
#pragma mark - init
//intialize the API class with the destination host name
-(API *)init
{
//call super init
self = [super init];
if (self != nil){
//initialize the object
user = nil;
[self registerHTTPOperationClass:[AFJSONRequestOperation class]];
// Accept HTTP Header; see http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.1
[self setDefaultHeader:#"Accept" value:#"application/json"];
}
return self;
}
-(void)loginCommand:(NSMutableDictionary *)params onCompletion:(JSONResponseBlock)completionBlock{
NSLog(#"%#%#",kAPIHost,kAPILogin);
NSMutableURLRequest *apiRequest = [self multipartFormRequestWithMethod:#"POST" path:kAPILogin parameters:params constructingBodyWithBlock:^(id <AFMultipartFormData>formData){
//TODO: attach file if needed
}];
AFJSONRequestOperation *operation = [[AFJSONRequestOperation alloc] initWithRequest:apiRequest];
[operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject){
//success!
NSLog(#"SUCCESSSS!");
completionBlock(responseObject);
}failure:^(AFHTTPRequestOperation *operation, NSError *error){
//Failure
NSLog(#"FAILUREE!");
completionBlock([NSDictionary dictionaryWithObject:[error localizedDescription] forKey:#"error"]);
}];
[operation start];
}
This handles the communication between my webservice and application.
In the viewControler itself I call this method like this.
/* [[API sharedInstance] loginCommand:[NSMutableDictionary dictionaryWithObjectsAndKeys:_txtLogin.text,#"email",_txtPass.text,#"pwd", nil] onCompletion:^(NSDictionary *json){
//completion
if(![json objectForKey:#"error"]){
NSLog(#"status %#",[json valueForKeyPath:#"data.status"]);
if([[json valueForKeyPath:#"data.status"]intValue] == 200){
// Everything is oké, and login is succesfull
}else{
//show validation
}
}else {
NSLog(#"Cannot connect to the server");
}
}];*/
This is how I do this in AFnetworking. But what are the differences when I do this in RestKit. I searched for tutorials. But after the update from RestKit 1.0 to 2.0 a lot of these tutorials are outdated. So I hope anybody can help me out with this!
Kind regards!
I used this tutorial for using RestKit. It shows you how to use it and you can learn the other details. http://www.youtube.com/watch?v=dFi9t8NW0oY

iPad capturing 16:9 photos

I am building a prototype app on iOS, and I’m cannibalizing some Apple sample code to do it (thin ice, I know—this code uses goto statements :\ ). I am using the AVCam project from Session 520 - What's New in Camera Capture. I don’t need video capture capability, just still photos.
The device inputs and outputs are set up thusly:
// Init the device inputs
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureDeviceInput *newAudioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
// Setup the still image file output
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = #{AVVideoCodecKey: AVVideoCodecJPEG};
[newStillImageOutput setOutputSettings:outputSettings];
// Create session (use default AVCaptureSessionPresetHigh)
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
// Add inputs and output to the capture session
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddInput:newAudioInput]) {
[newCaptureSession addInput:newAudioInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
[self setStillImageOutput:newStillImageOutput];
[self setVideoInput:newVideoInput];
[self setAudioInput:newAudioInput];
[self setSession:newCaptureSession];
And here is the method that’s called when I tap the shutter button:
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:orientation];
[[self stillImageOutput]
captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error)
{
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)])
{
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if ([self.delegate respondsToSelector:#selector(captureManagerCapturedImage:)])
{
dispatch_async(dispatch_get_main_queue(), ^{
[self.delegate captureManagerCapturedImage:image];
});
}
[library writeImageToSavedPhotosAlbum:[image CGImage]
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
}
else
{
completionBlock(nil, error);
}
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)])
{
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
This code successfully captures an image and saves it to the library. However, at some point while I was working on it, it changed from capturing 5-megapixel 4:3 images to capturing 1920x1080 16:9 images. I can’t find anywhere that the aspect ratio is specified, and I didn’t change any of the code relating to the configuration of the camera, capture sessions, or capture connection. Why did my camera start taking 16:9 photos?
Update: I just re-ran Apple’s original sample code, and it appears that it is also saving 16:9 images captured directly from the video. It is quite possible that I was insane before, or I took a test shot with Camera.app and was looking at that. So my real question is, how do I show a live feed from the camera on the screen while I’m shooting, and take a full-resolution photo. I can’t use UIImagePickerController, because I need to be able to overlay things on top of the live camera feed.
Update 2: I was able to solve this by throwing out the AVCapture code I was using. It turns out that UIImagePickerController does what I needed. I didn’t realize you could overlay custom controls - I thought it took over the whole screen until you were done taking a picture.
If you're capturing frames from a video source, you'll end up with a resolution of 16:9. Capturing frames from a video source and taking photos are different things.

Upload photo using new iOS Facebook SDK API (3.0)

How can I upload a photo to facebook from an iOS app using their new API/SDK? I've already tried and I'm not getting anywhere, just keep running in circles. Here is the code I currently have:
-(void)dataForFaceboo{
self.postParams =
[[NSMutableDictionary alloc] initWithObjectsAndKeys:
self.uploadPhoto.image, #"picture", nil];
}
-(void)uploadToFacebook{
[self dataForFacebook];
NSLog(#"Going to facebook: %#", self.postParams);
// Hide keyboard if showing when button clicked
if ([self.photoCaption isFirstResponder]) {
[self.photoCaption resignFirstResponder];
}
// Add user message parameter if user filled it in
if (![self.photoCaption.text
isEqualToString:kPlaceholderPostMessage] &&
![self.photoCaption.text isEqualToString:#""])
{
[self.postParams setObject:self.photoCaption.text forKey:#"message"];
}
[FBRequestConnection startWithGraphPath:#"me/feed"
parameters:self.postParams
HTTPMethod:#"POST"
completionHandler:^(FBRequestConnection *connection,
id result,
NSError *error)
{
NSString *alertText;
if (error) {
alertText = [NSString stringWithFormat:
#"error: domain = %#, code = %d",
error.domain, error.code];
} else {
alertText = [NSString stringWithFormat:
#"Posted action, id: %#",
[result objectForKey:#"id"]];
}
// Show the result in an alert
[[[UIAlertView alloc] initWithTitle:#"Result"
message:alertText
delegate:self
cancelButtonTitle:#"OK!"
otherButtonTitles:nil] show];
}];
}
Your code is fine, some slight changes to be done:
add the image to the dictionary in NSData format, like
[params setObject:UIImagePNGRepresentation(_image) forKey:#"picture"];
and change the graph path to "me/photos" instead of "me/feed"
Make these changes, it worked for me.
Remember you need to use "publish_actions" permissions.
"me/photos" is meant for the photo actually be in the "Photo's" list on your Facebook profile. "me/feed" is just a post on the timeline.

Error trying to assigning __block ALAsset from inside assetForURL:resultBlock:

I am trying to create a method that will return me a ALAsset for a given asset url. (I need upload the asset later and want to do it outside the result block with the result.)
+ (ALAsset*) assetForPhoto:(Photo*)photo
{
ALAssetsLibrary* library = [[[ALAssetsLibrary alloc] init] autorelease];
__block ALAsset* assetToReturn = nil;
NSURL* url = [NSURL URLWithString:photo.assetUrl];
NSLog(#"assetForPhoto: %#[", url);
[library assetForURL:url resultBlock:^(ALAsset *asset)
{
NSLog(#"asset: %#", asset);
assetToReturn = asset;
NSLog(#"asset: %# %d", assetToReturn, [assetToReturn retainCount]);
} failureBlock:^(NSError *error)
{
assetToReturn = nil;
}];
NSLog(#"assetForPhoto: %#]", url);
NSLog(#"assetToReturn: %#", assetToReturn); // Invalid access exception coming here.
return assetToReturn;
}
The problem is assetToReturn gives an EXC_BAD_ACCESS.
Is there some problem if I try to assign pointers from inside the block? I saw some examples of blocks but they are always with simple types like integers etc.
A few things:
You must keep the ALAssetsLibrary instance around that created the ALAsset for as long as you use the asset.
You must register an observer for the ALAssetsLibraryChangedNotification, when that is received any ALAssets you have and any other AssetsLibrary objects will need to be refetched as they will no longer be valid. This can happen at any time.
You shouldn't expect the -assetForURL:resultBlock:failureBlock:, or any of the AssetsLibrary methods with a failureBlock: to be synchronous. They may need to prompt the user for access to the library and will not always have their blocks executed immediately. It's better to put actions that need to happen on success in the success block itself.
Only if you absolutely must make this method synchronous in your app (which I'd advise you to not do), you'll need to wait on a semaphore after calling assetForURL:resultBlock:failureBlock: and optionally spin the runloop if you end up blocking the main thread.
The following implementation should satisfy as a synchronous call under all situations, but really, you should try very hard to make your code asynchronous instead.
- (ALAsset *)assetForURL:(NSURL *)url {
__block ALAsset *result = nil;
__block NSError *assetError = nil;
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[self assetsLibrary] assetForURL:url resultBlock:^(ALAsset *asset) {
result = [asset retain];
dispatch_semaphore_signal(sema);
} failureBlock:^(NSError *error) {
assetError = [error retain];
dispatch_semaphore_signal(sema);
}];
if ([NSThread isMainThread]) {
while (!result && !assetError) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
else {
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
}
dispatch_release(sema);
[assetError release];
return [result autorelease];
}
You should retain and autorelease the asset:
// ...
assetToReturn = [asset retain];
// ...
return [assetToReturn autorelease];