Play video in Mac OS X 10.7 - objective-c

What is the simplest way to play a video programmatically with Objective-C in Mac OS X 10.7 (Lion)? And if I want to support OS X 10.6 (Snow Leopard) too?
I noticed that iOS AV Foundation was introduced to OS X 10.7. Unfortunately the documentation seems to be written for iOS and I found it confusing.

Here's a NSView subclass that plays a video given a URL, using AV Foundation (thus Mac OS X 10.7 upwards only). Based on the AVSimplePlayer sample code.
Header:
#interface RMVideoView : NSView
#property (nonatomic, readonly, strong) AVPlayer* player;
#property (nonatomic, readonly, strong) AVPlayerLayer* playerLayer;
#property (nonatomic, retain) NSURL* videoURL;
- (void) play;
#end
Implementation:
static void *RMVideoViewPlayerLayerReadyForDisplay = &RMVideoViewPlayerLayerReadyForDisplay;
static void *RMVideoViewPlayerItemStatusContext = &RMVideoViewPlayerItemStatusContext;
#interface RMVideoView()
- (void)onError:(NSError*)error;
- (void)onReadyToPlay;
- (void)setUpPlaybackOfAsset:(AVAsset *)asset withKeys:(NSArray *)keys;
#end
#implementation RMVideoView
#synthesize player = _player;
#synthesize playerLayer = _playerLayer;
#synthesize videoURL = _videoURL;
- (id)initWithFrame:(NSRect)frame {
self = [super initWithFrame:frame];
if (self) {
self.wantsLayer = YES;
_player = [[AVPlayer alloc] init];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:RMVideoViewPlayerItemStatusContext];
}
return self;
}
- (void) dealloc {
[self.player pause];
[self removeObserver:self forKeyPath:#"player.currentItem.status"];
[self removeObserver:self forKeyPath:#"playerLayer.readyForDisplay"];
[_player release];
[_playerLayer release];
[_videoURL release];
[super dealloc];
}
- (void) setVideoURL:(NSURL *)videoURL {
_videoURL = videoURL;
[self.player pause];
[self.playerLayer removeFromSuperlayer];
AVURLAsset *asset = [AVAsset assetWithURL:self.videoURL];
NSArray *assetKeysToLoadAndTest = [NSArray arrayWithObjects:#"playable", #"hasProtectedContent", #"tracks", #"duration", nil];
[asset loadValuesAsynchronouslyForKeys:assetKeysToLoadAndTest completionHandler:^(void) {
dispatch_async(dispatch_get_main_queue(), ^(void) {
[self setUpPlaybackOfAsset:asset withKeys:assetKeysToLoadAndTest];
});
}];
}
#pragma mark - KVO
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (context == RMVideoViewPlayerItemStatusContext) {
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status) {
case AVPlayerItemStatusUnknown:
break;
case AVPlayerItemStatusReadyToPlay:
[self onReadyToPlay];
break;
case AVPlayerItemStatusFailed:
[self onError:nil];
break;
}
} else if (context == RMVideoViewPlayerLayerReadyForDisplay) {
if ([[change objectForKey:NSKeyValueChangeNewKey] boolValue]) {
self.playerLayer.hidden = NO;
}
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#pragma mark - Private
- (void)onError:(NSError*)error {
// Notify delegate
}
- (void)onReadyToPlay {
// Notify delegate
}
- (void)setUpPlaybackOfAsset:(AVAsset *)asset withKeys:(NSArray *)keys {
for (NSString *key in keys) {
NSError *error = nil;
if ([asset statusOfValueForKey:key error:&error] == AVKeyValueStatusFailed) {
[self onError:error];
return;
}
}
if (!asset.isPlayable || asset.hasProtectedContent) {
[self onError:nil];
return;
}
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) { // Asset has video tracks
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.frame = self.layer.bounds;
self.playerLayer.autoresizingMask = kCALayerWidthSizable | kCALayerHeightSizable;
self.playerLayer.hidden = YES;
[self.layer addSublayer:self.playerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:RMVideoViewPlayerLayerReadyForDisplay];
}
// Create a new AVPlayerItem and make it our player's current item.
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.player replaceCurrentItemWithPlayerItem:playerItem];
}
#pragma mark - Public
- (void) play {
[self.player play];
}
#end

"Simplest" depends on exactly what you're trying to do. If you want more control (e.g., rendering the movie as an OpenGL texture) or less (e.g., a completely independent window that you can just pop up and ignore), there might be different answers.
But for most use cases, if you want 10.6+ support, the simplest way to show a movie is QTKit. See the article "Using QTKit for Media Playback" in the Xcode documentation for a good starting point.

Related

Xcode iOS app hanging on launch screen "semaphore_wait_trap()"

Excuse me but I'm a total Noob, not a programmer. I based a photo editing app on a template and customised it heavily with help from Google searches, tutorials etc.
Using Xcode 7.3.1, iOS 9.3, newer Photosframework and only objective C.
Ive got the app to a point that Im happy with it, except that I noticed on first launch, the app hangs (debug reports semaphore_wait_trap().
The app can't get to next step "request to access photos" alert pop up in iOS 9.3, and only way to get to it is to hit the home button, then see the grant access alert, then switch back to app. Then quit the app, reload it and then it runs fine overtime after that. This is of course not an ideal user experience.
I see if I pause on debug mode its hanging on: "semaphore_wait_trap()"
Ive googled and searched for days and can't find a solution to get the permissions alert popup to show on top of my app window.
Its beyond me. Any Ideas would be greatly appreciated.
See screen shot of the launch image that remains on top of the alert pop up.
If you press the "Home" button, the alert to grant access to photos appears.
The app delegate:
#implementation AppDelegate
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
if ([UIApplication instancesRespondToSelector:#selector(registerUserNotificationSettings:)]){
[application registerUserNotificationSettings:[UIUserNotificationSettings settingsForTypes:UIUserNotificationTypeAlert|UIUserNotificationTypeBadge|UIUserNotificationTypeSound categories:nil]];
}
UILocalNotification *locationNotification = [launchOptions objectForKey:UIApplicationLaunchOptionsLocalNotificationKey];
if (locationNotification) {
// Sets icon badge number to zero
application.applicationIconBadgeNumber = 0;
}
// END Local Notification ==========================
return true;
}
-(void)application:(UIApplication *)application didReceiveLocalNotification:(UILocalNotification *)notification {
// Resets icon's badge number to zero
application.applicationIconBadgeNumber = 0;
}
Here is a snippet of the main View controller (hope its not to long, not sure where the problem lies)
HomeVC.m:
#import "HomeVC.h"
#import "Configs.h"
#import "AAPLGridViewCell2.h"
#import "NSIndexSet+Convenience.h"
#import "UICollectionView+Convenience.h"
#import "AAPLRootListViewController.h"
#import "Configs.h"
#import "ImageEditorTheme.h"
#import "ImageEditorTheme+Private.h"
#import PhotosUI;
#import UIKit;
#interface HomeVC()
<
PHPhotoLibraryChangeObserver,
UICollectionViewDelegateFlowLayout,
UICollectionViewDataSource,
UICollectionViewDelegate
>
#property (nonatomic, strong) NSArray *sectionFetchResults;
#property (nonatomic, strong) NSArray *sectionLocalizedTitles;
#property (nonatomic, strong) PHCachingImageManager *imageManager;
#property CGRect previousPreheatRect;
#property (nonatomic, strong) IBOutlet UICollectionViewFlowLayout *flowLayout;
#property (nonatomic, assign) CGSize lastTargetSize;
#end
#implementation HomeVC
{
UIActivityIndicatorView *_indicatorView;
}
static NSString * const AllPhotosReuseIdentifier = #"AllPhotosCell";
static NSString * const CollectionCellReuseIdentifier = #"CollectionCell";
static NSString * const CellReuseIdentifier = #"Cell";
static CGSize AssetGridThumbnailSize;
- (void)awakeFromNib {
self.imageManager = [[PHCachingImageManager alloc] init];
[self resetCachedAssets];
[[PHPhotoLibrary sharedPhotoLibrary] registerChangeObserver:self];
}
- (void)dealloc {
[[PHPhotoLibrary sharedPhotoLibrary] unregisterChangeObserver:self];
}
- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
_logoImage.layer.cornerRadius = 30;
[self loadPhotos];
[_libraryOutlet addTarget:self action:#selector(touchUp:) forControlEvents:UIControlEventTouchUpInside];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(handle_data) name:#"reload_data" object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(hideMenu) name:#"hide_menu" object:nil];
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Begin caching assets in and around collection view's visible rect.
[self updateCachedAssets];
}
-(void)handle_data {
//[self.collectionView2 layoutIfNeeded];
//[self resetCachedAssets];
[self.collectionView2 reloadData];
[self updateCachedAssets];
NSLog(#"did it work?");
}
- (void)viewDidLayoutSubviews
{
NSInteger section = [self.collectionView2 numberOfSections] - 1;
NSInteger item = [self.collectionView2 numberOfItemsInSection:section] - 1;
NSIndexPath *indexPath = [NSIndexPath indexPathForItem:item inSection:section];
[self.collectionView2 scrollToItemAtIndexPath:indexPath atScrollPosition:(UICollectionViewScrollPositionTop) animated:NO];
//[self loadPhotos];
}
-(void) loadPhotos {
PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
allPhotosOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *allPhotos = [PHAsset fetchAssetsWithOptions:allPhotosOptions];
if (self.assetsFetchResults == nil) {
self.assetsFetchResults = allPhotos;
}
}
#pragma mark - PHPhotoLibraryChangeObserver
- (void)photoLibraryDidChange:(PHChange *)changeInstance {
// Check if there are changes to the assets we are showing.
PHFetchResultChangeDetails *collectionChanges = [changeInstance changeDetailsForFetchResult:self.assetsFetchResults];
if (collectionChanges == nil) {
return;
}
/*
Change notifications may be made on a background queue. Re-dispatch to the
main queue before acting on the change as we'll be updating the UI.
*/
dispatch_async(dispatch_get_main_queue(), ^{
// Get the new fetch result.
self.assetsFetchResults = [collectionChanges fetchResultAfterChanges];
UICollectionView *collectionView = self.collectionView;
if (![collectionChanges hasIncrementalChanges] || [collectionChanges hasMoves]) {
// Reload the collection view if the incremental diffs are not available
[collectionView reloadData];
} else {
/*
Tell the collection view to animate insertions and deletions if we
have incremental diffs.
*/
[collectionView performBatchUpdates:^{
NSIndexSet *removedIndexes = [collectionChanges removedIndexes];
if ([removedIndexes count] > 0) {
[collectionView deleteItemsAtIndexPaths:[removedIndexes aapl_indexPathsFromIndexesWithSection:0]];
}
NSIndexSet *insertedIndexes = [collectionChanges insertedIndexes];
if ([insertedIndexes count] > 0) {
[collectionView insertItemsAtIndexPaths:[insertedIndexes aapl_indexPathsFromIndexesWithSection:0]];
}
NSIndexSet *changedIndexes = [collectionChanges changedIndexes];
if ([changedIndexes count] > 0) {
[collectionView reloadItemsAtIndexPaths:[changedIndexes aapl_indexPathsFromIndexesWithSection:0]];
}
} completion:NULL];
}
[self resetCachedAssets];
});
}
#pragma mark - UICollectionViewDataSource
- (NSInteger)collectionView:(UICollectionView *)collectionView numberOfItemsInSection:(NSInteger)section {
return self.assetsFetchResults.count;
}
- (CGSize)collectionView:(UICollectionView *)collectionView layout:(UICollectionViewLayout*)collectionViewLayout sizeForItemAtIndexPath:(NSIndexPath *)indexPath; {
CGFloat colum = 3.0, spacing = 0.0;
CGFloat value = floorf((CGRectGetWidth(self.view.bounds) - (colum - 1) * spacing) / colum);
UICollectionViewFlowLayout *layout = [[UICollectionViewFlowLayout alloc] init];
layout.itemSize = CGSizeMake(value, value);
layout.sectionInset = UIEdgeInsetsMake(0, 0, 0, 0);
layout.minimumInteritemSpacing = spacing;
layout.minimumLineSpacing = spacing;
return CGSizeMake(value, value);
//return self.collectionView.frame.size;
}
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
PHAsset *asset = self.assetsFetchResults[indexPath.item];
// Dequeue an AAPLGridViewCell.
AAPLGridViewCell2 *cell = [collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath];
cell.representedAssetIdentifier = asset.localIdentifier;
// Request an image for the asset from the PHCachingImageManager.
[self.imageManager requestImageForAsset:asset
targetSize:CGSizeMake(130, 130)
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// Set the cell's thumbnail image if it's still showing the same asset.
if ([cell.representedAssetIdentifier isEqualToString:asset.localIdentifier]) {
cell.thumbnailImage = result;
}
}];
CGPoint bottomOffset = CGPointMake(-0, self.collectionView.contentSize.height - self.collectionView.bounds.size.height + self.collectionView.contentInset.bottom);
[self.collectionView setContentOffset:bottomOffset animated:NO];;
return cell;
}
- (void) collectionView:(UICollectionView *)collectionView didSelectItemAtIndexPath:(NSIndexPath *)indexPath
{
// Prepare the options to pass when fetching the live photo.
PHAsset *asset = self.assetsFetchResults[indexPath.item];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.networkAccessAllowed = NO;
dispatch_async(dispatch_get_main_queue(), ^{
_indicatorView = [ImageEditorTheme indicatorView];
_indicatorView.center = self.containerView.center;
[self.containerView addSubview:_indicatorView];
[_indicatorView startAnimating];
UIStoryboard *storyboard = [UIStoryboard storyboardWithName:#"Main" bundle:nil];
PreviewVC *prevVC = (PreviewVC *)[storyboard instantiateViewControllerWithIdentifier:#"PreviewVC"];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFit options:options resultHandler:^(UIImage *result, NSDictionary *info) {
// Show the UIImageView and use it to display the requested image.
passedImage = result;
prevVC.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
[self presentViewController:prevVC animated:true completion:nil];
[_indicatorView stopAnimating];
}];
});
}
#pragma mark - UIScrollViewDelegate
- (void)scrollViewDidScroll:(UIScrollView *)scrollView {
// Update cached assets for the new visible area.
[self updateCachedAssets];
}
I managed to solve the issue. It was as simple as removing the call to "[self resetCachedAssets];" in "awakeFromNib"
Works great now.

tapAtPoint on UIWebView subclass

I have subclassed UIWebView so that I can get touch events and also implemented this handy method. I'm curious, if this will work on an actual iOS device. I'm not at the office, so I don't know if does. It seems to work in the simulator.
- (void) tapAtPoint:(CGPoint)point
{
id /*UIWebBrowserView*/ webBrowserView = nil;
id webViewInternal = nil;
object_getInstanceVariable(self, "_internal", (void **)&webViewInternal);
object_getInstanceVariable(webViewInternal, "browserView", (void **)&webBrowserView);
if (webBrowserView) {
[webBrowserView tapInteractionWithLocation:point];
}
}
Has anyone tried something like this? I for sure find out in the morning, lol.
Please try this code, Here its working fine.
/* TapDetectingWindow.m */
#import "TapDetectingWindow.h"
#implementation TapDetectingWindow
#synthesize viewToObserve;
#synthesize controllerThatObserves;
- (id)initWithViewToObserver:(UIView *)view andDelegate:(id)delegate {
if(self == [super init]) {
self.viewToObserve = view;
self.controllerThatObserves = delegate;
}
return self;
}
- (void)dealloc {
[viewToObserve release];
[super dealloc];
}
- (void)forwardTap:(id)touch {
[controllerThatObserves userDidTapWebView:touch];
}
- (void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
if (viewToObserve == nil || controllerThatObserves == nil)
return;
NSSet *touches = [event allTouches];
if (touches.count != 1)
return;
UITouch *touch = touches.anyObject;
if (touch.phase != UITouchPhaseEnded)
return;
if ([touch.view isDescendantOfView:viewToObserve] == NO)
return;
CGPoint tapPoint = [touch locationInView:viewToObserve];
NSLog(#"TapPoint = %f, %f", tapPoint.x, tapPoint.y);
NSArray *pointArray = [NSArray arrayWithObjects:[NSString stringWithFormat:#"%f", tapPoint.x],
[NSString stringWithFormat:#"%f", tapPoint.y], nil];
if (touch.tapCount == 1) {
[self performSelector:#selector(forwardTapwithObject:pointArray afterDelay:0.5];
}
else if (touch.tapCount > 1) {
[NSObject cancelPreviousPerformRequestsWithTarget:self selector:#selector(forwardTap  object:pointArray];
}
}
#end
/* WebViewController.h */
#interface WebViewController : UIViewController<TapDetectingWindowDelegate> {
IBOutlet UIWebView *mHtmlViewer;
TapDetectingWindow *mWindow;
}
/* WebViewController.m */
- (void)viewDidLoad {
[super viewDidLoad];
mWindow = (TapDetectingWindow *)[[UIApplication sharedApplication].windows objectAtIndex:0];
mWindow.viewToObserve = mHtmlViewer;
mWindow.controllerThatObserves = self;
}
- (void)userDidTapWebView:(id)tapPoint
{
NSLog(#"TapPoint = %f, %f", tapPoint.x, tapPoint.y);
}
Thanks, Let me know if you face any problems.
short answer: Yes, I tried something like this in the same way and it works on the real devices as well (tested with iOS 6).
ARC version of your method:
- (void) tapAtPoint:(CGPoint)point
{
Ivar internalWebViewIvar = class_getInstanceVariable([self class], "_internal");
id internalWebView = object_getIvar(self, internalWebViewIvar);
Ivar browserViewIvar = class_getInstanceVariable(object_getClass(internalWebView), "browserView");
id browserView = object_getIvar(internalWebView, browserViewIvar);
if (browserView) {
[browserView performSelector:#selector(tapInteractionWithLocation:) withObject:[NSValue valueWithCGPoint:point]];
}
}

Why is my NSOperation subclass never finishing?

I have an NSOperation subclass that I want to run concurrently.
My understanding is that for concurrent operations to work:
I need to define isConcurrent to return YES.
I need to define the start method
I need to send KVOs notification for isExecuting and isFinished when it's done.
Using #synthesize will automatically send the appropriate KVO notifications when the values for isExecuting and isFinished are changed.
Despite this, I have verified that my queue never moves on to the next item.
Here's the meat of my code:
#interface MyOperation()
#property (readwrite) BOOL isExecuting;
#property (readwrite) BOOL isFinished;
#end
#implementation MyOperation
- (void)start
{
#autoreleasepool {
self.isExecuting = YES;
self.HTTPOperation = [[AFHTTPRequestOperation alloc] initWithRequest: URLRequest];
_HTTPOperation.completionBlock = [^{
[self completed];
self.isExecuting = NO;
self.isFinished = YES;
} copy];
[_HTTPOperation start];
}
}
- (BOOL)isConcurrent
{
return YES;
}
- (void)completed
{
}
#end
What am I missing?
(This is on an iPhone, but I can't imagine that matters.)
It looks like whatever KVO notifications #synthesize sends aren't enough for NSOperationQueue to move on.
Sending the notifications manually fixes the problem:
- (void)start
{
#autoreleasepool {
[self willChangeValueForKey:#"isExecuting"];
self.isExecuting = YES;
[self didChangeValueForKey:#"isExecuting"];
NSURLRequest *URLRequest = [self buildRequest];
if (!URLRequest) {
[self willChangeValueForKey:#"isFinished"];
[self willChangeValueForKey:#"isExecuting"];
_isExecuting = NO;
_isFinished = YES;
[self didChangeValueForKey:#"isExecuting"];
[self didChangeValueForKey:#"isFinished"];
return;
}
self.HTTPOperation = [[AFHTTPRequestOperation alloc] initWithRequest: URLRequest];
_HTTPOperation.completionBlock = [^{
[self completed];
[self willChangeValueForKey:#"isFinished"];
[self willChangeValueForKey:#"isExecuting"];
_isExecuting = NO;
_isFinished = YES;
[self didChangeValueForKey:#"isExecuting"];
[self didChangeValueForKey:#"isFinished"];
} copy];
[_HTTPOperation start];
}
}
See also:
Why does NSOperation disable automatic key-value observing?
What's your "queue" look like? Are you using an NSOperationQueue?
Anyway, I'll try to answer your question with what I understood :P
I would create a delegate for my NSOperation and have KVO take care of calling this.
Say for example your NSOperation class looks like this
#interface MyOperation : NSOperation
#property (assign) id<MyOperationDelegate> delegate;
Your implementation
#synthesize delegate;
#synthesize error;
-(id)init{
self = [super init];
if(self){
[self addObserver:self forKeyPath:#"isFinished"
options:NSKeyValueObservingOptionNew
context:NULL];
}
return self;
}
-(void)dealloc{
[self removeObserver:self forKeyPath:#"isFinished"];
[super dealloc];
}
-(void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
if([keyPath isEqualToString:#"isFinished"] == YES){
if([self isCancelled] == NO){
if(delegate != nil && [delegate respondsToSelector:#selector(operationComplete:)]){
[delegate taskComplete:self];
}
}else{
if(delegate != nil && [delegate respondsToSelector:#selector(operationCancelled)]){
[delegate taskCancelled];
}
}
}
}
-(void)main{
[NSException exceptionWithName:kTaskException
reason:#"Only to be used with subclass"
userInfo:nil];
}
And finally your protocol
#class MyOperation;
#protocol MyOperationDelegate <NSObject>
#optional
-(void)operationComplete:(MyOperation*)operation;
-(void)operationCancelled;

Objective-C, Need help creating an AVAudioPlayer singleton

I'm working on a soundboard app, that has several pages of buttons to play sound effects with a stop button on every page should the user wish to manually interrupt the clip. I'm using avaudioplayer in each view to play the sound upon pressing the button for that clip. It works fine until the view is changed. If a user jumps to a new page the sound keeps playing and the stop button stops working even if they return to the original view. Pressing a sound button no longer interrupts the running sound resulting in two sounds over each other.
From googling and searching this site, I know the issue is that each view change creates a new instance of the player and the remedy is to create a singleton class. Unfortunately I have yet to find any further examples of how to actually do this. If someone could provide or point the way to a beginners guide for creating an avaudioplayer singleton I would really appreciate it. All I need to be able to do is pass the file name to the shared player and start play with a sound clip button and have the stop button stop sounds no matter what view the user is on. I am using the ios 5.1 sdk with storyboards and ARC enabled.
My solution, as used in one of my own projects, is posted beneath. Feel free to copy-and-paste, I intend to open-source this project once it's finished :)
A preview of the player can be seen on YouTube: http://www.youtube.com/watch?v=Q98DQ6iNTYM
AudioPlayer.h
#protocol AudioPlayerDelegate;
#interface AudioPlayer : NSObject
#property (nonatomic, assign, readonly) BOOL isPlaying;
#property (nonatomic, assign) id <AudioPlayerDelegate> delegate;
+ (AudioPlayer *)sharedAudioPlayer;
- (void)playAudioAtURL:(NSURL *)URL;
- (void)play;
- (void)pause;
#end
#protocol AudioPlayerDelegate <NSObject>
#optional
- (void)audioPlayerDidStartPlaying;
- (void)audioPlayerDidStartBuffering;
- (void)audioPlayerDidPause;
- (void)audioPlayerDidFinishPlaying;
#end
AudioPlayer.m
// import AVPlayer.h & AVPlayerItem.h
#interface AudioPlayer ()
- (void)playerItemDidFinishPlaying:(id)sender;
#end
#implementation AudioPlayer
{
AVPlayer *player;
}
#synthesize isPlaying, delegate;
+ (AudioPlayer *)sharedAudioPlayer
{
static dispatch_once_t pred;
static AudioPlayer *sharedAudioPlayer = nil;
dispatch_once(&pred, ^
{
sharedAudioPlayer = [[self alloc] init];
[[NSNotificationCenter defaultCenter] addObserver:sharedAudioPlayer selector:#selector(playerItemDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
});
return sharedAudioPlayer;
}
- (void)playAudioAtURL:(NSURL *)URL
{
if (player)
{
[player removeObserver:self forKeyPath:#"status"];
[player pause];
}
player = [AVPlayer playerWithURL:URL];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
if (delegate && [delegate respondsToSelector:#selector(audioPlayerDidStartBuffering)])
[delegate audioPlayerDidStartBuffering];
}
- (void)play
{
if (player)
{
[player play];
if (delegate && [delegate respondsToSelector:#selector(audioPlayerDidStartPlaying)])
[delegate audioPlayerDidStartPlaying];
}
}
- (void)pause
{
if (player)
{
[player pause];
if (delegate && [delegate respondsToSelector:#selector(audioPlayerDidPause)])
[delegate audioPlayerDidPause];
}
}
- (BOOL)isPlaying
{
DLog(#"%f", player.rate);
return (player.rate > 0);
}
#pragma mark - AV player
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (object == player && [keyPath isEqualToString:#"status"])
{
if (player.status == AVPlayerStatusReadyToPlay)
{
[self play];
}
}
}
#pragma mark - Private methods
- (void)playerItemDidFinishPlaying:(id)sender
{
DLog(#"%#", sender);
if (delegate && [delegate respondsToSelector:#selector(audioPlayerDidFinishPlaying)])
[delegate audioPlayerDidFinishPlaying];
}
#end
AudioPlayerViewController.h
extern NSString *const kAudioPlayerWillShowNotification;
extern NSString *const kAudioPlayerWillHideNotification;
#interface AudioPlayerViewController : UIViewController
#property (nonatomic, assign, readonly) BOOL isPlaying;
#property (nonatomic, assign, readonly) BOOL isPlayerVisible;
- (void)playAudioAtURL:(NSURL *)URL withTitle:(NSString *)title;
- (void)pause;
#end
AudioPlayerViewController.m
NSString *const kAudioPlayerWillShowNotification = #"kAudioPlayerWillShowNotification";
NSString *const kAudioPlayerWillHideNotification = #"kAudioPlayerWillHideNotification";
#interface AudioPlayerViewController () <AudioPlayerDelegate>
#property (nonatomic, strong) AudioPlayerView *playerView;
- (void)playButtonTouched:(id)sender;
- (void)closeButtonTouched:(id)sender;
- (void)hidePlayer;
#end
#implementation AudioPlayerViewController
#synthesize playerView, isPlaying, isPlayerVisible;
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self)
{
playerView = [[AudioPlayerView alloc] initWithFrame:CGRectZero];
[AudioPlayer sharedAudioPlayer].delegate = self;
}
return self;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
}
#pragma mark - View lifecycle
// Implement loadView to create a view hierarchy programmatically, without using a nib.
- (void)loadView
{
self.view = playerView;
}
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad
{
[super viewDidLoad];
[playerView.playButton addTarget:self action:#selector(playButtonTouched:) forControlEvents:UIControlEventTouchUpInside];
[playerView.closeButton addTarget:self action:#selector(closeButtonTouched:) forControlEvents:UIControlEventTouchUpInside];
}
- (void)viewDidUnload
{
[super viewDidUnload];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
#pragma mark - Private methods
- (AudioPlayerView *)playerView
{
return (AudioPlayerView *)self.view;
}
- (void)hidePlayer
{
[[NSNotificationCenter defaultCenter] postNotificationName:kAudioPlayerWillHideNotification object:nil];
[self.playerView hidePlayer];
}
- (void)playButtonTouched:(id)sender
{
DLog(#"play / pause");
if ([AudioPlayer sharedAudioPlayer].isPlaying)
{
[[AudioPlayer sharedAudioPlayer] pause];
}
else
{
[[AudioPlayer sharedAudioPlayer] play];
}
[self.playerView showPlayer];
}
- (void)closeButtonTouched:(id)sender
{
DLog(#"close");
if ([AudioPlayer sharedAudioPlayer].isPlaying)
[[AudioPlayer sharedAudioPlayer] pause];
[self hidePlayer];
}
#pragma mark - Instance methods
- (void)playAudioAtURL:(NSURL *)URL withTitle:(NSString *)title
{
playerView.titleLabel.text = title;
[[AudioPlayer sharedAudioPlayer] playAudioAtURL:URL];
[[NSNotificationCenter defaultCenter] postNotificationName:kAudioPlayerWillShowNotification object:nil];
[playerView showPlayer];
}
- (void)pause
{
[[AudioPlayer sharedAudioPlayer] pause];
[[NSNotificationCenter defaultCenter] postNotificationName:kAudioPlayerWillHideNotification object:nil];
[playerView hidePlayer];
}
#pragma mark - Audio player delegate
- (void)audioPlayerDidStartPlaying
{
DLog(#"did start playing");
playerView.playButtonStyle = PlayButtonStylePause;
}
- (void)audioPlayerDidStartBuffering
{
DLog(#"did start buffering");
playerView.playButtonStyle = PlayButtonStyleActivity;
}
- (void)audioPlayerDidPause
{
DLog(#"did pause");
playerView.playButtonStyle = PlayButtonStylePlay;
}
- (void)audioPlayerDidFinishPlaying
{
[self hidePlayer];
}
#pragma mark - Properties
- (BOOL)isPlaying
{
return [AudioPlayer sharedAudioPlayer].isPlaying;
}
- (BOOL)isPlayerVisible
{
return !playerView.isPlayerHidden;
}
#end
AudioPlayerView.h
typedef enum
{
PlayButtonStylePlay = 0,
PlayButtonStylePause,
PlayButtonStyleActivity,
} PlayButtonStyle;
#interface AudioPlayerView : UIView
#property (nonatomic, strong) UIButton *playButton;
#property (nonatomic, strong) UIButton *closeButton;
#property (nonatomic, strong) UILabel *titleLabel;
#property (nonatomic, strong) UIActivityIndicatorView *activityView;
#property (nonatomic, assign) PlayButtonStyle playButtonStyle;
#property (nonatomic, assign, readonly) BOOL isPlayerHidden;
- (void)showPlayer;
- (void)hidePlayer;
#end
AudioPlayerView.m
#implementation AudioPlayerView
{
BOOL _isAnimating;
}
#synthesize playButton, closeButton, titleLabel, playButtonStyle, activityView, isPlayerHidden = _playerHidden;
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self)
{
self.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"musicplayer_background.png"]];
_playerHidden = YES;
activityView = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleWhite];
activityView.frame = CGRectMake(0.0f, 0.0f, 30.0f, 30.0f);
[self addSubview:activityView];
playButton = [[UIButton alloc] initWithFrame:CGRectMake(0.0f, 0.0f, 30.0f, 30.0f)];
[playButton setTitleColor:[UIColor blackColor] forState:UIControlStateNormal];
[playButton setBackgroundImage:[UIImage imageNamed:#"button_pause.png"] forState:UIControlStateNormal];
playButton.titleLabel.textAlignment = UITextAlignmentCenter;
[self addSubview:playButton];
closeButton = [[UIButton alloc] initWithFrame:CGRectMake(0.0f, 0.0f, 30.0f, 30.0f)];
[closeButton setBackgroundImage:[UIImage imageNamed:#"button_close.png"] forState:UIControlStateNormal];
[closeButton setTitleColor:[UIColor blackColor] forState:UIControlStateNormal];
closeButton.titleLabel.textAlignment = UITextAlignmentCenter;
[self addSubview:closeButton];
titleLabel = [[UILabel alloc] initWithFrame:CGRectMake(0.0f, 0.0f, 240.0f, 30.0f)];
titleLabel.text = nil;
titleLabel.textAlignment = UITextAlignmentCenter;
titleLabel.font = [UIFont boldSystemFontOfSize:13.0f];
titleLabel.numberOfLines = 2;
titleLabel.textColor = [UIColor whiteColor];
titleLabel.backgroundColor = [UIColor clearColor];
[self addSubview:titleLabel];
}
return self;
}
- (void)layoutSubviews
{
#define PADDING 5.0f
DLog(#"%#", NSStringFromCGRect(self.bounds));
CGRect frame = self.bounds;
CGFloat y = frame.size.height / 2;
titleLabel.center = CGPointMake(frame.size.width / 2, y);
CGFloat x = titleLabel.frame.origin.x - (playButton.frame.size.width / 2) - PADDING;
playButton.center = CGPointMake(x, y);
activityView.center = CGPointMake(x, y);
x = titleLabel.frame.origin.x + titleLabel.frame.size.width + (closeButton.frame.size.width / 2) + PADDING;
closeButton.center = CGPointMake(x, y);
}
#pragma mark - Instance methods
- (void)showPlayer
{
if (_isAnimating || _playerHidden == NO)
return;
_isAnimating = YES;
[UIView
animateWithDuration:0.5f
animations:^
{
CGRect frame = self.frame;
frame.origin.y -= 40.0f;
self.frame = frame;
}
completion:^ (BOOL finished)
{
_isAnimating = NO;
_playerHidden = NO;
}];
}
- (void)hidePlayer
{
if (_isAnimating || _playerHidden)
return;
_isAnimating = YES;
[UIView
animateWithDuration:0.5f
animations:^
{
CGRect frame = self.frame;
frame.origin.y += 40.0f;
self.frame = frame;
}
completion:^ (BOOL finished)
{
_isAnimating = NO;
_playerHidden = YES;
}];
}
- (void)setPlayButtonStyle:(PlayButtonStyle)style
{
playButton.hidden = (style == PlayButtonStyleActivity);
activityView.hidden = (style != PlayButtonStyleActivity);
switch (style)
{
case PlayButtonStyleActivity:
{
[activityView startAnimating];
}
break;
case PlayButtonStylePause:
{
[activityView stopAnimating];
[playButton setBackgroundImage:[UIImage imageNamed:#"button_pause.png"] forState:UIControlStateNormal];
}
break;
case PlayButtonStylePlay:
default:
{
[activityView stopAnimating];
[playButton setBackgroundImage:[UIImage imageNamed:#"button_play.png"] forState:UIControlStateNormal];
}
break;
}
[self setNeedsLayout];
}
#end
AppDelegate - didFinishLaunching
// setup audio player
audioPlayer = [[AudioPlayerViewController alloc] init]; // public property ...
CGRect frame = self.window.rootViewController.view.frame;
UITabBarController *tabBarController = (UITabBarController *)self.window.rootViewController;
CGFloat tabBarHeight = tabBarController.tabBar.frame.size.height;
audioPlayer.view.frame = CGRectMake(0.0f, frame.size.height - tabBarHeight, 320.0f, 40.0f);
[self.window.rootViewController.view insertSubview:audioPlayer.view belowSubview:tabBarController.tabBar];
From any view controller inside the app I start audio with the following code:
- (void)playAudioWithURL:(NSURL *)URL title:(NSString *)title
{
OnsNieuwsAppDelegate *appDelegate = (OnsNieuwsAppDelegate *)[[UIApplication sharedApplication] delegate];
[appDelegate.audioPlayer playAudioAtURL:URL withTitle:title];
}
Assets
For the above example, the following assets can be used (button images are white, so hard to see against background):
Buttons:
Background:
There's a lot of discussion (and links to blogs, etc.) about singletons over at What should my Objective-C singleton look like?, and I see a fair number of tutorials as a result of this Google search: http://www.google.com/search?q=+cocoa+touch+singleton+tutorial, but the real answer to your question, I believe, is that you should do one of two things:
If you do want the sound for a particular view to continue playing when the user switches, create the player as you're doing now, but when the view (re)appears, check that a player exists, and don't make a new one.
If you want the sound to stop, then stop the sound when the view changes (i.e., in viewWillDisappear:).

NSNotification touch overlay

4 files for 2 view controllers, firstviewcontroller and MyOverlayView. MyOverlayView just receives touches and sends notification which should be received by firstviewcontroller. Im getting touch event on the overlayview but not receiving the notification at firstviewcontroller. Any ideas?
FVC-Header
#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>
#import "MyOverlayView.h"
extern NSString *const OverlayViewTouchNotification;
#interface FirstViewController : UIViewController {
MPMoviePlayerViewController *mvc;
MyOverlayView *overlayView;
NSArray *keyArray;
NSMutableDictionary *urlDic;
}
#property (nonatomic, retain) MPMoviePlayerViewController *mvc;
#property (nonatomic, retain) IBOutlet MyOverlayView *overlayView;
#property (nonatomic, retain) NSArray *keyArray;
#property (nonatomic, retain) NSMutableDictionary *urlDic;
-(void)playMovieAtURL:(NSDictionary *)dic:(NSArray *)array:(int) rand;
-(void)overlayViewTouches:(NSNotification *)notification;
-(void)loadArray;
-(void)reset;
#end
FVC-Implementation
#import "FirstViewController.h"
#import "SMWebRequest.h"
NSString * const OverlayViewTouchNotification = #"overlayViewTouch";
#implementation FirstViewController
#synthesize mvc;
#synthesize overlayView;
#synthesize keyArray, urlDic;
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad
{
[super viewDidLoad];
}
-(void)viewDidAppear:(BOOL)animated
{
[self loadArray];
if (!overlayView) {
overlayView = [[MyOverlayView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]];
}
NSArray *windows = [[UIApplication sharedApplication] windows];
if ([windows count] >= 1)
{
// Locate the movie player window
UIWindow *moviePlayerWindow = [[UIApplication sharedApplication] keyWindow];
// Add our overlay view to the movie player's subviews so it is
// displayed above it.
[moviePlayerWindow addSubview:self.overlayView];
}
//[self.parentViewController.view addSubview:overlayView];
}
-(void)loadArray{
if (!urlDic) {
NSMutableDictionary *tempDic = [NSMutableDictionary new];
[tempDic setObject:#"mp4" forKey:#"euromount_high_res"];
urlDic = [[NSDictionary alloc] initWithDictionary:tempDic];
}
if (!keyArray) {
NSArray *tempArray = [urlDic allKeys];
keyArray = [[NSArray alloc] initWithArray:tempArray];
}
//Random choice
int point = rand() % ([keyArray count]);
//Call play movie
[self playMovieAtURL :urlDic :keyArray :point];
}
-(void)playMovieAtURL:(NSDictionary *)dic:(NSArray *)array:(int)rand
{
NSString *key = [array objectAtIndex:rand];
NSString *path = [[NSBundle mainBundle] pathForResource:key ofType:[dic valueForKey:key]];
if (mvc == nil) { mvc = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:path]]; }
mvc.moviePlayer.scalingMode = MPMovieScalingModeAspectFit;
mvc.moviePlayer.shouldAutoplay = TRUE;
mvc.moviePlayer.controlStyle = MPMovieControlStyleNone;
// Register for the playback finished notification.
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(myMovieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:nil];
[self presentModalViewController:mvc animated:YES];
// Movie playback is asynchronous, so this method returns immediately.
[mvc.moviePlayer play];
}
// When the movie is done,release the controller.
-(void)myMovieFinishedCallback:(NSNotification*)aNotification
{
[[NSNotificationCenter defaultCenter] removeObserver:self
name:MPMoviePlayerPlaybackDidFinishNotification
object:nil];
//for (UIView *view in self.view.subviews) {
// [view removeFromSuperview];
//}
mvc = nil;
[self loadArray];
}
// Touches in the overlay view (not in the overlay button)
// post the "overlayViewTouch" notification and will send
// the overlayViewTouches: message
- (void)overlayViewTouches:(NSNotification *)notification
{
NSLog(#"screen touched");
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return YES;
}
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc. that aren't in use.
}
- (void)viewDidUnload
{
mvc = nil;
keyArray = nil;
urlDic = nil;
[super viewDidUnload];
}
-(void)reset{
for (UIView *view in self.view.subviews) {
[view removeFromSuperview];
}
mvc = nil;
keyArray = nil;
urlDic = nil;
[self loadArray];
}
- (void)dealloc
{
[super dealloc];
}
#end
MyOverlayView-Header
#import <UIKit/UIKit.h>
#interface MyOverlayView : UIView {
}
- (void)awakeFromNib;
- (void)dealloc;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
#end
MyOverlayView-Implementation
#import "MyOverlayView.h"
#import "FirstViewController.h"
#implementation MyOverlayView
// MPMoviePlayerController will play movies full-screen in
// landscape mode, so we must rotate MyOverlayView 90 degrees and
// translate it to the center of the screen so when it draws
// on top of the playing movie it will display in landscape
// mode to match the movie player orientation.
//
- (void)awakeFromNib
{
CGAffineTransform transform = self.transform;
// Rotate the view 90 degrees.
transform = CGAffineTransformRotate(transform, (M_PI / 2.0));
UIScreen *screen = [UIScreen mainScreen];
// Translate the view to the center of the screen
transform = CGAffineTransformTranslate(transform,
((screen.bounds.size.height) - (self.bounds.size.height))/2,
0);
self.transform = transform;
CGRect newFrame = self.frame;
newFrame.origin.x = 190;
self.frame = newFrame;
}
- (void)dealloc {
[super dealloc];
}
// Handle any touches to the overlay view
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* touch = [touches anyObject];
if (touch.phase == UITouchPhaseBegan)
{
NSNotificationCenter *nc = [NSNotificationCenter defaultCenter];
[nc postNotificationName:OverlayViewTouchNotification object:nil];
}
}
#end
****EDIT ANSWER*******
I wasnt adding an observer for the notification. Got it all sorted out.
I wasn't adding an observer for the notification. Got it all Sorted out