I'm not familiar with iOS but I'm trying to find when the default, built-in camera application is focusing. To do this I create my own separate Objective-C application and following this answer here [iPhone : camera autofocus observer? but I'm not getting anything from observeValueForKeyPath in the NSLog.
#import "ViewController.h"
#import "AVFoundation/AVCaptureDevice.h"
#import "AVFoundation/AVMediaFormat.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
NSLog(#"viewDidLoad");
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
// callback
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
NSLog(#"observeValueForKeyPath");
if( [keyPath isEqualToString:#"adjustingFocus"] ){
BOOL adjustingFocus = [ [change objectForKey:NSKeyValueChangeNewKey] isEqualToNumber:[NSNumber numberWithInt:1] ];
NSLog(#"Is adjusting focus? %#", adjustingFocus ? #"YES" : #"NO" );
NSLog(#"Change dictionary: %#", change);
}
if( [keyPath isEqualToString:#"focusMode"] ){
AVCaptureFocusMode focusMode = [ [change objectForKey:NSKeyValueChangeNewKey] isEqualToNumber:[NSNumber numberWithInt:1] ];
NSLog(#"focusMode? %ld", focusMode);
}
}
// register observer
- (void)viewWillAppear:(BOOL)animated{
[super viewWillAppear: animated];
NSLog(#"viewWillAppear");
AVCaptureDevice *camDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
int flags = NSKeyValueObservingOptionNew;
[camDevice addObserver:self forKeyPath:#"adjustingFocus" options:flags context:nil];
[camDevice addObserver:self forKeyPath:#"focusMode" options:flags context:nil];
}
#end
Any help much appreciated.
For anyone who visits this question, the answer is what Bluewings wrote as a comment. I was trying to use KVO to observe one application from another which is not possible since only one lock on a capture device is possible at one time.
Related
I am doing the parallax effect by using category by doing :
add and UIView into the uitableView (via category
add addObserver:forKeyPath so that whenever tableview is moving, i will reframe the view above
Details are below
UIScrollView+Parallax.h
#import <UIKit/UIKit.h>
#class ParallaxView;
#interface UIScrollView (Parallax)
#property (strong, nonatomic) ParallaxView *parallaxView;
- (void) addParallaxViewWith:(UIView*)parallaxView;
- (void) removeKVO;
#end
#interface ParallaxView : UIView
#end
UIScrollView+Parallax.m
static char parallaxKey;
#implementation UIScrollView (Parallax)
#dynamic parallaxView;
#pragma mark - Add parallax view to scrollView
- (void) addParallaxViewWith:(ParallaxView*)pView {
if ( !self.parallaxView) {
[self addSubview:pView];
[self setParallaxView:pView];
}
}
#pragma mark - Set parallaxView + register parallaxView as an observer
- (void) setParallaxView:(ParallaxView *)parallaxView {
objc_setAssociatedObject(self, ¶llaxKey, parallaxView, OBJC_ASSOCIATION_ASSIGN);
/* THESE LINE ARE CRASHING THE APP */
// [self addObserver:self.parallaxView
// forKeyPath:#"contentOffset"
// options:NSKeyValueObservingOptionNew
// context:nil];
}
#pragma mark - Get parallaxView
- (ParallaxView*) parallaxView {
return (objc_getAssociatedObject(self, ¶llaxKey));
}
#pragma mark - Remove
- (void)removeKVO {
[self removeObserver:self.parallaxView forKeyPath:#"contentOffset"];
}
#end
#implementation ParallaxView
-(id)init
{
//load xib from main bundle and assign it to self
self = [[[NSBundle mainBundle]loadNibNamed:#"Parallex"
owner:self
options:nil] objectAtIndex:0];
return self;
}
-(id)initWithFrame:(CGRect)frame
{
self = [self init];
[self setFrame:frame];
return self;
}
................
#end
And I am adding parallax to the table by doing
ParallaxView *pView = [[ParallaxView alloc]initWithFrame:CGRectMake(0, 0, 320, 160)];
[self.tableView addParallaxViewWith:pView];
However, [self addObserver:forKeyPath:options:context:nil] keeps crashing the app without no clues at all. If I comments this line out and app is not crashing but parallex effect is not working.
Any ideas for this problematics. Please help. Thanks
Problem in code
-(id)initWithFrame:(CGRect)frame
{
self = [self init];
[self setFrame:frame];
return self;
}
In above code self = [self init]; and [self setFrame:frame]; will go in recursion will give crash
,First fix this I guess it will solve your problem,It should be like this
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
and also loading View from nib using
self = [[[NSBundle mainBundle]loadNibNamed:#"Parallex"
owner:self
options:nil] objectAtIndex:0];
this code is really a bad idea.
you can refer THIS for this task.
Happy and clean coding...
#implementation ParallaxView
//Add Observe Method
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
if([keyPath isEqualToString:#"contentOffset"]){
NSLog(#"contentOffset:%#", [change objectForKey:NSKeyValueChangeNewKey]);
}
}
#end
Try to replace
objc_setAssociatedObject(self, ¶llaxKey, parallaxView, OBJC_ASSOCIATION_ASSIGN);
with
//Change to OBJC_ASSOCIATION_RETAIN_NONATOMIC
objc_setAssociatedObject(self, ¶llaxKey, parallaxView, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
parallaxView should be a strong reference.
I'm trying to do a very simple application, the purpose is listening an audio stream (AAC 64 kbps). To do so I'm using AVPlayer from the Apple AVFoundation has follow :
ViewController.m
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
#synthesize playerItem, player;
- (void)viewDidLoad
{
[super viewDidLoad];
}
- (void) viewWillAppear:(BOOL)animated{
[super viewWillAppear:animated];
playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:#"http://stream.myjungly.fr/MYJUNGLY2"]];
[playerItem addObserver:self forKeyPath:#"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];
player = [AVPlayer playerWithPlayerItem:playerItem];
[player play];
NSLog(#"player item error : %#", playerItem.error.description);
NSLog(#"player error : %#", player.error.description);
}
- (void) observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object
change:(NSDictionary*)change context:(void*)context {
if ([keyPath isEqualToString:#"timedMetadata"])
{
AVPlayerItem* _playerItem = object;
for (AVMetadataItem* metadata in _playerItem.timedMetadata)
{
NSLog(#"\nkey: %#\nkeySpace: %#\ncommonKey: %#\nvalue: %#", [metadata.key description], metadata.keySpace, metadata.commonKey, metadata.stringValue);
}
}
}
#end
My object player and playerItem are strong properties :
ViewController.h
#interface ViewController : UIViewController
#property (nonatomic, strong) AVPlayerItem* playerItem;
#property (nonatomic, strong) AVPlayer* player;
#end
The Key Value Observer is working great, here is my log :
2013-05-14 11:18:03.725 MusicAvPlayer[6494:907] player item error : (null)
2013-05-14 11:18:03.728 MusicAvPlayer[6494:907] player error : (null)
2013-05-14 11:18:08.140 MusicAvPlayer[6494:907]
key: title
keySpace: comn
commonKey: title
value: Alabama Shakes - Be Mine
But the audio is not played, I've go no sound ! Any idea why ?
EDIT: I already look at this questions :
No sound coming from AVPlayer
AVAudioPlayer, No Sound
AVAudioPlayer not playing any sound
That's why I'm using a strong property, so I guess my problem is not ARC related
I found problem : the iphone was in silent mode ... so no sound can go out on the speaker, the the sound was played when I was using the head phone.
But I've got a new question now : how can you play sound on the speaker when the phone is in silent mode ? (like the official Music application)
EDIT : ... and the answer is there :
Play sound on iPhone even in silent mode
// Init PlayerItem
playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:#"http://stream.myjungly.fr/MYJUNGLY2"]];
// Init Player Obj
player = [AVPlayer playerWithPlayerItem:playerItem];
// Add objserver on Player
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
Add Observer Method your Class
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusReadyToPlay) {
// Start playing...
[player play];
} else if (player.status == AVPlayerStatusFailed) {
// something went wrong. player.error should contain some information
}
}
if ([keyPath isEqualToString:#"timedMetadata"])
{
AVPlayerItem* _playerItem = object;
for (AVMetadataItem* metadata in _playerItem.timedMetadata)
{
NSLog(#"\nkey: %#\nkeySpace: %#\ncommonKey: %#\nvalue: %#", [metadata.key description], metadata.keySpace, metadata.commonKey, metadata.stringValue);
}
}
}
What is the simplest way to play a video programmatically with Objective-C in Mac OS X 10.7 (Lion)? And if I want to support OS X 10.6 (Snow Leopard) too?
I noticed that iOS AV Foundation was introduced to OS X 10.7. Unfortunately the documentation seems to be written for iOS and I found it confusing.
Here's a NSView subclass that plays a video given a URL, using AV Foundation (thus Mac OS X 10.7 upwards only). Based on the AVSimplePlayer sample code.
Header:
#interface RMVideoView : NSView
#property (nonatomic, readonly, strong) AVPlayer* player;
#property (nonatomic, readonly, strong) AVPlayerLayer* playerLayer;
#property (nonatomic, retain) NSURL* videoURL;
- (void) play;
#end
Implementation:
static void *RMVideoViewPlayerLayerReadyForDisplay = &RMVideoViewPlayerLayerReadyForDisplay;
static void *RMVideoViewPlayerItemStatusContext = &RMVideoViewPlayerItemStatusContext;
#interface RMVideoView()
- (void)onError:(NSError*)error;
- (void)onReadyToPlay;
- (void)setUpPlaybackOfAsset:(AVAsset *)asset withKeys:(NSArray *)keys;
#end
#implementation RMVideoView
#synthesize player = _player;
#synthesize playerLayer = _playerLayer;
#synthesize videoURL = _videoURL;
- (id)initWithFrame:(NSRect)frame {
self = [super initWithFrame:frame];
if (self) {
self.wantsLayer = YES;
_player = [[AVPlayer alloc] init];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:RMVideoViewPlayerItemStatusContext];
}
return self;
}
- (void) dealloc {
[self.player pause];
[self removeObserver:self forKeyPath:#"player.currentItem.status"];
[self removeObserver:self forKeyPath:#"playerLayer.readyForDisplay"];
[_player release];
[_playerLayer release];
[_videoURL release];
[super dealloc];
}
- (void) setVideoURL:(NSURL *)videoURL {
_videoURL = videoURL;
[self.player pause];
[self.playerLayer removeFromSuperlayer];
AVURLAsset *asset = [AVAsset assetWithURL:self.videoURL];
NSArray *assetKeysToLoadAndTest = [NSArray arrayWithObjects:#"playable", #"hasProtectedContent", #"tracks", #"duration", nil];
[asset loadValuesAsynchronouslyForKeys:assetKeysToLoadAndTest completionHandler:^(void) {
dispatch_async(dispatch_get_main_queue(), ^(void) {
[self setUpPlaybackOfAsset:asset withKeys:assetKeysToLoadAndTest];
});
}];
}
#pragma mark - KVO
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (context == RMVideoViewPlayerItemStatusContext) {
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status) {
case AVPlayerItemStatusUnknown:
break;
case AVPlayerItemStatusReadyToPlay:
[self onReadyToPlay];
break;
case AVPlayerItemStatusFailed:
[self onError:nil];
break;
}
} else if (context == RMVideoViewPlayerLayerReadyForDisplay) {
if ([[change objectForKey:NSKeyValueChangeNewKey] boolValue]) {
self.playerLayer.hidden = NO;
}
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#pragma mark - Private
- (void)onError:(NSError*)error {
// Notify delegate
}
- (void)onReadyToPlay {
// Notify delegate
}
- (void)setUpPlaybackOfAsset:(AVAsset *)asset withKeys:(NSArray *)keys {
for (NSString *key in keys) {
NSError *error = nil;
if ([asset statusOfValueForKey:key error:&error] == AVKeyValueStatusFailed) {
[self onError:error];
return;
}
}
if (!asset.isPlayable || asset.hasProtectedContent) {
[self onError:nil];
return;
}
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) { // Asset has video tracks
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.frame = self.layer.bounds;
self.playerLayer.autoresizingMask = kCALayerWidthSizable | kCALayerHeightSizable;
self.playerLayer.hidden = YES;
[self.layer addSublayer:self.playerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:RMVideoViewPlayerLayerReadyForDisplay];
}
// Create a new AVPlayerItem and make it our player's current item.
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.player replaceCurrentItemWithPlayerItem:playerItem];
}
#pragma mark - Public
- (void) play {
[self.player play];
}
#end
"Simplest" depends on exactly what you're trying to do. If you want more control (e.g., rendering the movie as an OpenGL texture) or less (e.g., a completely independent window that you can just pop up and ignore), there might be different answers.
But for most use cases, if you want 10.6+ support, the simplest way to show a movie is QTKit. See the article "Using QTKit for Media Playback" in the Xcode documentation for a good starting point.
I have an NSOperation subclass that I want to run concurrently.
My understanding is that for concurrent operations to work:
I need to define isConcurrent to return YES.
I need to define the start method
I need to send KVOs notification for isExecuting and isFinished when it's done.
Using #synthesize will automatically send the appropriate KVO notifications when the values for isExecuting and isFinished are changed.
Despite this, I have verified that my queue never moves on to the next item.
Here's the meat of my code:
#interface MyOperation()
#property (readwrite) BOOL isExecuting;
#property (readwrite) BOOL isFinished;
#end
#implementation MyOperation
- (void)start
{
#autoreleasepool {
self.isExecuting = YES;
self.HTTPOperation = [[AFHTTPRequestOperation alloc] initWithRequest: URLRequest];
_HTTPOperation.completionBlock = [^{
[self completed];
self.isExecuting = NO;
self.isFinished = YES;
} copy];
[_HTTPOperation start];
}
}
- (BOOL)isConcurrent
{
return YES;
}
- (void)completed
{
}
#end
What am I missing?
(This is on an iPhone, but I can't imagine that matters.)
It looks like whatever KVO notifications #synthesize sends aren't enough for NSOperationQueue to move on.
Sending the notifications manually fixes the problem:
- (void)start
{
#autoreleasepool {
[self willChangeValueForKey:#"isExecuting"];
self.isExecuting = YES;
[self didChangeValueForKey:#"isExecuting"];
NSURLRequest *URLRequest = [self buildRequest];
if (!URLRequest) {
[self willChangeValueForKey:#"isFinished"];
[self willChangeValueForKey:#"isExecuting"];
_isExecuting = NO;
_isFinished = YES;
[self didChangeValueForKey:#"isExecuting"];
[self didChangeValueForKey:#"isFinished"];
return;
}
self.HTTPOperation = [[AFHTTPRequestOperation alloc] initWithRequest: URLRequest];
_HTTPOperation.completionBlock = [^{
[self completed];
[self willChangeValueForKey:#"isFinished"];
[self willChangeValueForKey:#"isExecuting"];
_isExecuting = NO;
_isFinished = YES;
[self didChangeValueForKey:#"isExecuting"];
[self didChangeValueForKey:#"isFinished"];
} copy];
[_HTTPOperation start];
}
}
See also:
Why does NSOperation disable automatic key-value observing?
What's your "queue" look like? Are you using an NSOperationQueue?
Anyway, I'll try to answer your question with what I understood :P
I would create a delegate for my NSOperation and have KVO take care of calling this.
Say for example your NSOperation class looks like this
#interface MyOperation : NSOperation
#property (assign) id<MyOperationDelegate> delegate;
Your implementation
#synthesize delegate;
#synthesize error;
-(id)init{
self = [super init];
if(self){
[self addObserver:self forKeyPath:#"isFinished"
options:NSKeyValueObservingOptionNew
context:NULL];
}
return self;
}
-(void)dealloc{
[self removeObserver:self forKeyPath:#"isFinished"];
[super dealloc];
}
-(void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
if([keyPath isEqualToString:#"isFinished"] == YES){
if([self isCancelled] == NO){
if(delegate != nil && [delegate respondsToSelector:#selector(operationComplete:)]){
[delegate taskComplete:self];
}
}else{
if(delegate != nil && [delegate respondsToSelector:#selector(operationCancelled)]){
[delegate taskCancelled];
}
}
}
}
-(void)main{
[NSException exceptionWithName:kTaskException
reason:#"Only to be used with subclass"
userInfo:nil];
}
And finally your protocol
#class MyOperation;
#protocol MyOperationDelegate <NSObject>
#optional
-(void)operationComplete:(MyOperation*)operation;
-(void)operationCancelled;
I have now nearly figured out how to Filter a NSTreeController, to do this I have sub-classed NSManagedObject and added some code to my App Delegate, I have also bound my NSSearchField to the filterPredicate of my App Delegate but I think I need to connect my NSTreeController and NSSearchField in some way to make it work.
Below I have posted all the code I have used so far to try and make it work.
NSManagedObject Sub-Class Header File.
#interface Managed_Object_Sub_Class : NSManagedObject {
NSArray *filteredChildren; // this should fix the compiler error
}
- (NSArray *)filteredChildren;
#end
NSManagedObject Sub-Class Implementation File.
#implementation Managed_Object_Sub_Class
static char *FilteredChildrenObservationContext;
- (id)initWithEntity:(NSEntityDescription *)entity insertIntoManagedObjectContext:(NSManagedObjectContext *)context {
if (self = [super initWithEntity:entity insertIntoManagedObjectContext:context]) {
[[NSApp delegate] addObserver:self forKeyPath:#"filterPredicate" options:0 context:&FilteredChildrenObservationContext];
[self addObserver:self forKeyPath:#"subGroup" options:0 context:&FilteredChildrenObservationContext];
}
return self;
}
// use finalize with GC
- (void)dealloc {
[[NSApp delegate] removeObserver:self forKeyPath:#"filterPredicate"];
[self removeObserver:self forKeyPath:#"subGroup"];
[super dealloc];
}
- (NSArray *)filteredChildren {
if (filteredChildren == nil) {
NSPredicate *predicate = [[NSApp delegate] filterPredicate];
if (predicate)
filteredChildren = [[[self valueForKey:#"subGroup"] filteredArrayUsingPredicate:predicate] copy];
else
filteredChildren = [[self valueForKey:#"subGroup"] copy];
}
return filteredChildren;
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (context == &FilteredChildrenObservationContext) {
[self willChangeValueForKey:#"filteredChildren"];
[filteredChildren release];
filteredChildren = nil;
[self didChangeValueForKey:#"filteredChildren"];
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#end
Code Added To App Delegate Header File
NSPredicate *filterPredicate;
Code Added To App Delegate Implementation File
- (NSPredicate *)filterPredicate {
return filterPredicate;
}
- (void)setFilterPredicate:(NSPredicate *)newFilterPredicate {
if (filterPredicate != newFilterPredicate) {
[filterPredicate release];
filterPredicate = [newFilterPredicate retain];
}
}
Search Field Binding
alt text http://snapplr.com/snap/vs9q
This doesn't work yet, and so that is why I am asking what I need to do from here to make it work, like I said I think I need to connect the NSSearchField and NSTreeController Together in some way.
Again I hav answered my own question, I also hope that this will help other people so they know how to Filter a NSTreeController.
To make it work from my post above do the following.
1.For your entity set the Class as your NSManagedObject Sub-Class in my Case JGManagedObject.
alt text http://dvlp.me/c3k
2.For your search field in IB set the predicate format to what you want to Filter ( The Property in your entity, for me it is name).
alt text http://dvlp.me/9k9rw