How can I extract metadata from mp3 file in ios development - objective-c

I am working on an ios music player with cloud storage.
I need to extract the music information such as title, artist, artwork.
I have an action called playit which plays and pauses the mp3 file. It should also populate some UILables and UIImage with the metadtaa that is associated with the mp3 file. The problem is that I could not get the metadata extracted from more than different 25 mp3 files. Here is my code:
The file url is correct because the audio player is able to find and play it, but I do not know why avmetadataitem is not able to get the metadata.
- (IBAction)playIt:(id)sender {
AVAudioPlayer *audioPlayer;
AVAsset *assest;
NSString * applicationPath = [[NSBundle mainBundle] resourcePath];
NSString *secondParentPath = [applicationPath stringByDeletingLastPathComponent];
NSString *soundFilePath = [[secondParentPath stringByAppendingPathComponent:#"fisal1407"] stringByAppendingPathComponent:[musicFiles objectForKey:#"show_id"] ];
NSURL *fileURL = [NSURL URLWithString:[soundFilePath stringByAddingPercentEscapesUsingEncoding: NSUTF8StringEncoding]];
assest = [AVURLAsset URLAssetWithURL:fileURL options:nil];
NSArray *metadata = [assest commonMetadata];
for (NSString *format in metadata) {
for (AVMetadataItem *item in [assest metadataForFormat:format]) {
if ([[item commonKey] isEqualToString:#"title"]) {
filename.text = (NSString *)[item value];
NSLog(#" title : %#", (NSString *)[item value]);
}
if ([[item commonKey] isEqualToString:#"artist"]) {
show_id.text = (NSString *)[item value];
}
if ([[item commonKey] isEqualToString:#"albumName"]) {
// _albumName = (NSString *)[item value];
}
if ([[item commonKey] isEqualToString:#"artwork"]) {
NSData *data = [(NSDictionary *)[item value] objectForKey:#"data"];
UIImage *img = [UIImage imageWithData:data] ;
imageView.image = img;
continue;
}
}
}
if (audioPlayer == nil) {
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:&error];
audioPlayer.numberOfLoops = -1;
[audioPlayer play];
[sender setImage:[UIImage imageNamed:#"player_044.gif"] forState:UIControlStateNormal];
}
else
{
if (audioPlayer.isPlaying)
{
[sender setImage:[UIImage imageNamed:#"player_04.gif"] forState:UIControlStateNormal];
[audioPlayer pause];
} else {
[sender setImage:[UIImage imageNamed:#"player_044.gif"] forState:UIControlStateNormal];
[audioPlayer play];
}
}
}

Try
for (NSString *format in [asset availableMetadataFormats])
Instead of
NSArray *metadata = [assest commonMetadata];
for (NSString *format in metadata) {

Related

How to save AVAudioFile to document directory?

I want to save AVAudioFile to document directory with NSDictionary. Can anyone help me?
AVAudioFile *audiofile=[[AVAudioFile alloc] initForWriting:destinationURL settings:settings error:&error];
save this audio file to document directory...
Path to the documents directory:
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:fileName];
Saving the audio file to the documents directory:
BOOL status = [NSDictionary writeToFile:filePath atomically:YES];
if(status){
NSLog(#"File write successfully");
}
- (NSString *) dateString
{
// return a formatted string for a file name
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
formatter.dateFormat = #"ddMMMYY_hhmmssa";
return [[formatter stringFromDate:[NSDate date]]stringByAppendingString:#".aif"];
}
It saves below as in Documents
23Aug16_044104PM.aif
Why we save above like is we can differenciate the previous one next one by time.So we can't confuse now.
ViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController<AVAudioSessionDelegate,AVAudioRecorderDelegate, AVAudioPlayerDelegate>
{
NSURL *temporaryRecFile;
AVAudioRecorder *recorder;
AVAudioPlayer *player;
}
- (IBAction)actionRecordAudion:(id)sender;
- (IBAction)actionPlayAudio:(id)sender;
#end
ViewController.m
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
[audioSession setActive:YES error:nil];
[recorder setDelegate:self];
}
- (IBAction)actionRecordAudion:(id)sender
{
NSError *error;
// Recording settings
NSMutableDictionary *settings = [NSMutableDictionary dictionary];
[settings setValue: [NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[settings setValue: [NSNumber numberWithFloat:8000.0] forKey:AVSampleRateKey];
[settings setValue: [NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey];
[settings setValue: [NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[settings setValue: [NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[settings setValue: [NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
[settings setValue: [NSNumber numberWithInt: AVAudioQualityMax] forKey:AVEncoderAudioQualityKey];
NSArray *searchPaths =NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentPath_ = [searchPaths objectAtIndex: 0];
NSString *pathToSave = [documentPath_ stringByAppendingPathComponent:[self dateString]];
NSLog(#"the path is %#",pathToSave);
// File URL
NSURL *url = [NSURL fileURLWithPath:pathToSave];//FILEPATH];
//Save recording path to preferences
NSUserDefaults *prefs = [NSUserDefaults standardUserDefaults];
[prefs setURL:url forKey:#"Test1"];
[prefs synchronize];
// Create recorder
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
[recorder prepareToRecord];
[recorder record];
}
- (IBAction)actionPlayAudio:(id)sender
{
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayback error:nil];
[audioSession setActive:YES error:nil];
//Load recording path from preferences
NSUserDefaults *prefs = [NSUserDefaults standardUserDefaults];
temporaryRecFile = [prefs URLForKey:#"Test1"];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:temporaryRecFile error:nil];
player.delegate = self;
[player setNumberOfLoops:0];
player.volume = 1;
[player prepareToPlay];
[player play];
}
Record and Save Audio Permanently
Record Audio File and save Locally
Just now I tried the below code with iPhone 4s and it works perfectly.
AudioViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface AudioViewController : UIViewController<AVAudioSessionDelegate,AVAudioRecorderDelegate,AVAudioPlayerDelegate>
- (IBAction)actionRecordAudio:(id)sender;
- (IBAction)actionPlayAudio:(id)sender;
- (IBAction)actionStopAudio:(id)sender;
#property (strong, nonatomic) AVAudioRecorder *audioRecorder;
#property (strong, nonatomic) AVAudioPlayer *audioPlayer;
#end
AudioViewController.m
#import "AudioViewController.h"
#interface AudioViewController ()
#end
#implementation AudioViewController
#synthesize audioPlayer,audioRecorder;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
NSArray *dirPaths;
NSString *docsDir;
dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
docsDir = dirPaths[0];
NSString *soundFilePath = [docsDir stringByAppendingPathComponent:#"sound.caf"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
NSDictionary *recordSettings = [NSDictionary
dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:AVAudioQualityMin],
AVEncoderAudioQualityKey,
[NSNumber numberWithInt:16],
AVEncoderBitRateKey,
[NSNumber numberWithInt: 2],
AVNumberOfChannelsKey,
[NSNumber numberWithFloat:44100.0],
AVSampleRateKey,
nil];
NSError *error = nil;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
audioRecorder = [[AVAudioRecorder alloc]initWithURL:soundFileURL settings:recordSettings error:&error];
if (error)
{
NSLog(#"error: %#", [error localizedDescription]);
}
else {
[audioRecorder prepareToRecord];
}
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)actionRecordAudio:(id)sender
{
if (!audioRecorder.recording)
{
[audioRecorder record];
}
}
- (IBAction)actionPlayAudio:(id)sender
{
if (audioRecorder.recording)
{
NSError *error;
audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:audioRecorder.url
error:&error];
audioPlayer.delegate = self;
if (error)
NSLog(#"Error: %#",
[error localizedDescription]);
else
[audioPlayer play];
}
}
- (IBAction)actionStopAudio:(id)sender
{
if (audioRecorder.recording)
{
[audioRecorder stop];
}
else if (audioPlayer.playing) {
[audioPlayer stop];
}
}
-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
}
-(void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error
{
NSLog(#"Decode Error occurred");
}
-(void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag
{
}
-(void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)recorder error:(NSError *)error
{
NSLog(#"Encode Error occurred");
}
#end
Here is source

How to open view controller after data has been loaded into model object?

How can I check if the NSData dataWithContentsOfURLparsing in my secondary thread are finished? When every image is finished I want to open my view controller. Not before. Now I can open my view controller directly, and sometimes if I'm to quick my table view has no images, because they're not finished yet. Any ideas?
The following code happens in didFinishLaunchingWithOptions in AppDelegate. Im using the SBJSON framework for parsing.
(Im using the storyboard in this project so there's no code for opening the first view controller)
Code:
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"json_template" ofType:#"json"];
NSString *contents = [NSString stringWithContentsOfFile: filePath encoding: NSUTF8StringEncoding error: nil];
SBJsonParser *jsonParser = [[SBJsonParser alloc] init];
NSMutableDictionary *json = [jsonParser objectWithString: contents];
tabs = [[NSMutableArray alloc] init];
jsonParser = nil;
//parsing json into model objects
for (NSString *tab in json)
{
Tab *tabObj = [[Tab alloc] init];
tabObj.title = tab;
NSDictionary *categoryDict = [[json valueForKey: tabObj.title] objectAtIndex: 0];
for (NSString *key in categoryDict)
{
Category *catObj = [[Category alloc] init];
catObj.name = key;
NSArray *items = [categoryDict objectForKey:key];
for (NSDictionary *dict in items)
{
Item *item = [[Item alloc] init];
item.title = [dict objectForKey: #"title"];
item.desc = [dict objectForKey: #"description"];
item.url = [dict objectForKey: #"url"];
if([dict objectForKey: #"image"] != [NSNull null])
{
dispatch_async( dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0 ), ^(void)
{
NSURL *imgUrl = [NSURL URLWithString: [dict objectForKey: #"image"]];
NSData *imageData = [NSData dataWithContentsOfURL: imgUrl];
dispatch_async( dispatch_get_main_queue(), ^(void)
{
item.image = [UIImage imageWithData: imageData];
});
});
}
else
{
UIImage *image = [UIImage imageNamed: #"standard3.png"];
item.image = image;
}
[catObj.items addObject: item];
}
[tabObj.categories addObject: catObj];
}
[tabs addObject: tabObj];
}
//sort array
[tabs sortUsingComparator:^NSComparisonResult(id obj1, id obj2){
Tab *r1 = (Tab*) obj1;
Tab *r2 = (Tab*) obj2;
return [r1.title caseInsensitiveCompare: r2.title];
}];
/***** END PARSING JSON *****/
[[UINavigationBar appearance] setTitleTextAttributes: #{
UITextAttributeTextShadowOffset: [NSValue valueWithUIOffset:UIOffsetMake(0.0f, 0.0f)],
UITextAttributeFont: [UIFont fontWithName:#"GreatLakesNF" size:20.0f]
}];
UIImage *navBackgroundImage = [UIImage imageNamed:#"navbar.png"];
[[UINavigationBar appearance] setBackgroundImage:navBackgroundImage forBarMetrics:UIBarMetricsDefault];
UIImage *backButtonImage = [[UIImage imageNamed:#"backBtn.png"] resizableImageWithCapInsets:UIEdgeInsetsMake(0, 0, 0, 0)];
UIImage *backButtonSelectedImage = [[UIImage imageNamed:#"backBtn_selected.png"] resizableImageWithCapInsets:UIEdgeInsetsMake(0, 0, 0, 0)];
[[UIBarButtonItem appearance] setBackButtonBackgroundImage:backButtonImage forState:UIControlStateNormal barMetrics:UIBarMetricsDefault];
[[UIBarButtonItem appearance] setBackButtonBackgroundImage:backButtonSelectedImage forState: UIControlStateHighlighted barMetrics:UIBarMetricsDefault];
return YES;
Also, if this way of parsing is bad, please tell me!
First of all, you shouldn't use such way of downloading any content from remote host.
There are lots of libraries like AFNetworking, ASIHTTPRequest
which work around CFNetwork or NSURLConnection to handle such things as redirects, error handling etc.
So you should definitely move to one of those (or implement your own based on NSURLConnection).
As a direct answer to your question:
You should use some kind of identifier for counting downloaded images (i.e. for-loop iteration counter) and pass it via +[UINotificationCenter defaultCenter] as a parameter of some custom notification.
Example (assuming that you are blocking current thread by +[NSData dataWithContentsOfURL:]):
for (int i = 0; i < 10; i++) {
[[NSNotificationCenter defaultCenter] postNotificationName:#"someCustomNotificationClassName" object:nil userInfo:#{ #"counter" : #(i) }];
}
More expanded example of NSNotification-based approach:
- (id)init {
self = [super init];
if (self) {
// subscribing for notification
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(handleDataDownload:) name:#"someCustomNotificationClassName" object:nil];
}
return self;
}
- (void)dealloc {
// unsubscribing from notification on -dealloc
[[NSNotificationCenter defaultCenter] removeObserver:self];
}
#pragma mark - downloading delegation
- (void)handleDataDownload:(NSNotification *)notification {
NSDictionary *userInfo = [notification userInfo];
int counter = [userInfo[#"counter"] intValue];
if (counter == 10) {
// do some work afterwards
// assuming that last item was downloaded
}
}
Also you can use callback technique to manage handling of download state:
void (^callback)(id result, int identifier) = ^(id result, int identifier) {
if (identifier == 10) {
// do some work afterwards
}
};
for (int i = 0; i < 10; i++) {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, kNilOptions), ^{
// some downloading stuff which blocks thread
id data = nil;
callback(data, i);
});
}

AVAudioPlayer breaking video capture

In one of the views of my app there's a button. When pressed it is supposed to begin taking a video, trigger a sound file to start, and hide itself from view while unhiding another button. The second button is supposed to stop the video recording and make it save. Here's the code I have for the video recording, which initially worked with no problems:
in viewDidLoad:
finishButton.hidden = TRUE;
session = [[AVCaptureSession alloc] init];
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSError *error;
AVCaptureDeviceInput *videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self cameraWithPosition:AVCaptureDevicePositionFront] error:&error];
if (videoInput)
{
[session addInput:videoInput];
}
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *audioError = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&audioError];
if (audioInput)
{
[session addInput:audioInput];
}
Float64 TotalSeconds = 35; //Total seconds
int32_t preferredTimeScale = 30; //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale);
movieFileOutput.maxRecordedDuration = maxDuration;
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024;
if ([session canAddOutput:movieFileOutput])
[session addOutput:movieFileOutput];
[session setSessionPreset:AVCaptureSessionPresetMedium];
if ([session canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[session setSessionPreset:AVCaptureSessionPreset640x480];
[self cameraSetOutputProperties];
[session startRunning];
and for the button:
-(IBAction)start:(id)sender
{
startButton.hidden = TRUE;
finishButton.hidden = FALSE;
//Create temporary URL to record to
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
self.outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if required
}
}
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
finally, under the last button:
[movieFileOutput stopRecording];
and here's the code to save the video:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
}
}];
}
}
}
All of this was working just fine. Then I added a few lines so that a song file would play when the start button was pressed.
in viewDidLoad:
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/Song.aiff", [[NSBundle mainBundle] resourcePath]]];
NSError *audioFileError;
player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&audioFileError];
player.numberOfLoops = 0;
[self.player prepareToPlay];
and under the start button:
if (player == nil)
NSLog(#"Audio file could not be played");
else
[player play];
Now when the start button is pressed the song plays with no problems, but the video capture is messed up. Before adding the AVAudioPlayer stuff I would get the "didFinishRecordingToOutputFileAtURL - enter" and "didFinishRecordingToOutputFileAtURL - success" logs when I pressed the finish button, and now I get the first log as soon as I press the start button, nothing happens when I press the finish button, and no video is recorded. If I comment out the lines that make the song play then the video capture works just fine again. Any ideas what's going on here?
- (void)setupAudioSession
{
static BOOL audioSessionSetup = NO;
if (audioSessionSetup)
{
return;
}
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 1;
AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);
[[AVAudioSession sharedInstance] setActive: YES error: nil];
audioSessionSetup = YES;
}
- (void)playAudio
{
[self setupAudioSession];
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:#"btnClick" ofType:#"wav"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:soundFilePath];
AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil];
[fileURL release];
self.audioPlayer = newPlayer;
[newPlayer release];
[audioPlayer setDelegate:self];
[audioPlayer prepareToPlay];
audioPlayer.volume=1.0;
[audioPlayer play];
}
NOTE: Add the framework: AudioToolbox.framework.
#import <AudioToolbox/AudioServices.h>

how to return result after OpenWithCompletionHandler: is complete

Want to query a photo in the Coredata database
this is my code
this is the NSObjectSubclass category
//Photo+creak.h
#import "Photo+creat.h"
#implementation Photo (creat)
+(Photo *)creatPhotoByString:(NSString *)photoName inManagedObjectContext:(NSManagedObjectContext *)context{
Photo *picture = nil;
NSFetchRequest *request = [NSFetchRequest fetchRequestWithEntityName:#"Photo"];
request.predicate = [NSPredicate predicateWithFormat:#"name = %#", photoName];
NSArray *matches = [context executeFetchRequest:request error:nil];
if (!matches || [matches count]>1) {
//error
} else if ([matches count] == 0) {
picture = [NSEntityDescription insertNewObjectForEntityForName:#"Photo" inManagedObjectContext:context];
picture.name = photoName;
} else {
picture = [matches lastObject];
}
return picture;
}
+ (BOOL)isPhoto:(NSString *)photoName here:(NSManagedObjectContext *)context{
NSFetchRequest *request = [NSFetchRequest fetchRequestWithEntityName:#"Photo"];
request.predicate = [NSPredicate predicateWithFormat:#"name = %#", photoName];
NSArray *matches = [context executeFetchRequest:request error:nil];
switch ([matches count]) {
case 1:
return YES;
break;
default:
return NO;
break;
}
}
#end
code inside of view controller
//View Controller
- (IBAction)insertData:(UIButton *)sender {
NSURL *url = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
url = [url URLByAppendingPathComponent:#"test"];
UIManagedDocument *defaultDocument = [[UIManagedDocument alloc] initWithFileURL:url];
if (![[NSFileManager defaultManager] fileExistsAtPath:[url path]]) {
[defaultDocument saveToURL:defaultDocument.fileURL forSaveOperation:UIDocumentSaveForCreating completionHandler:NULL];
}
[defaultDocument openWithCompletionHandler:^(BOOL success) {
[Photo creatPhotoByString:#"test" inManagedObjectContext:defaultDocument.managedObjectContext];
[defaultDocument saveToURL:defaultDocument.fileURL forSaveOperation:UIDocumentSaveForOverwriting completionHandler:NULL];
}];
[sender setTitle:#"Okay" forState:UIControlStateNormal];
[sender setEnabled:NO];
}
- (IBAction)queryFromDatabase:(UIButton *)sender {
NSURL *url = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
url = [url URLByAppendingPathComponent:#"test"];
UIManagedDocument *defaultDocument = [[UIManagedDocument alloc] initWithFileURL:url];
BOOL isItWorking = [checkPhoto isPhoto:#"test" inManagedDocument:defaultDocument];
if (isItWorking) {
[sender setTitle:#"Okay" forState:UIControlStateNormal];
} else {
[sender setTitle:#"NO" forState:UIControlStateNormal];
}
}
The NSObject Class that hook them up.
// checkPhoto.m
#import "checkPhoto.h"
#implementation checkPhoto
+ (BOOL)isPhoto:(NSString *)photoToCheck inManagedDocument:(UIManagedDocument *)document{
__block BOOL isPhotoHere = NO;
if (document.documentState == UIDocumentStateClosed) {
[document openWithCompletionHandler:^(BOOL success) {
isPhotoHere = [Photo isPhoto:photoToCheck here:document.managedObjectContext];
}];
}
return isPhotoHere;
}
#end
The coredata only have on Entity named "Photo", and it got only one attribute "name".
The problem is that the return always get execute before the block is complete and always return NO.
Test code here
Or should I do something else than openWithCompletionHandler when querying?
You need to rework your method to work asynchronously, like -openWithCompletionHandler:. It needs to take a block which is invoked when the answer is known and which receives the answer, true or false, as a parameter.
Then, the caller should pass in a block that does whatever is supposed to happen after the answer is known.
Or, alternatively, you should delay the whole chunk of logic which cares about the photo being in the database. It should be done after the open has completed.
You'd have to show more code for a more specific suggestion.
So, you could rework the isPhoto... method to something like:
+ (BOOL)checkIfPhoto:(NSString *)photoToCheck isInManagedDocument:(UIManagedDocument *)document handler:(void (^)(BOOL isHere))handler {
if (document.documentState == UIDocumentStateClosed) {
[document openWithCompletionHandler:^(BOOL success) {
handler([Photo isPhoto:photoToCheck here:document.managedObjectContext]);
}];
}
else
handler(NO);
}
Then you can rework this:
- (IBAction)queryFromDatabase:(UIButton *)sender {
NSURL *url = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
url = [url URLByAppendingPathComponent:#"test"];
UIManagedDocument *defaultDocument = [[UIManagedDocument alloc] initWithFileURL:url];
[checkPhoto checkIfPhoto:#"test" isInManagedDocument:defaultDocument handler:^(BOOL isHere){
if (isHere) {
[sender setTitle:#"Okay" forState:UIControlStateNormal];
} else {
[sender setTitle:#"NO" forState:UIControlStateNormal];
}
}];
}
Try that
+(BOOL)isPhoto:(Photo *)photo inDataBase:(UIManagedDocument *)defaultDocument{
__block BOOL isPhotoThere = NO;
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[defaultDocument openWithCompletionHandler:^(BOOL success) {
[defaultDocument.managedObjectContext performBlock:^{
isPhotoThere = [Photo checkPhoto:photo];
dispatch_semaphore_signal(sema);
}];
}];
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
dispatch_release(sema);
return isPhotoThere;
}

fetch image by using url

I want image on view by fetching it from some url. I want changing in given code..
-(void)viewWillAppear:(BOOL)animated{
[super viewWillAppear:animated];
UIImage *image = [[UIImage imageNamed:<#(NSString *)name#>
NSString * mediaUrl = [[[self appDelegate]currentlySelectedBlogItem]mediaUrl];
[[self image]setImage:[UIImage imageNamed:#"unknown.jpg"]];
if(nil != mediaUrl){
NSData* imageData;
[UIApplication sharedApplication].networkActivityIndicatorVisible = YES;
#try {
imageData = [[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:mediaUrl]];
}
#catch (NSException * e) {
//Some error while downloading data
}
#finally {
UIImage * imageFromImageData = [[UIImage alloc] initWithData:imageData];
[[self image]setImage:imageFromImageData];
[imageData release];
[imageFromImageData release];
}
[UIApplication sharedApplication].networkActivityIndicatorVisible = NO;
}
self.titleTextView.text = [[[self appDelegate] currentlySelectedBlogItem]title];
self.descriptionTextView.text = [[[self appDelegate] currentlySelectedBlogItem]description];
}
Using this will give you a solution
NSURL *url = [NSURL URLWithString:#"ENTER YOUR URL HAVING THE IMAGE"];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:data];
I have used the following:
NSString *url=[NSString stringWithFormat:#"Your URL"];
//NSLog(#"URL=%#",url);
UIImage *myImage=[[UIImage alloc] initWithData:[NSData UIImage *myImage=[[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString: url]]];
NSLog(#"%d byte of data", [[NSData dataWithContentsOfURL: [NSURL URLWithString: url]] length]);
if (myImage)
{
//THIS CODE WILL STORE IMAGE DOCUMENT DIRECTORY
NSString *jpegFilePath = [NSString stringWithFormat:#"%#/%#.jpg",[self pathForDocumentDirectory],[self.idOfImagesToDownload objectAtIndex:i]];
NSData *data1 = [NSData dataWithData:UIImageJPEGRepresentation(myImage, 1.0f)];//1.0f = 100% quality
[data1 writeToFile:jpegFilePath atomically:YES];
}
NOTE:[self pathForDocumentDirectory] is method returning path of document directory.