Getting frames though a stream and display on screen - objective-c

I have a requirement of streaming from server and displaying the streamed content on the screen...Streaming is working fine using NSStream, and NSInputStream and NSOutputStream.How can I display it on the screen?
Stream used looks like #"http://191.168.143.41:1212/;
if(stream == inputStream) {
uint8_t buf[1024];
unsigned int len = 0;
len = [inputStream read:buf maxLength:1024];
if(len > 0) {
NSMutableData* datas=[[NSMutableData alloc] initWithLength:0];
[datas appendBytes: (const void *)buf length:len];
NSString *s = [[NSString alloc] initWithData:datas encoding:NSASCIIStringEncoding];
[self readIn:s];
NSLog(#"ss%#",s);
[self loadMovie:s]; //method for movie player
}
I tried to display this is in a movieplayer as below..
-(void_loadMovie:(NSString*)moviePrefix
{
NSString *path = [NSString stringWithFormat:#"%#.mjpg", moviePrefix];
NSURL *url = [NSURL fileURLWithPath:path];
if (url) {
_moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:url];
_moviePlayer.view.frame = CGRectMake(0, 70, 600, 450);
_moviePlayer.controlStyle = MPMovieControlStyleNone;
_moviePlayer.scalingMode = MPMovieScalingModeNone;
[dic setObject:__moviePlayer forKey:path];
}
}
[_moviePlayer prepareToPlay];
[self.view addSubview: _moviePlayer.view];
[self.view bringSubviewToFront:_moviePlayer.view];
[self.view addSubview: _moviePlayer.view];
[_moviePlayer play];
}
Is NSString *path = [NSString stringWithFormat:#"%#.mjpg", moviePrefix]; correct way??
This displays a black screen.What is wrong?
If this way is not correct,Is there any other way I can display those frames?
Can anyone help me to solve this...

MJPEG are only JPEG sent one after the other.
I worked a few years ago on this.
On a version of iOS (iOS5?), it was easily read with a UIWebView, but an update of iOS broke all this. This broke all my current work.
Maybe a UIWebView could do the trick today again (fix).
Anyway, since it's just bunch of JPEG, you could just read the JPG (detect start/end of JPG file), create the JPG image and show it in a UIImageView.
A work around (not tested), but you should get the whole idea:
//Properties
#property (nonatomic, strong) NSMutableData *data;
#property (nonatomic, weak) IBOutlet UIImageView *streamImageView;
//Initialize somewhere
_data = [[NSMutableData alloc] init];
//In the stream delegate method:
//Start JPG: FFD8 — End JPG: FFD9
UInt8 startJPEGBytes[2];
startJPEGBytes[0] = 0xFF;
startJPEGBytes[1] = 0xD8;
NSData *startData = [NSData dataWithBytes:&startJPEGBytes length:2];
UInt8 endJPEGBytes[2];
endJPEGBytes[0] = 0xFF;
endJPEGBytes[0] = 0xD9;
NSData *endData = [NSData dataWithBytes:&endJPEGBytes length:2];
[_data appendBytes: (const void *)buf length:len];
NSRange startRange = [_data rangeOfData:startData options:0 range:NSMakeRange(0, [_data length])];
if (startRange.location != NSNotFound) //We found the start of a JPEG
{
NSRange endRange = [_data rangeOfData:endData options:0 range:NSMakeRange(startRange.location, [_data length]-startRange.location)];
if (endRange.location != NSNotFound) //We found the end of a JPEG
{
NSRange imageRange = NSMakeRange(startRange.location, endRange.location+endRange.length-startRange.location);
NSData *imageData = [_data subDataWithRange: imageRange];
streamImage = [UIImage imageWithData:imageData];
[_streamImageView setImage:streamImage];
[_data replaceBytesInRange:NSMakeRange(0, imageRange.location+imageRange.length withBytes:NULL length:0]; //We remove the start till the end of JPEG frame. Start at 0, since there could be garbage at the start.
}
}

You are not adding moviePrefix to the string
NSString *path = [NSString stringWithFormat:#".mjpg", moviePrefix, #"movie"];
Change it to
NSString *path = [NSString stringWithFormat:#"%#.mjpg", moviePrefix, #"movie"];

https://github.com/horsson/mjpeg-iphone/tree/55251a85e2c2489014036ddf5a491783f9b1962d
Used this to get the stream and display.It works

Related

AVAudioPCMBuffer for music files

I've been trying to play music in my SpriteKit game and used the AVAudioPlayerNode class to do so via AVAudioPCMBuffers. Every time I exported my OS X project, it would crash and give me an error regarding audio playback. After banging my head against the wall for the last 24 hours I decided to re-watch WWDC session 501 (see 54:17). My solution to this problem was what the presenter used, which is to break the frames of the buffer into smaller pieces to break up the audio file being read.
NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;
AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
AVAudioFramePosition readPosition = audioFile.framePosition;
if (![audioFile readIntoBuffer: readBuffer error: &error])
return NO;
if (readBuffer.frameLength == 0) //end of file reached
break;
}
My current problem is that the player only plays the last frame read into the buffer. The music that I'm playing is only 2 minutes long. Apparently, this is too long to just read into the buffer outright. Is the buffer being overwritten every time the readIntoBuffer: method is called inside the loop? I'm such a noob at this stuff...how can I get the entire file played?
If I can't get this to work, what is a good way to play music (2 different files) across multiple SKScenes?
This is the solution that I came up with. It's still not perfect, but hopefully it will help someone who is in the same predicament that I've found myself in. I created a singleton class to handle this job. One improvement that can be made in the future is to only load sound effects and music files needed for a particular SKScene at the time they are needed. I had so many issues with this code that I don't want to mess with it now. Currently, I don't have too many sounds, so it's not using an excessive amount of memory.
Overview
My strategy was the following:
Store the audio file names for the game in a plist
Read from that plist and create two dictionaries (one for music and one for short sound effects)
The sound effect dictionary is composed of a AVAudioPCMBuffer and a AVAudioPlayerNode for each of the sounds
The music dictionary is compose of an array of AVAudioPCMBuffers, an array of timestamps for when those buffers should be played in queue, a AVAudioPlayerNode and the sample rate of the original audio file
The sample rate is necessary for figuring out the time at which each buffer should be played (you'll see the calculations done in code)
Create an AVAudioEngine and get the main mixer from the engine and attach all AVAudioPlayerNodes to the mixer (as per usual)
Play sound effects or music using their various methods
sound effect playing is straightforward...call method -(void) playSfxFile:(NSString*)file;
and it plays a sound
for music, I just couldn't find a good solution without invoking the help of the scene trying to play the music. The scene will call -(void) playMusicFile:(NSString*)file;and it will schedule the buffers to play in order that they were created. I couldn't find a good way to get the music to repeat once completed within my AudioEngine class so I decided to get the scene to check in its update: method whether or not the music was playing for a particular file and if not, play it again (not a very slick solution, but it works)
AudioEngine.h
#import <Foundation/Foundation.h>
#interface AudioEngine : NSObject
+(instancetype)sharedData;
-(void) playSfxFile:(NSString*)file;
-(void) playMusicFile:(NSString*)file;
-(void) pauseMusic:(NSString*)file;
-(void) unpauseMusic:(NSString*)file;
-(void) stopMusicFile:(NSString*)file;
-(void) setVolumePercentages;
-(bool) isPlayingMusic:(NSString*)file;
#end
AudioEngine.m
#import "AudioEngine.h"
#import <AVFoundation/AVFoundation.h>
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)
#interface AudioEngine()
#property AVAudioEngine *engine;
#property AVAudioMixerNode *mixer;
#property NSMutableDictionary *musicDict;
#property NSMutableDictionary *sfxDict;
#property NSString *audioInfoPList;
#property float musicVolumePercent;
#property float sfxVolumePercent;
#property float fadeVolume;
#property float timerCount;
#end
#implementation AudioEngine
int const FADE_ITERATIONS = 10;
static NSString * const MUSIC_PLAYER = #"player";
static NSString * const MUSIC_BUFFERS = #"buffers";
static NSString * const MUSIC_FRAME_POSITIONS = #"framePositions";
static NSString * const MUSIC_SAMPLE_RATE = #"sampleRate";
static NSString * const SFX_BUFFER = #"buffer";
static NSString * const SFX_PLAYER = #"player";
+(instancetype) sharedData {
static AudioEngine *sharedInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
sharedInstance = [[self alloc] init];
[sharedInstance startEngine];
});
return sharedInstance;
}
-(instancetype) init {
if (self = [super init]) {
_engine = [[AVAudioEngine alloc] init];
_mixer = [_engine mainMixerNode];
_audioInfoPList = [[NSBundle mainBundle] pathForResource:#"AudioInfo" ofType:#"plist"]; //open a plist called AudioInfo.plist
[self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
[self initMusic];
[self initSfx];
}
return self;
}
//opens all music files, creates multiple buffers depending on the length of the file and a player
-(void) initMusic {
_musicDict = [NSMutableDictionary dictionary];
_audioInfoPList = [[NSBundle mainBundle] pathForResource: #"AudioInfo" ofType: #"plist"];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *musicFileName in audioInfoData[#"music"]) {
[self loadMusicIntoBuffer:musicFileName];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
[_engine connect:player to:_mixer format:buffer.format];
[_musicDict[musicFileName] setObject:player forKey:#"player"];
}
}
//opens a music file and creates an array of buffers
-(void) loadMusicIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:#"aif"];
//NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"aif"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
[_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];
NSMutableArray *buffers = [NSMutableArray array];
NSMutableArray *framePositions = [NSMutableArray array];
const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
[framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
if (![audioFile readIntoBuffer:readBuffer error:&error]) {
NSLog(#"failed to read audio file: %#", error);
return;
}
if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
break;
}
[buffers addObject:readBuffer];
}
[_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
[_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
}
-(void) initSfx {
_sfxDict = [NSMutableDictionary dictionary];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *sfxFileName in audioInfoData[#"sfx"]) {
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
[self loadSoundIntoBuffer:sfxFileName];
AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
[_engine connect:player to:_mixer format:buffer.format];
[_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
}
}
//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
-(void) loadSoundIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"mp3"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
[audioFile readIntoBuffer:readBuffer error:&error];
[_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
}
-(void)startEngine {
[_engine startAndReturnError:nil];
}
-(void) playSfxFile:(NSString*)file {
AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:#"player"];
AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
[player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
[player setVolume:1.0];
[player setVolume:_sfxVolumePercent];
[player play];
}
-(void) playMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying] == NO) {
NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];
double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];
for (int i = 0; i < [buffers count]; i++) {
long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];
AVAudioPCMBuffer *buffer = [buffers objectAtIndex:i];
[player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
if (i == [buffers count] - 1) {
[player stop];
}
}];
[player setVolume:_musicVolumePercent];
[player play];
}
}
}
-(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
if ([file isEqualToString:#"menuscenemusic"]) {
AVAudioPlayerNode *player = [_musicDict[#"levelscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
else {
AVAudioPlayerNode *player = [_musicDict[#"menuscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
}
//stops the player for a particular sound
-(void) stopMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
_timerCount = FADE_ITERATIONS;
_fadeVolume = _musicVolumePercent;
[self fadeOutMusicForPlayer:player]; //fade out the music
}
}
//helper method for stopMusicFile:
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:#selector(handleTimer:) userInfo:player repeats:YES];
}
//helper method for stopMusicFile:
-(void) handleTimer:(NSTimer*)timer {
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
if (_timerCount > 0) {
_timerCount--;
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
_fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
[player setVolume:_fadeVolume];
}
else {
[player stop];
[player setVolume:_musicVolumePercent];
[timer invalidate];
}
}
-(void) pauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
[player pause];
}
}
-(void) unpauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player play];
}
//sets the volume of the player based on user preferences in GameData class
-(void) setVolumePercentages {
NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:#"musicVolume"];
_musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:#"sfxVolume"];
_sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
//immediately sets music to new volume
for (NSString *file in [_musicDict allKeys]) {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player setVolume:_musicVolumePercent];
}
}
-(bool) isPlayingMusic:(NSString *)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying])
return YES;
return NO;
}
#end

Image not displayed on UICollectionViewCell

I am trying to get images on contacts,here i used UICollectionViewCell but in the collection view i didn't get image for the contact,i get only name and number.Here my code is
- (IBAction)ContactDisplay:(id)sender {
_addressBookController = [[ABPeoplePickerNavigationController alloc] init];
[_addressBookController setPeoplePickerDelegate:self];
[self presentViewController:_addressBookController animated:YES completion:nil];
}
- (void)peoplePickerNavigationController:(ABPeoplePickerNavigationController*)peoplePicker didSelectPerson:(ABRecordRef)person
{
[self displayPerson:person];
}
- (void)displayPerson:(ABRecordRef)person
{
NSString* name = (__bridge_transfer NSString*)ABRecordCopyValue(person,
kABPersonFirstNameProperty);
NSLog(#"%#",name);
NSString* phone = nil;
ABMultiValueRef phoneNumbers = ABRecordCopyValue(person,
kABPersonPhoneProperty);
if (ABMultiValueGetCount(phoneNumbers) > 0) {
phone = (__bridge_transfer NSString*)
ABMultiValueCopyValueAtIndex(phoneNumbers, 0);
} else {
phone = #"[None]";
}
NSLog(#"%#",phone);
UIImage *img ;
if (person != nil && ABPersonHasImageData(person)) {
if ((&ABPersonCopyImageDataWithFormat) != nil ) {
img= [UIImage imageWithData:(__bridge NSData *)ABPersonCopyImageDataWithFormat(person, kABPersonImageFormatThumbnail)];
}
} else {
NSString *imageUrlString = #"http://www.google.co.in/intl/en_com/images/srpr/logo1w.png";
NSURL *url = [NSURL URLWithString:imageUrlString];
NSData *data = [[NSData alloc] initWithContentsOfURL:url];
img= [UIImage imageWithData:data];
}
NSString *string ;//
string =[NSString stringWithFormat:#"%#",img];
NSLog(#"%#",img);
self.name.text=name;
self.number.text=phone;
[self.nameArray addObject:name];
[self.imageArray addObject:string];
NSLog(#"%#",self.nameArray);
NSLog(#"%#",self.imageArray);
[self.collectionView reloadData];
[self.collectionView performBatchUpdates:^{
[self.collectionView reloadSections:[NSIndexSet indexSetWithIndex:0]];
} completion:nil];
}
finally an image array i got like this
(
"add-button.png",
"<UIImage: 0x17e56c80>, {148, 148}"
)
On image array every image like display .PNG format it will display fine ,then how can modify it.
Can you please suggest me how can you solve this,thank you.
I don't fully agree with everything you're doing there but I think you're getting your data wrong. Try using this instead when you're fetching the ABPerson image data.
if (person != nil) {
CFDataRef imageData = ABPersonCopyImageData(person);
NSData *data = CFBridgingRelease(imageData);
if (data != nil && data.length > 10){ //arbitrary length to make sure our data object isnt' really empty
img = [UIImage imageWithData:data];
} else {
NSString *imageUrlString = #"http://www.google.co.in/intl/en_com/images/srpr/logo1w.png";
NSURL *url = [NSURL URLWithString:imageUrlString];
NSData *data = [[NSData alloc] initWithContentsOfURL:url];
img= [UIImage imageWithData:data];
}
Then don't store your images as Strings in your array. Store them either as NSData or UIImage, but NOT STRINGS.
so
[myArray addObject:img]; //not the string.
And when you fetch it later, make sure you treat is as an image and not as a string
on your storyboard, select the image and look at the properties panel.
there are "Installed" options at the bottom. check the topmost "Installed" box.
I think there might be issue with conversion of image to string
NSString *string ;//
string =[NSString stringWithFormat:#"%#",img];
Add image to image array without converting to string
[self.imageArray addObject:img];
I do it like this in my app. Assuming 'person' is an ABRecordRef.
NSMutableDictionary *contactInfoDict = [[NSMutableDictionary alloc]
initWithObjects:#[#"", #"", #"", #""]
forKeys:#[#"firstName", #"lastName", #"birthday", #"picture"]];
CFTypeRef generalCFObject;
// Firtname
generalCFObject = ABRecordCopyValue(person, kABPersonFirstNameProperty);
if (generalCFObject) {
[contactInfoDict setObject:(__bridge NSString *)generalCFObject forKey:#"firstName"];
CFRelease(generalCFObject);
}
// Lastname
generalCFObject = ABRecordCopyValue(person, kABPersonLastNameProperty);
if (generalCFObject) {
[contactInfoDict setObject:(__bridge NSString *)generalCFObject forKey:#"lastName"];
CFRelease(generalCFObject);
}
// Birthday
generalCFObject = ABRecordCopyValue(person, kABPersonBirthdayProperty);
if (generalCFObject) {
[contactInfoDict setObject:(__bridge NSString *)generalCFObject forKey:#"birthday"];
NSLog(#"Date : %#", [contactInfoDict objectForKey:#"birthday"]);
CFRelease(generalCFObject);
}
// User image
CFDataRef photo = ABPersonCopyImageData(person);
if (photo) {
CFRelease(photo);
UIImage *image = [UIImage imageWithData:(__bridge NSData*)photo];
[contactInfoDict setObject:image forKey:#"picture"];
}

custom Annotations being switched when reloaded on MKMapView

I've been having this issue for a couple of weeks now, and I still have not found an answer. on my MapView I have custom annotations, and when I hit the "reload button" all the information is correct as in the annotation "title, subtitle". but the annotation has changed. the annotations are in a NSMutableArray and I'm sure that the issue i am having revolves around that. here is the code I am using to reload the annotations.
so just prevent any confusion, my custom annotations work just fine when i first load the mapView. But once i hit the reload button, all the annotation's information like "location,title, subtitle" all that is correct, just the actual annotation has changed. Like all the annotations have been switched around.
if anyone can help, it would greatly be appreciated! thanks!
- (IBAction)refreshMap:(id)sender {
NSArray *annotationsOnMap = myMapView.annotations;
[myMapView removeAnnotations:annotationsOnMap];
[locations removeAllObjects];
[citiesArray removeAllObjects];
[self retrieveData];
}
-(void) retrieveData {
userLAT = [NSString stringWithFormat:#"%f", myMapView.userLocation.coordinate.latitude];
userLNG = [NSString stringWithFormat:#"%f", myMapView.userLocation.coordinate.longitude];
NSString *fullPath = [mainUrl stringByAppendingFormat:#"map_json.php?userID=%#&lat=%#&lng=%#",theUserID,userLAT,userLNG];
NSURL * url =[NSURL URLWithString:fullPath];
NSData *data = [NSData dataWithContentsOfURL:url];
json =[NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:nil];
citiesArray =[[NSMutableArray alloc]init];
for (int i = 0; i < json.count; i++)
{
//create city object
NSString * eID =[[json objectAtIndex:i] objectForKey:#"userid"];
NSString * eAddress =[[json objectAtIndex:i] objectForKey:#"full_address"];
NSString * eHost =[[json objectAtIndex:i] objectForKey:#"username"];
NSString * eLat =[[json objectAtIndex:i] objectForKey:#"lat"];
NSString * eLong =[[json objectAtIndex:i] objectForKey:#"lng"];
NSString * eName =[[json objectAtIndex:i] objectForKey:#"Restaurant_name"];
NSString * eState = [[json objectAtIndex:i] objectForKey:#"type"];
NSString * annotationPic = [[json objectAtIndex:i] objectForKey:#"Annotation"];
NSString * eventID = [[json objectAtIndex:i] objectForKey:#"id"];
//convert lat and long from strings
float floatLat = [eLat floatValue];
float floatLONG = [eLong floatValue];
City * myCity =[[City alloc] initWithRestaurantID: (NSString *) eID andRestaurantName: (NSString *) eName andRestaurantState: (NSString *) eState andRestaurantAddress: (NSString *) eAddress andRestaurantHost: eHost andRestaurantLat: (NSString *) eLat andRestaurantLong: (NSString *) eLong];
//Add our city object to our cities array
// Do any additional setup after loading the view.
[citiesArray addObject:myCity];
//Annotation
locations =[[NSMutableArray alloc]init];
CLLocationCoordinate2D location;
Annotation * myAnn;
//event1 annotation
myAnn =[[Annotation alloc]init];
location.latitude = floatLat;
location.longitude = floatLONG;
myAnn.coordinate = location;
myAnn.title = eName;
myAnn.subtitle = eHost;
myAnn.type = eState;
myAnn.AnnotationPicture = annotationPic;
myAnn.passEventID = eventID;
myAnn.hotZoneLevel = hotZone;
[locations addObject:myAnn];
[self.myMapView addAnnotations:locations];
}
}
- (MKAnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(id <MKAnnotation>)annotation
{
if([annotation isKindOfClass:[MKUserLocation class]])
return nil;
static NSString *annotationIdentifier = #"AnnotationIdentifier";
MKAnnotationView *annotationView = (MKAnnotationView *) [self.myMapView
dequeueReusableAnnotationViewWithIdentifier:annotationIdentifier];
if (!annotationView)
{
annotationView = [[MKAnnotationView alloc]
initWithAnnotation:annotation
reuseIdentifier:annotationIdentifier];
NSString *restaurant_Icon = ((Annotation *)annotation).AnnotationPicture;
NSString *restaurant_Callout = [NSString stringWithFormat:#"mini.%#",restaurant_Icon];
UIImage *oldImage = [UIImage imageNamed:restaurant_Icon];
UIImage *newImage;
CGSize newSize = CGSizeMake(75, 75);
newImage = [oldImage imageScaledToFitSize:newSize]; // uses MGImageResizeScale
annotationView.image= newImage;
annotationView.canShowCallout = YES;
UIImage *Mini_oldImage = [UIImage imageNamed:event_Callout];
UIImage *Mini_newImage;
CGSize Mini_newSize = CGSizeMake(30,30);
Mini_newImage = [Mini_oldImage imageScaledToFitSize:Mini_newSize]; // uses MGImageResizeScale
UIImageView *finalMini_callOut = [[UIImageView alloc] initWithImage:Mini_newImage];
annotationView.leftCalloutAccessoryView = finalMini_callOut;
UIButton* rightButton = [UIButton buttonWithType:UIButtonTypeDetailDisclosure];
annotationView.rightCalloutAccessoryView = rightButton;
}
else
{
annotationView.annotation = annotation;
}
return annotationView;
}
If nothing else, you're setting the icon and the callout based upon the annotation, but only doing that in viewForAnnotation if the annotation was not dequeued. You really want to do any annotation-specific customization outside of that if block, in case an annotation view is reused.
Unrelated to your reported issue, there are a few other observations:
You probably should be doing retrieveData asynchronously so you don't tie up the main thread with your data retrieval/parsing. Go ahead and dispatch the adding of the entry to your array and adding the annotation to the map in the main queue, but the network stuff and should definitely be done asynchronously.
You probably should check to make sure data is not nil (e.g. no network connection or some other network error) because JSONObjectWithData will crash if you pass it a nil value.
Your use of locations seems unnecessary because you're resetting it for every entry in your JSON. You could either (a) retire locations entirely and just add the myAnn object to your map's annotations; or (b) initialize locations before the for loop. But it's probably misleading to maintain this ivar, but only populate it with the last annotation.

How Can I Save This Array of Images?

I'm very new to programming, and I jumped right into a project (I know thats not the smartest thing to do, but I'm learning as I go). The app that I'm writing has 10 UIImageViews that display a picture from the users camera roll. The code I'm using needs each of the UIImageViews to have tags. I'm currently using NSData to save the array images, and it works great, but I can't use this method anymore because NSData doesn't support the use of tags. I also can't use NSUserDefaults, because I can't save images to a plist. Here is how I'm attempting to do this (using the NSData method, which works but I have to edit this so that my tags work.)
This is my current code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)img editingInfo:(NSDictionary *)editInfo {
if (imageView.image == nil) {
imageView.image = img;
[self.array addObject:imageView.image];
[picker dismissModalViewControllerAnimated:YES];
[self.popover dismissPopoverAnimated:YES];
return;
}
if (imageView2.image == nil) {
imageView2.image = img;
NSLog(#"The image is a %#", imageView);
[self.array addObject:imageView2.image];
[picker dismissModalViewControllerAnimated:YES];
[self.popover dismissPopoverAnimated:YES];
return;
}
...
- (void)applicationDidEnterBackground:(UIApplication*)application {
NSLog(#"Image on didenterbackground: %#", imageView);
[self.array addObject:imageView.image];
[self.array addObject:imageView2.image];
[self.user setObject:self.array forKey:#"images"];
[user synchronize];
}
- (void)viewDidLoad
{
self.user = [NSUserDefaults standardUserDefaults];
NSLog(#"It is %#", self.user);
self.array = [[self.user objectForKey:#"images"]mutableCopy];
imageView.image = [[self.array objectAtIndex:0] copy];
imageView2.image = [[self.array objectAtIndex:1] copy];
UIApplication *app = [UIApplication sharedApplication];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(applicationDidEnterBackground:)
name:UIApplicationDidEnterBackgroundNotification
object:app];
[super viewDidLoad];
}
Any help or suggestions on how to edit this code so that I can save the images, while using tags is much appreciated, thanks!
EDIT: Here is my updated code:
-(IBAction)saveButtonPressed:(id)sender {
NSString *docsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask,YES) objectAtIndex:0];
for (UIImageView *imageView in self.array) {
NSInteger tag = self.imageView.tag;
UIImage *image = self.imageView.image;
NSString *imageName = [NSString stringWithFormat:#"Image%i.png",tag];
NSString *imagePath = [docsDir stringByAppendingPathComponent:imageName];
[UIImagePNGRepresentation(image) writeToFile:imagePath atomically:YES];
}
NSLog(#"Saved Button Pressed");
}
- (void)applicationDidEnterBackground:(UIApplication*)application {
}
-(void)viewDidLoad {
NSString *docsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES) objectAtIndex:0];
NSArray *docFiles = [[NSFileManager defaultManager]contentsOfDirectoryAtPath:docsDir error:NULL];
for (NSString *fileName in docFiles) {
if ([fileName hasSuffix:#".png"]) {
NSString *fullPath = [docsDir stringByAppendingPathComponent:fileName];
UIImage *loadedImage = [UIImage imageWithContentsOfFile:fullPath];
if (!imageView.image) {
imageView.image = loadedImage;
} else {
imageView2.image = loadedImage;
}
}
}
}
You need to use "Fast Enumeration" to parse the array's objects, and write each object to disk sequentially. First, you're going to need to add the UIImageView objects to the array instead of the UIImage property of the UIImageView, so you can recover the tag. So instead of writing
[self.array addObject:imageView.image];
It will be
[self.array addObject:imageView];
Try to follow along with my code. I inserted comments on each line to help.
-(void)applicationDidEnterBackground:(UIApplication *)application {
//Obtain the documents directory
NSString *docsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainmask,YES) objectAtIndex:0];
//begin fast enumeration
//this is special to ObjC: it will iterate over any array one object at a time
//it's easier than using for (i=0;i<array.count;i++)
for (UIImageView *imageView in self.array) {
//get the imageView's tag to append to the filename
NSInteger tag = imageView.tag;
//get the image from the imageView;
UIImage *image = imageView.image;
//create a filename, in this case "ImageTAGNUM.png"
NSString *imageName = [NSString stringWithFormat:#"Image%i.png",tag];
//concatenate the docsDirectory and the filename
NSString *imagePath = [docsDir stringByAppendingPathComponent:imageName];
[UIImagePNGRepresentation(image) writeToFile:imagePath atomically:YES];
}
}
To load the images from disk, you'll have to look at your viewDidLoad method
-(void)viewDidLoad {
//get the contents of the docs directory
NSString *docsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainmask,YES) objectAtIndex:0];
//Get the list of files from the file manager
NSArray *docFiles = [[NSFileManager defaultManager]contentsOfDirectoryAtPath:docsDir error:NULL]);
//use fast enumeration to iterate the list of files searching for .png extensions and load those
for (NSString *fileName in docFiles) {
//check to see if the file is a .png file
if ([fileName hasSuffix:#".png"]) {
NSString *fullPath = [docsDir stringByAppendingPathComponent:fileName];
UIImage *loadedImage = [UIImage imageWithContentsOfFile:fullPath];
//you'll have to sort out how to put these images in their proper place
if (!imageView1.image) {
imageView1.image = loadedImage;
} else {
imageView2.image = loadedImage;
}
}
}
}
Hope this helps
One thing you need to be aware of is that when an app enters the background it has about 5 seconds to clean up its act before it's suspended. The UIPNGRepresentation() function takes a significant amount of time and is not instantaneous. You should be aware of this. It would probably be better to write some of this code in other places and do it earlier than at app backgrounding. FWIW
You can use the [NSbundle Mainbundel] to store that images.
To get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
First, there's still a problem in your for loop.
for (UIImageView *imageView in self.array) {
NSInteger tag = self.imageView.tag;
UIImage *image = self.imageView.image;
// ...
}
Before you make any other changes, you must understand why. imageView is your for loop control variable, which changes on each iteration through the loop. self.imageView is a different thing. It is the first of the 10 imageViews attached to your viewController. Every time this loop cycles, it looks at the first imageView, and only the first.
As for why saving doesn't work, it's probably because the arrays elsewhere aren't working. Add some logging to make sure there's something in the array, and that it has as many elements as you expect.
-(IBAction)saveButtonPressed:(id)sender {
NSString *docsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask,YES) objectAtIndex:0];
// Log to make sure the views expected have previously been stored.
// If the array is empty, or shorter than expected, the problem is elsewhere.
NSLog(#"Image view array before saving = %#", self.array);
for (UIImageView *imageViewToSave in self.array) {
NSInteger tag = imageViewToSave.tag;
UIImage *image = imageViewToSave.image;
NSString *imageName = [NSString stringWithFormat:#"Image%i.png",tag];
NSString *imagePath = [docsDir stringByAppendingPathComponent:imageName];
// Log the image and path being saved. If either of these are nil, nothing will be written.
NSLog(#"Saving %# to %#", image, imagePath);
[UIImagePNGRepresentation(image) writeToFile:imagePath atomically:NO];
}
NSLog(#"Save Button Pressed");
}

Resizing image and saving it to the specified directory path in Cocoa

Using this code I am trying to resize the selected image and then want to save it to a specific path:
-(void)processImage:(NSString*)inputPath:(int)imageWidth:(int)imageHeight:(NSString*)outputPath {
NSImage * img = [NSImage imageNamed:inputPath];
[img setSize: NSMakeSize(imageWidth,imageHeight)];
}
-(void)startProcessingImages {
int i; // Loop counter.
// Loop through all the files and process them.
for( i = 0; i < [files count]; i++ )
{
inputFilePath = [[files objectAtIndex:i] retain];
NSLog(#"filename::: %#", inputFilePath);
// Do something with the filename.
[selectedFile setStringValue:inputFilePath];
NSLog(#"selectedFile:::: %#", selectedFile);
}
NSLog(#"curdir:::::%#", inputFilePath);
NSString *aString = [[NSString stringWithFormat:#"%#%#%#", thumbnailDirPath , #"/" , fileNameNumber] retain];
fileNameJPG = [[aString stringByAppendingString:#".jpg"] retain];
[self processImage:inputFilePath: 66 :55 :thumbnailDirPath];
[self processImage:inputFilePath: 800 :600 :thumbnailDirPath];
[self processImage:inputFilePath: 320 :240 :thumbnailDirPath];
}
My issue is I am not getting that how to save it to thumbnailDirPath.
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.8] forKey:NSImageCompressionFactor];
NSData *tiffData = [img TIFFRepresentation];
NSData *JPEGData = [[NSBitmapImageRep imageRepWithData:tiffData] representationUsingType:NSJPEGFileType properties:options];
NSError *anError;
if (![JPEGData outputPath options:0 error:&anError])
MyLog(#"Error saving image: %# to: %#", anError, outputPath);
Check the documentation for NSJPEGFileType as it will show you the other foramt options for saving, such as PNG.
you should do export your image into file.
currently I only see how to store the TIFF image.
[[img TIFFRepresentation] writeToFile:outputPathName atomacally:NO];
Where outputPathName is the path with file name for your thumbnail file.