When changing the image of a contact - objective-c

I'm trying to change the image of a selected contact in peoplePickerNavigationController, but when I choose a contact, does not change the image and shows no error.
Well, far as I know, it's all right! Something is missing?
This is the code:
ABAddressBookRef aBook = ABAddressBookCreate();
UIImage *img = [UIImage imageNamed:#"90x90.png"];
NSData *dataRef = UIImagePNGRepresentation(img);
CFDataRef cfdata = CFDataCreate(NULL, [dataRef bytes], [dataRef length]);
CFErrorRef error;
if(ABPersonRemoveImageData(person, &error)){
NSLog(#"OK");
}
if(ABAddressBookSave(aBook, &error)){
NSLog(#"OK");
}
if(ABPersonSetImageData(person, cfdata, &error)){
NSLog(#"OK");
}
if(ABAddressBookSave(aBook, &error)){
NSLog(#"OK");
}
CFRelease(cfdata);
[self dismissModalViewControllerAnimated:YES];
return NO;
Remembering that the above code is within the BOOL:
- (BOOL)peoplePickerNavigationController: (ABPeoplePickerNavigationController *)peoplePicker shouldContinueAfterSelectingPerson:(ABRecordRef)person { }
Thanks!

Managed to solve, for those who have the same question in the future, just use the ABRecordGetRecordID and ABAddressBookGetPersonWithRecordID and follows the code below:
ABAddressBookRef aBook = ABAddressBookCreate();
NSNumber *recordId = [NSNumber numberWithInteger:ABRecordGetRecordID(person)];
ABRecordRef pID = ABAddressBookGetPersonWithRecordID(aBook,recordId.integerValue);
UIImage *img = [UIImage imageNamed:#"90x90.png"];
NSData *dataRef = UIImagePNGRepresentation(img);
CFDataRef cfdata = CFDataCreate(NULL, [dataRef bytes], [dataRef length]);
CFErrorRef error;
ABPersonRemoveImageData(pID, &error);
ABAddressBookSave(aBook, &error);
ABPersonSetImageData(pID, cfdata, &error);
ABAddressBookSave(aBook, &error);
CFRelease(cfdata);
[self dismissModalViewControllerAnimated:YES];
return NO;
Thanks!

Related

Loading image from url ios 8 objective c

i´m trying to obtain the imagen from this url[#"file:///var/mobile/Media/DCIM/100APPLE/IMG_0158.JPG"], but i can´t.
Always is nil.
this is my code:
NSData *data = [NSData dataWithContentsOfURL: #"file:///var/mobile/Media/DCIM/100APPLE/IMG_0158.JPG"];
UIImage *image = [UIImage imageWithData:data];
self.pruebaTmp.image = image;
i obtain the url with this code:
if (asset) {
// get photo info from this asset
PHImageRequestOptions * imageRequestOptions = [[PHImageRequestOptions alloc] init];
imageRequestOptions.synchronous = YES;
[[PHImageManager defaultManager]
requestImageDataForAsset:asset
options:imageRequestOptions
resultHandler:^(NSData *imageData, NSString *dataUTI,
UIImageOrientation orientation,
NSDictionary *info)
{
NSURL *path = [info objectForKey:#"PHImageFileURLKey"];
//asignamos el path de la imágen seleccionada en galeria
self.pathImagen = path;
}];
}
if someone could help i would be very grateful, because i can´t load the image with the url obtained.
you can`t not get UIimage or metadata from that url.
you can get UIImage from local Identifier of access
PHFetchResult *savedAssets = [PHAsset fetchAssetsWithLocalIdentifiers:#[localIdentifier] options:nil];
[savedAssets enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
//this gets called for every asset from its localIdentifier you saved
//PHImageRequestOptionsDeliveryModeHighQualityFormat
PHImageRequestOptions * imageRequestOptions = [[PHImageRequestOptions alloc] init];
imageRequestOptions.synchronous = NO;
imageRequestOptions.deliveryMode = PHImageRequestOptionsResizeModeFast;
imageRequestOptions.resizeMode = PHImageRequestOptionsResizeModeFast;
[[PHImageManager defaultManager]requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFill options:imageRequestOptions resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
NSLog(#"get image from result");
if (result) {
}
}];
imageRequestOptions = nil;
}];

Image not displayed on UICollectionViewCell

I am trying to get images on contacts,here i used UICollectionViewCell but in the collection view i didn't get image for the contact,i get only name and number.Here my code is
- (IBAction)ContactDisplay:(id)sender {
_addressBookController = [[ABPeoplePickerNavigationController alloc] init];
[_addressBookController setPeoplePickerDelegate:self];
[self presentViewController:_addressBookController animated:YES completion:nil];
}
- (void)peoplePickerNavigationController:(ABPeoplePickerNavigationController*)peoplePicker didSelectPerson:(ABRecordRef)person
{
[self displayPerson:person];
}
- (void)displayPerson:(ABRecordRef)person
{
NSString* name = (__bridge_transfer NSString*)ABRecordCopyValue(person,
kABPersonFirstNameProperty);
NSLog(#"%#",name);
NSString* phone = nil;
ABMultiValueRef phoneNumbers = ABRecordCopyValue(person,
kABPersonPhoneProperty);
if (ABMultiValueGetCount(phoneNumbers) > 0) {
phone = (__bridge_transfer NSString*)
ABMultiValueCopyValueAtIndex(phoneNumbers, 0);
} else {
phone = #"[None]";
}
NSLog(#"%#",phone);
UIImage *img ;
if (person != nil && ABPersonHasImageData(person)) {
if ((&ABPersonCopyImageDataWithFormat) != nil ) {
img= [UIImage imageWithData:(__bridge NSData *)ABPersonCopyImageDataWithFormat(person, kABPersonImageFormatThumbnail)];
}
} else {
NSString *imageUrlString = #"http://www.google.co.in/intl/en_com/images/srpr/logo1w.png";
NSURL *url = [NSURL URLWithString:imageUrlString];
NSData *data = [[NSData alloc] initWithContentsOfURL:url];
img= [UIImage imageWithData:data];
}
NSString *string ;//
string =[NSString stringWithFormat:#"%#",img];
NSLog(#"%#",img);
self.name.text=name;
self.number.text=phone;
[self.nameArray addObject:name];
[self.imageArray addObject:string];
NSLog(#"%#",self.nameArray);
NSLog(#"%#",self.imageArray);
[self.collectionView reloadData];
[self.collectionView performBatchUpdates:^{
[self.collectionView reloadSections:[NSIndexSet indexSetWithIndex:0]];
} completion:nil];
}
finally an image array i got like this
(
"add-button.png",
"<UIImage: 0x17e56c80>, {148, 148}"
)
On image array every image like display .PNG format it will display fine ,then how can modify it.
Can you please suggest me how can you solve this,thank you.
I don't fully agree with everything you're doing there but I think you're getting your data wrong. Try using this instead when you're fetching the ABPerson image data.
if (person != nil) {
CFDataRef imageData = ABPersonCopyImageData(person);
NSData *data = CFBridgingRelease(imageData);
if (data != nil && data.length > 10){ //arbitrary length to make sure our data object isnt' really empty
img = [UIImage imageWithData:data];
} else {
NSString *imageUrlString = #"http://www.google.co.in/intl/en_com/images/srpr/logo1w.png";
NSURL *url = [NSURL URLWithString:imageUrlString];
NSData *data = [[NSData alloc] initWithContentsOfURL:url];
img= [UIImage imageWithData:data];
}
Then don't store your images as Strings in your array. Store them either as NSData or UIImage, but NOT STRINGS.
so
[myArray addObject:img]; //not the string.
And when you fetch it later, make sure you treat is as an image and not as a string
on your storyboard, select the image and look at the properties panel.
there are "Installed" options at the bottom. check the topmost "Installed" box.
I think there might be issue with conversion of image to string
NSString *string ;//
string =[NSString stringWithFormat:#"%#",img];
Add image to image array without converting to string
[self.imageArray addObject:img];
I do it like this in my app. Assuming 'person' is an ABRecordRef.
NSMutableDictionary *contactInfoDict = [[NSMutableDictionary alloc]
initWithObjects:#[#"", #"", #"", #""]
forKeys:#[#"firstName", #"lastName", #"birthday", #"picture"]];
CFTypeRef generalCFObject;
// Firtname
generalCFObject = ABRecordCopyValue(person, kABPersonFirstNameProperty);
if (generalCFObject) {
[contactInfoDict setObject:(__bridge NSString *)generalCFObject forKey:#"firstName"];
CFRelease(generalCFObject);
}
// Lastname
generalCFObject = ABRecordCopyValue(person, kABPersonLastNameProperty);
if (generalCFObject) {
[contactInfoDict setObject:(__bridge NSString *)generalCFObject forKey:#"lastName"];
CFRelease(generalCFObject);
}
// Birthday
generalCFObject = ABRecordCopyValue(person, kABPersonBirthdayProperty);
if (generalCFObject) {
[contactInfoDict setObject:(__bridge NSString *)generalCFObject forKey:#"birthday"];
NSLog(#"Date : %#", [contactInfoDict objectForKey:#"birthday"]);
CFRelease(generalCFObject);
}
// User image
CFDataRef photo = ABPersonCopyImageData(person);
if (photo) {
CFRelease(photo);
UIImage *image = [UIImage imageWithData:(__bridge NSData*)photo];
[contactInfoDict setObject:image forKey:#"picture"];
}

Capturing iSight image using AVFoundation on Mac

I previously had this code to capture a single image from a Mac's iSight camera using QTKit:
- (NSError*)takePicture
{
BOOL success;
NSError* error;
captureSession = [QTCaptureSession new];
QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];
success = [device open: &error];
if (!success) { return error; }
QTCaptureDeviceInput* captureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device];
success = [captureSession addInput: captureDeviceInput error: &error];
if (!success) { return error; }
QTCaptureDecompressedVideoOutput* captureVideoOutput = [QTCaptureDecompressedVideoOutput new];
[captureVideoOutput setDelegate: self];
success = [captureSession addOutput: captureVideoOutput error: &error];
if (!success) { return error; }
[captureSession startRunning];
return nil;
}
- (void)captureOutput: (QTCaptureOutput*)captureOutput
didOutputVideoFrame: (CVImageBufferRef)imageBuffer
withSampleBuffer: (QTSampleBuffer*)sampleBuffer
fromConnection: (QTCaptureConnection*)connection
{
CVBufferRetain(imageBuffer);
if (imageBuffer) {
[captureSession removeOutput: captureOutput];
[captureSession stopRunning];
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
_done = YES;
}
}
However, I found today that QTKit has been deprecated and so we must now use AVFoundation.
Can anyone help me convert this code to its AVFoundation equivalent? It seems as though many methods have the same name, but at the same time, a lot is different and I'm at a complete loss here... Any help?
Alright, I found the solution!! Here it is:
- (void)takePicture
{
NSError* error;
AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: device error: &error];
if (!input) {
_error = error;
_done = YES;
return;
}
AVCaptureStillImageOutput* output = [AVCaptureStillImageOutput new];
[output setOutputSettings: #{(id)kCVPixelBufferPixelFormatTypeKey: #(k32BGRAPixelFormat)}];
captureSession = [AVCaptureSession new];
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[captureSession addInput: input];
[captureSession addOutput: output];
[captureSession startRunning];
AVCaptureConnection* connection = [output connectionWithMediaType: AVMediaTypeVideo];
[output captureStillImageAsynchronouslyFromConnection: connection completionHandler: ^(CMSampleBufferRef sampleBuffer, NSError* error) {
if (error) {
_error = error;
_result = nil;
}
else {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (imageBuffer) {
CVBufferRetain(imageBuffer);
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
}
}
_done = YES;
}];
}
I hope this helps whoever has any problems in doing this same thing.

GCDAsyncSocket not receiving all transmitted data, missing last "Chunk"

I am trying to send some strings and image data from a python script to an objective C application running on OSX.
I am collecting the transmitted data, using GCDAsyncSocket, and appending it to an NSMutableData until the server disconnects. I am then processing that NSData and splitting it into it's original parts.
The transmitted data consists of the following:
ID string, filled out to 16 bytes.
Image number string, filled out to 16 bytes.
Raw image data.
Termination string, filled out to 16 bytes.
The problem is that i am not receiving/getting the last chunk of data, i end up missing the end of the JPEG image, resulting in a corrupt (though mostly displayed) image, and a missing termination string.
Here is the code i am using with GCDAsyncSocket to get the data, and process it:
Socket connection:
- (void)socket:(GCDAsyncSocket *)sock didAcceptNewSocket:(GCDAsyncSocket *)newSocket
{
// This method is executed on the socketQueue (not the main thread)
#synchronized(connectedSockets)
{
[connectedSockets addObject:newSocket];
}
NSString *host = [newSocket connectedHost];
UInt16 port = [newSocket connectedPort];
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
[self logInfo:FORMAT(#"Accepted client %#:%hu", host, port)];
}
});
[newSocket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
}
Socket Data Received
- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
// This method is executed on the socketQueue (not the main thread)
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
NSLog(#"Thread Data Length is %lu", (unsigned long)[data length]);
if (!imageBuffer){
imageBuffer = [[NSMutableData alloc]init];
}
[imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
NSLog(#"Total Data Length is %lu", (unsigned long)[imageBuffer length]);
}
});
// Echo message back to client
[sock writeData:data withTimeout:-1 tag:ECHO_MSG];
[sock readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
}
Socket Disconnected
- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
[self logInfo:FORMAT(#"Client Disconnected")];
NSData *cameraNumberData;
NSData *imageNumberData;
NSData *imageData;
NSData *endCommandData;
//if ([data length] > 40){
cameraNumberData = [imageBuffer subdataWithRange:NSMakeRange(0, 16)];
imageNumberData = [imageBuffer subdataWithRange:NSMakeRange(16, 16)];
imageData = [imageBuffer subdataWithRange:NSMakeRange(32, [imageBuffer length]-34)];
endCommandData = [imageBuffer subdataWithRange:NSMakeRange([imageBuffer length]-16, 16)];
//}
NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
NSImage* image = [[NSImage alloc]initWithData:imageData];
if (cameraNumberString)
{
NSLog(#"Image recieved from Camera no %#", cameraNumberString);
[self logMessage:cameraNumberString];
}
else
{
[self logError:#"Error converting received data into UTF-8 String"];
}
if (imageNumberString)
{
NSLog(#"Image is number %#", imageNumberString);
[self logMessage:imageNumberString];
}
else
{
[self logError:#"Error converting received data into UTF-8 String"];
}
if (image)
{
NSLog(#"We have an image");
[self.imageView setImage:image];
}
else
{
[self logError:#"Error converting received data into image"];
}
if (endCommandString)
{
NSLog(#"Command String is %#", endCommandString);
[self logMessage:endCommandString];
}
else
{
[self logError:#"No command string"];
}
//self.imageBuffer = nil;
}
});
#synchronized(connectedSockets)
{
[connectedSockets removeObject:sock];
}
}
}
I have used wireshark, and the data is being transmitted, it's just not getting through GCDAsynSocket.
So, i'm obviously missing something. Socket programming and encoding/decoding of data like this is relatively new to me, so i am probably being an idiot.
Help greatly appreciated!
Thanks
Gareth
Ok, so i finally got this working. It involved modifying the transmitting code in Python to send a completion string at the end of the data, and watching for that. The biggest takeaway was that i needed to re-call the readDataToData: method each time the socket read some data, otherwise it would just sit there and wait, and the transmitting socket would also just sit there.
I also had to implement re-calling the second receive with a tag so i could store the received data in the correct NSMutableData object in an NSMutableArray, otherwise i had no way of knowing after the first receive which transmitting socket the data was coming from as the ID was only at the beginning of the first message.
Here is the didReadData code:
- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
NSInteger cameraNumberNumber = 0;
NSString *cameraNumberString = [[NSString alloc]init];
if (tag > 10){
cameraNumberNumber = tag-11;
DDLogVerbose(#"Second data loop, tag is %ld", tag);
} else {
NSData *cameraNumberData;
//if ([data length] > 40){
cameraNumberData = [data subdataWithRange:NSMakeRange(0, 16)];
NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
cameraNumberNumber = [cameraNumberString intValue]-1;
}
if (cameraNumberNumber+1 <= self.images.count){
if ([self.images objectAtIndex:cameraNumberNumber] == [NSNull null]){
image* cameraImage = [[image alloc]init];
[self.images replaceObjectAtIndex: cameraNumberNumber withObject:cameraImage];
}
image* cameraImage = [self.images objectAtIndex:cameraNumberNumber];
[cameraImage.imageData appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
cameraImage.cameraNumber = cameraNumberString;
if (!imageBuffer){
imageBuffer = [[NSMutableData alloc]init];
}
[imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
DDLogVerbose(#"Total Data Length is %lu", (unsigned long)[imageBuffer length]);
} else {
DDLogInfo(#"Wrong camera quantity!");
NSAlert *testAlert = [NSAlert alertWithMessageText:#"Wrong camera quantity!"
defaultButton:#"Ok"
alternateButton:nil
otherButton:nil
informativeTextWithFormat:#"We have recieved more images than cameras, please set No.Cameras correctly!"];
[testAlert beginSheetModalForWindow:[self window]
modalDelegate:self
didEndSelector:#selector(stop)
contextInfo:nil];
}
[sock readDataToData:[#"end" dataUsingEncoding:NSUTF8StringEncoding] withTimeout:-1 tag:cameraNumberNumber + 11];
}
});
}
and here is the socketDidDisconnect code, a lot of things in here that don't make sense out of context, but it shows how i handled the received data.
- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
totalCamerasFetched = [NSNumber numberWithInt:1+[totalCamerasFetched intValue]];
if ([totalCamerasFetched integerValue] >= [numberOfCameras integerValue]){
for (image* cameraImage in self.images){
NSData *cameraNumberData;
NSData *imageNumberData;
NSData *imageData;
NSData *endCommandData;
NSInteger cameraNumberNumber = 0;
cameraNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(0, 16)];
imageNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(16, 16)];
imageData = [cameraImage.imageData subdataWithRange:NSMakeRange(32, [cameraImage.imageData length]-32)];
endCommandData = [cameraImage.imageData subdataWithRange:NSMakeRange([cameraImage.imageData length]-16, 16)];
NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
imageNumberString = [imageNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
NSImage* image = [[NSImage alloc]initWithData:imageData];
cameraNumberNumber = [cameraNumberString intValue]-1;
if (cameraNumberString)
{
DDLogInfo(#"Image recieved from Camera no %#", cameraNumberString);
}
else
{
DDLogError(#"No Camera number in data");
}
if (imageNumberString)
{
DDLogInfo(#"Image is number %#", imageNumberString);
}
else
{
DDLogError(#"No Image number in data");
}
if (image)
{
DDLogVerbose(#"We have an image");
NSString* dataPath = [[NSString alloc]initWithFormat:#"%#/image%#/",self.exportLocation, imageNumberString];
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath]){
NSError* error;
[[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error];
if (error)
{
DDLogError(#"[%#] ERROR: attempting to write directory for images", [self class]);
NSAssert( FALSE, #"Failed to create directory maybe out of disk space?");
}
}
NSString* dataPathVideo = [[NSString alloc]initWithFormat:#"%#/video%#/",self.exportLocation, imageNumberString];
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPathVideo]){
NSError* error;
[[NSFileManager defaultManager] createDirectoryAtPath:dataPathVideo withIntermediateDirectories:NO attributes:nil error:&error];
if (error)
{
DDLogError(#"[%#] ERROR: attempting to write directory for images", [self class]);
NSAssert( FALSE, #"Failed to create directory maybe out of disk space?");
}
}
NSString * exportLocationFull = [[NSString alloc]initWithFormat:#"%#/image%#/camera_%#.jpg",self.exportLocation, imageNumberString, cameraNumberString];
DDLogInfo(#"Full export URL = %#", exportLocationFull);
[imageData writeToFile:exportLocationFull atomically:YES];
self.currentSet = [NSNumber numberWithInt:[imageNumberString intValue]];
NSImage* imageToStore = [[NSImage alloc]initWithData:imageData];
[self.imagesToMakeVideo replaceObjectAtIndex: cameraNumberNumber withObject:imageToStore];
} else {
DDLogError(#"No image loacted in data");
}
if (endCommandString)
{
DDLogVerbose(#"Command String is %#", endCommandString);
//[self logMessage:endCommandString];
}
else
{
//[self logError:#"No command string"];
}
self.imageBuffer = nil;
}
self.totalCamerasFetched = [NSNumber numberWithInt:0];
[self loadandDisplayLatestImages];
[self createVideowithImages:imagesToMakeVideo toLocation:[[NSString alloc]initWithFormat:#"%#/video%#/image_sequence_%#.mov",self.exportLocation, self.currentSet, self.currentSet]];
processing = false;
}//end of for loop
}
});
#synchronized(connectedSockets)
{
[connectedSockets removeObject:sock];
}
}
}
also here is how i modified the Python code to add the extra "end" tag.
def send_media_to(self, ip, port, media_name, media_number, media_dir):
camera_number = self.camera.current_mode['option'].number
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect((ip, port))
try:
sock.send(bytes(str(camera_number).ljust(16), 'utf-8'))
sock.send(bytes(str(media_number).ljust(16), 'utf-8'))
with open(media_dir + media_name, 'rb') as media:
sock.sendall(media.read())
finally:
sock.send(bytes(str("end").ljust(16), 'utf-8'))
sock.close()
Hopefully this helps someone else stuck in the same situation!

How to get nowplaying infomation in Thirdparty music application?

I want to get nowplaying infomation.
So, following this code:
NSDictionary *info = [[MPNowPlayingInfoCenter defaultCenter] nowPlayingInfo];
NSString *title = [info valueForKey:MPMediaItemPropertyTitle];
NSLog(#"%#",title);
MPMusicPlayerController *pc = [MPMusicPlayerController iPodMusicPlayer];
MPMediaItem *playingItem = [pc nowPlayingItem];
if (playingItem) {
NSInteger mediaType = [[playingItem valueForProperty:MPMediaItemPropertyMediaType] integerValue];
if (mediaType == MPMediaTypeMusic) {
NSString *songTitle = [playingItem valueForProperty:MPMediaItemPropertyTitle];
NSString *albumTitle = [playingItem valueForProperty:MPMediaItemPropertyAlbumTitle];
NSString *artist = [playingItem valueForProperty:MPMediaItemPropertyArtist];
NSString *genre = [playingItem valueForProperty:MPMediaItemPropertyGenre];
TweetTextField.text = [NSString stringWithFormat:#"#nowplaying %# - %# / %# #%#", artist, songTitle, albumTitle,genre];
MPMediaItemArtwork *artwork = [playingItem valueForProperty:MPMediaItemPropertyArtwork];
CGSize newSize = CGSizeMake(250, 250);
UIGraphicsBeginImageContext(newSize);
[[artwork imageWithSize:CGSizeMake(100.0, 100.0)] drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
_imageView.image = newImage;
}
if (_imageView.image == nil){
} else {
_tableView.alpha=0.5;
}
}
But this code can get nowplaying infomation from Defautl iPod Application.
How to get nowplaying infomation in Thirdparty music application?
(e.g.: Mobile Safari, Youtube App, gMusic, Melodies etc).
I think this isn't possible. The documentation states that MPNowPlayingInfoCenter is only for setting information on the lock screen.
Here is a related question.