How can I get rid of page margins using UIPrintInteractionController for Airprint - objective-c

I am trying to AirPrint a UIImage using UIPrintInteractionController
this code works successfully except a margin is added to the print preview which is not desirable for me. how can I customize the left, right , top & bottom margins to zero or other values? i am aware of printablerect but how to modify it? Thanks for helps in advance:)
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
NSData *imageData=UIImageJPEGRepresentation(imageforPrint, 1.0);
if (pic && [UIPrintInteractionController canPrintData: imageData] ) {
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = #"Job 1";
printInfo.duplex = UIPrintInfoDuplexLongEdge;
pic.printInfo = printInfo;
UIPrintPageRenderer *renderer = [[UIPrintPageRenderer alloc] init];
UIPrintFormatter *Formatter = [[UIPrintFormatter alloc] init];
Formatter=pic.printFormatter;
Formatter.startPage=0;
Formatter.perPageContentInsets = UIEdgeInsetsMake(0.0, 0.0, 0.0, 0.0);
Formatter.startPage=0;
[renderer Formatter startingAtPageAtIndex: 0];
pic.showsPaperSelectionForLoadedPapers=YES;
pic.printPageRenderer=renderer;
pic.printFormatter=Formatter;
pic.printingItem = imageData;
pic.showsPageRange = YES;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *pic, BOOL completed, NSError *error) {
//self.content = nil;
if (!completed && error)
NSLog(#"FAILED! due to error in domain %# with error code %ld", error.domain, (long)error.code);
};
[pic presentAnimated:YES completionHandler:completionHandler];

Related

How to Add text in a video

I have a video that lasts a duration of 5 minutes now I want to add text in that video for a particular time of say 5 - 15 seconds.
Can any one help me out. I have tried below code for adding text to an image
CATextLayer *subtitle1Text = [[CATextLayer alloc] init];
[subtitle1Text setFont:#"Helvetica-Bold"];
[subtitle1Text setFontSize:36];
[subtitle1Text setFrame:CGRectMake(0, 0, size.width, 100)];
[subtitle1Text setString:_subTitle1.text];
[subtitle1Text setAlignmentMode:kCAAlignmentCenter];
[subtitle1Text setForegroundColor:[[UIColor whiteColor] CGColor]];
// 2 - The usual overlay
CALayer *overlayLayer = [CALayer layer];
[overlayLayer addSublayer:subtitle1Text];
overlayLayer.frame = CGRectMake(0, 0, size.width, size.height);
[overlayLayer setMasksToBounds:YES];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
Can one one help me how to do it for videos
Let's try the below method to edit a video frame by frame using AVFoundation.framework,
-(void)startEditingVideoAtURL:(NSURL *)resourceURL
{
NSError * error = nil;
// Temp File path to write the edited movie
NSURL * movieURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), resourceURL.lastPathComponent]];
NSFileManager * fm = [NSFileManager defaultManager];
NSError * error;
BOOL success = [fm removeItemAtPath:movieURL.path error:&error];
// Create AVAssetWriter to convert the images into movie
AVAssetWriter * videoWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeQuickTimeMovie error:&error];
NSParameterAssert(videoWriter);
AVAsset * avAsset = [[AVURLAsset alloc] initWithURL:resourceURL options:nil];
// Set your out put video frame here
NSDictionary * videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:480.0], AVVideoWidthKey,
[NSNumber numberWithInt:480.0], AVVideoHeightKey,
nil];
// Create AVAssetWriterInput with video Settings
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
NSDictionary * sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:480.0], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:480.0], kCVPixelBufferHeightKey,
nil];
AVAssetWriterInputPixelBufferAdaptor * assetWriterPixelBufferAdaptor1 = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
NSError *aerror = nil;
// Create AVAssetReader to read the video files frame by frame
AVAssetReader * reader = [[AVAssetReader alloc] initWithAsset:avAsset error:&aerror];
NSArray * vTracks = [avAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetReaderTrackOutput * asset_reader_output = nil;
if (vTracks.count)
{
AVAssetTrack * videoTrack = [vTracks objectAtIndex:0];
videoWriterInput.transform = videoTrack.preferredTransform;
NSDictionary *videoOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
asset_reader_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoOptions];
[reader addOutput:asset_reader_output];
}
else
{
[[NSNotificationCenter defaultCenter] postNotificationName:EDITOR_FAILED_NOTI object:resourceURL.path];
[reader cancelReading];
queueInProgress = NO;
[self removeCurrentVideo];
return;
}
//audio setup
AVAssetWriterInput * audioWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeAudio
outputSettings:nil];
AVAssetReader * audioReader = [AVAssetReader assetReaderWithAsset:avAsset error:&error];
NSArray * aTrack = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetReaderOutput * readerOutput = nil;
if (aTrack.count)
{
AVAssetTrack * audioTrack = [aTrack objectAtIndex:0];
readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];
[audioReader addOutput:readerOutput];
}
NSParameterAssert(audioWriterInput);
NSParameterAssert([videoWriter canAddInput:audioWriterInput]);
audioWriterInput.expectsMediaDataInRealTime = NO;
[videoWriter addInput:audioWriterInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
[reader startReading]; // Here the video reader starts the reading
dispatch_queue_t _processingQueue = dispatch_queue_create("assetAudioWriterQueue", NULL);
[videoWriterInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
while ([videoWriterInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer;
if ([reader status] == AVAssetReaderStatusReading &&
(sampleBuffer = [asset_reader_output copyNextSampleBuffer]))
{
CMTime currentTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage * _ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
void * baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
CGDataProviderRef dataProvider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGDataProviderRelease(dataProvider);
// Add your text drawing code here
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
CGColorSpaceRelease(rgbColorSpace);
BOOL result = [assetWriterPixelBufferAdaptor1 appendPixelBuffer:imageBuffer withPresentationTime:currentTime];
CVPixelBufferRelease(imageBuffer);
CFRelease(sampleBuffer);
if (!result)
{
dispatch_async(dispatch_get_main_queue(), ^
{
[[NSNotificationCenter defaultCenter] postNotificationName:EDITOR_FAILED_NOTI object:resourceURL.path];
[reader cancelReading];
queueInProgress = NO;
[self removeCurrentVideo];
});
break;
}
}
else
{
[videoWriterInput markAsFinished]; // This will called, once video write done
switch ([reader status])
{
case AVAssetReaderStatusReading:
// the reader has more for other tracks, even if this one is done
break;
case AVAssetReaderStatusCompleted:
{
if (!readerOutput)
{
if ([videoWriter respondsToSelector:#selector(finishWritingWithCompletionHandler:)])
[videoWriter finishWritingWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^
{
});
}];
else
{
if ([videoWriter finishWriting])
{
dispatch_async(dispatch_get_main_queue(), ^
{
});
}
}
break;
}
[audioReader startReading];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[audioWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^
{
while (audioWriterInput.readyForMoreMediaData) {
CMSampleBufferRef nextBuffer;
if ([audioReader status] == AVAssetReaderStatusReading &&
(nextBuffer = [readerOutput copyNextSampleBuffer]))
{
if (nextBuffer)
{
[audioWriterInput appendSampleBuffer:nextBuffer];
}
CFRelease(nextBuffer);
}else
{
[audioWriterInput markAsFinished];
switch ([audioReader status])
{
case AVAssetReaderStatusCompleted:
if ([videoWriter respondsToSelector:#selector(finishWritingWithCompletionHandler:)])
[videoWriter finishWritingWithCompletionHandler:^
{
}];
else
{
if ([videoWriter finishWriting])
{
}
}
break;
}
}
}
}
];
break;
}
// your method for when the conversion is done
// should call finishWriting on the writer
//hook up audio track
case AVAssetReaderStatusFailed:
{
break;
}
}
break;
}
}
}
];
}
Thanks!

CAKeyframeAnimation not animating CATextLayer when exporting video

I have an application that I am attempting to put a timestamp on a video. To do this I am using AVFoundation and Core Animation to place a CATextLayer over the video layer. If I place text into the CATextLayer's string property, the string is properly displayed in the exported video. If I then add the animation to the CATextLayer the text never changes. I figure I've overlooked something, but I can find what it is.
Thank you in advance for any help.
Here is a code example.
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableCompositionTrack *audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSDictionary *assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVAsset *asset = [[AVURLAsset alloc] initWithURL:myUrl options:assetOptions];
AVAssetTrack *audioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil];
AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
videoComposition.renderSize = videoCompositionTrack.naturalSize;
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
[videoCompositionLayerInstruction setOpacity:1.0f atTime:kCMTimeZero];
videoCompositionInstruction.layerInstructions = #[videoCompositionLayerInstruction];
videoComposition.instructions = #[videoCompositionInstruction];
CALayer *parentLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0.0f, 0.0f, videoComposition.renderSize.width, videoComposition.renderSize.height);
CALayer *videoLayer = [CALayer layer];
videoLayer.frame = CGRectMake(0.0f, 0.0f, videoComposition.renderSize.width, videoComposition.renderSize.height);
[parentLayer addSublayer:videoLayer];
CATextLayer *textLayer = [CATextLayer layer];
textLayer.font = (__bridge CFTypeRef)([UIFont fontWithName:#"Helvetica Neue" size:45.0f]);
textLayer.fontSize = 45.0f;
textLayer.foregroundColor = [UIColor colorWithRed:1.0f green:1.0f blue:1.0f alpha:1.0f].CGColor;
textLayer.shadowColor = [UIColor colorWithRed:0.0f green:0.0f blue:0.0f alpha:1.0f].CGColor;
textLayer.shadowOffset = CGSizeMake(0.0f, 0.0f);
textLayer.shadowOpacity = 1.0f;
textLayer.shadowRadius = 4.0f;
textLayer.alignmentMode = kCAAlignmentCenter;
textLayer.truncationMode = kCATruncationNone;
CAKeyframeAnimation *keyFrameAnimation = [CAKeyframeAnimation animationWithKeyPath:#"string"];
// Step 8: Set the animation values to the object.
keyFrameAnimation.calculationMode = kCAAnimationLinear;
keyFrameAnimation.values = #[#"12:00:00", #"12:00:01", #"12:00:02", #"12:00:03",
#"12:00:04", #"12:00:05", #"12:00:06", #"12:00:07",
#"12:00:08", #"12:00:09"];
keyFrameAnimation.keyTimes = #[[NSNumber numberWithFloat:0.0f], [NSNumber numberWithFloat:0.1f],
[NSNumber numberWithFloat:0.2f], [NSNumber numberWithFloat:0.3f],
[NSNumber numberWithFloat:0.4f], [NSNumber numberWithFloat:0.5f],
[NSNumber numberWithFloat:0.6f], [NSNumber numberWithFloat:0.7f],
[NSNumber numberWithFloat:0.8f], [NSNumber numberWithFloat:0.9f]];
keyFrameAnimation.beginTime = AVCoreAnimationBeginTimeAtZero;
keyFrameAnimation.duration = CMTimeGetSeconds(composition.duration);
keyFrameAnimation.removedOnCompletion = YES;
[textLayer addAnimation:keyFrameAnimation forKey:#"string"];
textLayer.frame = CGRectMake(0.0f, 20.0f, videoComposition.renderSize.width, 55.0f);
[parentLayer addSublayer:textLayer];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPreset1920x1080];
exportSession.videoComposition = videoComposition;
exportSession.audioMix = audioMix;
exportSession.outputFileType = [exportSession.supportedFileTypes objectAtIndex:0];
exportSession.outputURL = mySaveUrl;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
self.delegate = nil;
switch (exportSession.status) {
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export session cancelled.");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export session completed.");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export session failed.");
break;
case AVAssetExportSessionStatusUnknown:
NSLog(#"Export session unknown status.");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Export session waiting.");
break;
default:
break;
}
NSError *error = exportSession.error;
if (error != nil) {
NSLog(#"An error ocurred while exporting. Error: %#: %#", error.localizedDescription, error.localizedFailureReason);
}
}];
I stay in contact with apple support and told me that it's a bug in his SDK. They are trying to fix the issue.
Also I open an incident in apple bug reporter, If you are apple developer, I recommend you open a new one to do more pressure over this incident.
https://bugreport.apple.com/logon
Best regards.
Finally I fond a solution in CATextLayer and animation. It's probably that it's not the best solution but at least it works fine.
To fix the issue we ought to pass NSString to UIImage or CALayer and then put all this images (CGImage) in NSMutableArray and load this mutable array in CAKeyAnimation and do the modification of contents variable
-(UIImage *)imageFromText:(NSString *)text
{
UIGraphicsBeginImageContextWithOptions(sizeImageFromText,NO,0.0);//Better resolution (antialiasing)
//UIGraphicsBeginImageContext(sizeImageFromText); // iOS is < 4.0
[text drawInRect:aRectangleImageFromText withAttributes:attributesImageFromText];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
Adding images to NSMutableArray
[NSMutableArrayAux addObject:(__bridge id)[self imageFromText:#"Hello"].CGImage];
Set CAKeyAnimation to modify contents of image (change image)
CAKeyframeAnimation *overlay = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
overlay.values = [[NSArray alloc] initWithArray:NSMutableArrayAux];
It works really fine, but it takes 1 or 2 seconds to 5000 or 6000 images in array, and also with 15000/16000 images when you export your video it crash with a overflow error. This is another bug in framework.
As you know, this is a issue in framework, and this is a solution at least until apple fix CATextLayer and animated issue, also I give this solution to apple too.

Air Print Action not doing anything unless file type is png

Wrangling with Air Print not doing anything from a button press, im trying to print a file from screen (my first time implementing this)
Ive added the UIPrintInteractionControllerDelegate in .h
.m Hooked up my button which has been checked as working correctly
Button handler code:
NSString *path = [[NSBundle mainBundle] pathForResource:#"mylocalhtmlfile" ofType:#"HTML"];
NSData *dataFromPath = [NSData dataWithContentsOfFile:path];
UIPrintInteractionController *printController = [UIPrintInteractionController sharedPrintController];
if(printController && [UIPrintInteractionController canPrintData:dataFromPath]) {
printController.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = [path lastPathComponent];
printInfo.duplex = UIPrintInfoDuplexLongEdge;
printController.printInfo = printInfo;
printController.showsPageRange = YES;
printController.printingItem = dataFromPath;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) = ^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"FAILED! due to error in domain %# with error code %u", error.domain, error.code);
}
};
[printController presentAnimated:YES completionHandler:completionHandler];
}
I get no action whatsoever, no presentation of the air print controls or anything unless file type is set to png

Function that sends an image to AirPrint

Im trying to find a function that lets me print using AirPrint.
I have a button btnPrint, that when pressed, should print myPic.jpg to the default AirPrint device. But I cannot figure out if there even is such a function.
I cannot find a lot of documentation on AirPrint in xcode.
Apple has documentation on printing that would probably benefit you.
And the following is from Objective-C code for AirPrint:
Check wether printing is available:
if ([UIPrintInteractionController isPrintingAvailable])
{
// Available
} else {
// Not Available
}
Print after button click:
-(IBAction) buttonClicked: (id) sender;
{
NSMutableString *printBody = [NSMutableString stringWithFormat:#"%#, %#",self.encoded.text, self.decoded.text];
[printBody appendFormat:#"\n\n\n\nPrinted From *myapp*"];
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = self.titleLabel.text;
pic.printInfo = printInfo;
UISimpleTextPrintFormatter *textFormatter = [[UISimpleTextPrintFormatter alloc] initWithText:printBody];
textFormatter.startPage = 0;
textFormatter.contentInsets = UIEdgeInsetsMake(72.0, 72.0, 72.0, 72.0); // 1 inch margins
textFormatter.maximumContentWidth = 6 * 72.0;
pic.printFormatter = textFormatter;
[textFormatter release];
pic.showsPageRange = YES;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"Printing could not complete because of error: %#", error);
}
};
[pic presentFromBarButtonItem:self.rightButton animated:YES completionHandler:completionHandler];
}

How to get nowplaying infomation in Thirdparty music application?

I want to get nowplaying infomation.
So, following this code:
NSDictionary *info = [[MPNowPlayingInfoCenter defaultCenter] nowPlayingInfo];
NSString *title = [info valueForKey:MPMediaItemPropertyTitle];
NSLog(#"%#",title);
MPMusicPlayerController *pc = [MPMusicPlayerController iPodMusicPlayer];
MPMediaItem *playingItem = [pc nowPlayingItem];
if (playingItem) {
NSInteger mediaType = [[playingItem valueForProperty:MPMediaItemPropertyMediaType] integerValue];
if (mediaType == MPMediaTypeMusic) {
NSString *songTitle = [playingItem valueForProperty:MPMediaItemPropertyTitle];
NSString *albumTitle = [playingItem valueForProperty:MPMediaItemPropertyAlbumTitle];
NSString *artist = [playingItem valueForProperty:MPMediaItemPropertyArtist];
NSString *genre = [playingItem valueForProperty:MPMediaItemPropertyGenre];
TweetTextField.text = [NSString stringWithFormat:#"#nowplaying %# - %# / %# #%#", artist, songTitle, albumTitle,genre];
MPMediaItemArtwork *artwork = [playingItem valueForProperty:MPMediaItemPropertyArtwork];
CGSize newSize = CGSizeMake(250, 250);
UIGraphicsBeginImageContext(newSize);
[[artwork imageWithSize:CGSizeMake(100.0, 100.0)] drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
_imageView.image = newImage;
}
if (_imageView.image == nil){
} else {
_tableView.alpha=0.5;
}
}
But this code can get nowplaying infomation from Defautl iPod Application.
How to get nowplaying infomation in Thirdparty music application?
(e.g.: Mobile Safari, Youtube App, gMusic, Melodies etc).
I think this isn't possible. The documentation states that MPNowPlayingInfoCenter is only for setting information on the lock screen.
Here is a related question.