saving thumbnail image to directory - objective-c

Hi I'm going to save thumbnail image get from video to directory with the time duration and with video icon just like
How it could be possible to save like this.
I'm using the code to save image to directory and later I'll display it in GridView.
if ([[asset valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypeVideo])
{
// Getting Time Duration of video file
NSURL* videoUrl = alassetRep.url;
AVURLAsset *avUrl = [AVURLAsset assetWithURL:videoUrl];
CMTime time = [avUrl duration];
NSUInteger dTotalSeconds = ceil(time.value/time.timescale);
NSUInteger dMinutes = floor(dTotalSeconds % 3600 / 60);
NSUInteger dSeconds = floor(dTotalSeconds % 3600 % 60);
NSString *videoDurationText = [NSString stringWithFormat:#"%02i:%02i", dMinutes, dSeconds];
NSLog(#"%#",videoDurationText);
/************************************Low Resolution Images ******************************************/
UIImage *image = [UIImage imageWithCGImage:[alassetRep fullResolutionImage]];
UIImage *thumbImage = [self imageWithImage:image scaledToSize:CGSizeMake(50, 50)];
NSData *thumbImageData = UIImageJPEGRepresentation(thumbImage, 0.8);
NSString *thumbOriginalPath = [NSString stringWithFormat:#"SMALL_IMAGE_%d_%d.jpg",(int)currentDate,i];
NSString* thumbImagePath = [DucPath stringByAppendingPathComponent:thumbOriginalPath];
NSLog(#"Image path At Save Time:%#",thumbImagePath);
[thumbImageData writeToFile:thumbImagePath atomically:YES];
[pMediaArray addObject:thumbOriginalPath];
}

Yes I've found solution....
if ([[asset valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypeVideo])
{
unsigned long DataSize = (unsigned long)[alassetRep size];
Byte *buffer = (Byte*)malloc(DataSize);
NSUInteger buffered = (NSUInteger)[alassetRep getBytes:buffer fromOffset:0.0 length:(NSUInteger)alassetRep.size error:nil];
NSData *videoData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
NSString* newVideoName = [NSString stringWithFormat:#"video_%d_%d.mov",(int)currentDate,i];
NSString* newVideoPath = [DucPath stringByAppendingPathComponent:newVideoName];
[videoData writeToFile:newVideoPath atomically:YES];
[pImageMediaArray addObject:newVideoName];
NSLog(#"%#",newVideoName);
// Getting Time Duration of video file
NSURL* videoUrl = alassetRep.url;
AVURLAsset *avUrl = [AVURLAsset assetWithURL:videoUrl];
CMTime time = [avUrl duration];
NSUInteger dTotalSeconds = ceil(time.value/time.timescale);
NSUInteger dMinutes = floor(dTotalSeconds % 3600 / 60);
NSUInteger dSeconds = floor(dTotalSeconds % 3600 % 60);
NSString *videoDurationText = [NSString stringWithFormat:#"%02i:%02i", dMinutes, dSeconds];
/************************************Low Resolution Images ******************************************/
UIImage *lowResImage = [UIImage imageWithCGImage:[alassetRep fullResolutionImage]];
UIImage *thumbImage = [self imageWithImage:lowResImage scaledToSize:CGSizeMake(50, 50)];
UIImageView* imageView = [[UIImageView alloc] initWithFrame:(CGRect){ CGPointZero, thumbImage.size}];
imageView.image = thumbImage;
UIImageView* timeView = [[UIImageView alloc] initWithFrame:CGRectMake(imageView.frame.size.width - 35, imageView.frame.size.height - 20, 30, 15)];
timeView.image = [UIImage imageNamed:#"screen2-videoTime-icon.png"];
UIImageView* videoIcon = [[UIImageView alloc] initWithFrame:CGRectMake(imageView.frame.origin.x+5, imageView.frame.size.height - 15, 15, 10)];
videoIcon.image = [UIImage imageNamed:#"screen2-video-icon.png"];
UILabel* timelabel = [[UILabel alloc] initWithFrame:CGRectMake(0, 0, 30, 15)];
timelabel.textColor = [UIColor whiteColor];
timelabel.text = videoDurationText;
timelabel.textAlignment = NSTextAlignmentCenter;
timelabel.font = [UIFont fontWithName:#"Helvetica" size: 10.0];
[timeView addSubview:timelabel];
[timeView bringSubviewToFront:timelabel];
[imageView addSubview:videoIcon];
[imageView addSubview:timeView];
UIGraphicsBeginImageContext(imageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[imageView.layer renderInContext:context];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *thumbImageData = UIImageJPEGRepresentation(screenShot, 0.8);
NSString *thumbOriginalPath = [NSString stringWithFormat:#"SMALL_IMAGE_%d_%d.jpg",(int)currentDate,i];
NSString* thumbImagePath = [DucPath stringByAppendingPathComponent:thumbOriginalPath];
NSLog(#"Image path At Save Time:%#",thumbImagePath);
[thumbImageData writeToFile:thumbImagePath atomically:YES];
[pMediaArray addObject:thumbOriginalPath];
}

Related

Get all Images from Live Photo

I want to get a NSArray with all the UIImage from a Live Photo to create a GIF of that. I tried to make screenshots while animating the live photo but it doesn't work.
Can anyone help me?
Thanks!
First step, you need convert a Live Photo to Video, using this:
PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes,
toFile: fileURL, options: nil, completionHandler:
{
// Video file has been written to path specified via fileURL
}
Finally, using this library to convert this to GIF, or you can search in google for another way: https://github.com/NSRare/NSGIF
Hope this will help you.
This is what I did to achieve the same thing as you requested.
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d", PHAssetMediaTypeImage];
options.predicate = [NSPredicate predicateWithFormat:#"mediaSubtype == %d", PHAssetMediaSubtypePhotoLive];
options.includeAllBurstAssets = NO;
PHFetchResult *allLivePhotos = [PHAsset fetchAssetsWithOptions:options];
NSLog(#"Get total live count : %ld",(unsigned long)allLivePhotos.count);
NSMutableArray *arrAllLiveImagesGroups = [NSMutableArray array];
for (PHAsset *asset in allLivePhotos) {
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *urlMov = [contentEditingInput.livePhoto valueForKey:#"videoURL"];
NSMutableArray *arrLive = [NSMutableArray array];
NSMutableArray *arrSingleLiveImagesGroup = [NSMutableArray array];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:urlMov options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * 5 ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, 5);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image scale:1.0 orientation:UIImageOrientationDown];
[arrLive addObject:generatedImage];
CGImageRelease(image);
}
}
[arrSingleLiveImagesGroup addObject:arrLive];
[arrAllLiveImagesGroups addObject:arrSingleLiveImagesGroup];
}];
}

Colored Artifacts while transforming PDF to NSImageRep (png)

I want to convert different PDF pages to an png. After that I iterate through all pixels to search colored pixel. The main goal is to get the colored pages of the PDF.
For the most pages it runs great. But on some pages I have colored artifacts in red (left beside a letter) and blue (right beside a letter).
This is the original from the PDF:
(source: brebook.de)
And this is the converted letter in the png:
How can I prevent this ugly artifacts. I can't use my whole idea with this colored pixel.
This is the code which is converting the single page to a png:
//Path for saving colored Pages
NSURL *savePath = [self.homePath URLByAppendingPathComponent:#"brebook_temp/"];
//PDFDocument *thePDF = [[PDFDocument alloc] initWithURL:self.filename];
NSPDFImageRep *img = [NSPDFImageRep imageRepWithContentsOfURL:self.filename];
int coloredPages = 0;
for(int i = 0; i < self.getNumberOfPages; i++){
#autoreleasepool {
//set current page label to current page
self.currentPage = i + 1;
//set current page to i
[img setCurrentPage:i];
//create a new NSImage instance
NSImage *singlePage = [NSImage new];
//set actuall page
[singlePage addRepresentation:img];
//get "old" size
NSSize oldSize = [singlePage size];
//edit the size to 72dpi
NSSize newSize;
newSize.width = oldSize.width * 150/72;
newSize.height = oldSize.height * 150/72;
//make new image
NSImage *resizedImage = [[NSImage alloc] initWithSize: NSMakeSize(newSize.width, newSize.height)];
//write into new image
[resizedImage lockFocus];
[singlePage drawInRect: NSMakeRect(0, 0, newSize.width, newSize.height) fromRect: NSMakeRect(0, 0, oldSize.width, oldSize.height) operation: NSCompositeSourceOver fraction: 1.0];
[resizedImage unlockFocus];
//Set URL for single pages
NSURL *pageFilename = [savePath URLByAppendingPathComponent: [NSString stringWithFormat:#"Seite_%d.png", i+1]];
[[NSFileManager defaultManager] createFileAtPath: [pageFilename path]
contents:[[NSBitmapImageRep imageRepWithData:[singlePage TIFFRepresentation]]
representationUsingType:NSPNGFileType properties:nil]
attributes:nil];
if([self getColoredPixelOfImageFromURL:pageFilename] > 0){
coloredPages++;
NSLog(#"SEITE %d -----------------------------------------------------------------------------------------------", i+1);
_coloredPages = coloredPages;
[self.coloredPagesList appendString:[NSString stringWithFormat:#"%d,",i+1]];
}else{
NSError *error = nil;
[[NSFileManager defaultManager] removeItemAtURL:pageFilename error:&error];
if(error){
NSLog(#"%#", error);
}
}
resizedImage = nil;
singlePage = nil;
}
}
[self.appTimer invalidate];
Thank you so much for helping!!!

How to Add text in a video

I have a video that lasts a duration of 5 minutes now I want to add text in that video for a particular time of say 5 - 15 seconds.
Can any one help me out. I have tried below code for adding text to an image
CATextLayer *subtitle1Text = [[CATextLayer alloc] init];
[subtitle1Text setFont:#"Helvetica-Bold"];
[subtitle1Text setFontSize:36];
[subtitle1Text setFrame:CGRectMake(0, 0, size.width, 100)];
[subtitle1Text setString:_subTitle1.text];
[subtitle1Text setAlignmentMode:kCAAlignmentCenter];
[subtitle1Text setForegroundColor:[[UIColor whiteColor] CGColor]];
// 2 - The usual overlay
CALayer *overlayLayer = [CALayer layer];
[overlayLayer addSublayer:subtitle1Text];
overlayLayer.frame = CGRectMake(0, 0, size.width, size.height);
[overlayLayer setMasksToBounds:YES];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
Can one one help me how to do it for videos
Let's try the below method to edit a video frame by frame using AVFoundation.framework,
-(void)startEditingVideoAtURL:(NSURL *)resourceURL
{
NSError * error = nil;
// Temp File path to write the edited movie
NSURL * movieURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), resourceURL.lastPathComponent]];
NSFileManager * fm = [NSFileManager defaultManager];
NSError * error;
BOOL success = [fm removeItemAtPath:movieURL.path error:&error];
// Create AVAssetWriter to convert the images into movie
AVAssetWriter * videoWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeQuickTimeMovie error:&error];
NSParameterAssert(videoWriter);
AVAsset * avAsset = [[AVURLAsset alloc] initWithURL:resourceURL options:nil];
// Set your out put video frame here
NSDictionary * videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:480.0], AVVideoWidthKey,
[NSNumber numberWithInt:480.0], AVVideoHeightKey,
nil];
// Create AVAssetWriterInput with video Settings
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
NSDictionary * sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:480.0], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:480.0], kCVPixelBufferHeightKey,
nil];
AVAssetWriterInputPixelBufferAdaptor * assetWriterPixelBufferAdaptor1 = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
NSError *aerror = nil;
// Create AVAssetReader to read the video files frame by frame
AVAssetReader * reader = [[AVAssetReader alloc] initWithAsset:avAsset error:&aerror];
NSArray * vTracks = [avAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetReaderTrackOutput * asset_reader_output = nil;
if (vTracks.count)
{
AVAssetTrack * videoTrack = [vTracks objectAtIndex:0];
videoWriterInput.transform = videoTrack.preferredTransform;
NSDictionary *videoOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
asset_reader_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoOptions];
[reader addOutput:asset_reader_output];
}
else
{
[[NSNotificationCenter defaultCenter] postNotificationName:EDITOR_FAILED_NOTI object:resourceURL.path];
[reader cancelReading];
queueInProgress = NO;
[self removeCurrentVideo];
return;
}
//audio setup
AVAssetWriterInput * audioWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeAudio
outputSettings:nil];
AVAssetReader * audioReader = [AVAssetReader assetReaderWithAsset:avAsset error:&error];
NSArray * aTrack = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetReaderOutput * readerOutput = nil;
if (aTrack.count)
{
AVAssetTrack * audioTrack = [aTrack objectAtIndex:0];
readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];
[audioReader addOutput:readerOutput];
}
NSParameterAssert(audioWriterInput);
NSParameterAssert([videoWriter canAddInput:audioWriterInput]);
audioWriterInput.expectsMediaDataInRealTime = NO;
[videoWriter addInput:audioWriterInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
[reader startReading]; // Here the video reader starts the reading
dispatch_queue_t _processingQueue = dispatch_queue_create("assetAudioWriterQueue", NULL);
[videoWriterInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
while ([videoWriterInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer;
if ([reader status] == AVAssetReaderStatusReading &&
(sampleBuffer = [asset_reader_output copyNextSampleBuffer]))
{
CMTime currentTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage * _ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
void * baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
CGDataProviderRef dataProvider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGDataProviderRelease(dataProvider);
// Add your text drawing code here
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
CGColorSpaceRelease(rgbColorSpace);
BOOL result = [assetWriterPixelBufferAdaptor1 appendPixelBuffer:imageBuffer withPresentationTime:currentTime];
CVPixelBufferRelease(imageBuffer);
CFRelease(sampleBuffer);
if (!result)
{
dispatch_async(dispatch_get_main_queue(), ^
{
[[NSNotificationCenter defaultCenter] postNotificationName:EDITOR_FAILED_NOTI object:resourceURL.path];
[reader cancelReading];
queueInProgress = NO;
[self removeCurrentVideo];
});
break;
}
}
else
{
[videoWriterInput markAsFinished]; // This will called, once video write done
switch ([reader status])
{
case AVAssetReaderStatusReading:
// the reader has more for other tracks, even if this one is done
break;
case AVAssetReaderStatusCompleted:
{
if (!readerOutput)
{
if ([videoWriter respondsToSelector:#selector(finishWritingWithCompletionHandler:)])
[videoWriter finishWritingWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^
{
});
}];
else
{
if ([videoWriter finishWriting])
{
dispatch_async(dispatch_get_main_queue(), ^
{
});
}
}
break;
}
[audioReader startReading];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[audioWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^
{
while (audioWriterInput.readyForMoreMediaData) {
CMSampleBufferRef nextBuffer;
if ([audioReader status] == AVAssetReaderStatusReading &&
(nextBuffer = [readerOutput copyNextSampleBuffer]))
{
if (nextBuffer)
{
[audioWriterInput appendSampleBuffer:nextBuffer];
}
CFRelease(nextBuffer);
}else
{
[audioWriterInput markAsFinished];
switch ([audioReader status])
{
case AVAssetReaderStatusCompleted:
if ([videoWriter respondsToSelector:#selector(finishWritingWithCompletionHandler:)])
[videoWriter finishWritingWithCompletionHandler:^
{
}];
else
{
if ([videoWriter finishWriting])
{
}
}
break;
}
}
}
}
];
break;
}
// your method for when the conversion is done
// should call finishWriting on the writer
//hook up audio track
case AVAssetReaderStatusFailed:
{
break;
}
}
break;
}
}
}
];
}
Thanks!

How to get nowplaying infomation in Thirdparty music application?

I want to get nowplaying infomation.
So, following this code:
NSDictionary *info = [[MPNowPlayingInfoCenter defaultCenter] nowPlayingInfo];
NSString *title = [info valueForKey:MPMediaItemPropertyTitle];
NSLog(#"%#",title);
MPMusicPlayerController *pc = [MPMusicPlayerController iPodMusicPlayer];
MPMediaItem *playingItem = [pc nowPlayingItem];
if (playingItem) {
NSInteger mediaType = [[playingItem valueForProperty:MPMediaItemPropertyMediaType] integerValue];
if (mediaType == MPMediaTypeMusic) {
NSString *songTitle = [playingItem valueForProperty:MPMediaItemPropertyTitle];
NSString *albumTitle = [playingItem valueForProperty:MPMediaItemPropertyAlbumTitle];
NSString *artist = [playingItem valueForProperty:MPMediaItemPropertyArtist];
NSString *genre = [playingItem valueForProperty:MPMediaItemPropertyGenre];
TweetTextField.text = [NSString stringWithFormat:#"#nowplaying %# - %# / %# #%#", artist, songTitle, albumTitle,genre];
MPMediaItemArtwork *artwork = [playingItem valueForProperty:MPMediaItemPropertyArtwork];
CGSize newSize = CGSizeMake(250, 250);
UIGraphicsBeginImageContext(newSize);
[[artwork imageWithSize:CGSizeMake(100.0, 100.0)] drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
_imageView.image = newImage;
}
if (_imageView.image == nil){
} else {
_tableView.alpha=0.5;
}
}
But this code can get nowplaying infomation from Defautl iPod Application.
How to get nowplaying infomation in Thirdparty music application?
(e.g.: Mobile Safari, Youtube App, gMusic, Melodies etc).
I think this isn't possible. The documentation states that MPNowPlayingInfoCenter is only for setting information on the lock screen.
Here is a related question.

issue in generating image files in mac os x app cocoa

-(void)processImage:(NSString*)inputPath:(int)imageWidth:(int)imageHeight:(NSString*)outputPath {
// NSImage * img = [NSImage imageNamed:inputPath];
NSImage *image = [[NSImage alloc] initWithContentsOfFile:inputPath];
[image setSize: NSMakeSize(imageWidth,imageHeight)];
[[image TIFFRepresentation] writeToFile:outputPath atomically:NO];
NSLog(#"image file created");
}
- (IBAction)processImage:(id)sender {
NSTimeInterval timeStamp = [[NSDate date] timeIntervalSince1970];
// NSTimeInterval is defined as double
NSNumber *timeStampObj = [NSNumber numberWithInt:timeStamp];
NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init];
[formatter setNumberStyle:NSNumberFormatterNoStyle];
NSString *convertNumber = [formatter stringForObjectValue:timeStampObj];
NSLog(#"timeStampObj:: %#", convertNumber);
fileNameNumber = [[convertNumber stringByAppendingString:[self genRandStringLength:8]] retain];
int i; // Loop counter.
// Loop through all the files and process them.
for( i = 0; i < [files count]; i++ )
{
inputFilePath = [[files objectAtIndex:i] retain];
NSLog(#"filename::: %#", inputFilePath);
// Do something with the filename.
[selectedFile setStringValue:inputFilePath];
NSLog(#"selectedFile:::: %#", selectedFile);
}
NSLog(#"curdir:::::%#", inputFilePath);
NSString *aString = [[NSString stringWithFormat:#"%#%#%#", thumbnailDirPath , #"/" , fileNameNumber] retain];
fileNameJPG = [[aString stringByAppendingString:#"_small.jpg"] retain];
fileNameJPG1 = [[aString stringByAppendingString:#".jpg"] retain];
fileNameJPG2 = [[aString stringByAppendingString:#"_H.jpg"] retain];
[self processImage:inputFilePath: 66 :55 :fileNameJPG];
[self processImage:inputFilePath: 800 :600 :fileNameJPG1];
[self processImage:inputFilePath: 320 :240 :fileNameJPG2];
}
The issue I am facing is that the above code is generating 3 files with different names(as I have defined the name should be) having the same size of all 3 files but not with the dimensions or width/length I pass to the function.
What can be the issue ?
NSImage objects are immutable. So image is not modified when you change its size.
You should use something like the following code (adapted from here).
-(void)saveImageAtPath:(NSString*)sourcePath toPath:(NSString*)targetPath withWidth:(int)targetWidth andHeight:(int)targetHeight
{
NSImage *sourceImage = [[NSImage alloc] initWithContentsOfFile:sourcePath];
NSImage *targetImage = [[NSImage alloc] initWithSize: NSMakeSize(targetWidth, targetHeight)];
NSSize sourceSize = [sourceImage size];
NSRect sourceRect = NSMakeRect(0, 0, sourceSize.width, sourceSize.height);
NSRect targetRect = NSMakeRect(0, 0, targetWidth, targetWidth);
[targetImage lockFocus];
[sourceImage drawInRect:targetRect fromRect:sourceRect operation: NSCompositeSourceOver fraction: 1.0];
[targetImage unlockFocus];
[[targetImage TIFFRepresentation] writeToFile:targetPath atomically:NO];
NSLog(#"image file created");
[sourceImage release];
[targetImage release];
}