Why is TIFFRepresentation bigger than the actual image file? - objective-c

Here's the code I use:
NSImage *image = [[NSBundle bundleForClass:[self class]] imageForResource:#"test.jpg"];
NSData *originalData = [image TIFFRepresentationUsingCompression:NSTIFFCompressionJPEG factor:1.];
The originalData.length gives me 1802224 bytes (1.7MB). While the image size on the disk is 457KB. Why does TIFFRepresentation get larger and which representation I should use in order to get original image (i want to transfer image over the network)?

TIFF is much bigger than JPEG. No surprise there.
However, the answer to your actual question is: don't use any "representation". You have the data (namely, the file itself); send it! Don't turn the image file into an NSImage; grab it as an NSData.

Related

Cannot get creation date from screenshot png file stored in Photo Library

I am trying to get the creation date for the pictures stored in the photo library using PHPhotoLibrary. The code works fine for all pictures except for a couple of PNG files that were created as screenshots on the phone by clicking the home and power button. When I iterate through the PHFetchResult and load the CIImage, the PNG files do not contain the TIFF dictionary which contains the creation date for all the other JPG files. I have searched stackoverflow extensively and found similar issues but have not found a solution yet. I have tried loading the files in different ways and have also set the compress PNG files settings to NO in the Build Settings. When I iterate through the PHAssets called pic in PHFetchResult, the PNG image properties return a dictionary with only five elements and no TIFF dictionary. The JPG files all return a much larger dictionary including the TIFF dictionary where I can find the creation date. Here is my current code which returns an empty pictiff dictionary for the PNG files:
[imagemanager requestImageDataForAsset:pic options:ro resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
CIImage *ciimage = [CIImage imageWithData:imageData];
NSMutableDictionary *exif = [NSMutableDictionary dictionary];
[exif addEntriesFromDictionary:ciimage.properties];
// Get file name
NSURL *picpath = [info objectForKey:#"PHImageFileURLKey"];
NSString *myfilename = [picpath lastPathComponent];
// Get creation date
NSDictionary *pictiff = exif[#"{TIFF}"];
NSString *picdate = pictiff[#"DateTime"];
I would really appreciate some help on getting the creation date for these PNG files.
Deployment target 8.0, Xcode 7.3.1, iPhone 6, iOS 8.4
Well, I was hoping I could get the file name and creation date from the same object for all the pictures but I couldn't. I still don't know why the TIFF dictionary was not available in the PNG files so I used these answers:
To get the creation date: How to get photo creation date in objective c
To get the file name: iOS8 Photos Framework: How to get the name(or filename) of a PHAsset?

Create thumbnail for large .psd and .tiff files in cocoa (objective-c)

I am developing an OSX application dealing with very large images (500 MB to + 1.0 GB). My application needs to load the images (.psd & .tif) and allows the user to sort the images, rate the images and etc.
I would like to load a small thumbnail of the image. So here is what I am struggling with:
I have tried to generate the thumbnail in three different ways and the fastest has been about 17 second for generating each thumbnail.
I am looking for recommendations on how to decrease the thumbnail generation time. Do you guys know a library that I could use? Maybe another way to speed this up.
Attempts:
I used CGImage's thumbnail generation method CGImageSourceCreateThumbnailAtIndex to generate the image.
I used embedding AppleScript in my Cocoa and used the following command do shell script (\"/usr/bin/qlmanage -t -s640 \" & quoted form of file_Path & space & \" -o \" & quoted form of save_path
I used grabbing the thumbnail from image preview using QLThumbnailImageCreate
both 1 & 2 been around 17 second generation time for each image. 3 returns the blank image. I think it has to do with the fact that preview needs to load it.
I also tried using GCD (Grand Central dispatch) to speed things up too, however it seems due to the disk read bottleneck, the processes are always in serial and do not get executed in paralel. So multi-threading using different queues didn't help (used dispatch_async).
Its worth mentioning that all these images exist on an external hard drive which my application will be reading. The idea is to do this without needing to move the files.
Again i am using Objective-C and developing this of OSX 10.8. I am hoping maybe there is a C++ / C library or something faster than the three options I found my self.
Any help is greatly appreciate it.
Thank you.
If there is a preview/thumbnail embedded in the file, you can try to create the CGImageSource with a mapped file (so only the bytes that are really necessary for generating the thumbnail will be read from the disk):
NSURL* inURL = ... // The URL for your PSD or TIFF file
// Create an NSData object that copies only the required bytes to memory:
NSDataReadingOptions dataReadingOptions = NSDataReadingMappedIfSafe;
NSData* data = [[NSData alloc] initWithContentsOfURL:inURL options:dataReadingOptions error:nil];
// Create a CGImageSourceRef that do not cache the decompressed result:
NSDictionary* sourcOptions =
#{(id)kCGImageSourceShouldCache: (id)kCFBooleanFalse,
(id)kCGImageSourceTypeIdentifierHint: (id)typeName
};
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)data,(CFDictionaryRef)sourceOptions);
// Create a thumbnail without caching the decompressed result:
NSDictionary* thumbOptions = #{(id)kCGImageSourceShouldCache: (id)kCFBooleanFalse,
(id)kCGImageSourceCreateThumbnailWithTransform: (id)kCFBooleanTrue,
(id)kCGImageSourceCreateThumbnailFromImageIfAbsent: (id)kCFBooleanTrue,
(id)kCGImageSourceCreateThumbnailFromImageAlways: (id)kCFBooleanFalse,
(id)kCGImageSourceThumbnailMaxPixelSize:[NSNumber numberWithInteger:kIMBMaxThumbnailSize]};
result = CGImageSourceCreateThumbnailAtIndex(source,0,(CFDictionaryRef)options);
// Clean up
CFRelease(source);
[data release];
// Return the result (autoreleased):
return [NSMakeCollectable(result) autorelease];

How to serialize NSImage to an sql script in order to import it in a blob column?

My iphone app uses a SQLite database to store data. I have a table with a blob column where i store my images.
When i do an update i don't want to overwrite the user's database, i want to execute some sql scripts and inject new data if needed.
I have a utility app made for Mac that should make the sql scripts that will be run on the iphone.
I have the images stored as NSImages in my app but i have problems when i want to export the data as sql scripts(simple text files).
My files should have lines like:
Insert into Images(imageData) values ( ___IMAGE1_DATA___ );
Insert into Images(imageData) values ( ___IMAGE2_DATA___ );
Insert into Images(imageData) values ( ___IMAGE3_DATA___ );
the question is how could i serialize images data to my sql script in order to import the data correctly into the blob column?
You can use TIFFRepresentation of the NSImage to get hold of the NSData representation.
Actually I would save the images to disk and reference those only from your sqlite database. This can improve performance when your images tend to be large.
You will need to get the underlying CGImage object, using the CGImage method of UIImage. Then take a look at the CGImage reference:
http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Reference/CGImage/Reference/reference.html
Then use a CGContextRef object to draw the image, and obtain the pixel data.
There's a good example on StackOverflow:
How to get pixel data from a UIImage (Cocoa Touch) or CGImage (Core Graphics)?
You can also use the UIImagePNGRepresentationor UIImagePNGRepresentation functions, that return directly a NSData object, if those formats suits you. It will be a lot simpler.
Then use the bytesmethod of NSData to have a pointer to the image data (const void *).
I found a solution to serialize the images to an sql script and then insert them into the db (blob column).
I extract the NSData from an image like this NSData *thumbailData = [thumbnail TIFFRepresentation]; (thanks Nick).
After i extract the NSData, i convert it into a hex string using the method below. I added it to a category of NSData.
- (NSString*) hexString {
NSMutableString *stringBuffer = [NSMutableString
stringWithCapacity:([self length] * 2)];
const unsigned char *dataBuffer = [self bytes];
int i;
for (i = 0; i < [self length]; ++i)
[stringBuffer appendFormat:#"%02x", (unsigned long)dataBuffer[ i ]];
return [[stringBuffer copy] autorelease];
}
NSString *hexRepresentation = [thumbnailData hexString];
The hexRepresentation will look like below:
4d4d002a00005a48fafafafff8f8f8fff8f8f8fff9f9f9fff8f8f8fff8f8f8
…
In order to serialize the hexRepresentation of the image i created an SQL script like below:
INSERT INTO Thumbnails (Picture_uid, Thumbnail) Values(10, x'4d4d002a00005a48fafafafff8f8f8fff8f8f8fff9f9f9fff8f8f8fff8f8f8 … ‘) ;
the x' data ' tells the db that it will receive info in hex format and it will know how to deal with it.
one of the problems with this solution is that it will double the size of the script. if you'll have an image of 200kb the script will have 400kb but in the db the image will be 200kb.
for me this was a good solution to update my db using sql scripts without writing any code.

How to append a UIImage or CGImage to an AVAssetWriter?

Is it possible to convert a UIImage instance to a CMSampleBufferRef so that it can be appended to a specified output file using AVAssetWriter's appendSampleBuffer: method?
If so ... how?
Thanks
While I haven't really tried this one possibility is to create a CVPixelBuffer using CVPixelBufferCreateWithBytes, and pointing it to the raw pixels from the UIImage.
Once this is done, since CVPixelBuffers are just CVImageBuffers you can use CMSampleBufferCreateForImageBuffer to get a CMSampleBufferRef that you can then use the appendSampleBuffer method.
As I said previously I haven't ever tried this, but it looks plausible.

Reading and writing images to an SQLite DB for iPhone use

I've set up a SQLite DB that currently reads and writes NSStrings perfectly. I also want to store an image in the database and recall it later. I've read up a bit on using NSData and encoding the image, but I'm not entirely sure what the syntax is for what I want to do. Any code snippets or examples would be greatly appreciated.
My current process goes like this:
UIImagePickerController -> User Chooses Image from Photos -> chosenImage is set to instance of UIImageView -> Now I want to take this image and store it in the DB
I should mention this call will eventually be replaced with a call to a remote server. Not sure if this makes a difference as far as performance goes.
You'll need to convert the UIImage hosted within your UIImageView into a binary BLOB for storage in SQLite. To do that, you can use the following:
NSData *dataForImage = UIImagePNGRepresentation(cachedImage);
sqlite3_bind_blob(yourSavingSQLStatement, 2, [dataForImage bytes], [dataForImage length], SQLITE_TRANSIENT);
This will generate a PNG representation of your image, store it in an NSData instance, and then bind the bytes from the NSData as a BLOB for the second argument in your SQL query. Use UIImageJPEGRepresentation in the above to store in that format, if you like. You will need to have a BLOB column added to the appropriate table in your SQLite database.
To retrieve this image, you can use the following:
NSData *dataForCachedImage = [[NSData alloc] initWithBytes:sqlite3_column_blob(yourLoadingSQLStatement, 2) length: sqlite3_column_bytes(yourLoadingSQLStatement, 2)];
self.cachedImage = [UIImage imageWithData:dataForCachedImage];
[dataForCachedImage release];
One option (and generally preferred when working in SQL) is to write the image to a file on the system and store the path (or some other kind of identifier) in the database.
Apple's recommendation is not to store BLOB's in SQLite databases that are bigger than ~2 kilobytes.
SQLite organizes databases into pages. Each page is 4 kilobytes in size. When you read data from the SQLite database file it loads these pages into an internal page cache. On the iPhone I think this cache defaults to 1 megabyte in size. This makes reading adjacent records very fast because they will probably be in the page cache already.
When SQLite reads your database record into memory it reads the entire record and all of the pages that it occupies. So if your record contains a BLOB, it could occupy many pages and you will be ejecting existing pages from the cache and replacing them with your BLOB record's pages.
This isn't so bad if you're just scanning through and loading all your BLOBS to do something with them (display them for example). But if say you did a query where you just wanted to get some data that is in the same row as the BLOB this query would be much slower than if the record did not contain the large BLOB.
So at a minimum you should store your BLOB data in a separate table. Eg:
CREATE TABLE blobs ( id INTEGER PRIMARY KEY, data BLOB );
CREATE TABLE photos ( id INTEGER PRIMARY KEY, name TEXT, blob_id INTEGER,
FOREIGN KEY(blob_id) REFERENCES blobs(id) );
Or better yet, store the BLOB data as files outside of the SQLite database.
Note that it may be possible to tweak the page cache size with SQL PRAGMA statements (if you're not using CoreData).
Writing Image to SQLite DB
if(myImage != nil){
NSData *imgData = UIImagePNGRepresentation(myImage);
sqlite3_bind_blob(update_stmtement, 6, [imgData bytes], [imgData length], NULL);
}
else {
sqlite3_bind_blob(update_stmtement, 6, nil, -1, NULL);
}
Reading From SQLite DB:
NSData *data = [[NSData alloc] initWithBytes:sqlite3_column_blob(init_statement, 6) length:sqlite3_column_bytes(init_statement, 6)];
if(data == nil)
NSLog(#"No image found.");
else
self.pictureImage = [UIImage imageWithData:data];
you should first write image on file system. and then take the image path and store that image path(URL) as TEXT in sqlite.
The best practice is to compress the image and store it in the database or file system. If you don't care about the resolution of the image you can go ahead and even resize the image by using :
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
After that you can use UIImageJPEGRepresentation for jpeg images with a value of "0" for maximum compression. Or you can use UIImagePNGRepresentation for png images