This question already has answers here:
UIImageWriteToSavedPhotosAlbum saves to wrong size and quality
(2 answers)
Closed 7 years ago.
I'm using code like the following to save the current screen to the photos library:
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(result, nil,nil, nil);
This works, but the quality looks a lot like a compressed JPG (and the file name saved is in the standard IMG_XXXX.JPG format), even though nowhere in this code is the type of image or its quality specified. Is there a way to control quality? I.e. could I have it saved as an uncompressed PNG instead?
You can save it as a PNG by utilizing the UIImagePNGRepresentation() method for NSData, do something like this:
....
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData* data = UIImagePNGRepresentation ( result );
UIImage* pngImage = [UIImage imageWithData:data];
UIImageWriteToSavedPhotosAlbum(pngImage, nil, nil, nil);
Hope this helps.
Related
During first launch of the app I want to show UIImage. After 2 seconds I have to write it to file. Each next launch of the app, I have to read the file and draw it on the screen and then each 1 second, I have to draw UIBezierPath of 5 random points over it. When the app closes, the latest version of the UIImage with all UIBezierPaths renderings, have to be written to the file.
I would like to use code like this:
UIImage *imageSrc = [UIImage imageWithContentsOfFile];
UIGraphicsGetImageFromCurrentImageContext();
[imageSrc drawInRect:];
[UIBezierPath fill]; // bezier path with 5 points
UIImage *imageNew = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *dataNew = UIImagePNGRepresentation(imageNew);
[dataNew writeToFile: atomically:YES];
My question : UIImagePNGRepresentation is Lossless ? After 500 repeats can I observe quality lose ? Maybe better is to use CGBitmapContext ?
I have a program that fetches an image from the library, but I'm using code I found online to resize that image so that it can fit on the screen (basically making it 640x960), but then it would still be too big to display, so in another UIImage I'm copying the first resized image and re-resizing this one to make it about 1/4 of the screen (or 160x240). The code is this:
for ViewController.h:
UIImage *img;
UIImage *thumb;
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize;
(this of course, is only the code related to my problem)
for ViewController.m
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
and in the same m file on another function, the scaleImage function is called when pressing a button with these lines:
[self scaleImage:img toSize:CGSizeMake(640, 960)];
thumb = img;
[self scaleImage:thumb toSize:CGSizeMake(160, 240)];
In the project I've previously been able to successfully provide an image for img using [info objectForKey:UIImagePickerControllerOriginalImage]; which would be the image chosen from the library. I've already "file owned" everything so that this function takes place (and it does because I create an UIAlert within it and it shows, and a NSLog to print out when scaleImage starts and it does, twice), but the image is never re-sized!! Does anyone know why?? Please let me know, thank you for anyone who comments with help or suggestions!!
Your scaleImage method returns the scaled image, for example
thumb = [self scaleImage:img toSize:CGSizeMake(640, 960)];
How to take graphics drawn in the UIView and save as a jpg or something on the iPhone to pull up in the "Photos" and or attach to an email?
thanks
It's fairly straightforward to get a UIImage:
UIImage* image = nil;
UIGraphicsBeginImageContext(_myView.frame.size);
{
[_myView.layer renderInContext: UIGraphicsGetCurrentContext()];
image = UIGraphicsGetImageFromCurrentImageContext();
}
UIGraphicsEndImageContext();
Then save to the photos library with this:
UIImageWriteToSavedPhotosAlbum(_myView, ,
I am looking for a class that will allow the user to draw?(the main purpose is to get their signature).
I checked Cocoa Controls but couldn't find such a class. Just want to now if there's any before I start writing it myself.
Thanks
Look at this tutorial: http://www.ifans.com/forums/showthread.php?t=132024
To get a JPG or PNG file, you can use the functions UIImageJPEGRepresentation or UIImagePNGRepresentation. They take a UIImage as input. And to get a UIImage from a UIView, you can use something like this.
UIGraphicsBeginImageContext(view.frame.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
I Use the following code to load png image:
UIImage *imageBack1 = [UIImage imageNamed:#"Bar1.png"];
UIImage *imageBack2 = [UIImage imageNamed:#"Bar2.png"];
imageBack1 work right when imageBack2's value is nil, Bar1.png and Bar2.png are located at the same place,but why Bar2.png couldn't be load?
In fact , the problem come from the png file,but not cocoa. Iphoto couldn't recognize it.