I have a simple list of items that needs to be printed using Cocoa. I have a half-baked solution that uses an NSView with a custom drawRect: method, but it's fairly complex and not very easy to maintain.
What I would like to have is an HTML string (which could be easily constructed from the list) that can be embedded in a one-off WebView, then printed.
Assuming I have a simple NSString like:
NSString *htmlString = #"<b>Test</b>";
What's the easiest method for creating a WebView displaying this content? I've tried the below code, but it results in a single blank page:
WebView *webView = [[WebView alloc] init];
NSString *dir = #"/Users/Me/Desktop/";
NSString *fileUrl = [dir stringByAppendingPathComponent:#"Temp_Print.html"];
NSString *htmlString = #"<b>Hi!</b>";
[[htmlString dataUsingEncoding:NSUTF8StringEncoding] writeToFile:fileUrl atomically:YES];
[[webView mainFrame] loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:fileUrl]]];
[webView setFrame:NSMakeRect(0, 0, 500, 500)];
NSPrintOperation *po = [NSPrintOperation printOperationWithView:webView printInfo:pi];
[pi release];
[po runOperation];
Another one of those questions you solve right after asking it!
The run loop needs to iterate in order for the content to actually load. I simply finished running the actual print operation in the frame load delegate method:
- (void)webView:(WebView *)sender didFinishLoadForFrame:(WebFrame *)frame {
...
}
Source
Related
Is there anyway to load text from a file into a UITextView in parts yet keep scrolling smooth (Like loading text into a UITableView and splitting it up among multiple cells)?
For Example: I have a RTF (rich-text file) with over 120,000 words in it. Due to attributes and the amount of words, scrolling is choppy.
I want to try a free up memory. Instead of loading the entire text-file at once, I would like to load just enough text to fit on the screen and to make scrolling looks smooth.
I've seen this answer: Reading Specific Part Of A Text File that deals with reading specific parts of a file but my file isn't broken up by special characters in anyway.
Instead of using a UITextView and RTF, you can use a UIWebView and HTML (use a conversion tool to turn RTF into HTML). Here is how to hook it all up and change the text size. You can call the resizeWebViewToSize at any time to resize the text.
- (void)viewDidLoad
{
[super viewDidLoad];
self.webView.delegate = self;
}
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
NSString *htmlFile = [[NSBundle mainBundle] pathForResource:#"HtmlFile" ofType:#"html"];
NSString* htmlString = [NSString stringWithContentsOfFile:htmlFile encoding:NSUTF8StringEncoding error:nil];
[self.webView loadHTMLString:htmlString baseURL: [[NSBundle mainBundle] bundleURL]];
}
- (void)webViewDidFinishLoad:(UIWebView *)webView
{
[self resizeWebViewToSize:#"1000"];
}
- (void)resizeWebViewToSize:(NSString *)size
{
NSString *fontSize=size;
NSString *jsString = [[NSString alloc] initWithFormat:#"document.getElementsByTagName('body')[0].style.webkitTextSizeAdjust= '%d%%'",[fontSize intValue]];
[self.webView stringByEvaluatingJavaScriptFromString:jsString];
}
In the application I'm creating, I load a long page of HTML into a webView and then print it to a PDF using the following:
-(void)webView:(WebView *)sender didFinishLoadForFrame:(WebFrame *)frame
{
if ([frame isEqual:[[self doc] mainFrame]])
{
NSMutableData *newData = [[NSMutableData alloc] init];
NSPrintInfo *newInfo = [NSPrintInfo sharedPrintInfo];
NSView *docView = [[[[self doc] mainFrame] frameView] documentView];
NSPrintOperation *newPrintOp = [NSPrintOperation PDFOperationWithView:docView insideRect:docView.bounds toData:newData printInfo:newInfo];
BOOL runPrint = [newPrintOp runOperation];
if (!runPrint)
{
NSLog(#"Print Failed");
}
PDFDocument *newDoc = [[PDFDocument alloc] initWithData:newData];
[newData release];
[self setPdf:newDoc];
//Other code here
}
}
The problem is that when I look at newDoc, it is a huge PDF of a single page. What I would prefer would be the printing acting the same as it does from the "save as PDF..." dialog - that is, splitting the PDF into multiple reasonably-sized pages.
Does anyone know how to accomplish this?
I attempted inserting the following after NSPrintInfo *newInfo = [NSPrintInfo sharedPrintInfo];
[newInfo setVerticalPagination:NSAutoPagination];
[newInfo setHorizontalPagination:NSAutoPagination];
NSAutoPagination is described in the docs as the following:
NSAutoPagination
The image is divided into equal-sized rectangles and placed in one column of pages.
Available in Mac OS X v10.0 and later.
Declared in NSPrintInfo.h.
This had no effect on the printed PDF.
You get a file with one large page because + PDFOperationWithView: method doesn't support pagination at all. For that reason calling - setVerticalPagination: or - setHoriziontalPagination: doesn't change anything.
You could try use "classical" + printOperationWithView:printInfo: method, configure it to save PDF to temporary location and then create PDFDocument with contents of obtained file. I hope that fragment of code below will help.
NSMutableDictionary *dict = [[NSPrintInfo sharedPrintInfo] dictionary];
[dict setObject:NSPrintSaveJob forKey:NSPrintJobDisposition];
[dict setObject:temporaryFilePath forKey:NSPrintSavePath];
NSPrintInfo *pi = [[NSPrintInfo alloc] initWithDictionary:dict];
[pi setHorizontalPagination:NSAutoPagination];
[pi setVerticalPagination:NSAutoPagination];
NSPrintOperation *op = [NSPrintOperation printOperationWithView:[[[webView mainFrame] frameView] documentView] printInfo:pi];
[pi release];
[op setShowsPrintPanel:NO];
[op setShowsProgressPanel:NO];
if ([op runOperation] ){
PDFDocument *doc = [[[PDFDocument alloc] initWithURL:[NSURL fileURLWithPath: temporaryFilePath]] autorelease];
// do with doc what you want, remove file, etc.
}
So I have an app I've written for the iPad, and I'd like to be able to allow users to insert images into their documents by selecting an image from an album or the camera. All that works great. Because the user might keep the document longer than they keep the image in an album, I make a copy of it, scale it down a bit, and store it in a core data table that is just used for this purpose.
I store the image like this:
NSManagedObjectContext* moc=[(ActionNote3AppDelegate *)[[UIApplication sharedApplication] delegate] managedObjectContext];
NSString* imageName=[NSString stringWithFormat:#"img%lf.png",[NSDate timeIntervalSinceReferenceDate]];
Image* anImage = [NSEntityDescription insertNewObjectForEntityForName:#"Image" inManagedObjectContext:moc];
anImage.imageName=imageName;
anImage.imageData=UIImagePNGRepresentation(theImage);
NSError* error=nil;
if(![moc save:&error]) {...
I sub-class NSURLCache, as suggested on Cocoa With Love, and ovverride cachedResponseForRequest thusly:
- (NSCachedURLResponse *)cachedResponseForRequest:(NSURLRequest *)request {
NSString *pathString = [[[request URL] absoluteString]lastPathComponent];
NSData* data = [Image dataForImage:pathString];
if (!data) {
return [super cachedResponseForRequest:request];
}
NSURLResponse *response =[[[NSURLResponse alloc]
initWithURL:[request URL]
MIMEType:[NSString stringWithString:#"image/png"]
expectedContentLength:[data length]
textEncodingName:nil]
autorelease];
NSCachedURLResponse* cachedResponse =[[[NSCachedURLResponse alloc] initWithResponse:response data:data] autorelease];
return cachedResponse;
}
I also make sure the app uses the sub-classed NSURLCache by doing this in my app delegate in didFinishLaunchingWithOptions:
ANNSUrlCache* uCache=[[ANNSUrlCache alloc]init];
[NSURLCache setSharedURLCache:uCache];
The method that returns the image data from the core data record looks like this:
+(NSData*)dataForImage:(NSString *)name {
NSData* retval=nil;
NSManagedObjectContext* moc=[(ActionNote3AppDelegate *)[[UIApplication sharedApplication] delegate] managedObjectContext];
NSEntityDescription *entityDescription = [NSEntityDescription entityForName:#"Image" inManagedObjectContext:moc];
NSFetchRequest *request = [[[NSFetchRequest alloc] init] autorelease];
[request setEntity:entityDescription];
NSPredicate *predicate = [NSPredicate predicateWithFormat:#"imageName==%#", name];
[request setPredicate:predicate];
NSError* error=nil;
NSArray *array = [moc executeFetchRequest:request error:&error];
if ([array count]>0) {
retval=((Image*)[array objectAtIndex:0]).imageData;
}
return retval;
}
To insert the image into the web view, I have an html img tag where the name in src="" relates back to the name in the image table. The point of the NSURLCache code above is to watch for a name we have stored in the image table, intercept it, and send the actual image data for the image requested.
When I run this, I see the image getting requested in my sub-classed NSURLCache object. It is finding the right record, and returning the data as it should. However, I'm still getting the image not found icon in my uiwebview:
So Marcus (below) suggested that I not store the image data in a core data table. So I made changes to accomodate for that:
Storing the image:
NSString* iName=[NSString stringWithFormat:#"img%lf.png",[NSDate timeIntervalSinceReferenceDate]];
NSData* iData=UIImagePNGRepresentation(theImage);
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* documentsDirectory = [paths objectAtIndex:0];
NSString* fullPathToFile = [documentsDirectory stringByAppendingPathComponent:iName];
[iData writeToFile:fullPathToFile atomically:NO];
Retrieving the image:
- (NSCachedURLResponse *)cachedResponseForRequest:(NSURLRequest *)request {
NSString *pathString = [[[request URL] absoluteString]lastPathComponent];
NSString* iPath = [Image pathForImage:pathString];
if (!iPath) {
return [super cachedResponseForRequest:request];
}
NSData* idata=[NSData dataWithContentsOfFile:iPath];
NSURLResponse *response =[[[NSURLResponse alloc]
initWithURL:[request URL]
MIMEType:#"image/png"
expectedContentLength:[idata length]
textEncodingName:nil]
autorelease];
NSCachedURLResponse* cachedResponse =[[[NSCachedURLResponse alloc] initWithResponse:response data:idata] autorelease];
return cachedResponse;
}
In debug mode, I see that idata does get loaded with the proper image data.
And I still get the image-not-found image! Obviously, I'm doing something wrong here. I just dont know what it is.
So... What am I doing wrong here? How can I get this to work properly?
Thank you.
I would strongly suggest that you do not store the binary data in Core Data. Storing binary data in Core Data, especially on an iOS device, causes severe performance issues with the cache.
The preferred way would be to store the actual binary data on disk in a file and have a reference to the file stored within Core Data. From there it is a simple matter to change the image url to point at the local file instead.
So it turns out I was way overthinking this. When I write the HTML, I just write the path to the image in with the image tag. Works like a charm.
I would love to know why the solution I posed in my question did not work, though.
And, I did wind up not storing the images in a table.
fairly new to Objective-C and iOS development (coming from PHP) and I have a relatively simple question that I can't seem to find an answer to:
I am following along with an example for split View design where a web page is loaded into the Detail View when a user clicks an item in the master view. I got all this working, but would like to substitute web view for an image. So I've amended the app to load a UIImage instead of a WebView. What I'm looking for is the equivalent to this code:
NSString *urlString = [pagesAddress objectAtIndex:indexPath.row];
NSURL *url = [NSURL URLWithString:urlString];
// these 2 is where I get lost with the images.
NSURLRequest = *request = [NSURLRequest requestWithURL:url];
[detailViewController.webView loadRequest:request];
I came up with this:
NSString *imageName = [pagesAddress objectAtIndex:indexPath.row];
UIImage *myImage = [UIImage imageNamed:imageName];
// missing the last 2 calls: one to tell Xcode that it's an image "request" I want and the second to load the actual image (based on it's name that is already in an array) into the ImageView.
Thanks.
PS
I tried this:
NSString *imageName = [pagesAddress objectAtIndex:indexPath .row];
[detailViewController.imageView setImage:[UIImage imageNamed:imageName]];
And it shows just the first image, then crashes when I try to show the last one.
In the end, the solution were those 2 lines when I amended the code:
NSString *imageName = [pagesAddress objectAtIndex:indexPath.row];
[detailViewController.imageView setImage:[UIImage imageNamed:imageName]];
Notice that I had to change the setImage to convert the NSString to a UIImage or Xcode would complain. It turns out it was crashing because in the array where I had the image names, I had put 3 images into one entry (basically I forgot the commas!) so it was out of range.
Tim:
This line you gave me
UIImageView *imageView = [[UIImageView alloc] initWithFrame:self.view.bounds];
is unnecessary because I already have a view created, it would create another view which I never used. Also, replacing it with CGRect seems overkill if I already have a UIImage placeholder no?
In any case, it works now and I'm very grateful for all the help. iPad development with Objectve-C is a very thorny road and I expect I'll be bugging you guys some more.
Cheers.
Try this:
UIImage *myImage = [[UIImage alloc] initWithData:[NSData dataWithConentsOfURL:[NSURL URLWithString:urlString];
// don't know if you already got the following?
UIImageView *imageView = [[UIImageView alloc] initWithFrame:self.view.bounds];
[imageView setImage:myImage];
[self.view addSubview:imageView];
The first line is synchronous (= blocking), so in production, you should rather use - [NSURLRequest start] for this (but that's a bit more complicated).
Or use this for your local images:
UIImage *myImage = [UIImage imageNamed:imageName];
// Now, follow the same steps as in the first code-example, just skip the first line.
Try this (on iOS 4.0 and later):
// Execute a block of code on a background thread.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),
^(void)
{
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
UIImage* image = [UIImage imageWithData:[NSData dataWithContentsOfURL:url]];
// When IO is done and image created, set it on the main thread.
dispatch_async(dispatch_get_main_queue(),
^(void)
{
imageView.image = image;
});
[pool release];
});
I'm working on my first JSON example in objective-c and came across this great tutorial that I'm trying to reproduce. Along the way I decided to push the JSON returned into my already working tableView (just to ensure I could do something w/ the data in the view).
- (void)viewDidLoad {
[super viewDidLoad];
responseData = [[NSMutableData data] retain];
NSURLRequest *request = [NSURLRequest requestWithURL:[NSURL URLWithString:#"http://www.unpossible.com/misc/lucky_numbers.json"]];
[[NSURLConnection alloc] initWithRequest:request delegate:self];
}
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
[connection release];
NSString *responseString = [[NSString alloc] initWithData:responseData encoding:NSUTF8StringEncoding];
[responseData release];
NSArray *luckyNumbers = [responseString JSONValue];
NSMutableString *text = [NSMutableString stringWithString:#"Nums "];
for (int i = 0; i < [luckyNumbers count]; i++)
[text appendFormat:#"%#", [luckyNumbers objectAtIndex:i]];
self.movies = [[NSArray alloc] initWithObjects:#"First", text, #"Last", nil];
}
What I've found is that when I set the array in "connectionDidFinishLoading" it shows up as nothing in the running application - yet if I set this directly in the "viewDidLoad" method with 3 simple string values it shows up fine.
When I debug the running application I see the JSON response and the string looks valid (no issues that I can see).
Is the datasource for my tableView already set in stone before this "connectionDidFinishLoading" method or did I miss something?
Your UITableView will call upon its DataSource for data once initially, presumably sometime after viewDidLoad. After that first load, it will only request data as it needs it (i.e. as you scroll to different cells.) If you want to make it refresh its contents when your data is ready (like after you've received your URL data), call [tableView reloadData].
My initial question was solved by this solution:
At the end of my "connectionDidFinishLoading" method I call a method on the appDelegate called "jsonFinished".
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
//do all the json work and set the array that I'm using as my datasource
self.movies = [[NSArray alloc] initWithObjects:#"First", "Last", nil];
[appDelegate jsonFinished]; //have the app delegate do the refresh call back
}
Then inside the appDelegate I simply provide an implementation for the "jsonFinished" method that does a refresh of the UITableView
- (void)jsonFinished
{
moviesController.refreshDisplay;
}
And in the "refreshDisplay" method I do the reloadData on the tableView
- (void)refreshDisplay
{
[moviesTableView reloadData];
}
And now after the data is loaded the appDelegate fires off the method that reloads the data for tableView