NSImageView tiles inside NSScrollView draw unwanted borders when zoomed out - objective-c

I have an NSScrollView, whose documentView is a huge NSView, made of many many sub-NSImageViews, who act as tiles in a map. (The entire map is the NSView, and since it is way bigger than the screen size, its embedded in a scrollview).
I'm able to display the map with correct tile positions, and scroll around with the bars/gestures. However.. when I enable magnification to be able to zoom, the following happens:
Somehow I'm assuming auto-layout adds the tile borders below, and I don't know how to disable them. These are surely borders, since I have checked thousands of times that my tiles and subviews are the same size.. so where does this come from?
I have quite some experience with iOS development, but am completely lost with NSScrollView (Where are my delegate methods?). How do I disable this behavior of the scroll view?
Here's my subview code:
- (void)setupSubViews
{
NSLog(#"--------Creating subviews!-------");
//first we create the subviews..
//This is the key part, we traverse from top Left, and since OS X coordinates start at bottom left, we need to invert the rows!
for (int i=0; i< NUMBER_OF_COLUMNS; i++) {
for (int j=NUMBER_OF_ROWS-1; j>=0; j--) {
CGRect frame = CGRectMake(i*256, j*256, 256, 256);
NSImageView *newView = [[NSImageView alloc] initWithFrame:frame];
newView.focusRingType = NSFocusRingTypeNone; //I gave this focusRing a try, it didn't work :(
[self addSubview:newView];
}
}
}
And this is where I connect the subviews to the actual images..
-(void)updateMap:(id)tilesPassed{
if (downloadFinished) {
NSLog(#"--------DRAW RECT-------------");
NSImageView *subView;
NSInteger idx = 0;
for (int i =0; i<[self.subviews count]; i++) {
subView = [self.subviews objectAtIndex:i];
[subView setAllowsCutCopyPaste:NO];
[subView setImageFrameStyle:NSImageFrameNone]; //This doesnt work either :(
MapTile *tile = [tilesArray objectAtIndex:idx];
subView.image = tile.image;
idx++;
}
}
}

You probably don't want to use subviews for this. There is a CALayer subclass for exactly this purpose: CATiledLayer: https://developer.apple.com/library/mac/documentation/GraphicsImaging/Reference/CATiledLayer_class/Introduction/Introduction.html
With it you can even load different image tiles based on how far you are zoomed in, just like Google Maps. It will render without borders and the performance will be WAY better than using lots of subviews. Subviews are expensive, layers are (generally) cheap.
Update: This example will get you up and running: http://bill.dudney.net/roller/objc/entry/catiledlayer_example

As a quick work around without using CATiledLayer you can stitch all images as one image and add it to main view
Below is the sample code to stitch 2 images:
UIImage *bottomImage = [UIImage imageNamed:#"bottom.png"]; //first image
UIImage *image = [UIImage imageNamed:#"top.png"]; //foreground image
CGSize newSize = CGSizeMake(209, 260); //size of image view
UIGraphicsBeginImageContext( newSize );
// drawing 1st image
[bottomImage drawInRect:CGRectMake(0,0,newSize.width/2,newSize.height/2)];
// drawing the 2nd image after the 1st
[image drawInRect:CGRectMake(0,newSize.height/2,newSize.width/2,newSize.height/2)] ;
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
and directly add this newImage formed by stitching all your tiled images to the main view.

Related

How to crop a user scaled UIImageView

I've a UIImageView *userImage and UIImageView *imageSquare whose size is 320x320. The user will be able to play with userImage being able to change size & change position. imageSquare is static and placed in the middle of the screen and should be seen as the cropping view
The code below can crop userImage as the imageSquare original size but not with its new aspect ratio / scale.
I've been going crazy trying to do this but i cant find a way. How could I crop the current view (the one the user is manipulating) of userImage?
CGSize pageSize = imageSquare.frame.size;
UIGraphicsBeginImageContext(pageSize);
CGContextRef resizedContext = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(resizedContext, -imageSquare.frame.origin.x, -imageSquare.frame.origin.y);
[userImage.layer renderInContext:resizedContext];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if (image != nil) {
NSLog(#"is not nil");
NSData *imgData = UIImagePNGRepresentation(image);
imageSquare.image = [[UIImage alloc]initWithData:imgData];
}
Your 320x320 cropping view should be a UIView with clipsToBounds ("Clip Subviews" in Interface Builder) set to YES, and then userImage should be one of its subviews.
When the user changes size/position, you change userImage.frame, and the cropping will be handled for you.

Image of NSView with background

I'm trying to get an image of a view, so I can put it in a NSImageView.
I'm using NSViews bitmapImageRepForCachingDisplayInRect: method.
However, this method does only return the contents of the view, without background of the window.
How can I get the image of the whole view, as it looks on the screen, with background-color etc.?
Current Code
- (NSImage *)imageOfView:(NSView *)view {
NSBitmapImageRep* rep = [view bitmapImageRepForCachingDisplayInRect:view.bounds];
[view cacheDisplayInRect:view.frame toBitmapImageRep:rep];
return [[NSImage alloc] initWithCGImage:[rep CGImage] size:view.bounds.size];
}
This statement from the project
/* Get the index for the chosen display from the CGDirectDisplayID array. */
NSInteger displaysIndex = [menuItem tag];
/* Make a snapshot image of the current display. */
CGImageRef image = CGDisplayCreateImage(displays[displaysIndex]);

background of UIView gets scaled incorrectly and is offset

I saw other posts but still can't get my background image to scale correctly. I use a UIView to set a background image because i need the background image to tile vertically. I set the frame of the UIView to be the same size as my pixel width and height of the png, but the image gets offset and scaled incorrectly. Could it be because the image has some transparent pixels? I tried the following:
_view.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"background.png"]];
_view.opaque = NO;
_view.contentMode = UIViewContentModeScaleAspectFit;
//_view.layer.contentsGravity = kCAGravityTop;
You might find it easier to just make a subclass of UIView. In the subclass, override drawRect: to draw your image using drawAsPatternInRect:. That method will fill the view by tiling the image:
- (void)drawRect:(CGRect)dirtyRect {
[self.tileImage drawAsPatternInRect:self.bounds];
}
I think it is because you are using a Image as a background colour.
it would be much better if you used an imageview, and added that as a subview, like this.
//Edit for vertical tiling
int y = 0;
int imgsize = 100; //in pixels
for (i =0; i<max; i++) { //max is the number of tiles you want
UIImageView *img = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"background.png"]];
img.frame = CGRectMake(0,y+imgsize,100,100);
y+=imgsize; //increase y each time(new starting point)
[_view addSubview:img];
}
Might not be the best way to do it, but it will work.

Making UITableView with cell-sized images smooth scrolled

EVERYTHING WRITTEN HERE ACTUALLY WORKS RIGHT
EXEPT FOR [UIImage imageNamed:] METHOD USAGE
Implementation
I am using model in witch you have a custom UITableViewCell with one custom UIView set up as Cell's backgroundView.
Custom UIView contains two Cell-sized images (320x80 px), one of which is 100% transparent to half of the view. All elements are set to be Opaque and have 1.0 Alpha property.
I don't reuse Cells because I failed to make them loading different images. Cell's reused one-by-one up to 9 cells overall. So I have 9 reusable Cells in memory.
Cell initWithStyle:reuseIdentifier method part:
CGRect viewFrame = CGRectMake(0.0f, 0.0f, 320.0f, 80.0f);
customCellView = [[CustomCellView alloc] initWithFrame:viewFrame];
customCellView.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
[self setBackgroundView:customCellView];
CustomCellView's initialization method:
- (id)initWithFrame:(CGRect)frame {
if ((self = [super initWithFrame:frame])) {
self.opaque = YES;
self.backgroundColor = [UIColor UICustomColor];
}
return self;
}
Images are being pre-loaded to NSMutableArray as UIImage objects from PNG files with UIImage's imageNamed: method.
They are being set in UITableViewDelegate's method tableView:cellForRowAtIndexPath: and passed through UITableViewCell with custom method to UIView.
And then drawn in UIView's drawRect: overridden method:
- (void)drawRect:(CGRect)rect {
CGRect contentRect = self.bounds;
if (!self.editing) {
CGFloat boundsX = contentRect.origin.x;
CGFloat boundsY = contentRect.origin.y;
CGPoint point;
point = CGPointMake(boundsX, boundsY);
if (firstImage) { [firstImage drawInRect:contentRect blendMode:kCGBlendModeNormal alpha:1.0f]; }
if (secondImage) { [secondImage drawInRect:contentRect blendMode:kCGBlendModeNormal alpha:1.0f]; }
}
}
As you see images are being drawn with drawInRect:blendMode:alpha: method.
Problem
Well, UITableView can't be scrolled at all, it's being struck on every cell, it's chunky and creepy.
Thoughts
Well digging sample code, stackoverflow and forums gave me thought to use OpenGL ES to pre-render images, but, really, is it that hard to make a smooth scrolling?
What's wrong with just using UIImageViews? Are they not fast enough? (They should be if you're preloading the UIImages).
One thing to note is that [UIImage imageNamed:] won't explicitly load the image data into memory. It'll give you a reference which is backed by the data on disk. You can get around this by making a call to [yourImage CGImage].

UIScrollView setZoomScale not working?

I am struggeling with my UIScrollview to get it to zoom-in the underlying UIImageView. In my view controller I set
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView {
return myImageView;
}
In the viewDidLoad method I try to set the zoomScale to 2 as follows (note the UIImageView and Image is set in Interface Builder):
- (void)viewDidLoad {
[super viewDidLoad];
myScrollView.contentSize = CGSizeMake(myImageView.frame.size.width, myImageView.frame.size.height);
myScrollView.contentOffset = CGPointMake(941.0, 990.0);
myScrollView.minimumZoomScale = 0.1;
myScrollView.maximumZoomScale = 10.0;
myScrollView.zoomScale = 0.7;
myScrollView.clipsToBounds = YES;
myScrollView.delegate = self;
NSLog(#"zoomScale: %.1f, minZoolScale: %.3f", myScrollView.zoomScale, myScrollView.minimumZoomScale);
}
I tried a few variations of this, but the NSLog always shows a zoomScale of 1.0.
Any ideas where I screw this one up?
I finally got this to work. what caused the problem was the delegate call being at the end. I now moved it up and .... here we go.
New code looks like this:
- (void)viewDidLoad {
[super viewDidLoad];
myScrollView.delegate = self;
myScrollView.contentSize = CGSizeMake(myImageView.frame.size.width, myImageView.frame.size.height);
myScrollView.contentOffset = CGPointMake(941.0, 990.0);
myScrollView.minimumZoomScale = 0.1;
myScrollView.maximumZoomScale = 10.0;
myScrollView.zoomScale = 0.7;
myScrollView.clipsToBounds = YES;
}
Here is another example I made. This one is using an image that is included in the resource folder. Compared to the one you have this one adds the UIImageView to the view as a subview and then changes the zoom to the whole view.
-(void)viewDidLoad{
[super viewDidLoad];
UIImage *image = [UIImage imageNamed:#"random.jpg"];
imageView = [[UIImageView alloc] initWithImage:image];
[self.view addSubview:imageView];
[(UIScrollView *) self.view setContentSize:[image size]];
[(UIScrollView *) self.view setMaximumZoomScale:2.0];
[(UIScrollView *) self.view setMinimumZoomScale:0.5];
}
I know this is quite late as answers go, but the problem is that your code calls zoomScale before it sets the delegate. You are right the other things in there don't require the delegate, but zoomScale does because it has to be able to call back when the zoom is complete. At least that's how I think it works.
My code must be completely crazy because the scale that I use is completely opposite to what tutorials and others are doing. For me, minScale = 1 which indicates that the image is fully zoomed out and fits the UIImageView that contains it.
Here's my code:
[self.imageView setImage:image];
// Makes the content size the same size as the imageView size.
// Since the image size and the scroll view size should be the same, the scroll view shouldn't scroll, only bounce.
self.scrollView.contentSize = self.imageView.frame.size;
// despite what tutorials say, the scale actually goes from one (image sized to fit screen) to max (image at actual resolution)
CGRect scrollViewFrame = self.scrollView.frame;
CGFloat minScale = 1;
// max is calculated by finding the max ratio factor of the image size to the scroll view size (which will change based on the device)
CGFloat scaleWidth = image.size.width / scrollViewFrame.size.width;
CGFloat scaleHeight = image.size.height / scrollViewFrame.size.height;
self.scrollView.maximumZoomScale = MAX(scaleWidth, scaleHeight);
self.scrollView.minimumZoomScale = minScale;
// ensure we are zoomed out fully
self.scrollView.zoomScale = minScale;
This works as I expect. When I load the image into the UIImageView, it is fully zoomed out. I can then zoom in and then I can pan the image.