UIWebView redraw on rotate - objective-c

I have implemented a UIWebView object into a custom UIView (TSAlertView). My UIWebView sits in the middle of the UIView, as a subview.
When I rotate the iOS device screen, the UIWebView object redraws right on top of the old one, without deleting the old one. All the other elements of this custom UIView are destroyed and redrawn when the screen is rotated.
I imagine I must have missed some sort of procedure for deallocating resources or removing the UIWebView from view. I have tried adding 'autorelease' to the declaration of the UIWebView, but to no avail. I wonder if this is a common symptom of a simple oversight I have made in the way the UIWebView is created?
This seems to be a simple case of not telling the UIWebView object it should destroy itself before it is redrawn on each rotate - but I don't know how I can go about this...
Any wisdom gratefully received.

Ideally while rotating you should not create the views again. Instead you can rearrange the frame in delegates appropriately. In your case you might be creating a UIWebView again, which should be avoided. Instead of that, you can keep it as a class level param(Declare in .h file) and adjust the frame in delegate methods for rotation. Also please note that release/autorelease wont remove your view from superview. You need to call removeFromSuperview method to achieve that.
Check this for view resizing and layout handling

Related

Rotate only subview of a uiviewcontroller

I read up other questions and it isn't very clear, as to what I could do about this problem.
Decided to ask anyway.
So I have a UIViewController, wherein I block auto-rotation.
I bring up one of the subviews added to the controller. Now, I rotate the device. And this UIView should change orientation, I will use autolayout to support all orientations on this view alone.
Is this possible?
What I tried is to, listen for device orientation changes, and then send the orientation to the subview and then was thinking probably transform, but that won't be accurate and it is probably a lot of work.
Any clue?
So, there isn't any direct method without using another view controller to hitch the view to. Add a new UIViewController and make your UIView the view controller's view. Add your rotation parameters/methods to the newly made view controller.
So its not the best solution because I have to now present that view from a view controller which is adding more code. That's kinda fundamental to a UIView. It can't handle itself unless you write hacks and override your view's behavior.
Another solution is to listen to changes in interface orientations and then relayout your view based on that change.

Abort processing of CATiledLayer?

I'm having a view controller which holds a scroll view, a content view and a CATiledLayer as sublayer in the content view.
If I remove my controllers view from its superview while the CATiledLayer is still busy (rendering a PDF page for instance), I get the weirdest crashes and null references. It seems like CATiledLayer is not happy if you disturbe it. Is there a way I can abort what it is currently doing?
Am I right that the controller you are removing is the delegate of the CATiledLayer?
Then you have to set the CATiledLayer's delegate to nil when you remove your controller.
May be related link (my own question XD): CATiledLayer drawLayer:inContext: crashing when the view is deallocated while the image to draw is being retrieved

Looking for guidance on detecting overlap of UIImageViews

The scenario is that I have a UIViewController containing multiple "InteractiveUIImageViews" (inherited from UIImageView) each containing their own UIImage. In InteractiveUIImageView I have iplemented methods for touchesBegan, touchesMoved and touchesEnded to handle their movement and behaviour on screen. Certain objects of this type will be set as 'containers' (think recycle bin) with the objective being that when one image is dragged onto it, it will be removed from the screen and placed inside it to be potentially retrieved later.
My current thinking would be to call a new method in UIViewController from the touchesEnded method of my InteractiveUIImageView but being new to all this I'm not really sure how to go about doing that (e.g. calling a method from the 'parent') or indeed if this is the best way to achieve what I want to do.
Thanks in advance for any advice.
I'm afraid your question is (too me at least) a bit unclear. I get that your are trying drag a UIImage around a scene and drop it in drop-locations.
What is unclear is you class hierarchy. I believe that you are going about it in a wrong way. You shouldn't have to subclass UIImage at all.
Instead I would urge you to let the UIViewController manage the movement of the images. When you touch down on an image, you also touch down on its parent (containing) view.
What you have to to is then reposition the UIImage (all handled by the UIViewController) as you drag the image across the screen. When you let go you check if your finger was inside your drop-zone on touch up.

touchesBegan method is not being called

I am trying to detect touches, but the touchesBegan method is not being called.
In my ViewController, I have added the touchesBegan method. My Nib files owner is set to the correct V.C. The Nib itself consists of the view, with a scroll view and a tab bar. Nested in the scroll view is an image view, which has user interaction enabled. What is precluding touches from being registered, or preventing my implementation of touchesBegan from being called?
I've scoured the Internet and Apple docs, and I can't see what I am doing wrong. Also, I'm not really sure what code I can post here to help with my query. Thanks.
Okay, after a lot more reading, I've now got a scrollview and a imageview, both of which are created programatically. The imageview is a sub view of the scrollview, and scrollview has been subclassed so that the touches ended method can decide whether it was a single touch, in which case call the touches ended method from the view controller, otherwise call its supers method. This works just fine, however, why is it that this cannot be done without subclassing scrollview? Is it my lack of understanding of how scrollview works, or is it just a limitation of it?

UIView. How do I redisplay a "data driven" UIView?

I am doing a scientific visualization app for iPhone. Periodically a UIView subclass will receive some data from the network that it must render via its overridden drawRect: method.
My challenge is that there is no way to indicate to Cocoa-Touch that this data has arrived - or changed - other then [myView setNeedsDisplay] which is ignored except for changes to the bounds of the UIView. I have tried hiding and unhiding. Nuthin'.
Maddeningly, since all I did was alter some internal state of the UIView [myView setNeedsDisplay]. This change is completely invisible to Cocoa-Touch. This change is not one of the criteria that warrents a redraw - according to Cocoa-Touch - thus my UIView sits there, unchanged.
This is very, very, very frustrating. I have hit a wall here.
Could someone please suggest a technique, a trick, a hack, that will prompt my UIView to re-render.
Cheers,
Doug
[myView setNeedsDisplay] should cause drawRect: to be sent to myView the next time you're back in the run loop and myView is visible. If changing the bounds is causing the view to redraw (myView.contentMode is UIViewContentModeRedraw), then setNeedsDisplay must be working, since that's how the redraw is signalled by the bounds change. See the UIView class reference for details.
Is your drawRect: being invoked the first time the view is shown? If not, it may be misspelled, overridden in a subclass, or even on the wrong object.
Is the view visible and on screen? It won't be drawn if it is off screen.
Is control returning to the event loop? drawRect: won't be invoked if your application is busy.