I am working on an app, which actually works like MSPaint (something to draw lines, etc...).
I got a white UIView, basically where the user draws. On top of this UIView I set up a UIImage, which is gray, with a 0,4 alpha. I want this UIImage to be used as a blotting paper. The idea is to disable touch when the user put the palm of his hand on this area, so it's more comfortable to draw (multitouch is disabled, and with this "blotting paper" you won't draw something accidentally with your palm...)
Even if I bring the UIImage to the front, on top of the view, and even if I disable user interactions on this UIImage, it is still possible to draw on the UIView. , behind the UIImage (kind of strange!)
I do not understand what's happening, because, it seems that the image is transparent, and that the UIView "behind" is still active, even if she's overlaid by the UIImage?!
Any help/indication/idea would be much appreciated! Thank you :-)
Have you set the "userInteractionEnabled" property of the UIImage to "NO" ?
You may actually want to do the opposite. When you disable user interaction or touches, the view basically becomes invisible to touches and they are passed on to the next view.
In your case you do want userInteractionEnabled because you want the view to catch those touches.
You have to disable the user interaction on the UIImageView not the UIImage and it should work.
Edit:
Or you could be sneaky and just add an empty view over it. Use the same frame size so it overlaps perfectly and thats it. You'll be able to see everything you need and it's not a subview of it so there will eb no interaction and touches will get registered but won't have any effect. :P
No better ideas unless you post some of your code...
OK, so I managed to do what I wanted to! YAY!
I got 3 different classes :
StrokesViewController (UIViewController)-the view controller
StrokesView (UIView) - the view where the user draws the strokes.
BlottingPaper (UIView) - the blotting paper.
I got a XIB file "linked" to all three.
I created this new class, called "BlottingPaper", typed UIView. the .h and .m file are actually empty (I do import #import < Foundation/Foundation.h >)
User interaction is enable on BlottingPaper.
I do not use the exclusive touch on this class.
On the XIB file, I just put a view on top of StrokesView. I link it to BlottingPaper (modify the alpha as I want, blablabla...)
And that's it! When I put the palm of my hand on it, it doesn't draw anything on the area where my hand is, but I still can draw with my finger on the rest of the StrokesView!
In addition to Dancreek's response, you should be setting buvard.userInteractionEnabled = YES; so that it captures interaction.
You should also set buvard.exclusiveTouch = YES; so that buvard is the only view which will receive touch events.
When you remove buvard you should set buvard.exclusiveTouch = NO; so that other views will regain their ability to receive touches.
Related
i want to use the new UIVC Custom Transition API in my iPad App Project. And i despair of it -.-. what i want to do, sounds very simple at first. My "FirstViewController" (simply the names) is a normal FullScreenVC. From that VC i open a "SecondViewController"modally with the default Presentation style Form Sheet. Everything allright. The SecondViewController is a normal UiTableViewController. So from inside the SecondViewController I want to open a "ThirdViewController" modally as well with a custom transition. This ThridViewController have to overlap the SecondVC with the Form Sheet Presentation and the content of the second view controller have to be dimmed as well. But i get many problems inside the animateTransition-method in the the Transition Delegate. My best idea by now is, making a UIView Snapshot of the from View. Create a new UIView with black background and alpha 0.5 and put it as a subview inside UIView Snapshot. Then transfer the frame and the center of the fromView to the toView and add the UIViewSnappshot as a subview to the toView and send it to the back. finally adding the toView to the containerView.
But when i do this, I get two s*** problems. The First is, that the Transition don't recognize that i am using a Retina display, because i put the center of the fromView to the the toView. But the toView dont overlap the fromView, better its nit at the same postion. Its almost at the left down of the screen and not in the middle of the screen. The second problem is, that the toView content seems to be transculent. In Storyboard and in code i write "be opaque and white bgcolor". But at runtime the see the controls of the view but the bgcolor is the bgcolor of the dimmed View behind it. Why?
At the moment i think i'm a dump guy :( What in hell im doing wrong?
Thanks
Avarlon
I want to show a logo UIView always on top when the app running,
I know there is a way to do that,add same UIView to every UIViewController,
but I think this is not the best way to do that.
when i have lot of pages,and modify the logo UIView,must modify it every page.
Did someone have better way to do this?
thanks.
look like this:
Since you only every have one window per app, and view's don't have levels, you have to make sure that view stays on top of the hierarchy, no matter what. One relatively easy way is to add it directly to the window above the rest of the interface (the navigation controller):
In applicationDidLaunch:
// After the main navigation controller or tab controller has been added
// either programmatically or in the xib:
UIImage *logo = [UIImage imageNamed:#"logo.png"];
UIImageView *logoView = [[UIImageView alloc] initWithImage:logo];
[self.window addSubview:logoView];
Actually, I think that (a) creating a subclass of UIView that shows your logo and has all the necessary setup in it and then (b) adding this subclass to each view controller is the cleanest and most manageable way to do this.
The reason I prefer this method over adding the view to the window is because if you ever have a view that you don't want to show the logo, you won't need to show and hide something you added to the window. Also, adding directly to the window may cause rotation challenges on certain iOS devices in my experience, depending on what you're doing.
Also, to make sure your logo view is always on top of the view hierarchy, you can do two things:
If the view already exists, you can bring it to front using [UIView bringSubviewToFront:]
[myParentView bringSubviewToFront:myLogoSubview];
If you are creating the view, it will be on top when you add it with [UIView addSubview:]
// Set up myLogoSubview first here with alloc+init, etc.
[myParentView addSubview:myLogoSubview];`
It looks like in your image you would replace myParentView with self.view and myLogoSubview with the view you're looking to keep on top, but this is just my assumption based on your image.
I have a UICollectionView, and it works great, but I have a doubt.
In my view I have a UICollectionViewCell and inside that I have a UIImage.
The cell is linked to a blank view controller, and inside that I have a UIScrollView (for managing the zoom) and a UIImage (the full-size image).
I wondered if there was some delegate or something that could handle the image opening process automatically (with zoom, etc.).
Now I'm handling the zoom effect with UIScrollViewDelegate delegate and method viewForZoomingInScrollView:... but the result is very poor, definitely not fluid!
There's no built in view for doing what you want, no. You're doing the right thing with a UIImageView inside UIScrollView. If it's not very fluid, then it's likely because your image is huge. The way to get around that is to load different images for different zoom levels. So as you zoom in, listen to the UIScrollViewDelegate method called scrollViewDidZoom: and change the image to better resolution as you zoom in. Or, take a look at CATiledLayer.
Note that this has nothing to do with your UICollectionView.
The scenario is that I have a UIViewController containing multiple "InteractiveUIImageViews" (inherited from UIImageView) each containing their own UIImage. In InteractiveUIImageView I have iplemented methods for touchesBegan, touchesMoved and touchesEnded to handle their movement and behaviour on screen. Certain objects of this type will be set as 'containers' (think recycle bin) with the objective being that when one image is dragged onto it, it will be removed from the screen and placed inside it to be potentially retrieved later.
My current thinking would be to call a new method in UIViewController from the touchesEnded method of my InteractiveUIImageView but being new to all this I'm not really sure how to go about doing that (e.g. calling a method from the 'parent') or indeed if this is the best way to achieve what I want to do.
Thanks in advance for any advice.
I'm afraid your question is (too me at least) a bit unclear. I get that your are trying drag a UIImage around a scene and drop it in drop-locations.
What is unclear is you class hierarchy. I believe that you are going about it in a wrong way. You shouldn't have to subclass UIImage at all.
Instead I would urge you to let the UIViewController manage the movement of the images. When you touch down on an image, you also touch down on its parent (containing) view.
What you have to to is then reposition the UIImage (all handled by the UIViewController) as you drag the image across the screen. When you let go you check if your finger was inside your drop-zone on touch up.
I have made a very simple web browser app using a web view. Now I need to get the app so that when the iPhone is rotated, the text of the page is rotated as well.
How do I do this?
I am very confused by the auto-resize dialog, so it is possible I have done something wrong there.
Any help would be appreciated!
I think you sholud rotate UIWebView widget, not its contents. Contents should rotate as well. To support rotating add following code to your view controller:
- (BOOL)shouldAutorotateToInterfaceOrientation:
(UIInterfaceOrientation)interfaceOrientation {
return YES;
}
Rotated widget might look different then expected. Adjust struts and springs in Interface Builder.
I think you need to give us some sample code in order to determine what goes wrong. It is as Jacek says, the only think you should need to do is to support auto rotation on the UIWebView itself. The content should be rotated automatically.
I think you are confused by device orientation and view frame.
In most cases UIViews do change with respect to the orientation change. But to clarify - it is not because of the orientation change, but the layout change.
Only UIViewControllers need to consider device orientation - UIViews do NOT. When the device orientation changes, the UIViewController captures the event from its instance methods:
– willRotateToInterfaceOrientation:duration:
– willAnimateRotationToInterfaceOrientation:duration:
– didRotateFromInterfaceOrientation:
The UIViewController then re-layout the views - leading to reframing of the UIViews. In many cases, iOS can helps you in simplifying the relayout process by setting the UIViewAutoresizeMask. For example:
myWebview.autoresizeMask = UIVIewAutoresizeMaskFlexibleHeight | UIVIewAutoresizeMaskFlexibleHeight;
implies that when webview's superview changed its bounds, the webview will change accordingly.
As a summary, UIView only takes care of its frame / bounds etc.