Different transparencies in Cocoa? - objective-c

I have overloaded NSWindow and have created a custom window of my own (Borderless and transparency of 0.3 alphaValue). I am going to be drawing images in this window. Is there any way I can get the images that will be drawn in the window opaque? I want the window to remain transparent but want the images to be opaque. How would I do this?
Mac OS X Snow Leopard
Xcode 3.2.6

#ughoavgfhw is on the right track, but it's actually much easier. You just need to set opaque to NO and set backgroundColor to semi-transparent.
#implementation MYWindow
- (void)setup
{
[self setStyleMask:NSBorderlessWindowMask];
[self setOpaque:NO];
[self setBackgroundColor:[NSColor colorWithCalibratedWhite:1.0 alpha:0.3]];
}
// We override init and awakeFromNib so this works with or without a nib file
- (id)initWithContentRect:(NSRect)contentRect styleMask:(NSUInteger)aStyle backing:(NSBackingStoreType)bufferingType defer:(BOOL)flag
{
self = [super initWithContentRect:contentRect styleMask:aStyle backing:bufferingType defer:flag];
if (self)
{
[self setup];
}
return self;
}
- (void)awakeFromNib
{
[super awakeFromNib];
[self setup];
}
#end

There's a potential problem with this approach.
Requesting "[self setStyleMask:NSBorderlessWindowMask];" (with any of the possible StyleMask values) will cause the loss of keystroke events to that window on subsequent presentations of the window as a sheet. I reported a bug to Apple today on this point.

Leave the alphaValue property to 1, and set the opaque property to NO. Then, replace the default contentView with one which fills itself with a color whose alpha component is 0.3 in its drawRect: method. When you change the alphaValue property, it changes how everything drawn in the window is displayed. When you make it non-opaque, it simply doesn't draw a black background beneath the content view, so if the content view is transparent, the window will be too, but anything drawn on top of that will not be affected.
Here's an example which uses a white background with a 0.3 alpha component. Note that I overrode the setContentView: method. This is so that I can copy any views from the passed view into the transparent content view, and is especially necessary if you load the window from a nib, since the nib loading will change the content view when it is loaded. (You could change the content view's class in IB instead.)
#interface MyWindow_ContentView : NSView
#end
#implementation MyWindow
- (id)initWithContentRect:(NSRect)contentRect styleMask:(NSUInteger)aStyle backing:(NSBackingStoreType)bufferingType defer:(BOOL)flag {
if(self = [super initWithContentRect:contentRect styleMask:aStyle backing:bufferingType defer:flag]) {
[super setOpaque:NO];
[super setContentView:[[[MyWindow_ContentView alloc] init] autorelease]];
}
return self;
}
- (void)setOpaque:(BOOL)ignored {}
- (void)setContentView:(NSView *)newView {
NSArray *views = [[self.contentView subviews] copy];
[views makeObjectsPerformSelector:#selector(removeFromSuperview)];
views = [[newView subviews] copy];
[views makeObjectsPerformSelector:#selector(removeFromSuperview)];
for(NSView *view in views) [self.contentView addSubview:view];
[views release];
}
#end
#implementation MyWindow_ContentView
- (void)drawRect:(NSRect)rect {
[[NSColor colorWithCalibratedWhite:1 alpha:0.3] set];
NSRectFill(rect);
}
#end

Related

Custom Window Style in Cocoa

OK, this is what I'm trying to do :
I have a custom NSPanel subclass
I want the NSPanel to be borderless (NO title - I'm drawing a titlebar myself) AND resizeable
The thing is that :
once I set the styleMask to NSResizableWindowMask, the default title bar appears as well.
once I set the styleMask to NSBorderlessWindowMask, the default title bar disappears (that's good), but the window loses its resizing ability.
This is my Code :
- (id)initWithContentRect:(NSRect)contentRect styleMask:(NSUInteger)windowStyle backing:(NSBackingStoreType)bufferingType defer:(BOOL)deferCreation
{
if ((self = [super initWithContentRect:contentRect styleMask:NSTitledWindowMask backing:bufferingType defer:deferCreation])) {
[self setOpaque:NO];
[self setBackgroundColor:[NSColor clearColor]];
[self setMovableByWindowBackground:YES];
[self setLevel:NSFloatingWindowLevel];
//[self setStyleMask:[self styleMask]&~NSTitledWindowMask];
}
return self;
}
As you may see from the commented-out code, I've tried using any possible combination of bit operations with the mask so that I combine what I need.
Any ideas??
Just do them at a time like this
styleMask:NSTitledWindowMask | NSResizableWindowMask

Reading touch events in a QLPreviewController

I've got a QuickLook view that I view some of my app's documents in. It works fine, but I'm having my share of trouble closing the view again. How do I create a touch event / gesture recognizer for which I can detect when the user wants to close the view?
I tried the following, but no events seem to trigger when I test it.
/------------------------ [ TouchPreviewController.h ]---------------------------
#import <Quicklook/Quicklook.h>
#interface TouchPreviewController : QLPreviewController
#end
//------------------------ [ TouchPreviewController.m ]---------------------------
#import "TouchPreviewController.h"
#implementation TouchPreviewController
- (id)init:(CGRect)aRect {
if (self = [super init]) {
// We set it here directly for convenience
// As by default for a UIImageView it is set to NO
UITapGestureRecognizer *singleFingerDTap = [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(handleSingleDoubleTap:)];
singleFingerDTap.numberOfTapsRequired = 2;
[self.view addGestureRecognizer:singleFingerDTap];
[self.view setUserInteractionEnabled:YES];
[self.view setMultipleTouchEnabled:YES];
//[singleFingerDTap release];
}
return self;
}
- (IBAction)handleSingleDoubleTap:(UIGestureRecognizer *) sender {
CGPoint tapPoint = [sender locationInView:sender.view.superview];
[UIView beginAnimations:nil context:NULL];
sender.view.center = tapPoint;
[UIView commitAnimations];
NSLog(#"TouchPreviewController tap!" ) ;
}
// I also tried adding this
- (BOOL)gestureRecognizer:(UIGestureRecognizer *) gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer*) otherGestureRecognizer {
return YES;
}
#end
Edit: For clarification, this is how I instantiate the controller:
documents = [[NSArray alloc] initWithObjects: filename , nil ] ;
preview = [[TouchPreviewController alloc] init];
preview.dataSource = self;
preview.delegate = self;
//set the frame from the parent view
CGFloat w= backgroundViewHolder.frame.size.width;
CGFloat h= backgroundViewHolder.frame.size.height;
preview.view.frame = CGRectMake(0, 0,w, h);
//refresh the preview controller
[preview reloadData];
[[preview view] setNeedsLayout];
[[preview view] setNeedsDisplay];
[preview refreshCurrentPreviewItem];
//add it
[quickLookView addSubview:preview.view];
Also, I've defined the callback methods as this:
- (NSInteger) numberOfPreviewItemsInPreviewController: (QLPreviewController *) controller
{
return [documents count];
}
- (id <QLPreviewItem>) previewController: (QLPreviewController *) controller previewItemAtIndex: (NSInteger) index
{
return [NSURL fileURLWithPath:[documents objectAtIndex:index]];
}
Edit2: One thing i noticed. If I try making swiping gestures, I get the following message. This could shed some light on what is wrong/missing?
Ignoring call to [UIPanGestureRecognizer setTranslation:inView:] since
gesture recognizer is not active.
I think your example code is incomplete. It isn't clear how you are instantiating the TouchPreviewController (storyboard, nib file or loadView.)
I have never used the class so I could be way out in left field.
If you've already instantiated a UITapGestureRecognizer in the parent viewController, it is absorbing the tap events and they aren't passed on to your TouchPreviewController.
I would implement the view hierarchy differently by attaching the UITapGestureRecognizer to the parent viewController and handle presentation and unloading of the QLPreviewController there.
I think you might not have to subclass QLPreviewController by instantiating the viewController from a nib file.
When your parent viewController's UITapGestureRecognizer got an event you would either push the QLPreviewController on the navigation stack or pop it off the navigation stack when done.
Hope this is of some help.

draw rect from another class

i have two classes on different xibs, one has the window and a slider (controller 1), the other has a window and a view (controller 2). controller 1 is files owner of the first xib and controller 2 is a class of a NSView. What I want to do is when I move the slider I want to change the color of the NSRect.
In controller 1 I have my slider action:
-(IBAction)moveSlider:(id)sender
{
Controller2 *view = [[Controller2 alloc] init];
[view redraw];
}
and in controller 2 I have my draw rect and my method to redraw.
- (id)initWithFrame:(NSRect)frame
{
self = [super initWithFrame:frame];
if (self) {
RectColor = [NSColor blackColor];
}
return self;
}
-(void) redraw
{
NSLog(#"changed");
RectColor = [NSColor blueColor];
[self setNeedsDisplay:YES];
}
- (void)drawRect:(NSRect)rect
{
[RectColor set];
NSRectFill(rect);
}
the weird thing is when i have my action and the slider on the same xib as my view with the draw rect it works just fine. however when I have the slider in the other xib it wont work. any ideas? thanks!
In Controller1 the moveSlider: method creates a new instance of Controller2 every time it is called. Controller1 needs to keep a reference to the existing Controller2 and call the redraw method on that instance.

mouseExited isn't called when mouse leaves trackingArea while scrolling

Why mouseExited/mouseEntered isn't called when mouse exits from NStrackingArea by scrolling or doing animation?
I create code like this:
Mouse entered and exited:
-(void)mouseEntered:(NSEvent *)theEvent {
NSLog(#"Mouse entered");
}
-(void)mouseExited:(NSEvent *)theEvent
{
NSLog(#"Mouse exited");
}
Tracking area:
-(void)updateTrackingAreas
{
if(trackingArea != nil) {
[self removeTrackingArea:trackingArea];
[trackingArea release];
}
int opts = (NSTrackingMouseEnteredAndExited | NSTrackingActiveAlways);
trackingArea = [ [NSTrackingArea alloc] initWithRect:[self bounds]
options:opts
owner:self
userInfo:nil];
[self addTrackingArea:trackingArea];
}
More details:
I have added NSViews as subviews in NSScrollView's view. Each NSView have his own tracking area and when I scroll my scrollView and leave tracking area "mouseExited" isn't called but without scrolling everything works fine. Problem is that when I scroll "updateTrackingAreas" is called and I think this makes problems.
* Same problem with just NSView without adding it as subview so that's not a problem.
As you noted in the title of the question, mouseEntered and mouseExited are only called when the mouse moves. To see why this is the case, let's first look at the process of adding NSTrackingAreas for the first time.
As a simple example, let's create a view that normally draws a white background, but if the user hovers over the view, it draws a red background. This example uses ARC.
#interface ExampleView
- (void) createTrackingArea
#property (nonatomic, retain) backgroundColor;
#property (nonatomic, retain) trackingArea;
#end
#implementation ExampleView
#synthesize backgroundColor;
#synthesize trackingArea
- (id) awakeFromNib
{
[self setBackgroundColor: [NSColor whiteColor]];
[self createTrackingArea];
}
- (void) createTrackingArea
{
int opts = (NSTrackingMouseEnteredAndExited | NSTrackingActiveAlways);
trackingArea = [ [NSTrackingArea alloc] initWithRect:[self bounds]
options:opts
owner:self
userInfo:nil];
[self addTrackingArea:trackingArea];
}
- (void) drawRect: (NSRect) rect
{
[[self backgroundColor] set];
NSRectFill(rect);
}
- (void) mouseEntered: (NSEvent*) theEvent
{
[self setBackgroundColor: [NSColor redColor]];
}
- (void) mouseEntered: (NSEvent*) theEvent
{
[self setBackgroundColor: [NSColor whiteColor]];
}
#end
There are two problems with this code. First, when -awakeFromNib is called, if the mouse is already inside the view, -mouseEntered is not called. This means that the background will still be white, even though the mouse is over the view. This is actually mentioned in the NSView documentation for the assumeInside parameter of -addTrackingRect:owner:userData:assumeInside:
If YES, the first event will be generated when the cursor leaves aRect, regardless if the cursor is inside aRect when the tracking rectangle is added. If NO the first event will be generated when the cursor leaves aRect if the cursor is initially inside aRect, or when the cursor enters aRect if the cursor is initially outside aRect.
In both cases, if the mouse is inside the tracking area, no events will be generated until the mouse leaves the tracking area.
So to fix this, when we add the tracking area, we need to find out if the cursor is within in the tracking area. Our -createTrackingArea method thus becomes
- (void) createTrackingArea
{
int opts = (NSTrackingMouseEnteredAndExited | NSTrackingActiveAlways);
trackingArea = [ [NSTrackingArea alloc] initWithRect:[self bounds]
options:opts
owner:self
userInfo:nil];
[self addTrackingArea:trackingArea];
NSPoint mouseLocation = [[self window] mouseLocationOutsideOfEventStream];
mouseLocation = [self convertPoint: mouseLocation
fromView: nil];
if (NSPointInRect(mouseLocation, [self bounds]))
{
[self mouseEntered: nil];
}
else
{
[self mouseExited: nil];
}
}
The second problem is scrolling. When scrolling or moving a view, we need to recalculate the NSTrackingAreas in that view. This is done by removing the tracking areas and then adding them back in. As you noted, -updateTrackingAreas is called when you scroll the view. This is the place to remove and re-add the area.
- (void) updateTrackingAreas
{
[self removeTrackingArea:trackingArea];
[self createTrackingArea];
[super updateTrackingAreas]; // Needed, according to the NSView documentation
}
And that should take care of your problem. Admittedly, needing to find the mouse location and then convert it to view coordinates every time you add a tracking area is something that gets old quickly, so I would recommend creating a category on NSView that handles this automatically. You won't always be able to call [self mouseEntered: nil] or [self mouseExited: nil], so you might want to make the category accept a couple blocks. One to run if the mouse is in the NSTrackingArea, and one to run if it is not.
#Michael offers a great answer, and solved my problem. But there is one thing,
if (CGRectContainsPoint([self bounds], mouseLocation))
{
[self mouseEntered: nil];
}
else
{
[self mouseExited: nil];
}
I found CGRectContainsPoint works in my box, not CGPointInRect,

Displaying an NSString on a Custom View

I have an interface that has an NSTextField, NSButton, and an NSView. When I type something in the NSTextfield and press the button, I want the text to be drawn in the NSView. So far I have everything connected and working, except for the view.
How can I connect the text and the view so that every time I press the button, the text is drawn to the view?
How can I connect the text and the view so that every time I press the button, the text is drawn to the view?
Views do their own drawing.
You need to give the view the string to draw, and then set the view as needing display. You'll do these in the action method that you wire the button up to.
First, your custom view class needs to have a property for the value (string, in this case) that it's going to display. From your action method, which should generally be on a controller object, send the view object a setFoo: message (assuming you named the property foo). That takes care of job one: The view now has the value to display.
Job two is even easier: Send the view a setNeedsDisplay: message, with the value YES.
That's it. The action method is two lines.
Of course, since views draw themselves, you also need your custom view to actually draw, so you need to implement the drawRect: method in that class. It, too, will be short; all you need to do is tell the string to draw itself.
Bindings
http://developer.apple.com/mac/library/documentation/cocoa/Conceptual/CocoaBindings/Concepts/WhatAreBindings.html#//apple_ref/doc/uid/20002372-CJBEJBHH
For simplicity's sake I didn't mention this before, but the app also has a speech element to speak the string. This aspect of the program works fine, so just ignore any messages involving the SpeakAndDraw class (it's actually misnamed and only includes a speech method, nothing about drawing).
View.m
#import "View.h"
#implementation View
#synthesize stringToDraw;
- (id)initWithFrame:(NSRect)frame {
self = [super initWithFrame:frame];
if (self) {
[self setAttributes];
stringToDraw = #"Hola";
}
return self;
}
- (void)drawRect:(NSRect)dirtyRect {
NSRect bounds = [self bounds];
[self drawStringInRect:bounds];
}
- (void)setAttributes
{
attributes = [[NSMutableDictionary alloc] init];
[attributes setObject:[NSFont fontWithName:#"Helvetica"
size:75]
forKey:NSFontAttributeName];
[attributes setObject:[NSColor blackColor]
forKey:NSForegroundColorAttributeName];
}
- (void)drawStringInRect:(NSRect)rect
{
NSSize strSize = [stringToDraw sizeWithAttributes:attributes];
NSPoint strOrigin;
strOrigin.x = rect.origin.x + (rect.size.width - strSize.width)/2;
strOrigin.y = rect.origin.y + (rect.size.height - strSize.height)/2;
[stringToDraw drawAtPoint:strOrigin withAttributes:attributes];
}
#end
SpeakerController.m
#import "SpeakerController.h"
#implementation SpeakerController
- (id)init
{
speakAndDraw = [[SpeakAndDraw alloc] init];
view = [[View alloc] init];
[mainWindow setContentView:mainContentView];
[mainContentView addSubview:view];
return self;
}
- (IBAction)speakText:(id)sender
{
[speakAndDraw setStringToSay:[text stringValue]];
[speakAndDraw speak];
[view setStringToDraw:[text stringValue]];
[view setNeedsDisplay:YES];
NSLog(#"%#", view.stringToDraw);
NSLog(#"%#", [view window]);
}
#end