I made a simple utility application for switching between multiple instances of a game that runs in fake fullscreen or borderless window mode. Essentially it just displays a row of buttons with names for the different instances and when I click the button it switches to that instance.
Things work great except for a little annoyance - upon clicking, there is like 1 millisecond where the taskbar flashes to the foreground. I think this is due to my application getting focus for a brief moment before it hands it off to the game instance which hides the taskbar.
Is there a way to avoid getting focus on my application when clicking its buttons? Its form sits on top of all other windows at all times anyway.
Related
I've developed a VBA-style add-on for PowerPoint in C# using NetOffice Framework, https://github.com/NetOfficeFw . The add-on is now working, and in fact my question is not really related to that.
On my desktop PC I have four screens, and when PowerPoint is in "slide show mode" it takes over two of the screens, one for the slide show and one for the presenter screen, both shown in full screen mode. The normal PowerPoint window is also still there on one of the other two screens.
On my old portable PC with an extra screen connected, the extra screen normally shows a mirror image of the built-in display. But when PowerPoint goes into "slide show mode" it somehow reconfigures the system and shows the slide show in full screen on the connected screen and the presenter screen on the built-in display. Very clever.
But what exactly has PowerPoint done, and how? Is this documented anywhere?
This is mostly just to satisfy my curiosity. I have now recoded my add-on so it works - I was previously using the .Net System.Windows.Forms.Screen class to figure out where to position the mouse cursor when I'd placed PowerPoint in "slide show mode", but that doesn't work on my portable PC because it maintains that there is still only one screen on the system.
Older versions of PPT behave differently but current versions, when you start a slide show, change your windows video settings from mirror to extend and put the slide show view on the second monitor unless you override the default settings. Then it resets the video to the previous setting when you end the slide show.
When you extend the display across multiple monitors, Windows sees it as one large screen, which explains why your PC reports only the one screen, I think.
I have developed an application using VB.NET that uses a touchscreen (it's a Point of Sales app). I have used button click events to execute the code like a normal Windows application. Is this correct way to do it, or should I use MouseUp and MouseDown events?
Using Click events is correct. On a touch screen, tapping a button will generate a Click event, just like it would if you clicked the button with a mouse.
P.S. You mention in the comments that sometimes the application hangs when you click a button. This is most likely caused by the code that responds to the Click event, and is not related to using a touch screen.
Just ran into an issue on our Touch POS app yesterday. The click event is fired twice in some cases on some monitors. It seems that different touchscreen monitors handle the click even differently and some have software to prevent it others do not. The specific issue was that our click event was being fired two times so clicking button "1" would result in "11" Only happened when using the touchscreen not the mouse. The first depress of the button would put "1" and then when you took your finger off the button another "1" would appear. If you debugged the click event it would only be fired 1 time and just put 1 in the field. If you took debug out it went back to 11.
Save yourself the headaches and use MOUSEUP.
I have an Objective-C application with a main window and a small progress window with a stack view to show the current progress.
If the application is put in the background by activating any other application and then clicking on the Dock icon, both the Main and secondary windows is brought to the front and shown.
But, if I just click one of the windows when in the background, only that window is activated and brought to the front, the other stays in the back.
I want to implement so that when I click on the main window it does the same thing as clicking on the Dock icon, it should show both windows on top with the Main window activated.
But if I click on the progress window, I don't want the main window to be brought to the front.
I haven't been able to find a way to do this, how should I go about achieving this?
You can detect the window being clicked with the window delegate's -windowDidBecomeKey: or -becomeKeyWindow, or the app delegate's -didBecomeActive:.
Then depending on your exact needs, you can use [NSApp activateIgnoringOtherApps:] or [[NSRunningApplication currentApplication] activateWithOptions:] (and possibly NSApplicationActivateAllWindows).
I am using QTP 11.0 and a java button in my application is highlighted in big monitor screen(19 or 20 inch) but not in Laptop screen(14 or 15 inch).
I have to click the button and a pop up will be seen,this works fine in big screen but in laptop screen the pop up does not appear.Is there any workaround?
In small screen the scroll bar appears and the java button is below the screen, but in big screen its appeared without scrolling so its working fine with the big screen.
Also i have tried if scroll down is possible through scripting in small screen ,but scroll down does not work NOR does pgDown pgUP works in the application.
Addins selected are ActiveX,Java,Web
Advanced thanks...
As per the given information I assume that the problem is with the object properties of the java button. Could you try adding only htmlid/name property & try highlighting the object in both the screens? Please disable object identification, location & indexing if applicable. If possible share the properties of the object for further analysis.
Does QTP fail to recognise the button during play back? QTP may not highlight the button because it is not visible; but in most cases it will be able to perform actions (such as click event) on the button. Make the button visible in web page / application and try clicking on Highlight button in QTP.
I've created a simple Cocoa-Application in XCode 4.6 with an NSPanel instead of the default NSWindow. When I enable the Non-Activating
option and start the application everything works fine:
The panel is displayed in front of everything else and when
the mouse cursor hovers over the panel's edges it changes from a normal
arrow-cursor to the appropriate resize-cursor, so the user knows that he can resize
the panel.
This works fine as longs as I don't click on any other application
as for example Safari or Finder.
From the moment I once give focus to another application,
I can click on and hover over my panel as much as I want, the
cursor style will not change anymore - it always stays an arrow and it's not possible to return to the normal behavior.
The panel stays selectable and in the front, you still can move and resize it,
but the mouse cursor stays an arrow all the time. You then cannot even change it
manually using something like: [[NSCursor crosshairCursor] set].
So I need to find a way to create a NSPanel that keeps the normal
automatic-change-cursorstyle-when-hovering-over-panel-edges-behaviour
even when you give focus to another application.
I have already tried to use an customized NSPanel-class,
where I have overwritten the canBecomeKeyWindow and
canBecomeMainWindow methods, so that they return YES
but even when I make my Panel KeyWindow and MainWindow...
[myPanel makeKeyAndOrderFront:self];
[myPanel makeMainWindow];
...it doesn't solve the cursor issue.
Would be great if someone could help me here :)
PS.: the Base SDK and the Deployment Target are set to 10.8 in my project
So I found out that the described issue has nothing to to with the panel's window-state. It really doesn't matter if it is set to key or to main, instead the cursor-problem (stays arrow all the time)is related to the application's activation state.
Everything works fine as long as the application that owns the panel is active but if you click on another application my application is deactivated and does not get activated again - even if you click on the panel - because the "non-activating"-option is enabled.
The problem is that i need the "non activating"-option because I am creating a status-bar-screen-capturing app that should be displayed and operate in front of everything else but without deactivating any running application. I could solve the cursor problem by
[NSApp activateIgnoringOtherApps];
but then taking a screenshot of a fullscreen video running in Safari would deactivate Safari and minimize the video, which I don't want.
I don’t think it’s possible through normal APIs to change the cursor when your app isn’t active. I’m pretty sure the window system doesn’t allow it: it’d be a violation of the boundaries between apps—if you try to set a cursor from the background, and the foreground app also tries to set a cursor, who would win?
Of course the system can do it (like when you take a screenshot with ⌘⇧4), because that’s in the window system itself.