Detect mouse pressed events outside window Processing 2.0 - mouseevent

I am trying to find a way where I can click out side of the program's window and have the program pick up any time I press the mouse (Ex: Have Chrome or a game open and every time the mouse button is pressed processing is told the mouse is being pressed). I do not need any information to be sent to the program besides the fact that the mouse button has been pressed.
Thank you

Related

How many time will the program calls the increment function when entering the Button control element

When the user enter the Button control element,how many time will the program calls the increment function?
For this question,i don't know how to calculate the numbers that program calls the
increment time,the answer is 2,but i don't know the reason,can anyone explain to me?
Your event case is called when mouse button is clicked on your pane and on your button.
Here, You need to notice that your button is on your pane. It means when mouse down event is happened on your button, in the same time, the event is happened on your pane.
So, when you press down your mouse, Labview will recognize mouse down events are happened twice. one time on your pane and one time on your button.

AHK script to click a specific button in a program not working inside the windowed program window?

I'm trying to create a hotkey for a software button inside a program since the devs of the program did not do so.
Here is what I tried to move the mouse, click the button and move the mouse back to the original position:
F3::
CoordMode, ToolTip, Screen
MouseGetPos, X, Y
Click 512, 516
MouseMove, %X%, %Y%
Return
This works, on say the desktop, but when the program window is active, nothing happens. Is there some way to make this work inside the program window? It is a windowed program (not full screen) and it a scientific tool used to analyse physiological data.
Clean "CoordMode, ToolTip, Screen". Buttons in programs can be clicked using the messages "PostMessage, 0x111".

which best event for touchscreen application?

I have developed an application using VB.NET that uses a touchscreen (it's a Point of Sales app). I have used button click events to execute the code like a normal Windows application. Is this correct way to do it, or should I use MouseUp and MouseDown events?
Using Click events is correct. On a touch screen, tapping a button will generate a Click event, just like it would if you clicked the button with a mouse.
P.S. You mention in the comments that sometimes the application hangs when you click a button. This is most likely caused by the code that responds to the Click event, and is not related to using a touch screen.
Just ran into an issue on our Touch POS app yesterday. The click event is fired twice in some cases on some monitors. It seems that different touchscreen monitors handle the click even differently and some have software to prevent it others do not. The specific issue was that our click event was being fired two times so clicking button "1" would result in "11" Only happened when using the touchscreen not the mouse. The first depress of the button would put "1" and then when you took your finger off the button another "1" would appear. If you debugged the click event it would only be fired 1 time and just put 1 in the field. If you took debug out it went back to 11.
Save yourself the headaches and use MOUSEUP.

Detect mouse cursor icon change in application

I've been looking for awhile now to no avail for this.
What I'm trying to do is find a way to detect if a mouse icon changes when you mouse over something.
For example: If you mouse over a link it changes from the arrow to a finger.
My plan is to grab the ID of a window, and scan it for clickable objects based on the mouse icon changing. I can grab the window, bring it to the front, and move the mouse around by setting the x,y coord of the mouse, but I don't see a way to detect if the mouse has found anything.
I would prefer this to be something built into vb.net, but if I have to use an API I'm fine with that.
The approach is wrong because the concepts are different from what you observe visually.
There's no such thing as "click" -- there's Button Down event (Windows message sent by the GUI subsystem to the application), Button Up event and Mouse Move event. If there were Button Down and Button Up with no or little Mouse Move, then the OS considers this a click.
All events are sent to the window under the mouse cursor hotspot unless the mouse input is captured by the other window.
When the cursor is moved over the window, the OS sends WM_NCHITTEST message to the window to determine, how the window treats the area under the cursor. Based on window's response Windows either performs the window operation (window move or resize etc) or passes mouse-related events to the window procedure. The procedure then decides how to react - do nothing, make visual changes, perform some action etc.
As you can see from the description, cursor change and actual actions are two different loosely related operations. There can be an action without cursor change or cursor change without an action.

cocoa mousedown on a window and mouse up in another

I am developping a Cocoa application and I have a special need. In my main window, when I mouse down on a certain area, a new window (like a complex tooltip) appears. I want to be able to do:
- mouse down on the main window (mouse button stay pressed)
- user moves the mouse on the "tooltip" window and mouseup on it.
My issue is that the tooltip window does nto get any mousevent until the mouseup.
How can I fix this?
Thanks in advance for your help,
Regards,
And it won't since mouse is tracked by the main window. However, you can process mouseUp in the main window, transform click coordinates into the desktop space, get tooltip window frame and check whether the click occurred on the tooltip. After that, you can send a message to the tooltip window manually.
Or you can try to find another way to implement the final goal :) It is usually better to follow the rules, in this case - mouse tracking.