We have a hardware device, with an LCD display. It supports an USB interface to connect keyboard and mose. Using these keyboard and mouse, we can navigate to varios menu items and edit entries.
We have couple of test cases written to verify that mouse click and keyboard input events are working when pressed respective key.
My task is to automate these test cases.
I donot have any control to the hardware device, as I can not access the o/s kernel or any application running there. There is one way to verify what is currently displayed on the UI. So I have to use that and verify whether the mouse/keyboard has performed the appropriate events.
As I have gone through couple of previous posts, it seems like that one of the way to achieve this is through virual HID device driver rather than actual keyboard and mosue. But I am not sure how to achieve it.
Please do help me for it. I am fine with any programming language.
I am more interested to simulate the mouse and keyboard events.
You probably don't need to write your own driver. AutoHotKey does pretty much anything you can think of, and the scripting language is quite easy to learn.
You can get it here:
http://www.autohotkey.com/
Since you're using linux, here's a similar project that will run on linux:
http://sikuli.org/
Related
I've been looking around for a way to detect if I have a physical keyboard attached to my computer from vb.net but can't find anything. The problem is all googled results returned think what I mean is detecting a keyboard input (AKA: keystrokes). But I want to know if there is a physical keyboard attached. I have also looked into listing USB devices but realize that not all computers (take notebooks for instance) use USB keyboards.
How can I check if a keyboard is attached/operational from VB.net?
I am developing a desktop application that capture computer activities on mac osx using objective-c. I know it's possible to capture it when user presses on the keyboard and mouse position. But I don't know how to detect when user switches tasks on computer, like closing a window (of other applications), activating another window (of other applications)?
Does anyone have any experience in that?
Yes, via the Accessibility system. For example the NSAccessibilityMainWindowChangedNotification.
Requirements:
Our application replaces the usual windows shell (explorer.exe). This is a product requirement for a closed system that we're supplying.
We oughtta let the user select a wi-fi network and connect to it.
The problem: The wi-fi networks dialog only shows up when explorer.exe is running
What we tried:
Write our own wi-fi manager that uses wlan API. It lists connectible networks and allows the user to connect/disconnect. Problem: too many network types/configuratons that have to be tested, especially when the wheel has already been invented and reinvented all over.
Try and check how is the networks dialog implemented. It appears that it's and undocumented COM interface (IUIRAdioManager). Problem: it's undocumented, so no API
Use an existing network manager, for instance the one that comes with the driver. Problems: it's ugly, not to the product's taste; and it opens too many options for the user, like creating and loading profiles, browsing for files on a file system - these things are unacceptable.
Running explorer.exe just for the purpose of showing the networks dialog and then killing it. Problem: once we run explorer.exe - it pops up metro view and hides our fullscreen application or shows the taskbar.
The latter seems like the preferred solution: no need to reinvent the wheel, it does what's needed. Just gotta make explorer.exe not pop out, keep it quiet in the background.
So, we're down to two options:
How to show the networks flyout dialog without explorer.exe?
How to run explorer.exe without it popping out metro or taskbar above our application?
Your first solution would be incredibly difficult to implement. I am almost certain that the Networks window is dependent on explorer.
However, your second is entirely possible.
To hide the taskbar, you will need to find a window (using FindWindowEx) to find the taskbar (name is Shell_traywnd). This will hide the taskbar and start button. EDIT: Unless you are implementing your own taskbar, you might want to set the taskbar to autohide.
Next you will need to hide all of the metro programs. In a similar fashion as above, find the class named EdgeUiInputWndClass and close it. You should be able to get the process name of it and then kill the process.
Windows key. This is a little more difficult. You will probably need to use a program and delete the key or a keyboard hook (a low level keyboard hook) and just ignore key presses with the same scancode as the windows key. Left Windows is 0x5b and Right is 0x5c (source). Note that this will not block Ctrl+Alt+Del.
Finally, to show the Flyout, you can run %windir%\explorer.exe shell:::{38A98528-6CBF-4CA9-8DC0-B1E1D10F7B1B}
(source).
EDIT2:
You should also be able to hide toast notifications via this
Of course, I don't see why you cannot just use Windows 8/8.1 and put the app in kiosk mode.
Im trying to send keystrokes and mouse movements to a Java program but once the applicaton has focus nothing is sent. It's as if the Java application takes focus of everything because Autohotkey stops responding. Everything works fine in a regular Windows app (e.g. Notepad).
I've tried using various send methods (Send, SendInput, and SendEvent) but nothing works. Does anyone have any suggestions?
The program in particular is ThinkOrSwim's ThinkDesktop.
I was able to get my script running with ThinkOrSwim by running the SciTE editor as Administrator [or running the compiled scripts as Administrator].
The TOS UI had some refresh issues but my scripts went through fine to do what I needed to do.
Some playing around I've discovered that TOS on Mac OSX can be controlled via scripting with Keyboard Maestro. It's a ugly, hacked solution, but it works. You can edit text boxes and click stuff if you know the X,Y position of elements.
Keyboard Maestro can be run via scripts (AppleScript, Python, etc.) so maybe you can build some elaborate rube goldberg.
I suggest you use Easy Macro Recorder
http://download.cnet.com/Easy-Macro-Recorder/3000-2094_4-10414139.html
Its a great tool to automate keystrokes and mouse movements.
Hope this helps :)
If you aren't familiar with Cinch, its an application on Mac App Store that allows you to resize ANY window to half/full screen size if you drag the window to the edge of the screen. Exactly like the functionality in windows 7.
Now my question is, how is it done? I have looked all over cocoa apis looking for notifications/delegate methods for whenever a window is being dragged (ALL windows, not just windows owned by the app from which code is running from) but can't find it. Looked in Core Graphics API...Quartz Display Services....but can't find it.
Any help will be greatly appreciated as I have been looking for the past week....Thanks!
Edit: Resize the window is easy since it can be done through applescript bridge..
Are you developer behind i-Snap or some other Mac App Store clone of Cinch?
I'm the developer behind Cinch, and while I try to maintain an "abundance mentality" which basically says "There's enough out there for everyone", I've been upset by the Mac App Store lowering the barrier for entry to this market which has produced a number of half-backed competitors.
I would be thrilled to see some real innovation around the work I have done, and not just clones looking to make a quick buck.
Anyway, you want to look at the Accessibility APIs. It's a Carbon C API. This is probably your best reference: http://developer.apple.com/library/mac/#samplecode/UIElementInspector/Introduction/Intro.html%23//apple_ref/doc/uid/DTS10000728
I've not used the Cinch app, but if I were to do this I'd expect to be using cocoa events. (Also see here) Specifically the mouse handling events, combined with where the mouse is currently on-screen. They probably set a variable when a window is grabbed and then track the mouse pointer until it hits an edge or until they release the mouse button.
Events are very powerful and provide very low level access to what is happening, but can also be very complex. Good luck!
I'm not sure. Maybe the developers combine apple script and carbon events. You can create carbon events to know when the mouse has been clicked or dragged