I mean the real USB keyboard, not a software keyboard driver.
I know that keybords differ, but in general is it easy/possible to reprogram / rewire it?
Maybe there are models that are easier to do that?
And yes, I can use soldering iron / hardware flash reprogrammer.
In general, no. Most keyboards that you find have a small bit of brains and a switch matrix and not much else. They have just enough brains to communicate over USB while scanning their switch matrix. The entire mess is in one mask-programmed chip that you can't change the programming of.
There are a few keyboards out there that do things like key remapping or macro programming in the keyboard, but they are pretty rare and/or pretty expensive. And, in my experience, pretty damned annoying when you accidentally hit the 'program macro' key accidentally.
You could, in theory, tear akeyboard apart, remove it's existing brains, install a micro-controller and write code to send whatever codes you want to the host when a given point on the switch matrix is hit. You'll need to work out what traces are what on the switch matrix, and you'll need to write/find micro-controller code to talk USB. And don't forget that a switch matrix is susceptible to ghosting effects (one keypress masks others), so don't try to put things like shift, control, alt on keys that are subject to ghosting.
If you don't/can't do the micro-controller work yourself, you could use something like an ipac (it's the micro-controller part that I just described) to do the job, but you'll have to get something that understands a switch matrix.
For some keyboards it is possible, as a recent hack of an Apple keyboard revealed.
yes. you need to remove the manufacturer's controller and wire in a programmable keyboard controller. the better controllers contain a rom to hold the key matrix. (essential keyboard functions and raw scancodes are segregated and remain untouched). a convenient windows app is used to create the matrix, then compile to a binary, then flash the rom.
search: FlexMatrix SK5100/SK5101 i think it does macros too! =D
Related
I bought a new mouse (which doesn't have it's own software) and I was wondering:
Since it has RGB lights that change on their own, as far as my understanding goes, it has some software inside it that controls this.
First, the simpler question: when I first connect the mouse, Windows says it's "installing" some stuff. Where can I find this stuff (files probably)?
Second: Is there any way for me to "reverse engineer" this and get access to the mouse's code, so that I would be able to control the LED's color, for example?
When Windows says it is "installing" something for your mouse, it is looking at the USB descriptors, figuring out what driver to associate with the mouse, and recording other metadata. You can look in your registry under "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum\USB" to see what gets recorded. For a more complicated device, I think Windows could actually download driver files from the internet during this stage and install them on your computer. But for a standard HID mouse, it should already have the drivers it needs.
There is no standard way to read the code from a hardware device, and it is likely to be extremely difficult if the device is not open source. The code is likely stored in the memory of a microcontroller that has read protection enabled, meaning that it cannot be read from an external programmer. It is also possible that much of the funcitonality of the mouse is actually implemented in application-specific hardware instead of software.
If there is existing software on your PC that allows you to control the LED color of your mouse, your best hope is to run that software and look at what USB packets it is sending to the mouse using a USB protocol analyzer.
I'm working on an arcade cabinet that will be able to play various video game consoles (real hardware, not emulated.) There will be a PC inside to run a selection menu. I'll have to write that myself. I'll also need program a PLC which will do various things like control the relays which switch audio/video/controls between the PC and the various consoles, etc. I'll need help with those two tasks in time, but they are not what I'm working on right now.
What I'm working on as a starting point has to do with the controller encoding. Basically, the controls for each player consist of a few buttons and a joystick. These use momentary, normally-open contact switches, one for each button, and one for each cardinal direction on the joystick. Pressing the button, or joystick direction, closes the switch. The state of the buttons is then communicated to the console by an encoder.
The encoder has a connection for each button and joystick direction which is connected to 5 volts ("high") through a pull-up resistor. When a button or direction is pressed, a connection to ground is made through the momentary switch. When the encoder reads ground ("low") on a button connection, it knows that a button has been pressed and it communicates this to the console.
I already have all this working with the various consoles, but I've thought of some features that would be nice to add. This is where my current task comes in.
The first feature is button remapping. Some of these games were designed with controllers in mind, so when you use them with an arcade control panel, some of the buttons may not be where you want them. Some games allow buttons to be remapped via software, but others do not. My idea is to add a PLC in between the joystick and buttons and the encoder. I'll call this PLC a "pre-encoder."
The pre-encoder would read the states of the buttons on some input pins, then write these states back to some output pins, relaying them to the encoder. The advantage is that its programming could associate any input pin with any output pin, effectively remapping the buttons. Whenever a console is selected via the computer's menu, a button-mapping profile associated with a particular game could be selected as well, and forwarded to the pre-encoder.
Of course, the pre-encoder's routine which reads the buttons and relays their states to the encoder must repeat very quickly for smooth control. These games will be running at about 50 to 60Hz, meaning a new a video frame every 16.67ms or less. Ideally, the pre-encoder will be able to repeat this routine many, many times per frame to ensure the absolute minimum input lag. I want to ensure that the code and hardware selection is optimized to run as fast as possible.
The second feature is turbo buttons. Some games, especially arcade games, require a fire button to be pressed repeatedly every time you want to fire your gun, or your ship's cannons, etc, even if you have unlimited ammo. This seems unnecessary, and it will tire your fingers out pretty quickly. A turbo button is one that can be held down continuously, yet the game is being told that you are rapidly pressing and releasing it. This could be done in software for anything running on the PC, or with an analog solution like a 555 timer, but the best method is to synchronize the turbo button timing with the video refresh rate. By feeding the vertical sync pulse from the PC or video game console's video output to a PLC, it will know exactly how often a frame of video is rendered. Turbo button timing can then be controlled by defining, in numbers of frames, the periods when the button should be pressed and released. Timing information could also be included with the game-specific button profiles.
The third feature is slow buttons. Actually, this would probably only be applied to the joystick, but I'm referring to the switches for its cardinal directions as buttons. In certain games (it will probably only be used in shmups) it is sometimes needed to move your character (ship/plane) through very tight spaces. If movement is too fast in response to even minimal joystick input, you may go too far and crash. The idea is that, while a slow activation button is held, the joystick will be made less responsive by rapidly activating and deactivating it in the same manner as the turbo buttons.
I'm not sure if I want the pre-encoder itself to be watching the vertical sync pulse or if it will slow it down too much. My current thinking is that a seperate PLC will be responsible for general management of the cab itself; watching the "on" button, switching relays, communicating directly with the PC, watching the vertical sync pulse, etc. This will free up the pre-encoder to run more quickly.
Here is some example "code" for the pre-encoder. Obviously, it's just a rough outline of what I have in mind, as I don't even know what language it will be. This example assumes that a dedicated PLC will be used just as the pre-encoder. A separate PLC will be responsible for watching the vertical sync pulse, in addition to other tasks, like getting a game profile from the computer and passing some of that info to the pre-encoder. That PLC will know what the frame timing should be for turbo and slow functions, it will count frames, and during frames when turbo buttons should be disabled, it outputs high to a pin on the pre-encoder PCB, letting it know to disable turbo buttons. During frames when it should be enabled, it outputs low to that pin. Same idea with the slow buttons. There is also a pin which the pre-encoder checks at the end of its routine, so it can be told to stop and await a different game profile.
get info from other PLC (which got it from the computer, from a user-selected game profile):
array containing list of turbo buttons (buttons are identified by what input pin they are connected to)
array containing list of slow buttons (will probably only be the joystick directions, if any)
array containing list of slow activation buttons (should normally be only one button, if any)
array containing list of normal buttons (not turbo or slow)
array containing which output pin to use for each button (this determines remapping)
Begin Loop
if turbo pin is high
for each turbo button
output pin = high
next
else
for each turbo button
output pin = input pin
next
end if
if slow pin is high and slow activation button is pressed
for each slow button
output pin = high
next
else
for each slow button
output pin = input pin
next
end if
for each normal button
output pin = input pin
next
Restart Loop unless stop pin is low
If you've read all this, thank you for your time. So (finally), here are my questions:
What are your overall thoughts; on my idea in general, feasibility, etc.?
What kind of PLC should I use for the pre-encoder? I was originally thinking of trying an Arduino, but my reading indicates that it will be much too slow, due to its use of high-level programming libraries. I don't have a problem building my own board around another PLC.
What language should I use to program the PLC? I don't mind learning a new language. There's no time limit on this project, and I'll put it in whatever it takes to get the pre-encoder running as fast as possible.
What will I need to flash my program onto the PLC?
At run-time, how should these PLC's communicate with each other, and with the PC?
Am I asking in the right place; right forum, right section, etc.? Anywhere else I should ask?
Awaiting your response eagerly,
-Rob
I have some thoughts that might be useful to you:
What are your overall thoughts; on my idea in general, feasibility, etc.?
This project sounds like you want to cheat at Defender, like I used to do with a 555 timer chip in my Atari joystick when I was a kid.
The project is feasible but you will need a pretty fast PLC.
You might spend a lot of time making this work, like a quest.
What kind of PLC should I use for the pre-encoder? I was originally thinking of trying an Arduino, but my reading indicates that it will be much too slow, due to its use of high-level programming libraries. I don't have a problem building my own board around another PLC.
As I thought of what PLC might be fast enough, a few things came to mind.
If you use a PLC that has a task architecture, you can use an event to trigger a task on the v-sync pulse, and another event to trigger on console activity. If you use a PLC without a task architecture, the user might recognize the variable latency that will occur as the program scan moves in and out of phase with the v-sync and the activity in the game. This might not be true if the PLC is fast enough, say 1ms scan time.
Most inexpensive PLCs are never going to make it. The overhead and performance will keep most PLCs around 5-10ms per scan. However, a PC-based PLC might work well. So maybe a Beckhoff controller will work nicely. If you use something like a CX2000, it has Windows 7, USB, DVI for the user interface, and an Ethercat bus on the side to attach physical I/O cards for the controller and console connections. See about the software below. There are many non-PC-based PLCs that would work fine, but these will likely be expensive and harder to integrate.
The Arduino solution should work if you are using a fast enough model. But your development time will be higher because it doesn't come with anything but a blank screen and a bunch of libraries. Troubleshooting is much more of a pain-in-the-neck than PLCs that really shine. You'll need to plan carefully to get the Arduino to work. Also, hardware interfacing a microcontroller is harder and you'll have to manage debouncing the switches in your code. Every PLC has filtering in its inputs, and the variety of I/O makes design easy. But, the Arduino or other microcontroller is really the choice if money is an issue. A fast PLC can be real expensive ($800 to $20k, think around $1500). If you are going to build more than a few systems, the Arduino might be better.
What language should I use to program the PLC? I don't mind learning a new language. There's no time limit on this project, and I'll put it in whatever it takes to get the pre-encoder running as fast as possible.
IEC61131 is a standard for PLC programming languages. In the USA most PLCs are programmed in ladder logic because it is really easy to learn and quicker to troubleshoot and maintain in machinery. Structured text has its advantages too, particularly in performance. It looks like some amalgamation of basic/C/Java, easy to learn and looks almost like your pseudocode example. As for your project, I think it could be programmed in either language. I would never use the other IEC61131 languages for this task.
Beckhoff TwinCAT3 uses MS Visual Studio as the IDE, where you can write both the selection menu (in VB/C++/C#) and the PLC code (in IEC61131) in the same project. The runtime license for TwinCAT (on the CX2000 unit) runs in kernel mode, providing processing performance to Windows 7 whenever it is not doing something else more important. I've used a few CX1020 models and they were great performers. The scan times were around 5ms with a significant amount of code. Faster units will scan <1ms.
What will I need to flash my program onto the PLC?
PLCs don't "flash" like microcontrollers. Whatever software you use to write the software will have a way to connect to the controller. The term "go online" makes the connection. The terms "download" and "upload" refer to transferring the program between the development computer and the PLC. The term "online edit" refers to making code changes while the PLC is executing the code. When modern PLCs are powered down, they use a battery to copy program and user RAM to flash. When they power up, they copy the flash back to RAM. To make a connection to any modern PLC, you will use a USB or Ethernet cable.
At run-time, how should these PLC's communicate with each other, and with the PC?
You plan more than one PLC? A PLC connection to a PC is a complicated subject. The term "OPC Server" refers to some [expensive] software that lets your custom Windows PC application access memory in PLCs. The Beckhoff solution glues all that together nicely without buying more stuff. PLC to PLC communication is easier. The method is usually by ethernet and varies widely as to the details.
Am I asking in the right place; right forum, right section, etc.? Anywhere else I should ask?
Sure, there is some PLC activity on this forum, which appears to tend toward hardcore PC/Web/Mobile development. I come here for awesomely intelligent answers to my deeper software questions.
You could try plctalk.net, a forum that is a little more geared toward nuts-and-bolts engineers and service techs with wild connectivity and compatibility questions related to machinery and automation. You might get some blank stares about vertical sync pulses. Their skill sets revolve around an industrial paradigm, where reliability is probably their highest calling.
You might also ask questions about performance on an Arduino or Microchip/Atmel/ARM forums. If you tell them that a PLC is faster than their hardware, that will rile them up real good! They might tell you that you can get microsecond performance numbers, which you can if you are using hardware interrupts and lots of physical circuitry to make that a reality, and you are able to cope with the sleepless nights of troubleshooting.
-Dennis
So, I'm looking into Permanent DOS attacks for a class, and I'm having a hard time coming up with concrete examples. There's a lot of information about Phlashing (flashing firmware to either brick the device, or put malicious firmware in its place, for those of you who don't know the term) but I'd like to have a broader set of examples.
That being said, there has to be a way to write code that will do something like wear out disk arms, right? Something that will have the disk seek to the end of the disk, then back to the front, on and on. Anyone have an example of how that would be accomplished? Is there some way to specify where to track to on a disk in C (similar to traversing to a certain point in a file, but for the entire HDD!)? If not, I guess there's always trying to force a file's location on the disk... which seems like less fun trying to accomplish. Again, can you do something like that programmatically?
If anyone has any insight into these types of attacks, or any good resources for me to check into, I'd appreciate it. Maybe you read a story about it on Slashdot a few years back? Let me know! The more info I can gather, the less likely I'll be forced to kill time during my talk by bricking my router in the class :) I'm not made of money OR routers!
Seems like these would primarily be limited to physical attacks and social engineering ("To enable your computer's hidden turbo function, remove the cover and pry this part). But:
Adjust screen refresh rates to insane values to blow older CRTs
Monkey with ACPI fan, charge, or battery controls if possible to cause overheating or battery failure.
Overwrite every rewritable storage device of every kind attached to any bus. Discover and overwrite any IDE, USB, etc... device you know the flash updater details for.
Of course nothing is permanent. You can replace the hard drive, BIOS chips, CPU, motherboard, memory, etc...
Although it is mostly fictional, the halt and catch fire operation would be a very convenient and permanent DOS attack.
Steve Gibson (google his name) has a paper he wrote a few years back about protocol-level vulnerabilities in TCP/IP. Some of it is still pertinent today.
Socially engineer the power company or ISP to turn off service at the location in question.
Many devices in the computer today have their own firmwares, including but not limited to CPU, DVD, HDD, VGA, motherboard (BIOS) etc. Most of these devices also have a way of updating their respective firmwares. Which can also be used to brick them pretty efficiently. Although this does require an individual approach to every device, often using privileged instructions and undocumented interfaces.
It's possible for a virus to do this. I seem to recall an actual virus doing this back in the day, but can't find anything to back that up.
I was able to find an article where the author has a conversation with the VP from Western Digital wherein he states a program could potentially access a hard drive's firmware causing such a DOS attack:
There are back doors if you will that allow us to get into places that the operating system can't go through the IDE connector
There used to be a few viruses that could cause old CRT monitors to break. They could cause invalid sync signals out the VGA point that would be too high in frequency for the video sweep. I also remember a few that would use bad sector flagging to draw images on the old versions of Scandisk (we are talking early 90’s or older.) I don't remember and of the names or have any references, but they used to be quite annoying.
Fortunately better circuits, memory protection, API abstraction have made such attacked very difficult to impossible.
As a hobby project to keep myself out of trouble, I'd like to build a little programmer timer device. It will basically accept a program which is a list of times and then count down from each time.
I'd like to use a C or Java micro controller. I have used BASIC in the past to make a little autonomous robot, so this time around I'd like something different.
What micro controller and display would you recommend? I am looking to keep it simple, so the program would be loaded into memory via computer (serial is ok, but USB would make it easier)
Just use a PIC like 16F84 or 16F877 for this. It is more than enough.
As LCD use a 16 x 2 LCD. It is easy to use + will give a nice look to your project.
LCD
The language is not a matter. You can use PIC C, Micro C or any thing you like. The LCD's interface is really easy to drive.
As other components you will just need the crystal and 2 capacitors as oscillator + pull up resister. The rest of the components depend on the input method that you are going to use to set the times.
If you are using a computer to load the list then you will need additional circuit to change the protocols. Use MAX 232 to do that. If you want to use USB, you need to go ahead and use a PIC with USB support. (18F series)
(source: sodoityourself.com)
This is a set of nice tutorials you can use. You can purchase the products from them as well. I purchased once from them.
I would go with the msp430. An ez430 is $20 and you can get them at digikey or from ti directly, then sets of 3 microcontroller boards for $10 after that. llvm and gcc (and binutils) compiler support. Super simple to program, extremely small and extremely low power.
There are many ways to do this, and a number of people have already given pretty good suggestions AVR or PIC are good starting points for a microcontroller to work with that doesn't require too much in the way of complicated setup (hardware & software) or expense (these micros are very cheap). Honestly I'm somewhat surprised that nobody has mentioned Arduino here yet, which happens to have the advantage of being pretty easy to get started with, provides a USB connection (USB->Serial, really), and if you don't like the board that the ATMega MCU is plugged into, you can later plug it in wherever you might want it. Also, while the provided programming environment provides some high level tools to easily protype things you're still free to tweak the registers on the device and write any C code you might want to run on it.
As for an LCD display to use, I would recommend looking for anything that's either based on an HD44780 or emulates the behavior of one. These will typically use a set of parallel lines for talking to the display, but there are tons code examples for interfacing with these. In Arduino's case, you can find examples for this type of display, and many others, on the Arduino Playground here: http://www.arduino.cc/playground/Code/LCD
As far as a clock is concerned, you can use the built-in clock that many 8-bit micros these days provide, although they're not always ideal in terms of precision. You can find an example for Arduino on doing this sort of thing here: http://www.arduino.cc/playground/Code/DateTime. If you want something that might be a little more precise you can get a DS1307 (Arduino example: http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1191209057/0).
I don't necessarily mean to ram you towards an Arduino, since there are a huge number of ways to do this sort of thing. Lately I've been working with 32-bit ARM micros (don't do that route first, much steeper learning curve, but they have many benefits) and I might use something in that ecosystem these days, but the Arduino is easy to recommend because it's relatively inexpensive, there's a large community of people out there using it, and chances are you can find a code example for at least part of what you're trying to do. When you need something that has more horsepower, configuration options, or RAM, there are options out there.
Here are a few places where you can find some neat hardware (Arduino-related and otherwise) for projects like the one you're describing:
SparkFun Electronics
Adafruit Industries
DigiKey (this is a general electronics supplier, they have a bit of everything)
There are certainly tons more, though :-)
I agree with the other answers about using a PIC.
The PIC16F family does have C compilers available, though it is not ideally suited for C code. If performance is an issue, the 18F family would be better.
Note also that some PICs have internal RC oscillators. These aren't as precise as external crystals, but if that doesn't matter, then it's one less component (or three with its capacitors) to put on your board.
Microchip's ICD PIC programmer (for downloading and debugging your PIC software) plugs into the PC's USB port, and connects to the microcontroller via an RJ-11 connector.
Separately, if you want the software on the microcontroller to send data to the PC (e.g. to print messages in HyperTerminal), you can use a USB to RS232/TTL converter. One end goes into your PC's USB socket, and appears as a normal serial port; the other comes out to 5 V or 3.3 V signals that can be connected directly to your processor's UART, with no level-shifting required.
We've used TTL-232R-3V3 from FDTI Chip, which works perfectly for this kind of application.
There are several ways to do this, and there is a lot of information on the net. If you are going to use micro controllers then you might need to invest in some programming equipment for them. This won't cost you much though.
Simplest way is to use the sinus wave from the power grid. In Europe the AC power has a frequency of 50Hz, and you can use that as the basis for your clock signal.
I've used Atmel's ATtiny and ATmega, which are great for programming simple and advanced projects. You can program it with C or Assembly, there are lots of great projects for it on the net, and the programmers available are very cheap.
Here is a project I found by Googling AVR 7 segment clock.
A second vote for PIC. Also, I recommend the magazine Circuit Cellar Ink. Some technical bookstores carry it, or you can subscribe: http://www.circellar.com/
PIC series will be good, since you are creating a timer, I recommend C or Assembly (Assembly is good), and use MPLAB as the development environment. You can check how accurate your timer with 'Stopwatch' in MPLAB. Also PIC16F877 has built in Hardware Serial Port. Also PIC16F628 has a built in Hardware serial port. But PIC16F877 has more ports. For more accurate timers, using higher frequency oscillators is recommended.
I would like to create/start a simulator for the following microcontroller board: http://www.sparkfun.com/commerce/product_info.php?products_id=707#
The firmware is written in assembly so I'm looking for some pointers on how one would go about simulating the inputs that the hardware would receive and then the simulator would respond to the outputs from the firmware. (which would also require running the firmware in the simulated environment).
Any pointers on how to start?
Thanks
Chris
Writing a whole emulator is going to be a real challenge. I've attempted to write an ARM emulator before, and let me tell you, it's not a small project. You're going to either have to emulate the entire CPU core, or find one that's already written.
You'll also need to figure out how all the IO works. There may be docs from sparkfun about that board, but you'll need to write a memory manager if it uses MMIO, etc.
The concept of an emulator isn't that far away from an interpreter, really. You need to interpret the firmware code, and basically follow along with the instructions.
I would recommend a good interactive debugger instead of tackling an emulator. The chances of destroying the hardware is low, but really, would you rather buy a new board or spend 9 months writing something that won't implement the entire system?
It's likely that the PIC 18F2520 already has an emulator core written for it, but you'll need to delve into all the hardware specs to see how all the IO is mapped still. If you're feeling up to it, it would be a good project, but I would consider just using a remote debugger instead.
You'll have to write a PIC simulator and then emulate the IO functionality of the ports.
To be honest, it looks like its designed as a dev kit - I wouldn't worry about your code destroying the device if you take care. Unless this a runner-up for an enterprise package, I would seriously question the ROI on writing a sim.
Is there a particular reason to make an emulator/simulator, vs. just using the real thing?
The board is inexpensive; Microchip now has the RealICE debugger which is quite a bit more responsive than the old ICD2 "hockey puck".
Microchip's MPLAB already has a built-in simulator. It won't simulate the whole board for you, but it will handle the 18F2520. You can sort of use input test vectors & log output files, I've done this before with a different Microchip IC and it was doable but kinda cumbersome. I would suggest you take the unit-testing approach and modularize the way you do things; figure out your test inputs and expected outputs for a manageable piece of the system.
It's likely that the PIC 18F2520 already has an emulator core written for it,
An open source, cross-platform simulator for microchip/PICs is available under the name of "gpsim".
It's extremely unlikely that a bug in your code could damage the physical circuitry. If that's possible, then it is either a bug in the board design or it should be very clearly documented.
If I may offer you a suggestion from many years of experience working with these devices: don't program them in assembly. You will go insane. Use C or BASIC or some higher-level language. Microchip produces a C compiler for most of their chips (dunno about this one), and other companies produce them as well.
If you insist on using an emulator, I'm pretty sure Microchip makes an emulator for nearly every one of their microcontrollers (at least one from each product line, which would probably be good enough). These emulators are not always cheap, and I'm unsure of their ability to accept complex external input.
If you still want to try writing your own, I think you'll find that emulating the PIC itself will be fairly straightforward -- the format of all the opcodes is well documented, as is the memory architecture, etc. It's going to be emulating the other devices on the board and the interconnections between them that will kill you. You might want to look into coding the interconnections between the components using a VHDL tool that will allow you to create custom simulations for the different components.
Isn't this a hardware-in-the-loop simulator problem? (e.g. http://www.embedded.com/15201692 )