Mag stripe keyboard wedge print speed - input

I am writing an application where I need to accept information from a mag swipe keyboard edge. I am running into concerns in regards to the wedge's print speed. Any lag in the application, even at input, is very important.
I can not seem to find an answer to this question:
What is the main bottleneck for a mag stripe reader's print speed (keyboard wedge specific via usb or usb slave)?
I have noticed that the print speed differs from machine to machine. I want to say that it is either the usb port or the RAM on the device but I unfortunately don't know enough about the hardware end of things to come to a conclusion. Furthermore might it be the operating system or card reader? I know the reader itself has a print speed but that does not explain the fluctuation.
Anyone know as to what it might be?

So it turns out it is the baud rate. You can change the baud rate on the computer and the card reader to adjust the speed at which the data (or print) speed is sent/received.

Related

GNU Radio and bladeRF on Raspberry Pi (simple FSK system)

I am having a problem porting a GNU Radio setup from PC (windows 10, USB3) to Raspberry Pi 2 (USB2). USB bandwidth and CPU should not be a problem I think (only around 30% utilization while running). Essentially it looks like the RPi is 'pausing' during transmission, while the PC is not. The receiver is running on PC in both cases. I am including a pic of what I see after the FSK demod when running transmitter on PC vs Pi (circled 'pause' area), as well as a picture of my (admittedly sloppy) schematic. Any help/tips is greatly appreciated.gnuradio schemreceived signals
Edit: It appears it may actually be processing limitations. Switching from 9400 baud to 2400 baud makes the issue go away. If anyone has experience with GNURadio...am I doing anything overly inefficient or should I just drop comm rate?
The first thing I would do would be to lower your sample rates.
You don't need 1.5Ms/s if you are going to keep only the lowest 32k in your low pass filter.
Then you could do the same for your second stage after the quadrature demod if it's not enough (by the way, the sample rate of your second low pass filter does not seem to match the actual sample rate of the stage which is still 1.5Ms/s if I'm not mistaken).
Anyway, Gnuradio uses a lot of processing power so try not to use a sampling rate way above what you actually need ;)
In your case, you could cut the incoming sample rate down to 64k (say 80 for safety). 18 times less samples to process might do the trick :)

Hacking computer hardware to do experiment control

I am a physicist, and I had a revelation a few weeks ago about how I might be able to use my personal computer to get much finer control over laboratory experiments than is typically the case. Before I ran off to try this out though, I wanted to check the feasibility with people who have more expertise than myself in such matters.
The idea is to use the i/o ports---VGA, ethernet, speaker jacks, etc.---on the computer to talk directly to the sensors and actuators in the experimental setup. E.g. cut open one side of an ethernet cable (with the other end attached to the computer) and send each line to a different device. I knew a postdoc who did something very similar using a BeagleBone. He wrote some assembly code that let him sync everything with the internal clock and used the GPIO pins to effectively give him a hybrid signal generator/scope that was completely programmable. It seems like the same thing should be possible with a laptop, and this would have the additional benefit that you can do data analysis from the same device.
The main potential difficulty that I foresee is that the hardware on a BeagleBone is designed with this sort of i/o in mind, whereas I expect the hardware on a laptop will probably be harder to control directly. I know for example (from some preliminary investigation, http://ask.metafilter.com/125812/Simple-USB-control-how-to-blink-an-LED-via-code) that USB ports will be difficult to access this way, and VGA is (according to VGA 15 pin port data read and write using Matlab) impossible. I haven't found anything about using other ports like ethernet or speaker jacks, though.
So the main question is: will this idea be feasible (without investing many months for each new variation of the hardware), and if so what type of i/o (ethernet, speaker jacks, etc.) is likely to be the best bet?
Auxiliary questions are:
Where can I find material to learn how I might go about executing this plan? I'm not even sure what keywords to plug in on Google.
Will the ease with which I can do this depend strongly on operating system or hardware brand?
The only cable I can think of for a pc that can get close to this would be a parallel printer cable which is pretty much gone away. It's a 25 wire cable that data is spread across so that it can send more data at the same time. I'm just not sure if you can target a specific line or if it's more of a left to right fill as data is sent.
To use one on a laptop today would definitely be difficult. You won't find any laptops with parallel ports. There are usb to parallel cables and serial to parallel cables but I would guess that the only control you would have it to the usb or serial interface and not the parallel.
As for Ethernet, you have 4 twisted pair with only 2 pair in use and 2 pair that are extra.
There's some hardware that available called Zwave that you might want to look into. Zwave will allow you to build a network of devices that communicate in a mesh. I'm not sure what kind of response time you need.
I actually just thought of something that might be a good solution. Check out security equipment. There's a lot of equipment available for pc's that monitor doors, windows, sensors, etc. That industry might what your looking for.
I think the easiest way would be to use the USB port as a Human Interface Device (HID) and using a custom built PIC program and a PIC that includes the USB functionality to encode the data to be sent to the computer and in that way be able to program it independently from the OS due to the fact that all mayor OS have the HID USB functionality.
Anyways if you used your MIC/VGA/HDMI whatever other port you still need a device to encode the data or transmit it, and another program inside the computer to decode that data being sent.
And remember that different hardware has different software (drivers) that might decode the raw data in other odd ways rendering your IO hardware dependent.
Hope this helps, but thats why the USB was invented in the first place to make it hardware and os independent.

How to get a single screen shot from photo camera using microcontroller

Let's imagine that we have any of popular photocameras (like Canon or whatever) installed on a mechanical platform. This platform allows us to accurately adjust camera's lens direction to any interesting object. This platform is controlled from PC via microcontroller board. But we need a feedback from a photocamera - the image which currently appears on camera's display. Obviously, this feedback is required to be sure that the camera looks in a right direction. At the moment I don't know how to get a single shot image from photocamera by a microcontroller.
Could you please recommend me any directions to dig to ? Any recommendations on how to select photo camera (web cameras are not allowed) ? Any tips ?
Thank you in advance =)
Dwelch is right, you need to pick a "friendly" camera and work from there - google CHDK for a starter.
You could use the SPI interface of a micro to spoof being an SD card, and accept image data from the camera straight into the micro, but you would probably need quite a fast micro with a fair amount of RAM, especially if you want to do any processing on it.
Other than that, you could sample the camera's AV-output (if it has one), either into the micro or straight into the PC via a USB capture stick (or USB capture stick into micro if you're being a show-off), or maybe interrogate the camera over its USB or (insert name of proprietary port here) IO port.
Getting more hacky (yes, even more!) you could sniff the LCD data bus of the camera and steal the image from that, but that brings all sorts of pain, and tiny, tiny screws.

Plot a graph of Time vs RSSI for a 433Mhz RF ASK Receiver

Hi Im using the following RF module
http://www.apogeekits.com/rf_receiver_module_rx433.htm
on an embedded board with the PIC16F628A. Sadly, I realized that the signal strength was in analog form and couldn't get any ideas to get the RSSI reading off the pin because well my PIC is digital DUH!.
My basic idea was
To get the RSSI value from my Receiver
Send it to the PIC
Link the PIC to a PC via RS232
Plot a graph of time vs RSSI of the receiver (so I can make out how close my TX is to my RX)
I thought it was bloody brilliant at first but ive hit a dead end here. Any ideas on getting the RSSI data to my PC from this receiver would be nice.
Thanks in Advance
You can get a PIC that has an integrated ADC for sampling the analog signal. Or, you can use an external ADC chip to do the conversion. You would connect that to your PIC using SPI or I2C.
The simplest thing to do is obviously to use a more appropriate microcontroller - one with an ADC! There are many (most), including PICs (though that wouldn't be my first choice).
Attaching an external SPI or I2C ADC might be a bit tedious since having no SPI or I2C on your part, you'd have to bit-bash it. If you do that, use an SPI part - its simpler. Your sample rate will suffer and may end-up being a bit jittery if you are not careful.
Another solution is to use a voltage controlled PWM, then use the timer input capture to time the pulse width. That will give you good regularity and potentially good resolution. You can get a chip (example) to do that, or grow your own. That last option requires a triangle wave input as well as the measured (control) voltage, but on the same site...
In a similar vein, you could use a low frequency VCO (example) and use the output to clock one of the timers, then using a second timer periodically sampling the first and reset it. The count will relate to the voltage, though not necessarily a linear relationship, linearisation could be none on the PIC or at the receiving PC - I'd go for the latter - your micro will suck at arithmetic (performance wise) - even integer arithmetic, especially if it involves division.

Is there software or code to alter USB power output

I had a look at this and this but no one sounded particularly sure of their ideas and I'm kind of after a different thing anyway. I want to hook my usb power cables (red and black) up to my phone so I don't have to use a battery (the battery is dead anyway and this is just an experiment). The problem is that USB standards ensure that a minimum of 4.35V is supplied, when I only want 3.7V. Does anyone know for sure that you can or cannot regulate power output programmatically? Some other queries I have are: What kind of power does the sleep mode provide? And what would I need to code something in to play with this, C++?
No, you won't find a computer that allows you to set this voltage in software. It would break the USB specification.
You can get 150mA by default, and 500mA if your USB device negotiates it with the computer (requiring a little bit of logic in the device). Multiply by 5V to get the provided power.
A bit more info on the answer from Pascal:
The normal operation (Non-Configured mode) is 100mA
In theory, the operating system should check the MaxPower value of the device's configuration descriptor to decide if to allow it to draw more than 100mA.
In practice, PCs do not do it (and have no way to control it). So you can try taking 500mA.
(Of course connecting a bus powered hub and linking more then one 500mA device, should, not work.)
If the device is not actively used, the OS may (and should) suspend it. When suspended the power is limited to 1-0.5mA (Again, in theory, since it can not be controlled by software).