How to make a picture program for gps - gps

How can I re-program a gps with my own software? I'm new to this but i think i have to write all 0's to the hard disk to wipe it clean first. Then from there what program can i download or use to build my own software to make a picture/audio to put on the device. To transform it from a gps to a digital picture frame basically.. I need help so can someone please advise?

You're going to need to flash custom firmware to the device - a process that requires a lot of expertise, as well as information about the specifics of the hardware - which is likely not to be publicly disclosed. It's highly unlikely you're going to be able to pull this off.

Related

Details on USB- no luck so far

I've been looking for a detailed description for how USB protocol and cabling works for a long time with no luck. I am looking for a detailed yet not overcomplicated explanation of how things work on the software and hardware side of USB. Links and explanations would be appreciated. I've really run out of ideas, so it would be great if you can help me out.
This is what I do know:
USB hardware carries 4 lines- 5V power, ground, and 2 full duplex lines.
When connecting, the device can ask for a specified amount of current.
The transfer speeds for USB are quite fast compared to traditional serial connections.
When connecting, a device will output descriptors to the host describing itself. These descriptors will also be used for data.
What I don't know:
How does a program in C/C++ write directly to a USB port? Does it write to an address in the port?
How do some devices describe themselves as HID?
How do drivers work?
Everything else...
Thank you!
Identification
Every device has a (unique) Vendor and Product ID. These are provided (sold) by usb.org to identify a device. You can use a library like libusbx to enumerate all connected devices and select the one with the Vendor and Product ID you are looking for.
HID Descriptors
The point of HID descriptors is actually to do away with drivers. HID descriptors are a universal way of describing your device so you don't need to waste time on a driver for every system/architecture/etc/. (Same concept as the JVM.)
Reports
You will use either the input, output, or feature reports to read or write to your device. You send a stream to your device on the input or feature report. This is typically 8 bytes I believe. Only one of which is a single character you wish to write. The HID descriptor contains all the information you need to put together a report. Although I'm struggling to find a related link to clarify this.
Potential Libraries
In an effort to be open-minded here are all the libraries I am familiar with and some info about them.
libusb-0.1
First off is libusb-0.1. This used to be the go to and was built in to many Linux kernels and Windows I believe. It is very easy to use and there is a lot of documentation. However, the owner never updated and it wasn't edited for many years. It supports only synchronous transfers. (If an error occurs, the program can wait infinitely while it expects a transfer.)
libusbx
Next is libusbx. This is what most people would suggest today and I agree. It was published by those frustrated by the owner of libusb-0.1. The code is much more lightweight, up-to-date, and importantly does not require root privileges like libusb-0.1 and libusb-1.0 (Discussed in a second). It supports synchronous or asynchronous transfers.
libusb-1.0
Then there is libusb-1.0. This was the first update to libusb-0.1 in some number of years. It is not compatible with libusb-0.1. This was published the same day as libusbx as a retaliation (I assume) and an attempt to rectify the lack of updated content and conserve a user-base. It supports synchronous or asynchronous transfers.
hid.h
Finally, there is the hid library. This was built on top of libusb as another layer of abstraction. But honestly, I think it's just really confusing and it just adds more overhead than necessary.
Some Good Resources
Understanding HID Descriptors
Control Message Transfer Documentation (Very Good Link IMO)
Rolling Your Own HID Descriptor
Good Visual of HID Reports for Transfers
Great List of bmRequestType constants (You will need this or similar)
A simple terminal app for speaking with DigiSpark using libusbx and libusb-0.1
I know this isn't exactly what you are looking for, but maybe it will get you started!
This website has a general overview of how USB devices work:
https://www.beyondlogic.org/usbnutshell/usb1.shtml
Particular sections give answers to things from the list of things you don't know yet about USB.
E.g. to find out how USB devices identify themselves, read about USB descriptors:
https://www.beyondlogic.org/usbnutshell/usb5.shtml#DeviceDescriptors
To learn how a C/C++ program can talk to a USB device, see examples on using the libusb library:
https://github.com/libusb/libusb/tree/master/examples
To learn how USB drivers work, see a tutorial from Bootlin:
https://bootlin.com/blog/usb-slides/

How do you hack/decompile Camera firmware? (w/ decompiling tangent)

I wanted to know what steps one would need to take to "hack" a camera's firmware to add/change features, specifically cameras of Canon or Olympus make.
I can understand this is an involved topic, but a general outline of the steps and what I issues I should keep an eye out for would be appreciated.
I presume the first step is to take the firmware, load it into a decompiler (any recommendations?) and examine the contents. I admit I've never decompiled code before, so this will be a good challenge to get me started, any advice? books? tutorials? what should I expect?
Thanks stack as always!
Note : I know about Magic Lantern and CHDK, I want to get technical advise on how they were started and came to be.
http://magiclantern.wikia.com/wiki/Decompiling
http://magiclantern.wikia.com/wiki/Struct_Guessing
http://magiclantern.wikia.com/wiki/Firmware_file
http://magiclantern.wikia.com/wiki/GUI_Events/550D
http://magiclantern.wikia.com/wiki/Register_Map/Brute_Force
I wanted to know what steps one would need to take to "hack" a
camera's firmware to add/change features, specifically cameras of
Canon or Olympus make.
General steps for this hacking/reverse engineering:
Gathering information about the camera system (main CPU, Image coprocessor, RAM/Flash chips..). Challenges: Camera system makers tend to hide such sensitive information. Also, datasheets/documentation for proprietary chips are not released to public at all.
Getting firmware: through dumping Flash memory inside the camera or extracting the firmware from update packages used for camera firmware update. Challenges: Accessing readout circuitry for flash is not a trivial job specially with the fact that camera systems have one of the most densely populated PCBs. Also, Proprietary firmware are highly protected with sophisticated encryption algorithms when embedded into update packages.
Dis-assembly: getting a "bit" more readable instructions out of the opcode firmware. Challenges: Although dis-assemblers are widely available, they will give you the "operational" equivalent assembly code out of the opcode with no guarantee for being human readable/meaningful.
Customization: Just after understanding most of the code functionalities, you can make modifications that need not to harm normal operation of the camera system. Challenges: Not an easy task.
Alternatively, I highly recommend you to look for an already open source camera software (also HW). You can learn a lot about camera systems.
Such projects are: Elphel and AXIOM

Windows Low-Level Graphics [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm new to programming. I do know C/C++ and the basics of Win32. I am now trying to do graphics, but I want the fastest connection to the screen. I realize most are going with Opengl or DirectX. But, I don't want the overhead. I want to start from scratch and control the pixel data. I know about GDI bitmap, but I'm not sure if this is best access to the data. I know that I have to talk through windows, which is the trouble. Do Opengl and DirectX compile down to the level of GDI or is there a special way they do it, do they bypass or use similar code? Please, Don't ask why I want to do this. Maybe an explanation of how this is done might help. Like how windows combines all windows to create the final image.
The most direct access to pixel data is via shaders, which are supported by both OpenGL and Direct3D. They are cross-compiled and run directly on the video card. They do not use OpenGL, they do not have OpenGL overhead. OpenGL is just used to get them to the graphics card's own processor in the first place.
Anything you do on the CPU has to first be copied across the bus (typically PCI-express) to the video card. GDI is actually many levels removed from the graphics memory.
OpenGL, Direct3D, Direct2D, GDI, and GDI+ are all abstraction layers. The GPU vendor writes a driver that accepts these standard command functions, re-encodes the data in the card-specific format, then sends it to the card. Typically OpenGL and Direct3D are the most heavily optimized and also require the least amount of re-encoding.
How Windows combines the various on-screen windows to create the full-screen image depends heavily on what version of Windows you are talking about. DWM changed everything. Since DWM was introduced in Vista, programs render to their personal areas of GPU memory, then the window manager uses the texture lookup units of the video card to efficiently layer each of the programs' individual areas onto the screen primary buffer. When a program (usually a game) requests full-screen exclusive access, this step is skipped and the driver causes rendering commands from that application to affect the primary screen buffer directly.
Assuming that the CPU is generating the data which needs to be displayed, the fastest and most efficient approach is likely to be block-copying that data into a vertex buffer object and using OpenGL commands to rasterize it as lines or polygons or whatever (or the Direct3D equivalent). If you previously thought that GDI was the low-level interface, you've got some reading ahead of you to make this work. But it will run several orders of magnitude faster than pure GDI. So much faster, in fact, that the new architecture is that GDI (and WPF) is built on top of Direct2D and/or Direct3D.
but I want the fastest connection to the screen
I want to start from scratch and control the pixel data
You're asking for the impossible. You get best performance when you use GPU-accelerated functions. However, in this case you don't get direct access to pixel data, and trying to access it (read it back or write) will negatively impact the performance, because you'll have to transfer data from system memory to video memory. As a result anything that is being streamed from system memory to video memory should be handled with care. Plus you'll have to study API.
If you "start from scratch" and do rendering on CPU, you'll get easy access to pixel data and full control over the rendering, but performance will be inferior to GPU (CPU is less suitable for parallel processing, and system memory can be slower by order of magnitude than video memory), plus you'll spend significant amount of time reinventing the wheel.
Do Opengl and DirectX compile down to the level of GDI or is there a special way they do it, do they bypass or use similar code?
No. They communicate with graphic hardware nearly directly using drivers provided by hardware manufacturer. And those "direct hardware access" interfaces used by DirectX/OpenGL won't be available to you - they're hardware-specific and manufacturer specific, can be internal and possibly even protected by patents.
There are, of course, few legacy hardware interfaces which ARE available to you (namely VESA or VGA 13h mode), however, their direct use is normally forbidden by operating system (you can't easily access VESA on windows), so to access them you'll have to either boot MS-DOS, use custom operating system, or helper classes (such as SVGAlib on linux) which might only function under root privilegies. And of course, even if you actually use VESA/VGA to render something yourself, on any hardware (newer than RivaTNT 2 Pro) performance will be horrible compared to hardware-accelerated rendering done by OpenGL/DirectX. Have you ever seen how fast windows xp works when it doesn't have proper GPU driver (takes a second to redraw window)? that's how fast it is going to work with direct VESA/VGA access.
Please, Don't ask why I want to do this.
It makes sense to ask why you would want to do that. Your "I want direct low-level access" approach was suitable maybe 15..20 years ago or in DOS era. Right now reasonable solution would be to use existing API (that is maintained by somebody who isn't you) and search for a way to fully utilize it. Of course, if you wanted to develop drivers, that would be another story.
Do Opengl and DirectX compile down to the level of GDI or is there a special way they do it
and
I realize most are going with Opengl or DirectX. But, I don't want the overhead
So what you're saying is **you have absolutely no clue what OpenGL or DirectX actually do, and yet you've decided that they are not efficient enough for your needs.
I'm sorry, but this is nonsense. It is impossible to answer a question like that.
In the real world, you have a small supercomputer dedicated to doing graphics. And you get access to it through OpenGL and DirectX.
And the reason they are fast is that they do NOT just "start from scratch and control the pixel data".
So please, if you want serious answers, try letting those with the knowledge to answer your questions decide which question is best.
The correct answer, if you want efficient graphics, is to use DirectX or OpenGL.

How do I control a motor wirelessly?

I am a ME undergrad and am designing an implant device that requires programming knowledge. I honestly have no idea how to get started and am looking for advice. Basically what I need is a way to control a stepper motor. Stepper motor's use steps (pulses) to rotate the gear head. Now this motor I'm using needs 20 steps to revolve once. I need to be able to control the # of steps I want in a day per say. The motor I'm purchasing comes with an encoder which I'm guessing connects to the circuit board. Now what I want to do is have an external control (like a remote control for a toy)that can set these rates. I don't know anything about radio transmitters, or how to program the circuit board to do this for me. Any help would be appreciated, or books I can look into, websites, or tutorials. Thanks.
There are many ways of solving this problem, but it is more of a systems engineering question than a programming question; until you know what the system looks like, there is no way of determining what parts will be implemented in software. More details would be required to provide a specific answer.
For example what are the security/safety considerations?
What wireless technology do you need to use? e.g. RF or IR, if RF then licensing may be an issue, and that may vary from country to country. You could use BlueTooth, ZigBee, or even WiFi, but these technologies are probably more expensive and complex than necessary for such a simple application. If IR then is immunity from interference from TV remotes or PC IrDA ports or similar required?
If the commands/signals from the remote are complex you will probably need both the remote and the motor driver to incorporate a micro-controller and software. On the other hand if you just need increase/decrease functions then it would be entirely possible to implement the remote functionality you describe without any processing at all (depending on teh communication technology you choose).
What is the motor encoder for? Stepper motors do not normally need an encoder since the controller can simply count steps executed in either direction to determine position. Is the encoder incremental or absolute? If it is incremental, then it is certainly not needed; if it is absolute than it may be useful if you need to know the exact position of the motor on power-up without having to perform an initialisation or requiring end-stop switches.
You mentioned a "circuit board"; what hardware do you already have? What does it do? Do you have documentation for it? If it is commercially available, can you provide a link so we can see the documentation?
As you can see you have more system-level design issues to solve before you even consider software implementation, so the question is not yet ready to be answered here on SO. I suggest you seek out your university's EE department and team-up with someone with electronics expertise do design a complete system, then consider the software aspects.
Well worth taking a look at the Microchip site:
http://www.microchip.com/forums/f170.aspx
They produce microcontrollers that can be programmed to do exactly what you require (and a lot more).

Permanent DOS Attacks - Anyone Knowledgeable?

So, I'm looking into Permanent DOS attacks for a class, and I'm having a hard time coming up with concrete examples. There's a lot of information about Phlashing (flashing firmware to either brick the device, or put malicious firmware in its place, for those of you who don't know the term) but I'd like to have a broader set of examples.
That being said, there has to be a way to write code that will do something like wear out disk arms, right? Something that will have the disk seek to the end of the disk, then back to the front, on and on. Anyone have an example of how that would be accomplished? Is there some way to specify where to track to on a disk in C (similar to traversing to a certain point in a file, but for the entire HDD!)? If not, I guess there's always trying to force a file's location on the disk... which seems like less fun trying to accomplish. Again, can you do something like that programmatically?
If anyone has any insight into these types of attacks, or any good resources for me to check into, I'd appreciate it. Maybe you read a story about it on Slashdot a few years back? Let me know! The more info I can gather, the less likely I'll be forced to kill time during my talk by bricking my router in the class :) I'm not made of money OR routers!
Seems like these would primarily be limited to physical attacks and social engineering ("To enable your computer's hidden turbo function, remove the cover and pry this part). But:
Adjust screen refresh rates to insane values to blow older CRTs
Monkey with ACPI fan, charge, or battery controls if possible to cause overheating or battery failure.
Overwrite every rewritable storage device of every kind attached to any bus. Discover and overwrite any IDE, USB, etc... device you know the flash updater details for.
Of course nothing is permanent. You can replace the hard drive, BIOS chips, CPU, motherboard, memory, etc...
Although it is mostly fictional, the halt and catch fire operation would be a very convenient and permanent DOS attack.
Steve Gibson (google his name) has a paper he wrote a few years back about protocol-level vulnerabilities in TCP/IP. Some of it is still pertinent today.
Socially engineer the power company or ISP to turn off service at the location in question.
Many devices in the computer today have their own firmwares, including but not limited to CPU, DVD, HDD, VGA, motherboard (BIOS) etc. Most of these devices also have a way of updating their respective firmwares. Which can also be used to brick them pretty efficiently. Although this does require an individual approach to every device, often using privileged instructions and undocumented interfaces.
It's possible for a virus to do this. I seem to recall an actual virus doing this back in the day, but can't find anything to back that up.
I was able to find an article where the author has a conversation with the VP from Western Digital wherein he states a program could potentially access a hard drive's firmware causing such a DOS attack:
There are back doors if you will that allow us to get into places that the operating system can't go through the IDE connector
There used to be a few viruses that could cause old CRT monitors to break. They could cause invalid sync signals out the VGA point that would be too high in frequency for the video sweep. I also remember a few that would use bad sector flagging to draw images on the old versions of Scandisk (we are talking early 90’s or older.) I don't remember and of the names or have any references, but they used to be quite annoying.
Fortunately better circuits, memory protection, API abstraction have made such attacked very difficult to impossible.