Video transmission from OV2640 camera using STM32F429IGT6 microcontroller - camera

For the project, I need to transfer video from the OV2640 camera using the STM32F429IGT6 microcontroller to a 7-inch screen on Open429I-C. So I have a few questions.
What will be better in terms of FPS, quality and additional features: transfer video to this screen, or make a utility on a PC and transfer video there (especially considering that in the future it is possible to expand the functionality)?
Can someone share tips/materials/examples on this topic?

Related

How steering wheel with force feedback read and send data amount windows using USB HID

How steering wheel with force feedback read and send data amount windows using USB HID?
Is there any sample project of driver for steering wheel to communicate with PC?
Or detailed docs about develop a firmware on steering wheel device?
I have viewed pid.pdf but have no idea how to do detailly.
Thanks!

Wireless camera for Convolutional Neural Networks

Soon i will start working on a project that requires me to classify different objects using CNN (Convolutional Neural Networks) and track it using a drone. The camera i need should stream live video on FHD at #60. I searched a lot especially the go pro's camera but i didn't find anything related to how many frames during live stream. Hope you can suggest me some cameras.
You could use a GoPro camera with a HDMI to USB capture card, this will give you 1080p 30 or 60 fps depending on the resolution and frame rate chosen. I'd look for a OpenMV Cam H7 if possible, the camera is designed for computer vision.

Does the Sony Remote Camera API control HDR modes, ISO, shutter speed, aperture and other "manual" settings?

I just bought a Sony A7 and I am blown away with the incredible pictures it takes, but now I would like to interact and automate the use of this camera using the Sony Remote Camera API. I consider myself a maker and would like to do some fun stuff: add a laser trigger with Arduino, do some computer controlled light painting, and some long-term (on the order of weeks) time-lapse photography. One reason I purchased this Sony camera over other models from famous brands such as Canon, Nikon, or Samsung is because of the ingenious Sony Remote Camera API. However, after reading through the API reference it seems that many of the features cannot be accessed. Is this true? Does anyone know a work around?
Specifically, I am interested in changing a lot of the manual settings that you can change through the menu system on the camera such as ISO, shutter speed, and aperture. I am also interested in taking HDR images in a time-lapse manner and it would be nice to change this setting through the API as well. If anyone knows, why wasn't the API opened up to the whole menu system in the first place?
Finally, if any employee of Sony is reading this I would like to make this plea: PLEASE PLEASE PLEASE keep supporting the Remote Camera API and improve upon an already amazing idea! I think the more control you offer to makers and developers the more popular your cameras will become. I think you could create a cult following if you can manage to capture the imagination of makers across the world and get just one cool project to go viral on the internet. Using http and POST commands is super awesome, because it is OS agnostic and makes communication a breeze. Did I mention that is awesome?! Sony's cameras will nicely integrate themselves into the internet of things.
I think the Remote Camera API strategy is better than the strategies of Sony's competitors. Nikon and Canon have nothing comparable. The closest thing is Samsung gluing Android onto the Galaxy NX, but that is a completely unnecessary cost since most people already own a smart phone; all that needs to exist is a link that allows the camera to talk to the phone, like the Sony API. Sony gets it. Please don't abandon this direction you are taking or the Remote Camera API, because I love where it is heading.
Thanks!
New API features for the Lens Style Cameras DSC-QX100 and DSC-QX10 will be expanded during the spring of 2014. The shutter speed functionality, white balance, ISO settings and more will be included! Check out the official announcement here: https://developer.sony.com/2014/02/24/new-cameras-now-support-camera-remote-api-beta-new-api-features-coming-this-spring-to-selected-cameras/
Thanks a lot for your valuable feedback. Great to hear, that the APIs are used and we are looking forward nice implementations!
Peter

windows PC camera image capturing, not taking one frame in a video stream

I got a question of image capture with a PC camera(integrated note book camera or web cam). While I am developing a computer vision system in which high quality image capture is the key issue, most of the current method is use VFW or directShow to capture video stream and snap one frame as an image.
However, this method could not get high quality image ( or using up the full capacity of the camera). For example, I got a 5 mega pixel web cam. but the video stream is maximum 720P(USB bandwidth problem?). Video streaming is wasting some of the camera sensors.
Could I video streaming and taking picture independently? like inputing video with a 640*480 video stream and render on the stream. then take a picture of 1280*720 from the same cam? I guess this would be a hardware problem? the new HTC one X camera?
In short, it's there a way for a PC system to take a picture ,full use of the sensor capacity, without video streaming and capture one frame. Is this a hardware related problem? Does common web cam support this? Or a Software problem, I should learn DirectShow things?
Thanks a lot.
I vaguely remember (some) video sources offer both a capture and still pin, the latter I assume would offer you higher quality. You can easily test this in GraphEdit. If it works then yes, you'll have to learn DirectShow. Or pay someone to code this for you.

How to get a single screen shot from photo camera using microcontroller

Let's imagine that we have any of popular photocameras (like Canon or whatever) installed on a mechanical platform. This platform allows us to accurately adjust camera's lens direction to any interesting object. This platform is controlled from PC via microcontroller board. But we need a feedback from a photocamera - the image which currently appears on camera's display. Obviously, this feedback is required to be sure that the camera looks in a right direction. At the moment I don't know how to get a single shot image from photocamera by a microcontroller.
Could you please recommend me any directions to dig to ? Any recommendations on how to select photo camera (web cameras are not allowed) ? Any tips ?
Thank you in advance =)
Dwelch is right, you need to pick a "friendly" camera and work from there - google CHDK for a starter.
You could use the SPI interface of a micro to spoof being an SD card, and accept image data from the camera straight into the micro, but you would probably need quite a fast micro with a fair amount of RAM, especially if you want to do any processing on it.
Other than that, you could sample the camera's AV-output (if it has one), either into the micro or straight into the PC via a USB capture stick (or USB capture stick into micro if you're being a show-off), or maybe interrogate the camera over its USB or (insert name of proprietary port here) IO port.
Getting more hacky (yes, even more!) you could sniff the LCD data bus of the camera and steal the image from that, but that brings all sorts of pain, and tiny, tiny screws.