Is it possible to access video data on both cameras on ipad2 at the same time? - camera

Could we get two live video streams at the same time? Or should I switch cameras to get two streams?

According to Apple, accessing both cameras at once is not supported. If you have access to the Apple Developer Forums, see this post.
Also, BTW, switching streams is quite slow in my experience and would be no substitute for accessing both cameras at once.

Related

Any follow-up to videoHD Stitching Project by Luke Yeager?

any progress made in stitching video since the last answer by Luke Yeager ?
I plan to develop 360 surround view for my car
Since --- StitcHD Project --- by Luke is already 5 years old
I expect some progress to be announced in technology, faster GPU live video processing and better depth maps matching.
https://github.com/lukeyeager/StitcHD
I would prefer WebRTC video tools but didn't get any answer on how to connect 4 usb webcams and get 4 live video streams for stitching.
If you want to enumerate all devices using WebRTC,
https://webrtc.github.io/samples/src/content/devices/input-output/
shows how to do that and then open a specific device. You can open multiple devices at the same time but mind you that this is going to be taxing for the USB bus.
The canvas sample which shows how to grab a frame from the video stream and convert it into an image might be useful too.

Is there a Broadcast Receiver concept in Windows Phone?

I have developed many apps on Android, and I have tried using Broadcast Receiver concept in many of them such as knowing when the battery is low or such.
I would like to use the same concept in developing on Windows phone, but I could not find Broadcast Receiver in it.
I want my application to know whenever the user took a new photo by camera. In addition, keep tracking of it.
What do you suggest?
There’s no broadcast receiver concept in Windows Phone.
IMO, this is by design.
I know two good reasons why I wouldn’t want that on my phone: privacy (almost no one reads list while installing apps) and battery life.
You can read those photos while your app is running (caching file list+modified times for faster updates).
And/or, you might want to create a lens extension, here’s a link for WP8.

Does the Sony Remote Camera API control HDR modes, ISO, shutter speed, aperture and other "manual" settings?

I just bought a Sony A7 and I am blown away with the incredible pictures it takes, but now I would like to interact and automate the use of this camera using the Sony Remote Camera API. I consider myself a maker and would like to do some fun stuff: add a laser trigger with Arduino, do some computer controlled light painting, and some long-term (on the order of weeks) time-lapse photography. One reason I purchased this Sony camera over other models from famous brands such as Canon, Nikon, or Samsung is because of the ingenious Sony Remote Camera API. However, after reading through the API reference it seems that many of the features cannot be accessed. Is this true? Does anyone know a work around?
Specifically, I am interested in changing a lot of the manual settings that you can change through the menu system on the camera such as ISO, shutter speed, and aperture. I am also interested in taking HDR images in a time-lapse manner and it would be nice to change this setting through the API as well. If anyone knows, why wasn't the API opened up to the whole menu system in the first place?
Finally, if any employee of Sony is reading this I would like to make this plea: PLEASE PLEASE PLEASE keep supporting the Remote Camera API and improve upon an already amazing idea! I think the more control you offer to makers and developers the more popular your cameras will become. I think you could create a cult following if you can manage to capture the imagination of makers across the world and get just one cool project to go viral on the internet. Using http and POST commands is super awesome, because it is OS agnostic and makes communication a breeze. Did I mention that is awesome?! Sony's cameras will nicely integrate themselves into the internet of things.
I think the Remote Camera API strategy is better than the strategies of Sony's competitors. Nikon and Canon have nothing comparable. The closest thing is Samsung gluing Android onto the Galaxy NX, but that is a completely unnecessary cost since most people already own a smart phone; all that needs to exist is a link that allows the camera to talk to the phone, like the Sony API. Sony gets it. Please don't abandon this direction you are taking or the Remote Camera API, because I love where it is heading.
Thanks!
New API features for the Lens Style Cameras DSC-QX100 and DSC-QX10 will be expanded during the spring of 2014. The shutter speed functionality, white balance, ISO settings and more will be included! Check out the official announcement here: https://developer.sony.com/2014/02/24/new-cameras-now-support-camera-remote-api-beta-new-api-features-coming-this-spring-to-selected-cameras/
Thanks a lot for your valuable feedback. Great to hear, that the APIs are used and we are looking forward nice implementations!
Peter

Does Google Glass Mirror API support WebRTC

The Glass Mirror Timeline API document suggests that video can be streamed to a Glass headset (https://developers.google.com/glass/timeline). I'm trying to determine whether this could theoretically work with a WebRTC connection? Documentation is limited around the browser/rendering capabilities of the timeline so has anyone tried something similar?
Ran across a discussion about WebRTC + Glass in a reported issue - https://code.google.com/p/webrtc/issues/detail?id=2083
From the sound of things someone got it to work in chrome beta at 176*144. There were some audio/frame rate issues that appear to have been resolved. Note though they talk about streaming video/audio from the glass to another machine not video streaming into the glass display. I believe that at this point it will only work in chrome beta & doubt you could integrate this into the timeline smoothly, though with how hard Google is pushing WebRTC I would not be surprised to see increased support. I'll be testing this out with my own WebRTC demos soon & will know more.
WebRTC for Google glass has been reported: http://www.ericsson.com/research-blog/context-aware-communication/field-service-support-google-glass-webrtc/. There were some limitations, e.g. the device overheated after 10 minutes.
Another case - http://tech.pristine.io/pristine-presents-at-the-austin-gdg (thanks to Arin Sime)

Streaming IP Camera solutions that do not require a computer?

I want to embed a video stream into my web page, which is part of our own cloud based software. The video should be low-latency (like video conferencing), and it would be preferable, but not required, for it to include audio. I am comfortable serving streaming binary data from the server-side, and embedding it into the page using HTML5 video.
What I am not comfortable with is the ability to capture the video data to begin with. The client does not already have a solution in place, and is looking to us for assistance. The video would be routed through our server equipment, and not be an embedded peice that connects directly to the video source.
It is a known quantity for us to use a USB or built-in camera from the computer. What I would like more information is about stand-alone cameras.
Some models of cameras have their own API documentation (example). It would seem from what I am reading that a manufacturer would typically have their own API which they repeat on many or all of their models, and that each manufacturer would be different in their API. However, I have only done surface reading and hope to gain more knowledge from someone who has already researched this, or perhaps even had first hand experience.
Do stand-alone cameras generally include an API? (Wouldn't this is a common requirement, so that security software can use multiple lines of cameras?) Or if not an API, how is the data retrieved from the on-board webserver? Is it usually flash based? Perhaps there is a re-useable video stream I could capture from there? Or is the stream formatting usually diverse?
What would I run into when trying to get the server-side to capture that data?
How does latency on a stand-alone device compare with a USB camera solution?
Do you have tips on picking out a stand-alone camera that would be a good fit for streaming through a server?
I am experienced at using JavaScript (both HTML5 and Node.JS), Perl and Java.
Each camera manufacturer has their own take on this from the point of access points; generally you should be able to ask for a snapshot or a MJPEG stream, but it can vary. Take a look at this entry on CodeProject; it tackles two common methodologies. Here's another one targeted at Foscam specifically.
Get a good NAS, I suggest Synology, check out their long list of supported IP Web Cams. You can connect them with a hub or with a router or whatever you wish. It's not a "computer" as-in "tower", but it does many computer jobs, and it can stay on while your computer is off or away, and do thing like like video feeds, torrents, backups, etc.
I'm not an expert on all the features, so I don't know how to get it to broadcast without recording, but even if it does then at least it's separate. Synology is a popular brand and there are lot of authorized and un-authorized plugins for it. Check them out and see if one suits you.