Access to SD card on Sony DSC-QX100 camera via Remote Camera API? - api

I'm working with a Sony DSC-QX100 camera via Sony's Remote Camera api, specifically for use on a Windows 8 tablet (basically replacing the built-in camera of the tablet with this unit). I'm able to consume the camera's LiveView (streaming) and LiveShot (take a picture and retrieve the image from url) features triggered from my application.
My question is whether or not the Remote Camera api exposes any functionality to access pictures stored on the camera's SD card (when it is available). Bottom line, my user may choose to take the picture directly with the camera unit (manually, instead of remotely via my application on the tablet) and I've not yet found the method by which to retrieve this picture from the camera (other than transferring the SD card from camera to pc). Anyone tried this or seen something in the API documentation that I'm missing?

Current Camera Remote API does not provide direct access to SD card.
All available APIs are documented.

Related

Agora Live Stream Dual Camera

I am currently trying to produce an android app that can do live broadcasting. May I know if Agora has the functionalities to access both the rear and front camera of the broadcaster at the same time? If yes, which part of the code do we need to modify (based on Open-Live-Android)?
Agora does offer a demo that directly displays the code you are looking for, but if you can get both camera frames (which some devices may not support that), you can take a look at this demo app: https://github.com/AgoraIO/Advanced-Video/tree/dev/win-screenshare/Screensharing/Agora-Screen-Sharing-Android. In this demo app, the SDK is sending both camera view and screen share view at the same time. In order to achieve that, you need to make screen share as a standalone service. Following a similar logic, you can change the screen sharing part to one of the camera view.

photo shoot not working with camera connected to the computer

I am trying to take photos while the camera is connected to the computer. As soon as the camera is connected to to PC the camera changes it's mode to busy. I would like trigger the camera to take picture while it is connected to the computer.
You can use the EdsSendStatusCommand function with kEdsCameraStatusCommand _UIUnLock to use the camera manually while plugged in. Note that some commands issued with the Canon SDK may lock the camera UI again.

Still pin capture on Linux. Is this possible?

A. I have a general understanding question about the "still pin" or "snapshot" functionality on some web cameras: how does this work ? It must be one of the following possibilities:
The camera is on and video is streaming to the host. When the snap button is pressed, a signal is sent to the host's camera driver (/dev/input/event0 on linux), the driver extracts a frame from the stream, and sends it up in the stack.
The camera is on and video is streaming (or not) to the host. When the snap button is pressed, the on board firmware puts aside the current frame, and tells the host a new "still" is available.
B. I have 4 usb cameras attached on a R-Pi (single usb host). All cameras have a still pin. I don't care about the video, no need for streaming, I want to take 4 simultaneous photos. My idea is to trigger all 4 cams to capture a frame using the still pin. How can I capture those 4 images without streaming video (bandwidth issues) ?
Note: I have already experimented a lot and I am capable of capturing a frame from a video stream. My cameras are unknown brands but exposes "video capture" as device caps. When using AMCap on windows, the snap button triggers a snapshot.
Thanks for any help.

Camera live View in apple watch

On 3/9 apple demo.
They show how to use apple watch to see the camera live view and open the garage door.
I think they should integrate with HomeKit to control the garage door,
but how to get the camera live view??
Do these live view images come from iphone app?
then using bluetooth to pass the image data to watch?
The iOS parent app to the Watch app most likely integrates with HomeKit to control the garage door and the webcam.
In order to display the images from the webcam on the Watch, they are most likely writing those images into the Shared App Group between the iOS app and the Watch Extension using MMWormhole or a similar approach. They then read the images from the App Group and push them to the Watch over bluetooth and WiFi using the WKInterfaceDevice addCachedImage(_:name:) method. Once the image is uploaded to the Watch, it can then displayed on the Watch using a WKInterfaceImage or WKInterfaceGroup background image.

CFBundleDocumentTypes and iPad camera connection kit (SD reader)

I am trying to register a custom document type in an iPad app, with the hopes that I can be prompted to work with files on an SD card attached to the iPad through the camera connection kit.
Does anyone know if this is currently possible (or even for sure if it's not possible)? I was hoping that the Camera tab in Photos would use a document interaction controller for types it did not understand, but it seems like Photos does not check for alternate handlers at all - I tried also registering support for the jpg type but was not prompted when using photos and loading in JPG images from an SD card.
There is an SD reader app called Zoomit that appears to read many different types from an SD card:
http://currentphotographer.com/zoomit-application-enabled-sd-card-reader-for-the-iphone/
But it seems that involves custom hardware, that differs from the standard SD reader.