Camera live View in apple watch - camera

On 3/9 apple demo.
They show how to use apple watch to see the camera live view and open the garage door.
I think they should integrate with HomeKit to control the garage door,
but how to get the camera live view??
Do these live view images come from iphone app?
then using bluetooth to pass the image data to watch?

The iOS parent app to the Watch app most likely integrates with HomeKit to control the garage door and the webcam.
In order to display the images from the webcam on the Watch, they are most likely writing those images into the Shared App Group between the iOS app and the Watch Extension using MMWormhole or a similar approach. They then read the images from the App Group and push them to the Watch over bluetooth and WiFi using the WKInterfaceDevice addCachedImage(_:name:) method. Once the image is uploaded to the Watch, it can then displayed on the Watch using a WKInterfaceImage or WKInterfaceGroup background image.

Related

Agora Live Stream Dual Camera

I am currently trying to produce an android app that can do live broadcasting. May I know if Agora has the functionalities to access both the rear and front camera of the broadcaster at the same time? If yes, which part of the code do we need to modify (based on Open-Live-Android)?
Agora does offer a demo that directly displays the code you are looking for, but if you can get both camera frames (which some devices may not support that), you can take a look at this demo app: https://github.com/AgoraIO/Advanced-Video/tree/dev/win-screenshare/Screensharing/Agora-Screen-Sharing-Android. In this demo app, the SDK is sending both camera view and screen share view at the same time. In order to achieve that, you need to make screen share as a standalone service. Following a similar logic, you can change the screen sharing part to one of the camera view.

tvOS notification when TV's source/input/HDMI changed

So I have a video app for tvOS and I would like to serve ads on it, but one of the requirements is to not serve them while the TV is not showing them.
All good, but if you change the source of your TV and the Apple TV is still running it will continue to run and not lock or turn off while watching the videos. I thought that the resign active notification or entered background will give me that information, but they do not.
Is there any way to check if tvOS is currently on screen?
I am using tvOS 12 and AVPlayer.

Access to SD card on Sony DSC-QX100 camera via Remote Camera API?

I'm working with a Sony DSC-QX100 camera via Sony's Remote Camera api, specifically for use on a Windows 8 tablet (basically replacing the built-in camera of the tablet with this unit). I'm able to consume the camera's LiveView (streaming) and LiveShot (take a picture and retrieve the image from url) features triggered from my application.
My question is whether or not the Remote Camera api exposes any functionality to access pictures stored on the camera's SD card (when it is available). Bottom line, my user may choose to take the picture directly with the camera unit (manually, instead of remotely via my application on the tablet) and I've not yet found the method by which to retrieve this picture from the camera (other than transferring the SD card from camera to pc). Anyone tried this or seen something in the API documentation that I'm missing?
Current Camera Remote API does not provide direct access to SD card.
All available APIs are documented.

Open the iPhone camera?

We need to open the iPhone camera, to take images that will be saved to the camera roll.
I have read many examples here that all of them opens the UIImagePickerView.
Besides the fact that i cant understand why i have to open the picker view in order to open the camera , i just can't do that- i dont want the picker view, because i have my costumed photo album that we build, and we just need to have a little button in it, that opens the camera to take an image . without opening any other views above it .
Is that possible to use the camera without this pickerview that will cover my scene ?
or can i lead the user to the camera app and than take him back to my app ?
Thanks.
Instead of high level (i.e. Apple supplies the UI element) classes, you have to go to a more foundational (lower) level of API's, which would be AVCaptureDevice and AVCaptureDeviceInput.
And Apple has some nice source code available in their AVCam project.
If you want to display camera stream in you app without UIImagePickerController than you should you AVFoundation framework.
Here some some examples and tutorials:
take-photos-with AVFoundation
Custom camera
Displaying camera

Objective-C iPhone - Playing youtube within an app

Is it possible to play youtube using the method described in this url
http://iphoneincubator.com/blog/audio-video/how-to-play-youtube-videos-within-an-application
but with a custom button? (i.e in the picture in the link, it's of a baseball game with the play button overlay on top, I want that to be a custom button that I create)
Thank you,
Tee
No, the way that you play YouTube videos is by opening it in the mobile site, which takes you to the embedded QuickTime/YouTube viewer. It doesn't play them in the view of your app like the QTView would on a Mac.