Agora Live Stream Dual Camera - agora.io

I am currently trying to produce an android app that can do live broadcasting. May I know if Agora has the functionalities to access both the rear and front camera of the broadcaster at the same time? If yes, which part of the code do we need to modify (based on Open-Live-Android)?

Agora does offer a demo that directly displays the code you are looking for, but if you can get both camera frames (which some devices may not support that), you can take a look at this demo app: https://github.com/AgoraIO/Advanced-Video/tree/dev/win-screenshare/Screensharing/Agora-Screen-Sharing-Android. In this demo app, the SDK is sending both camera view and screen share view at the same time. In order to achieve that, you need to make screen share as a standalone service. Following a similar logic, you can change the screen sharing part to one of the camera view.

Related

What's the name of camera tiles view in agora.io?

I'm just playing around with agora.io, WebRTC and I want to implement a "camera tile view". I hope you understand what I mean, so all the (small) cameras of the users should be displayed in a row/table one next to each other or in a list, if too many users. The active speaking user gets a border around his camera view or sth like that.
Can anybody tell me the name of this kind of view or point me to a location, where I can check some samples about this?
Best regards, Alex
The Agora SDK's provide all the API's for building your own Ui, so there is no method within the SDK for generating a tile view, you would have to do that yourself.
That being said, the Agora developer community has some open source UI Kits that serve as a good starter template for your UI that you can adjust. The Agora Web UIKit supports tile view as the default.
Vanilla JS: https://www.agora.io/en/blog/adding-video-chat-or-live-streaming-to-your-website-in-5-lines-of-code-using-the-agora-web-uikit/
React: https://agoraio-community.github.io/Web-React-UIKit/

React Native: Multiple previews of camera stream

I've got an application in React Native to be developed in Android and iOS, in which I'd like to take the camera stream, make some processing, and render it multiple times.
Imagine an app like Instagram, in which you can add filters to the camera live and it shows the applied filters with on live previews. It would be more or less the same.
I need to get the camera stream and be able to show it on screen multiple times.
I've tried react-native-camera, but it only lets me to get one instance of the camera to be shown.
With this same library, I've tried to take pictures in intervals of the camera and to render them via <Image> container but, of course, it's a performance kill and the app ends up crashing.
How can I achieve what I'm trying to do? Do you know of any module or any approach that allows me to do so?
Thank you.

How to find out if the video is currently playing in full screen on OS X

We have an application for Mac OS X that needs to know when the user is watching a movie in full screen to change its behavior.
Is there any system programmatic "hooks" that allow native Objective-C application to know when fullscreen playback is started?
You can get a list of all windows by using the CGWindow API, like in the Son of Grab sample.
From there, you can look at the window levels to figure out which windows are full screen, but I am not aware of any way to look for video playback specifically, as different apps (VLC, QickTime Player) all use slightly different methods. Of course, you could hard code specific process names, and assume that they are doing video playback if they have a fullscreen window.

How to open the camera within the size?

I want to open the camera action like below pictures. just like camera open within some area only for focusing particular area of the image.
Please help me. Thanks in advance.
Normal Camera View
My Expectation
If you want to use the CameraCaptureTask, you will have to deal with its possibilities. It uses the built-in camera app of the phone, and available setting are the ones in that app.
If you prefer a customized photo taking app, you should use the PhotoCamera class:
PhotoCamera Class
You will find some guidance here:
How to create a base camera app for Windows Phone

iphone app that takes photo or video in one view

i am looking for some sample code/tutorial that will demonstrate taking a photo or video using one view in an iPhone app, with a toggle switch. Similar to the built-in iOS camer app.
any help is greatly appreciated!
This library includes the functionality for both photo and video taking. Also, it allows you to present a popup for the user to select photo/video.
https://www.cocoacontrols.com/controls/fdtake