I'm developing an AR Remote assistance WebRTC in NodeJS, Opentok SDK and Three.js. When I initialize the AR session, the AR experience appears in a fullscreen mode and my video sharing stops which works with opentok session. How can I create a AR scene with my opentok video sharing for my video sharing doesn't stop?
Thanks.
Related
I have Inskam Wi-Fi endoscope.
After launch, it starts Wi-Fi network. To see video stream, you need to connect to network via phone with Inskam application for Android or iOS installed.
But I need to capture a video stream on my PC.
I think that camera turns on the streaming application. My idea is to directly access the streaming resource.
I've tried to access it from my PC via connect to camera Wi-Fi network:
http://192.168.29.102:8080/
http://192.168.29.102:8080/?action=stream
http://192.168.1.1:8080/
http://192.168.1.1:8080/?action=stream
Connection is failed.
How to capture video stream of camera via PC?
Do you want to use your PC to play the stream from your Wi-Fi endoscope, which only provides iOS/Android app to watch the stream?
Maybe there is another solution: Use iOS or Android to play the stream, then use OBS to capture the screen of your mobile phone(please see tutorial for iOS or Android), now you could use OBS to stream to anywhere.
It works like this:
Wi-Fi ---Wifi--> iOS/Android ---USB--> OBS
endoscope (PC)
OBS is running on your PC and you could record the stream.
If you would to broadcast the stream to internet or other mobile phone, then you could use OBS to publish it to a media server(SRS) or live streaming platform, like bellow:
OBS ---RTMP--> SRS/YouTube/Twitch ---> Players
(PC) (PC/Mobile)
It enables multiple users to watch the stream.
I'm currently building an app that requires a live stream integration from a Raspberry Pi connected to a camera module into a React Native Expo Cli app. Since I'm particularly concerned about latency, I'm not sure what the best method is for streaming video. Currently I've found https://mux.com/ which would require setting up a server on the pi. Does anybody have any suggestions?
I am trying to build a live stream video app.
I built an rtmp server which is ready for publishing and playing streams. I need a way to capture mobile's user camera and send the online stream to my rtmp server.
I use react-native in client side. I found react-native-camera which is great in dealing with camera but I couldn't find any event/api available for accessing camera stream in their documentations.
Another problem is the way that I have to send the stream to rtmp server. I have no knowledge in this area so any kind of help will be appreciated.
For anyone else who faces the same issue, This repo is the ultimate solution.
https://github.com/NodeMedia/react-native-nodemediaclient
Good Evening,
I'm starting the develop of an App with the Action Cam FDR-X1000, and I have 2 questions about the capabilities of the system:
- can I manage the camera (using Camera Remote API) by the usb interface instead the Wifi?
- using the wifi interface, if I'm operating with the camera by, for example, a smartwatch, can I connect a smartphone to the same camera and send commands to retreive stored images, etc?
Thank you for the attention
Andrea Carapezzi
To answer your questions.
You will not be able to communicate with the camera using the camera API over usb
You will be able to connect your camera to a smartphone and get images from the camera using the camera remote API. You can visit the Camera remote API landing page to find out more: https://developer.sony.com/develop/cameras/
Does Apple offer any developer APIs for reading data from the camera connection kit? (either via USB or on a card thats plugged in?)
Nope, not at this moment at least.