How to develop new WebRTC HTML5 webcam video samples ? getUserMedia() - webrtc

I would like to develop a numer of demo WebRTC HTML5 video canvas apps, based on WebRTC samples from
https://webrtc.github.io/samples/
integrating some features and adding new ones.
I am especially interested to know if MS Windows + Firefox/ Chrome can support 2 independent webcam live video stream inputs ( Android can, Linux can)
I am interested to split 1 webcam video input stream into 2 halt FPS video streams.
Just tell me if you have tested all samples as above and they all worked for you.

Related

video live streaming application in React Native

I am trying to implement video live streaming
live streaming and
upload it to server and
save the streaming video (Playback)
in react native can any one help me with a sample project
this is will be helpful https://www.npmjs.com/package/react-native-video
for point upload it to server, what exactly do u need upload? video uploading or something else?
So - you'll need a backend server that can accept a video stream, and convert it into a stream that can be consumed in React Native. You'd also like the server to save the streamed video, and encode it so it can be played back as video on demand (VOD) after the stream has stopped. None of this is React - it'll all be done on the backend.
You can build all this yourself, but there are a number of APIs that can do this for you. Disclaimer: I work for one such company: api.video. (A Google search will find others)
For livestreaming from the browser, you can use getUserMedia and stream to a server. (you can see my demo using JavaScript at livestream.a.video. This stream wil be visible to all your users, and then also recorded and saved as VOD for later playback.
To upload a recorded video - you can use file.slice() to break the video into manageable chunks, and upload to the server - for transcoding into a video stream (demo at upload.a.video, and tutorial.)
For playback, these APIs will give you a player URL or the m3u8 url that you can incorporate into a video player. This is your React Native part - and there are several video players that you can add into your application to playback HLS video. At api.video, we have our own player, and also support 3rd party players.

Play a stream with Windows.Media.Playback.MediaPlayer?

In Windows Phone 8.1, I want to play an internet radio station with Windows.Media.Playback.MediaPlayer. I'm aware of a sample which plays mp3 files with MediaPlayer.SetUriSource. However, I don't know how to play a stream. I see MediaPlayer.SetStreamSource, but it appears to be for random access streams which support seek, etc. I'm not sure it's for live streams.
Which method should I use?
Any tutorials or docs on this? I couldn't find any.
You could have a look at the Windows Phone Streaming Media library, which supports live streaming and HLS formats.
http://phonesm.codeplex.com/

Stream Camery Video directly to android device

Our company wants to use the DSC QX10 for Sport video Capturing.
We have to cut the Videos in the Android app and send it to our Server.
Can we record the Video directly to the Android device? (Stream the Video or is there a different Idea to get the Video to Android?)
How long can the maximum length of the Video be? (We would need a 2h Video)
Thanx
Movie recorded using Camera Remote API on DSC-QX10 camera is stored on the memory card.
Best Regards,
Prem, Developer World team

Can you obtain audio stream data to the System output device using CoreAudio?

Is it possible to obtain a stream of audio data arriving at the system output (speakers, headphones, etc.) using CoreAudio or another framework?
Example: You're listening to a song on iTunes while watching a YouTube video, all while playing a computer game that makes sounds of its own, all of which are being played through your computer's speakers (Probably terribly annoying). My app would need to receive the entire mix as streaming data.
Thanks in advance.
Not at a user application's Core Audio or other app framework level. Some audio output capture/snoop apps may do this with a kernel extension (kext), or perhaps a replacement audio hardware driver.

How to record Audio and Video at Mac separately?

I'm building application that needs to send recorded audio and video data to net separately.
Currently I'm using QTKit to capture media but it only allows to work with video and audio data combined.
Is there any way or frameworks that allows to work with video and audio apart?
I'm using QTKit to capture media but it only allows to work with video
and audio data combined.
False.
You can add multiple inputs and outputs to your capture session with QTKit. They don't have to combine the A/V. Start here and read the entire document, then post another question if you have trouble.
AVFoundation framework also support for recording audio and video.
refer this sample code
AVRecorder
but this framework only support for OS X v10.7 or later