Playing gstreamer udp stream in VLC - udp

I'm trying to play in VLC the next gstreamer pipeline form:
appsrc -> omxh264 -> h264parser -> mpegtsmux -> udpsink
In order to play it in VLC I setup the network stream source to:
udp://192.168.1.12#:5000/
The VLC only plays the video following the next sequence:
1) Start VLC player and open network connection, (remains on waiting status).
2) Start/play the gstreamer pipeline.
But if I try the other way around, i.e. start first the gstreamer pipeline and after VLC, it is not able to display the incoming udp video stream.
Ideally I need to play/start the gstreamer udp stream and open VLC any time I need to play/see the video.
Does anyone have any idea/clue why it's behaving like this?

I've finally got it working, I had to setup the distance between two consecutive Intra frames (gop-length) greater than 0, now I can start VLC any time and it will play the video stream.

Related

Can't send Media Stream of captureStream() through WebRTC

I create a WebRTC with simple peer. I need to capture stream from video element and send it to peers. I use vidEl.captureStream() to get Media Stream from video element and send it to other peer. It's working but only audio and video is blackscreen(not something is display).
[Testing] I create another video element on same origin of first video element and use captureStream() and test setObject with Media Stream captured and it's working.
How to send MediaStream from video.captureStream() to new peer through WebRTC and display video from remote peer.
Sorry for bad english.
https://webrtc.github.io/samples/src/content/capture/video-pc/
This sample does'nt work on chrome, edge but fire is worked. I don't know why.
There is currently an open bug in Chrome which breaks this:
https://bugs.chromium.org/p/chromium/issues/detail?id=1156408

WebRTC Audio Streaming without attaching to a html element

With reference to this tutorial. WebRTC - Voice Demo
The audio stream is attached to an <audio /> element for both Local and Remote audio.
I am a bit confused now. Aren't we suppose to connect the local audio to a Mic, instead of playing it?
Besides, for remote audio, is it possible for me to play the audio only without attaching it to an <audio /> element?
No, you need to create an audio element to play local audio as well.
The moment you call navigator.webkitGetUserMedia with constraints { video: false, audio: true }, you get your mic stream as a parameter on the callback. Then in order to play it, you need to attach it to an audio element, just like you would for remote audio, however, the remote audio stream will come from ontrack event instead.
But honestly, there is no reason for you to play your own local stream since the sound of your voice will overlap with the sound from the audio player. I guess they did it on the tutorial for demonstration purposes.
Btw, this tutorial you referred looks quite old, so if things don't work, that might be the reason why.

webrtc video stream stop sharing

I have created webrtc based video chat suing peerjs.
The local and remote video element is created using control:
local:
'video id= [local_peer_id] autoplay="true" controls="true">'
remote:
and
'video id= [remote_peer_id] autoplay="true" controls="true">'
Now during the video chat if local user mute auido remote user can not hear anything and its working perfect.
Problem is with the video. If local user pause his own video he can see the video is paused but remote user still can see his video live.
on the other hand if remote user pause his video, local user still can see his video live.
Any one tell what need to do to deploy the feathure
"Pause" and "resume" video that works real time for both peer?
You need to know the difference between the HTML tags and the WebRTC streams...
You can have streams running, without having them attached to any HTML tag, and the media can still be sent and received by each peer. So, each peer can attach the stream to a audio/video tag, and the tag will only act as a player that you use to play a stream that is already running.
So, if you mute the HTML tag, you will only be muting the player, and not the stream. If you want to make anything to have effect on the other peer, you need to do stuff on the stream or in the peer connection.
In particular, to mute and resume audio or video, you need to toggle the media tracks in the media stream
// create a button to toggle video
var button = document.createElement("button");
button.appendChild(document.createTextNode("Toggle Hold"));
button.onclick = function(){
mediaStream.getVideoTracks()[0].enabled =
!(mediaStream.getVideoTracks()[0].enabled);
}
To pause/resume audio, use getAudioTracks() instead.
calling mediaStream.stop() will stop the camera
where mediaStream is the stream that you got when calling getUserMedia
Here is a working example
mediaStream.getAudioTracks()[0].stop();
mediaStream.getVideoTracks()[0].stop();
Hope this will work with new standards. Its working fine in my app.

recording a remote webrtc stream with RecordRTC

I am using Opentok JavaScript WebRTC library to host a 1-to-1 video chat (peer-to-peer).
I can see my peer's video and hear the audio flawlessly.
My wish is to record audio / video of other chat party (remote). For this purpose, I'm using RecordRTC.
I was able to record the video of other chat participant (video is outputted to HTML video element), but, so far, I have not succeeded in recording audio (a dead-silence .wav file is as far as I could get). Using Chrome Canary (30.0.1554.0). This is my method:
var clientVideo = $('#peerdiv video')[0];//peer's video (html element)
var serverVideo = $('#myselfdiv video')[0];//my video (html element)
var context = new webkitAudioContext();
var clientStream = context.createMediaStreamSource(clientVideo.webRTCStream);
var serverStream = context.createMediaStreamSource(serverVideo.webRTCStream);
webRTCStream is a custom property i assigned to HTMLVideoElement object by modifying source of opentok js library. It contains MediaStream object linked to respective < video > element.
var recorder = RecordRTC({
video: clientVideo,
stream: clientStream
});
recorder.recordAudio();
recorder.recordVideo();
Video is recorded. Audio file is also created, it has a length that is close to video's length, however, it's completely silent (and yes, there was a lot of noise making on the other side during recording)
I've tested this with video element which displays my webcam's video stream (and audio), and it worked: both audio and video were recorded:
...
var recorder = RecordRTC({
video: serverVideo,
stream: serverStream
});
...
Is there something special about streams originating from a remote location? Any guidance on this issue would be very helpful.
This is the same issue occurs in following situations...
If not a stereo audio (dual channel audio)...i.e. it is mono audio
If audio input channels are not equal to audio output channels
If audio input device is not the default device selected on chrome
I'm still trying to find the actual issue.
I added this experiment for testing purpose... see console...
https://webrtc-experiment.appspot.com/demos/remote-stream-recording.html
Updated at: Saturday, 1 February 2014, 09:22:04 PKT
Remote audio recording is not supported; and this issue is considered as low-priority wontfix:
Support feeding remote WebRTC MediaStreamTrack output to WebAudio
Connect WebRTC MediaStreamTrack output to Web Audio API
Updated at March 28, 2016
Remote audio+video recording is now supported in RecordRTC, since Chrome version 49+.
Firefox, on the other hand, can merely record remote-audio.
If Chrome/WebRTC/Opus outputs mono audio by default and if that is the problem here, I see two options in that case:
By making opus output stereo - not sure how.
By making the RecordRTC/Recorderjs code work with mono
Or does anyone know any other recording library that works?
This actually now works fine in Firefox. I am using FireFox 29.0.1 and the AudioAPI can now work with audio streams sources grabbed from remote parties from a peer connection.
To test go to Muaz Khan's experiment page. I am not sure with what version of Firefox this rolled out but I would like to thank the team for cranking it out!
The chrome bug was moved to the AudioAPI team cr bug to track progress

What is streaming video?

I do this in my code, capturing a frame from camera, opening a UDP connection, and sending frame data:
UDP_Client.Connect(Host, Port)
Dim sendBytes As Byte() = Data
UDP_Client.Send(sendBytes, sendBytes.Length)
I do it every 400 ms.
Is it streaming video?
Streaming is basically playing a video from a network source, you can stream live video or pre-recorded video ala YouTube.
Your source at 400ms is 2.5fps, but technically speaking even 0.0001fps is still a video feed just very slow. What is the definition of a video, a sequence of still images, so as long as the picture does update it can be considered a video.
Is 2.5fps a good video feed, maybe not but it is still video.