How are video and audio synchronized by live555 framwork? - webrtc

when video and audio are made into RTP package ,and then sending-end sends RTP packages to receiving-end. I want to know that which end is this synchronized process made? sending-end or receiving-end ?

Audio and video are sent in separate RTP sessions each with their own random timestamp offset. They are synchronised at the receiver using RTCP Sender Reports (SR) sent by the sender. RTCP SRs map the RTP timestamps of an RTP session (e.g. video or audio) to an NTP timestamp allowing the receiver to synchronise audio and video. This is all specified in RFC3550. The book by Colin Perkins "RTP: Audio and Video for the Internet" has an excellent description of many aspects of RTP including synchronisation.

Related

Can Isochronous stream and control signal methods are possible simultaneously in USB-OTG without any data corruption/ delay in video stream?

Here, Data transfer is for controlling video pause and video record. We are using iMX8Mini Eval board - for streaming video to Android Mobile via USB-OTG. We would like to know, whether video stream does not affected with any command sent over same USB-OTG.

How does audio and video in a webrtc peerconnection stay in sync?

How does audio and video in a webrtc peerconnection stay in sync? I am using an API which publishes audio and video (I assume as one peer connection) to a media server. The audio can occasionally go out of sync up to 200ms. I am attributing this to the possibility that the audio and video are separate streams and this accounts for the why the sync can be out.
In addition to Sean's answer:
WebRTC player in browsers has a very low tolerance for timestamp difference between arriving audio and video samples. Your audio and video streams must be aligned (interleaved) precisely, i.e. the timestamp of last audio sample received from network, should be +- 200ms or so comparing to timestamp of last video frame received from network. Otherwise WebRTC player will stop using NTP Timestamps and will play streams individually. This is because WebRTC player tries to keep latency at a minimum. Not sure it's good decision from WebRTC team. If your bandwidth is not sufficient, or if live encoder provides streams not timestamp-aligned - then you will have out of sync playback. In my opinion, WebRTC player could have a setting - whether to use that tolerance value or always play in sync, using NTP Timestamps, at the expense of latency.
RTP/RTCP (which WebRTC uses) traditionally uses the RTCP Sender Report. That allows each SSRC stream to be synced on a NTP Timestamp. Browsers do use them today, so things should work.
Are you doing any protocol bridging or anything that could be RTP only? What Media Server are you using?

Where to start learning about basics about sending rtp packets for audio video

Can someone provide training or documentation about how actually video flows between two sip clients. I know basics of where client apis but don’t have much knowledge on how rtp packets are formatted , send over wire and received by other client
What are the header and what happens during packet loss
And how does a video gets converted into a rtp packet
Colin Perkins book "RTP: Audio and Video for the Internet" book is, despite being published in 2003, still the best guide for this.

rtmp live AND RTMP FLV

how to interrupt rtmp flv broadcast and publish live rtmp broadcast on red5?
I am using osmf strobe player. I have my flv playlist working but when I broadcast live from my webcam what is the formula to stop the flv streams, then play flv countdown video then connect live broadcast from web cam?
Here is how I would do it, from a high-level since I don't have the code for what you're asking and "easy" transitions between streams isn't built-in to the server.
First, create a signaling or event system within your app to accept actions triggered by your broadcaster. Using the signaling system, transition your subscribers / viewers by sending triggered events telling their players to stop playing a current video and start a new one. I suggest using Shared Objects for this when passing signals around. Use server-side methods called by your broadcaster to send the signals on the Share Object. The "play" functionality is the easy part since you simply provide the stream name in your signal / event.

How Chrome/Firefox handles SRTCP report comming from WebRtc connection?

SRTCP tracks the number of sent and lost bytes and packets, last received sequence number, inter-arrival jitter for each SRTP packet, and other SRTP statistics.
Does mentioned browsers do something with SRTCP reports when dealing with audio stream, for example adjust bitrate on the fly if network conditions are changed ?
Given that Chrome does adjust bitrate and resolution of VP8 on the fly in a connection, I would assume that OPUS configurations are changed in the connection as well.
You can see the feed back on the sending audio in this image. The bitrate obviously drops slightly when using opus. However, I would imagine that video bitrate would be the first changed in a video call as changing it would have the greater effect.
Obviously, one cannot change the bitrate on a codec that only supports constant bitrates.
All the other stats are a combination of what the RTCP reports give(packetsLost, Rtt, bits sent, etc.) and googles monitoring of the inputs/outputs(audio level, echo cancellation, etc.).
NOTE: this is taken from a session created by AppRtc in chrome.