I'm working on a project where i need to broadcast a live video on youtube, twitch , Facebook or other platforms from my website using HTML5 , rtmp , webrtc, nodejs....
so instead of going to youtube and start a live video, i want to start the video from my website
but im new to webrtc and live streaming and i don't know what to do or how to start this so please if anyone have any ideas or suggestions about how to do this please contact me or leave a comment here
this is what i did
SERVER SIDE (NodeJs)
io.on('connection', (socket) =>{
socket.on('stream', stream =>{
console.log(stream)
socket.broadcast.emit('stream', stream);
});
})
Client Side
Html (video.html)
<div id="videos">
<video id="video" autoplay>
</video>
</div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.3.0/socket.io.js"></script>
<script src="js/video.js"></script>
Javascript (video.js)
var socket = io();
navigator.mediaDevices.getUserMedia({
video : true,
audio: true
})
.then(stream =>{
document.getElementById('video').srcObject = stream
socket.emit("stream", stream);
})
socket.on('stream', stream=>{
video = document.createElement("video")
video.srcObject = stream
video.setAttribute('autoplay')
document.getElementById("videos").appendChild(video)
})
You will need to do a WebRTC to RTMP bridge on your backend.
There are a lot of things to consider, but this is a common question so I threw together twitch. It is an example of doing this.
Related
what I am trying to accomplish is to have on my page audio and video file. And then send them through webrtc. I managed to do this with audio using web audio api like this.
HTML:
<audio id="audio">
<source src="../assets/outfoxing.mp3" />
</audio>
JS:
const audioElement = document.getElementById("audio");
const incomingSource = audioContext.createMediaElementSource(audioElement);
const outgoingStream = audioContext.createMediaStreamDestination();
incomingSource.connect(outgoingStream);
const outgoingTrack = outgoingStream.stream.getAudioTracks()[0];
audioElement.play();
await this.sendTransport.produce({ track: outgoingTrack });
For webrtc I am using mediasoup
Now I want to do the same with the video. But there is no such thing like web video api so I am stuck. How can I accomplish this task.
There are some limitations, but you could refer to this sample implementation.
It uses the captureStream() method.
Safari doesn't support MediaRecorder to listen to the stream from WebCam like the below code.
This works perfectly in Chrome and I'm able to convert the blob to a webm video file.
if(navigator.mediaDevices.getUserMedia)
{
navigator.mediaDevices.getUserMedia({video: true, audio: true}).then (stream => {
videoRef.srcObject = stream
mediaRecorder.value = new MediaRecorder(stream, {mimeType: 'video/webm; codecs=vp8,opus'})
mediaRecorder.value.addEventListener('dataavailable', function(e) {
blobs.push(e.data)
})
})
}
})
I need to save the video streamed from webcam in my server. What should be the approach to achieve the same in Safari?
I researched a lot, saw a similar question. But there was no proper solution given.
Could someone guide to a tutorial on how to achieve this using WebRTC if needed?
let localStream;
let peerConnection;
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
}).then(function(stream) {
createPeerConnection();
localStream = stream;
peerConnection.addStream(localStream);
});
so when stopping the stream it stops the video
localStream.getTracks().forEach(track => track.stop());
But the browser tab says that it is accessing the camera or microphone with a red dot besides it. I just do not want to reload the page in order to stop that.
Note: this happens when after establishing a peer connection using webRTC and after disconnecting the peers the camera light stays on.
Is there any way to do that. Thanks for your help in advance.
you can use boolean value or condition in which tab access camera after track.stop() you can set the value to false then the camera will not be acessed anymore. (p.s you can try that if its works)
<!DOCTYPE html>
<html>
<head>
<title>Web Client</title>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body>
<div id="callerIDContainer">
<button onclick="call_user();">Call User</button>
</div>
<div class="video-container">
<video autoplay muted class="local-video" id="local-video"></video>
</div>
<div>
<button onclick="hangup();">Hangup</button>
</div>
</body>
<script >
var localStream;
var accessRequired=true
function call_user() //your function
{
if(accessRequired)
{
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
}).then(function(stream) {
localStream = stream;
const localVideo = document.getElementById("local-video");
if (localVideo) {
localVideo.srcObject = localStream;
}
});
}
}
function hangup(){
localStream.getTracks().forEach(track => track.stop()).then(()=>{accessRequired=false});
}
</script>
</html>
try this call user then hangup it is working
The sample code in your question looks like it uses gUM() to create an audio-only stream ({video: false, audio:true}).
It would be strange if using .stop() on all the tracks on your audio-only stream also stopped the video track on some other stream. If you want to turn off your camera's on-the-air light you'll need to stop the video track you used in peerConnection.addTrack(videoTrack). You probably also need to tear down the call using peerConnection.close().
I had same issue with webRTC and React. I have stopped tracks of remote stream but I forgot to stop local stream :
window.localStream.getTracks().forEach((track) => {
track.stop();
});
I am using Videojs version 7.6.6. It will not play a html5 video if the src is a blob URL. It will load the video time however, but will not play. I get this warning, and then it loads forever:
VIDEOJS: WARN: Problem encountered with the current HLS playlist. Trying again since it is the only playlist.
This is the way my code runs:
<video id="my_video" class="video-js vjs-matrix vjs-default-skin vjs-big-play-centered" controls
preload="none" width="640" height="268" data-setup="{}"></video>
<script type="text/javascript" src="/js/video-766.min.js"></script>
<script>
fetch("https://server/hls/index.m3u8").then(result => result.blob())
.then(blob => {
var blobURL = URL.createObjectURL(blob);
var player = videojs("my_video");
player.src({ src: blobURL, type: "application/x-mpegURL" });
}
);
</script>
If I try it without a blob, just a regular URL to the index.m3u8 file, then it works. So this is a problem with the creation of the blob URL I think. This works, the video starts playing:
<video id="my_video" class="video-js vjs-default-skin" height="360" width="640" controls preload="none">
<source src="https://server/hls/index.m3u8" type="application/x-mpegURL" />
</video>
<script>
var player = videojs('my_video');
</script>
I have searched for this issue and found a lot, but none of it helps me. Am I creating the blob wrong?
The object URL that is generated for the blob will start with file:// protocol if I'm not wrong. Browsers doesn't let you load data with file:// URL. I ran into a similar problem so I created a simple server on my app which returns the requested file over https:// .
The reason why your index.m3u8 is running because it is served over https protocol
I am trying to use the Google Play Game Services to create a multiplayer game based web, but I can't create the Rooms. Is possible use the Google play game services on web to use the real multiplayer?
<meta name="google-signin-clientid" content="32241234345-oklsdfhgiodf89789gfgfy9ym.apps.googleusercontent.com" />
<meta name="google-signin-cookiepolicy" content="single_host_origin" />
<meta name="google-signin-callback" content="signinCallback" />
<meta name="google-signin-scope" content="https://www.googleapis.com/auth/games https://www.googleapis.com/auth/plus.login">
<!--meta name="google-signin-scope" content="https://www.googleapis.com/auth/games-->
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.8.0/jquery.min.js"></script>
<script src="https://apis.google.com/js/client.js"></script>
<script src="https://apis.google.com/js/client:plusone.js"></script>
<script>
var appId = '0000000000';
var clientId = '0000000000-dfdsfsdfsdfsdgsdgfsdtodnjj8hq6hjm.apps.googleusercontent.com';
var apiKey = 'SDFGSDKjjk123123';
//var scopes = 'https://www.googleapis.com/auth/plus.me';
var scopes = 'https://www.googleapis.com/auth/games';
function create(){
gapi.client.request({
path: 'https://www.googleapis.com/games/v1/rooms/create',
method: 'POST',
callback: function(response) {
console.log(response);
}
});
return false;
}
The answer depends on whether or not you want the game to be in real time or not. If you are ok with a turn based multiplayer game then you can use the REST API.
For more info visit this link: https://developers.google.com/games/services/common/concepts/turnbasedMultiplayer
If you want multiplayer to be real-time then the answer is No. You currently can not do that with Googles API. It is only available on mobile.
Unfortunately only works with Android and iOS
The Google Play Games real-time multiplayer service is supported on
these platforms:
Android through the Google Play services SDK
iOS through the iOS SDK
https://developers.google.com/games/services/common/concepts/realtimeMultiplayer