How do I use a Webcam feed as an A-Frame texture? - webrtc

I'd like to attach a webcam stream as a texture to an entity within an aframe, is this possible and how would I do that?
An example of the effect I'm going for include:
Projecting my webcam feed onto a TV within the vr
"Face timing" someone within VR
Seeing yourself within the VR for debugging purposes

https://media.giphy.com/media/cJjZg8kXSUopNzZP4V/giphy.gif
Adding the Asset
The first step is to add the video as an asset:
<a-assets>
<video id="webcam" playsinline></video>
</a-assets>
Note the playsinline directive which prevents the page from entering full screen mode, particularly on mobile devices. It’s just a little detail I like to add, because while our app will be run fullscreen anyways I want the app to decide that and not some random video element!
Create the Stream
Next we create the stream with:
<!-- Start the webcam stream and attach it to the video element -->
<script>
// You can also set which camera to use (front/back/etc)
navigator.mediaDevices.getUserMedia({audio: false, video: true})
.then(stream => {
let $video = document.querySelector('video')
$video.srcObject = stream
$video.onloadedmetadata = () => {
$video.play()
}
})
</script>
Apply the Texture
Finally, we apply the stream as a material onto any entity with: material="src: #webcam"
Working Demo
<script src="https://aframe.io/releases/0.8.2/aframe.min.js"></script>
<!-- Create an empty video tag to hold our webcam stream -->
<a-assets>
<video id="webcam" playsinline></video>
</a-assets>
<!-- Creates -->
<a-scene background="color: #ECECEC">
<a-box position="-1 0.5 -3" rotation="0 45 0" shadow material="src: #webcam"></a-box>
<a-sphere position="0 1.25 -5" radius="1.25" color="#EF2D5E" shadow></a-sphere>
<a-cylinder position="1 0.75 -3" radius="0.5" height="1.5" color="#FFC65D" shadow></a-cylinder>
<a-plane position="0 0 -4" rotation="-90 0 0" width="4" height="4" color="#7BC8A4" shadow></a-plane>
</a-scene>
<!-- Start the webcam stream and attach it to the video element -->
<script>
// You can also set which camera to use (front/back/etc)
// #SEE https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia
navigator.mediaDevices.getUserMedia({audio: false, video: true})
.then(stream => {
let $video = document.querySelector('video')
$video.srcObject = stream
$video.onloadedmetadata = () => {
$video.play()
}
})
</script>
If the Code Runner doesn't work, you can also play with it here: https://glitch.com/~webcam-as-aframe-texture

Related

Browser keeps on accessing the camera with red dot even after stopping the stream aafter establishing the peer connection using webRTC

let localStream;
let peerConnection;
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
}).then(function(stream) {
createPeerConnection();
localStream = stream;
peerConnection.addStream(localStream);
});
so when stopping the stream it stops the video
localStream.getTracks().forEach(track => track.stop());
But the browser tab says that it is accessing the camera or microphone with a red dot besides it. I just do not want to reload the page in order to stop that.
Note: this happens when after establishing a peer connection using webRTC and after disconnecting the peers the camera light stays on.
Is there any way to do that. Thanks for your help in advance.
you can use boolean value or condition in which tab access camera after track.stop() you can set the value to false then the camera will not be acessed anymore. (p.s you can try that if its works)
<!DOCTYPE html>
<html>
<head>
<title>Web Client</title>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body>
<div id="callerIDContainer">
<button onclick="call_user();">Call User</button>
</div>
<div class="video-container">
<video autoplay muted class="local-video" id="local-video"></video>
</div>
<div>
<button onclick="hangup();">Hangup</button>
</div>
</body>
<script >
var localStream;
var accessRequired=true
function call_user() //your function
{
if(accessRequired)
{
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
}).then(function(stream) {
localStream = stream;
const localVideo = document.getElementById("local-video");
if (localVideo) {
localVideo.srcObject = localStream;
}
});
}
}
function hangup(){
localStream.getTracks().forEach(track => track.stop()).then(()=>{accessRequired=false});
}
</script>
</html>
try this call user then hangup it is working
The sample code in your question looks like it uses gUM() to create an audio-only stream ({video: false, audio:true}).
It would be strange if using .stop() on all the tracks on your audio-only stream also stopped the video track on some other stream. If you want to turn off your camera's on-the-air light you'll need to stop the video track you used in peerConnection.addTrack(videoTrack). You probably also need to tear down the call using peerConnection.close().
I had same issue with webRTC and React. I have stopped tracks of remote stream but I forgot to stop local stream :
window.localStream.getTracks().forEach((track) => {
track.stop();
});

broadcast a web-cam to (YouTube, Twitch , facebook) using HTML5 and WebRTC

I'm working on a project where i need to broadcast a live video on youtube, twitch , Facebook or other platforms from my website using HTML5 , rtmp , webrtc, nodejs....
so instead of going to youtube and start a live video, i want to start the video from my website
but im new to webrtc and live streaming and i don't know what to do or how to start this so please if anyone have any ideas or suggestions about how to do this please contact me or leave a comment here
this is what i did
SERVER SIDE (NodeJs)
io.on('connection', (socket) =>{
socket.on('stream', stream =>{
console.log(stream)
socket.broadcast.emit('stream', stream);
});
})
Client Side
Html (video.html)
<div id="videos">
<video id="video" autoplay>
</video>
</div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.3.0/socket.io.js"></script>
<script src="js/video.js"></script>
Javascript (video.js)
var socket = io();
navigator.mediaDevices.getUserMedia({
video : true,
audio: true
})
.then(stream =>{
document.getElementById('video').srcObject = stream
socket.emit("stream", stream);
})
socket.on('stream', stream=>{
video = document.createElement("video")
video.srcObject = stream
video.setAttribute('autoplay')
document.getElementById("videos").appendChild(video)
})
You will need to do a WebRTC to RTMP bridge on your backend.
There are a lot of things to consider, but this is a common question so I threw together twitch. It is an example of doing this.

Videojs can't play m3u8 blob URL

I am using Videojs version 7.6.6. It will not play a html5 video if the src is a blob URL. It will load the video time however, but will not play. I get this warning, and then it loads forever:
VIDEOJS: WARN: Problem encountered with the current HLS playlist. Trying again since it is the only playlist.
This is the way my code runs:
<video id="my_video" class="video-js vjs-matrix vjs-default-skin vjs-big-play-centered" controls
preload="none" width="640" height="268" data-setup="{}"></video>
<script type="text/javascript" src="/js/video-766.min.js"></script>
<script>
fetch("https://server/hls/index.m3u8").then(result => result.blob())
.then(blob => {
var blobURL = URL.createObjectURL(blob);
var player = videojs("my_video");
player.src({ src: blobURL, type: "application/x-mpegURL" });
}
);
</script>
If I try it without a blob, just a regular URL to the index.m3u8 file, then it works. So this is a problem with the creation of the blob URL I think. This works, the video starts playing:
<video id="my_video" class="video-js vjs-default-skin" height="360" width="640" controls preload="none">
<source src="https://server/hls/index.m3u8" type="application/x-mpegURL" />
</video>
<script>
var player = videojs('my_video');
</script>
I have searched for this issue and found a lot, but none of it helps me. Am I creating the blob wrong?
The object URL that is generated for the blob will start with file:// protocol if I'm not wrong. Browsers doesn't let you load data with file:// URL. I ran into a similar problem so I created a simple server on my app which returns the requested file over https:// .
The reason why your index.m3u8 is running because it is served over https protocol

Airplay audio only with Safari

I'm trying to use the Safari's Airplay API to play an audio file to an Airplay device:
<audio id="player" src="audio.mp3" controls style="width: 100%"></audio>
<button id="airplay" disabled>airplay</button>
<script>
const player = document.getElementById('player');
const airplay = document.getElementById('airplay');
player.addEventListener('webkitplaybacktargetavailabilitychanged', e => {
airplay.disabled = e.availability !== 'available';
});
airplay.addEventListener('click', () => player.webkitShowPlaybackTargetPicker());
</script>
The button is working as expected but the device is unable to play.
I'm trying on an AppleTV and when I try to use it the screen flashes and nothing happen (no music, the player is paused).
I have tried with an AAC file and the behavior is the same.
Any suggestions?

VideoJS doesn´t play after pause RTMP live stream

I´m using VideoJS for a live stream from Wowza server but when I pause the player and them I play again the player does not recover the stream. I need to reload the webpage to start the stream again.
<video id="videoID" class="video-js vjs-default-skin vjs-big-play-centered" poster="/images/image.png" controls="controls" width="320" height="240" data-setup='{"techOrder": ["flash"]}'>
<source src="rtmp://www.myhost.com:1935/live/live.stream" type="rtmp/mp4" />
</video>
There are any method to do stop or VideoJS reload when the paused event appear?
EDIT: I've encountered the solution using this script:
<script type="text/javascript">
var myPlayer = videojs('videoID');
videojs("videoID").ready(function(){
var myPlayer = this;
myPlayer.on("pause", function () {
myPlayer.on("play", function () { myPlayer.load (); myPlayer.off("play"); });
});
});
</script>
Ok so i found a solution. Instead of stripping all of the play events form the player you actually just need to edit the flash play event inside of video.dev.js on line 7337 (in version 4.11.4 i think). this is the line that says:
vjs.Flash.prototype.play = function(){
this.el_.vjs_play();
};
it should be changed to say:
vjs.Flash.prototype.play = function(){
this.el_.vjs_load();
this.el_.vjs_play();
};
so that the load event is called before the play event.
I found the solution here
$(document).ready(function(){
var player = videojs('really-cool-video', { /* Options */}, function () {
// ...
var player = this;
player.on("pause", function () {
player.one("play", function () {
player.load();
player.play();
});
});
});
});