Airplay audio only with Safari - safari

I'm trying to use the Safari's Airplay API to play an audio file to an Airplay device:
<audio id="player" src="audio.mp3" controls style="width: 100%"></audio>
<button id="airplay" disabled>airplay</button>
<script>
const player = document.getElementById('player');
const airplay = document.getElementById('airplay');
player.addEventListener('webkitplaybacktargetavailabilitychanged', e => {
airplay.disabled = e.availability !== 'available';
});
airplay.addEventListener('click', () => player.webkitShowPlaybackTargetPicker());
</script>
The button is working as expected but the device is unable to play.
I'm trying on an AppleTV and when I try to use it the screen flashes and nothing happen (no music, the player is paused).
I have tried with an AAC file and the behavior is the same.
Any suggestions?

Related

Stream static video file through webrtc

what I am trying to accomplish is to have on my page audio and video file. And then send them through webrtc. I managed to do this with audio using web audio api like this.
HTML:
<audio id="audio">
<source src="../assets/outfoxing.mp3" />
</audio>
JS:
const audioElement = document.getElementById("audio");
const incomingSource = audioContext.createMediaElementSource(audioElement);
const outgoingStream = audioContext.createMediaStreamDestination();
incomingSource.connect(outgoingStream);
const outgoingTrack = outgoingStream.stream.getAudioTracks()[0];
audioElement.play();
await this.sendTransport.produce({ track: outgoingTrack });
For webrtc I am using mediasoup
Now I want to do the same with the video. But there is no such thing like web video api so I am stuck. How can I accomplish this task.
There are some limitations, but you could refer to this sample implementation.
It uses the captureStream() method.

Webrtc disable track doesn't turn off webcam

I am trying to implement a toggle video feature using webRTC. Refer to the following code:
<video id="remote" autoPlay></video>
<button onclick="toggleVideo()">Toggle video</button>
let localVideo = document.querySelector('#local');
const toggleVideo = () => {
localVideo.srcObject.getVideoTracks()[0].enabled = !localVideo.srcObject.getVideoTracks()[0].enabled
}
This turns off video as well as webcam indicator in firefox but not in chrome. Chrome only turns off the video.
According to MDN docs,
If the MediaStreamTrack represents the video input from a camera, disabling the track by setting enabled to false also updates device activity indicators to show that the camera is not currently recording or streaming. For example, the green "in use" light next to the camera in iMac and MacBook computers turns off while the track is muted in this way.
MDN docs
Is there any other workaround?

broadcast a web-cam to (YouTube, Twitch , facebook) using HTML5 and WebRTC

I'm working on a project where i need to broadcast a live video on youtube, twitch , Facebook or other platforms from my website using HTML5 , rtmp , webrtc, nodejs....
so instead of going to youtube and start a live video, i want to start the video from my website
but im new to webrtc and live streaming and i don't know what to do or how to start this so please if anyone have any ideas or suggestions about how to do this please contact me or leave a comment here
this is what i did
SERVER SIDE (NodeJs)
io.on('connection', (socket) =>{
socket.on('stream', stream =>{
console.log(stream)
socket.broadcast.emit('stream', stream);
});
})
Client Side
Html (video.html)
<div id="videos">
<video id="video" autoplay>
</video>
</div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.3.0/socket.io.js"></script>
<script src="js/video.js"></script>
Javascript (video.js)
var socket = io();
navigator.mediaDevices.getUserMedia({
video : true,
audio: true
})
.then(stream =>{
document.getElementById('video').srcObject = stream
socket.emit("stream", stream);
})
socket.on('stream', stream=>{
video = document.createElement("video")
video.srcObject = stream
video.setAttribute('autoplay')
document.getElementById("videos").appendChild(video)
})
You will need to do a WebRTC to RTMP bridge on your backend.
There are a lot of things to consider, but this is a common question so I threw together twitch. It is an example of doing this.

VideoJS player shows LIVE mode for VOD stream

I am using videoJS player with m3u8 list loaded dynamically from the server. The list has #EXT-X-ENDLIST at the end so it shouldn't be interpreted as LIVE stream. However, player shows LIVE button and the totaltime is negative.
See the screenshot:
https://adamant69-my.sharepoint.com/:i:/g/personal/tvbegovic_adamant69_onmicrosoft_com/EYPU0uKR8ypCtf9L4Z6FFAYBq1YPKDE95z5KD0lBLIHdcw?e=12C0is
The link to m3u8 list:
http://accessb.streamsink.com/hls-africa/tv1/GetVODClip.m3u8?date=2019-09-05&start=14:00:00&end=15:00:00
HTML:
<video-js id="wp_video1" class="video-js vjs-default-skin" controls data-setup='{"overrideNative": true,"responsive": true}'></video-js>
Player is given src attribute dynamically in the code
player.ready(() => {
player.src({
src: `${$scope.selectedChannel.ArchivePath}/GetVODClip.m3u8?
date=${mClipFrom.format('YYYY-MM-DD')}&start=${mClipFrom.format('HH:mm:ss')}&end=${mClipTo.format('HH:mm:ss')}` ,
type: 'application/x-mpegURL'
});
});
Is there a problem with my m3u8 list or should I configure player differently?

How do I use a Webcam feed as an A-Frame texture?

I'd like to attach a webcam stream as a texture to an entity within an aframe, is this possible and how would I do that?
An example of the effect I'm going for include:
Projecting my webcam feed onto a TV within the vr
"Face timing" someone within VR
Seeing yourself within the VR for debugging purposes
https://media.giphy.com/media/cJjZg8kXSUopNzZP4V/giphy.gif
Adding the Asset
The first step is to add the video as an asset:
<a-assets>
<video id="webcam" playsinline></video>
</a-assets>
Note the playsinline directive which prevents the page from entering full screen mode, particularly on mobile devices. It’s just a little detail I like to add, because while our app will be run fullscreen anyways I want the app to decide that and not some random video element!
Create the Stream
Next we create the stream with:
<!-- Start the webcam stream and attach it to the video element -->
<script>
// You can also set which camera to use (front/back/etc)
navigator.mediaDevices.getUserMedia({audio: false, video: true})
.then(stream => {
let $video = document.querySelector('video')
$video.srcObject = stream
$video.onloadedmetadata = () => {
$video.play()
}
})
</script>
Apply the Texture
Finally, we apply the stream as a material onto any entity with: material="src: #webcam"
Working Demo
<script src="https://aframe.io/releases/0.8.2/aframe.min.js"></script>
<!-- Create an empty video tag to hold our webcam stream -->
<a-assets>
<video id="webcam" playsinline></video>
</a-assets>
<!-- Creates -->
<a-scene background="color: #ECECEC">
<a-box position="-1 0.5 -3" rotation="0 45 0" shadow material="src: #webcam"></a-box>
<a-sphere position="0 1.25 -5" radius="1.25" color="#EF2D5E" shadow></a-sphere>
<a-cylinder position="1 0.75 -3" radius="0.5" height="1.5" color="#FFC65D" shadow></a-cylinder>
<a-plane position="0 0 -4" rotation="-90 0 0" width="4" height="4" color="#7BC8A4" shadow></a-plane>
</a-scene>
<!-- Start the webcam stream and attach it to the video element -->
<script>
// You can also set which camera to use (front/back/etc)
// #SEE https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia
navigator.mediaDevices.getUserMedia({audio: false, video: true})
.then(stream => {
let $video = document.querySelector('video')
$video.srcObject = stream
$video.onloadedmetadata = () => {
$video.play()
}
})
</script>
If the Code Runner doesn't work, you can also play with it here: https://glitch.com/~webcam-as-aframe-texture