I am getting a dropout while making a call using jssip/sipjs library. There is no audio too. Following is shown in javascript console.
====
Fri Apr 04 2014 10:14:30 GMT+0530 (IST) | sip.sanitycheck | Via sent-by in the response does not match UA Via host value. Dropping the response sip-0.5.0.js:170
Fri Apr 04 2014 10:14:34 GMT+0530 (IST) | sip.sanitycheck | Via sent-by in the response does not match UA Via host value. Dropping the response sip-0.5.0.js:170
Fri Apr 04 2014 10:14:38 GMT+0530 (IST) | sip.sanitycheck | Via sent-by in the response does not match UA Via host value. Dropping the response sip-0.5.0.js:170
Fri Apr 04 2014 10:14:42 GMT+0530 (IST) | sip.sanitycheck | Via sent-by in the response does not match UA Via host value. Dropping the response sip-0.5.0.js:170
Set up involves an asterisk server and webrtc service.
Below given is the sample html page I am using to test.
<!DOCTYPE html>
<html>
<head>
<!-- <script type="text/javascript" src="SIPml-api.js"></script> -->
</head>
<body>
Hello woirld
<video id="remoteVideo"></video>
<video id="localVideo" muted="muted"></video>
<button type="button" id="endButton">End</button>
<button type="button" id="callButton">Call</button>
</body>
<script type="text/javascript" src="sip-0.5.0.js"></script>
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.11.0/jquery.min.js"></script>
<script type="text/javascript">
(function () {
var session;
function onAccepted () {
alert("You made a call!");
}
var userAgent = new SIP.UA({
uri: 'sip:100#X.X.X.X',
// wsServers: ['ws://mywebrtc.com:10060'],
wsServers: ['wss://mywebrtc.com:10062'],
authorizationUser: '100',
password: '1234'
});
$( document ).ready(function() {
var endButton = document.getElementById('endButton');
endButton.addEventListener("click", function() {
session.bye();
alert("Call Ended");
}, false);
});
//here you determine whether the call has video and audio
var options = {
mediaConstraints: {
audio: true,
video: true
}
};
//makes the call
session = userAgent.invite('111', options);
session.on('accepted', onAccepted);
}) ();
</script>
</html>
=====
Can somebody help me on this?
Try Replacing following from your code
<video id="remoteVideo"></video>
<video id="localVideo" muted="muted"></video>
with
<audio id="remoteAudio"></audio>
<audio id="localAudio" muted="muted"></audio>
and
//here you determine whether the call has video and audio
var options = {
mediaConstraints: {
audio: true,
video: true
}
};
with
//Here you determine whether the call has video and audio
var options = {
media: {
constraints: {
audio: true,
video: false,
},
render: {
remote: {
audio: document.getElementById('remoteAudio')
},
local: {
audio: document.getElementById('localAudio')
}
}
}
};
I have tested with Asterisk 11.11.0 and Firefox 31.0 and Opera 22.0.1471.70. Audio call
is working fine. I am facing no audio issue with latest chrome browser (Version 37.0.2062.58 beta-m (64-bit)). Otherwise its works like charm.
One more thing, Asterisk does not support vp8 codec, so video will not work. Asterisk 12 supports vp8 codec in passthrough mode. I am yet to test this feature.
Related
let localStream;
let peerConnection;
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
}).then(function(stream) {
createPeerConnection();
localStream = stream;
peerConnection.addStream(localStream);
});
so when stopping the stream it stops the video
localStream.getTracks().forEach(track => track.stop());
But the browser tab says that it is accessing the camera or microphone with a red dot besides it. I just do not want to reload the page in order to stop that.
Note: this happens when after establishing a peer connection using webRTC and after disconnecting the peers the camera light stays on.
Is there any way to do that. Thanks for your help in advance.
you can use boolean value or condition in which tab access camera after track.stop() you can set the value to false then the camera will not be acessed anymore. (p.s you can try that if its works)
<!DOCTYPE html>
<html>
<head>
<title>Web Client</title>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body>
<div id="callerIDContainer">
<button onclick="call_user();">Call User</button>
</div>
<div class="video-container">
<video autoplay muted class="local-video" id="local-video"></video>
</div>
<div>
<button onclick="hangup();">Hangup</button>
</div>
</body>
<script >
var localStream;
var accessRequired=true
function call_user() //your function
{
if(accessRequired)
{
navigator.mediaDevices.getUserMedia({
audio: true,
video: true
}).then(function(stream) {
localStream = stream;
const localVideo = document.getElementById("local-video");
if (localVideo) {
localVideo.srcObject = localStream;
}
});
}
}
function hangup(){
localStream.getTracks().forEach(track => track.stop()).then(()=>{accessRequired=false});
}
</script>
</html>
try this call user then hangup it is working
The sample code in your question looks like it uses gUM() to create an audio-only stream ({video: false, audio:true}).
It would be strange if using .stop() on all the tracks on your audio-only stream also stopped the video track on some other stream. If you want to turn off your camera's on-the-air light you'll need to stop the video track you used in peerConnection.addTrack(videoTrack). You probably also need to tear down the call using peerConnection.close().
I had same issue with webRTC and React. I have stopped tracks of remote stream but I forgot to stop local stream :
window.localStream.getTracks().forEach((track) => {
track.stop();
});
The OneDriver Picker does not load after the authentication process but instead shows a spinner.
Steps to Reproduce
OneDrive Scripts Tested:
https://js.live.net/v7.2/OneDrive.debug.js
https://js.live.net/v7.2/OneDrive.js
Code used to Initiate the OneDriver Picker:
function launchOneDriverPicker() {
debugger;
var odOptions = {
clientId: "${clientId}",
action: "share",
multiSelect: true,
openInNewWindow: true,
advanced: {
redirectUri: "${redirectUri}"
},
success: function(r) {
},
cancel: function() {
},
error: function(error) {
}
};
OneDrive.open(odOptions);
}
Environments Tested:
- Chrome (Normal/Incognito)
- Firefox (Normal/Incognito)
Steps
The page hosting the OneDrive Picker and redirect URLs are served from the same domain
The redirect occurs to a redirect page with the following content hosted under the same domain (domain/redirect):
<html>
<head>
<link rel="icon" href="data:;base64,iVBORw0KGgo=">
<script type="text/javascript" src="https://js.live.net/v7.2/OneDrive.debug.js"></script>
</head>
</html>
Additional Notes
The above setup works correctly with the OneDriver Picker 7.0 as noted in an earlier issue:
https://github.com/OneDrive/onedrive-api-docs/issues/824
While debugging the issue I noticed that the OneDriver Picker makes a cross document call to the parent window which opened the Picker. There are no errors up to this point but the parent page does not receive this message.
The domain specified in the cross document call is correct
Reference
[1] https://learn.microsoft.com/en-us/onedrive/developer/controls/file-pickers/js-v72/open-file?view=odsp-graph-online#using-a-custom-redirect-uri
I am trying to play back a video (currently hosted on S3 with public access) by creating a blob URL.
I have used Elastic Transcoder to encode the video since it is supposed to set the MOOV atom to the top (beginning).
I am unable to get the code to work but also found a working example: link here
Here is my code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8"/>
</head>
<body>
<video controls></video>
<script>
var video = document.querySelector('video');
var assetURL = 'https://ovation-blob-url-test.s3.amazonaws.com/AdobeStock_116640093_Video_WM_NEW.mp4';
// Need to be specific for Blink regarding codecs
// ./mp4info frag_bunny.mp4 | grep Codec
var mimeCodec = 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"';
if ('MediaSource' in window && MediaSource.isTypeSupported(mimeCodec)) {
var mediaSource = new MediaSource;
//console.log(mediaSource.readyState); // closed
video.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', sourceOpen);
} else {
console.error('Unsupported MIME type or codec: ', mimeCodec);
}
function sourceOpen (_) {
//console.log(this.readyState); // open
var mediaSource = this;
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
fetchAB(assetURL, function (buf) {
sourceBuffer.addEventListener('updateend', function (_) {
mediaSource.endOfStream();
video.play();
//console.log(mediaSource.readyState); // ended
});
sourceBuffer.appendBuffer(buf);
});
};
function fetchAB (url, cb) {
console.log(url);
var xhr = new XMLHttpRequest;
xhr.open('get', url);
xhr.responseType = 'arraybuffer';
xhr.onload = function () {
cb(xhr.response);
};
xhr.send();
};
</script>
</body>
</html>
What am I doing wrong? I looked at tools ie.e MP4Box or QT-FastStart but they seem to be kind of old school. I would also be willing to change from MP4 to M3U8 playlist but then I don't know what MIME types to use.
At the ned of the day I am trying to play back a video/stream and hide the URL (origin) potentially using blob.
Thank you guys!
So, first, even though this code seems to be taken from mozilla documentation site, there are a few issues - you are not checking the readyState before calling endOfStream thus the error you get is valid, secondly, the play() call is blocked by the autoplay policy changes. If you add an error handler, you will actually see that the appendBuffer fails. Here is the updated snippet:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8"/>
</head>
<body>
<video controls></video>
<script>
var video = document.querySelector('video');
var assetURL = 'https://ovation-blob-url-test.s3.amazonaws.com/AdobeStock_116640093_Video_WM_NEW.mp4';
// Need to be specific for Blink regarding codecs
// ./mp4info frag_bunny.mp4 | grep Codec
var mimeCodec = 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"';
if ('MediaSource' in window && MediaSource.isTypeSupported(mimeCodec)) {
var mediaSource = new MediaSource;
//console.log(mediaSource.readyState); // closed
video.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', sourceOpen);
} else {
console.error('Unsupported MIME type or codec: ', mimeCodec);
}
function sourceOpen (_) {
//console.log(this.readyState); // open
var mediaSource = this;
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
fetchAB(assetURL, function (buf) {
sourceBuffer.addEventListener('updateend', function (_) {
// console.log(mediaSource.readyState); // ended
if (mediaSource.readyState === "open") {
mediaSource.endOfStream();
video.play();
}
});
sourceBuffer.addEventListener('error', function (event) {
console.log('an error encountered while trying to append buffer');
});
sourceBuffer.appendBuffer(buf);
});
};
function fetchAB (url, cb) {
console.log(url);
var xhr = new XMLHttpRequest;
xhr.open('get', url);
xhr.responseType = 'arraybuffer';
xhr.onload = function () {
cb(xhr.response);
};
xhr.send();
};
</script>
</body>
</html>
So lets advance to next issue - the actual error. So, using chrome://media-internals/ we can see that the video actually fails to load do to incompatibility with the ISOBMFF format:
I am not familiar with Elastic Transcoder, but it seems that is it not producing an mp4 file suitable for live streaming. Also, if using mse, putting moov at the beginning is not enough, the video actually has to meet all of the ISOBMFF requirements - see chapters 3. and 4.
The working sample you mentioned is not a valid comparison since it uses the blob for the src, where the ISOBMFF rules do not apply. If it is fine for you to go that way, don't use MSE and put the blob directly in the src. If you need MSE, you have to mux it correctly.
Ok, so I got the original code example to work by encoding my MP4 videos with ffmpeg:
ffmpeg -i input.mp4 -vf scale=1920:1080,setsar=1:1 -c:v libx264 -preset medium -c:a aac -movflags empty_moov+default_base_moof+frag_keyframe output.mp4 -hide_banner
Important is: -movflags empty_moov+default_base_moof+frag_keyframe
This setup also scales the video to 1920x1080 (disregarding any aspect ratio of the input video)
However, based on the comments of the original post, I do believe there might be a more efficient way to generate the blob url and ingest into a video tag. This example was copied straight from https://developer.mozilla.org.
If anyone comes up with a better script (not over-engineered), please post it here.
Thank you #Rudolfs Bundulis for all your help!
I am trying to understand how this all push notifications works. I tried to do some test of push technology but so far i failed.
The base assumptions are:
1) use Apache web-server as the main application web-server (mandatory since all our code is using that)
2) Cross-Browser push notification server in node.js Technology (offered socket.io since it is crossed browser).
So far i failed and here is my code (p1.html):
<!doctype html>
<html>
<head>
<meta charset="UTF-8">
<title>P1</title>
</head>
<body>
<h1>P1</h1>
<section id="content"></section>
<script src="/socket.io.js"></script> <!--socket.io-->
<script src="/socket.js"></script> <!--socket.io-client-->
<script src="http://code.jquery.com/jquery-1.7.1.min.js"></script>
<script>
var socket = io.connect('http://localhost:8080');
socket.on('notification', function (data) {
$('#content').append(data.message + '<br>')
});
</script>
</body>
</html>
and my server script (p1.js):
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, url = require('url')
app.listen(8080);
console.log("creating a connection");
io.sockets.on( 'connection', function ( socket ) {
console.log("runing time");
sendTimeMessage(socket);
});
function sendTimeMessage(socket){
console.log("in time");
var time= new Date().getTime();
console.log(time);
socket.volatile.emit( 'notification' , time );
setTimeout(sendTimeMessage, 5000);
}
function handler (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end("");
}
function sendMessage(message) {
io.sockets.emit('notification', {'message': message});
}
i changed the IPs to local host for the example so i hope there is no mistake on the syntax.
when i run, the Apache web-server is the one that display the data and the idea is for the socket-io to update few fields.
current state:
1. If i don't add the socket.io-client js file i get reference error for socket.io-client
2. If i do add socket.io-client i get "ReferenceError: require is not defined
[Break On This Error] 'undefined' != typeof io ? io : module.exports
i can really need help understanding it, and making it work. i am also open minded to alternative solutions
i can really need help getting this done.
Working example, of what you want to achieve. First mistake is wrong javascript path on client-side, the right one is /socket.io/socket.io.js. Second mistake is use of socket.volatile which doesn't exist.
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, url = require('url')
console.log("creating a connection");
io.sockets.on( 'connection', function ( socket ) {
console.log("runing time");
sendTimeMessage(socket);
});
function sendTimeMessage(socket){
console.log("in time");
var now= new Date().getTime();
socket.emit('notification', {'message': now});
setTimeout(function() {
socket.emit('notification', {'message': "after 5s"});
},5000);
}
function handler (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.end("<html><script src=\"/socket.io/socket.io.js\"></script> <!--socket.io--><script>io.connect().on('notification', function (data) {console.log(data)});</script></html>");
}
app.listen(8080);
Ok, i partially solved the with a huge help from the guys on IRC i created an:
1) HTML over Apache on port 80
2) live notification service update my HTML over port 8080
(there might still have code issue in the values arrived from the functions cause its not fully debuged)
p1.html (client)
<!doctype html>
<html>
<head>
<meta charset="UTF-8">
</head>
<body>
<section id="content"></section>
<script src="/node_modules/socket.io-client/dist/socket.io.js"></script>
<script src="http://code.jquery.com/jquery-1.7.1.min.js"></script>
<script>
var socket = io.connect('http://10.10.10.1:8080');
socket.on('notification', function (from,msg) {
$('#content').append(msg.message + '<br>')
});
</script>
</body>
</html>
p1.js (service)
var io = require('socket.io').listen(8080)
console.log("creating a connection");
io.sockets.on( 'connection', function ( socket ) {
console.log("runing time");
var oldtime= new Date().getTime();
while (1){
var newtime= new Date().getTime();
if (newtime%5323==0 && newtime != oldtime){
oldtime = newtime;
console.log(newtime);
socket.emit( 'notification' , {'message': "the time is - " + newtime} );
}
}
});
enjoy
Thanks #yanger
I was helped by your code.
I want to add a comment.
But I can't use comment yet.
In my case, I want to make a real time alarm.
and I use 80 port web server and 81 port alarm server.
So I just use this code. (Client.js)
var socket = io.connect(':81');
It's totally working.
I wish someone would read this article and get help.
I followed the info from this post : here to load the JS SDK into my page.
Here is the page : https://www.tkwk.be/client/babyboom/www/
It works great (SSL connection is valid) until I use the SDK.
The problem is that when I try to use the function setAutoGrow() just before my /head I got an error.
<script type="text/javascript">
window.fbAsyncInit = function() {
FB.Canvas.setAutoGrow();
}
</script>
The page at about:blank displayed insecure content from
http://static.ak.facebook.com/connect/canvas_proxy.php?version=3#behavior=p&method=setSize¶ms=%7B%22height%22%3A892%2C%22width%22%3A1630%2C%22frame%22%3A%22iframe_canvas%22%7D.
However I did load the JS SDK with https like this :
<div id="fb-root"></div><script src="https://connect.facebook.net/en_US/all.js"></script>
<script>
FB._https = true;
FB._https = (window.location.protocol == "https:");
FB.init({
appId : 'XXXXXXXXXXX',
status : true, // check login status
cookie : true, // enable cookies to allow the server to access the session
xfbml : true // parse XFBML
});
</script>
I would like to understand where I made a mistake.
Thanks in advance for your time.