No video in WebRTC call on smartphone(Android) via Asterisk - webrtc

I had built a WebRTC system based on Asterisk and sipml5, and I could make audio calls on my smartphone(Android), but when I enables the video, the caller can get callee's video for about 5sec, and the callee cannot get video at all. Is there any settings needed in Asterisk?
sip.conf:
[2004]
type=friend
defaultuser=2004
username=2004
host=dynamic
secret=pass
encryption=yes
avpf=yes
icesupport=yes
context=rtc-01-dev.demo.net
directmedia=no
transport=udp,ws,wss
force_avp=yes
dtlsenable=yes
dtlsverify=no
dtlscertfile=/etc/asterisk/keys/asterisk.pem
dtlsprivatekey=/etc/asterisk/keys/asterisk.pem
dtlssetup=actpass
allow=vp8,h264
nat=yes
[2005]
type=friend
defaultuser=2005
username=2005
host=dynamic
secret=pass
encryption=yes
avpf=yes
icesupport=yes
context=rtc-01-dev.demo.net
directmedia=no
transport=udp,ws,wss
force_avp=yes
dtlsenable=yes
dtlsverify=no
dtlscertfile=/etc/asterisk/keys/asterisk.pem
dtlsprivatekey=/etc/asterisk/keys/asterisk.pem
dtlssetup=actpass
allow=vp8,h264
nat=yes
extensions.conf:
[rtc-01-dev.demo.net]
exten => _200Z,1,Dial(SIP/${EXTEN},30)
exten => _200Z,2,Congestion
exten => _200Z,102,Busy

Asterisk do nothing with your video, it just send packets it received.
There are no MCU features in asterisk
You should ensure video codecs AND resolution is SAME in all your apps.

Related

Is it safe to send file to server?

I'm creating a react native app that transform audio to text.
First, a user records the audio. Then the recording is sent with RNFS.uploadFiles to flask API. This flask API I created to convert the audio file into text and send back the text to the user.
Honestly, I'm not sure how it really works. For example, I don't see the audio files that were sent from react native to flask server. They are not saved in my server (or they are?)
Should I encrypt the recordings before they are sent?
I send audio with this function:
RNFS.uploadFiles({
toUrl: uploadUrl,
files: fileToSend,
method: 'POST',
headers: {
'Accept': 'application/json',
}
}).promise.then((response) => {
setResults(response.body)
})
.catch((err) => {
console.log(err)
});
}
}
To see the audio files sent from your application to the Flask server, you can use an HTTP Debugging tool such as Charles Proxy or Fiddler. This will show you the HTTP requests exchanged.
If your Flask server has SSL enabled (HTTPS) and your React Native is connecting through HTTPS, then the communication is already encrypted.

postman websocket api, "connected" - but no messages or acknowledgement from server

I can not get postman to connect to the server using websockets.
const port =5001;
io.on("connection", (socket) => {
console.log("Client connected",socket.id);
socket.emit("handshake","connected to backend");
socket.on("test", (data)=>{
console.log("test data is:",data);
socket.emit("test", "server heard you!")
});
}
in postman the request address is:
ws://localhost:5001/socket.io/?transport=websocket
the symptoms are: postman says it's connected. but if I try to send anything - it disconnects after a timeout.
if I set the reconnection attempts to 1, it will automatically reconnect when it disconnects...
but I don't think it's actually connecting - because nothing is happening on the server (no new client connected message)
the format of messages I have also experimented with, to no avail.
42["test","i hear you"]
42[test,i hear you]
["test":"i hear you"]
{"test":"I hear you"}
42{"test":"I hear you"}
{"event":"test","data":"I hear you"}
42{"event":"test","data":"I hear you"}
42["event","test","data","I hear you"]
["event","test","data","I hear you"]
I have inspected the console results, and have not found leads there yet. what could I be missing?
You are using socket.io as WebSocket and that does not work because socket.io is not an implementation of websocket.
From official socket.io documentation:
Socket.IO is NOT a WebSocket implementation. Although Socket.IO indeed uses WebSocket as a transport when possible, it adds additional metadata to each packet. That is why a WebSocket client will not be able to successfully connect to a Socket.IO server, and a Socket.IO client will not be able to connect to a plain WebSocket server either.
// WARNING: the client will NOT be able to connect!
const socket = io("ws://echo.websocket.org");
Source: https://socket.io/docs/v4#What-Socket-IO-is-not
Postman v8.10.0 added support for Socket.IO, read more.
Just enter ws://localhost:5001 as the connection URL and hit Connect.
Also, you can configure the client version (default: v3), handshake path (default: /socket.io), and other reconnection configurations in the request settings.
Because you don not add listener. Add listener "handshake" to postman. You will receive message.
This is my code:
io.on('connection', () => {
console.log('user connected');
setInterval(() => {
io.emit('msg', { data: [1, 2, 3] });
}, 5000);
});

how send video note in telegram bot?

can any body help me to send video not in telegram bot؟
In fact, my problem is that when sending a video note is not sent in a circle.
And it's sent as ordinary as sending a normal video.
I followed all the necessary points are posted the video.
I uploaded the file in :
Mp4 format
Less than a minute
And is square.
And the code I've used:
the main function :
define('API_KEY','Token');
function bot($method,$datas=[]){
$url = "https://api.telegram.org/bot".API_KEY."/".$method;
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch,CURLOPT_POSTFIELDS,http_build_query($datas));
$res = curl_exec($ch);
if(curl_error($ch)){
var_dump(curl_error($ch));
}else{
return json_decode($res);
}
};
send video note :
bot("sendVideoNote",[
"chat_id"=>$chat_id,
"video_note"=>$video_file_id,
]);
And with the place of this variable video_file_id ["file_id"], I used the direct address of the files, but I did not get any results in bot.
thanks for your helping...
As stated in the Telegram Bot Api:
Sending video notes by a URL is currently unsupported.
This leads to video notes that are passed by an URL to be displayed as normal videos.
However, you can upload the file directly to create a real video note. Using CURLFile it would work as following:
$path = "path/to/video.mp4";
$realpath = realpath($path);
bot("sendVideoNote",[
"chat_id" => $chat_id,
"video_note"=> new CURLFile($realpath))
]);

Why is WebRTC remote video source generated by URL.createObjectURL

In this document, it uses URL.createObjectURL to set the video source. (This is the code to answer a call).
var offer = getOfferFromFriend();
navigator.getUserMedia({video: true}, function(stream) {
pc.onaddstream = e => video.src = URL.createObjectURL(e.stream);
pc.addStream(stream);
pc.setRemoteDescription(new RTCSessionDescription(offer), function() {
pc.createAnswer(function(answer) {
pc.setLocalDescription(answer, function() {
// send the answer to a server to be forwarded back to the caller (you)
}, error);
}, error);
}, error);
});
I expected video.src to be the address to retrieve the remote video. So it should be fixed and given by the other side of the connection (whoever initiated the call). But the value of URL.createObjectURL is generated on the answerer's side, and it event depends on when the function is called. How it can be used to get the remote video stream?
Edit:
The result of URL.createObjectURL looks like blob:http://some.site.com/xxxx-the-token-xxxx. With this string, how does the video component know where to load the remote stream? Is there a hashmap of {url:stream} stored somewhere? If so, how does the video component access the hashmap?
A stream object does store a token string, which you can get with stream.toURL. But it is different from the result of URL.createObjectURL. The value of URL.createObjectURL depends on time. If you call it twice in a row, you get different values.
URL.createObjectURL(stream) is a hack. Stop using it. Efforts are underway to remove it.
Use video.srcObject = stream directly instead. It is standard and well-implemented.
This assignment of a local resource should never have been a URL in the first place, and is a red herring to understanding how WebRTC works.
WebRTC is a transmission API, sending data directly from one peer to another. No content URLs are involved. The remote stream you get from onaddstream is a local object receiver side, and is the live streaming result of the transmission, ready to be played.
The documentation you read is old and outdated. Thanks for pointing it out, I'll fix it. It has other problems: you should call setRemoteDescription immediately, not wait for the receiver to share their camera, otherwise incoming candidates are missed. Instead of the code you show, do this:
pc.onaddstream = e => video.srcObject = e.stream;
function getOfferFromFriend(offer) {
return pc.setRemoteDescription(new RTCSessionDescription(offer))
.then(() => navigator.getUserMedia({video: true}))
.then(stream => {
pc.addStream(stream);
return pc.createAnswer();
})
.then(answer => pc.setLocalDescription(answer))
.then(() => {
// send the answer to a server to be forwarded back to the caller (you)
})
.catch(error);
}
It uses srcObject, avoids the deprecated callback API, and won't cause intermittent ICE failures.
Because a WebRTC connection involves several steps and what you get from such a connection is a stream. But the src property of the video tag does not accept a stream, but a URL. And this is the way to "convert" a stream to a URL.

Open SCO Audio Connection with IOBluetoothAddSCOAudioDevice

I am trying to emulate a HandsFree device from Mac OS X.
I advertise correctly my SDP service as Handsfree
I can pair my Android phone to my computer which is seen as "HandsFree" device
I can send a couple of AT Commands (AT+BRST, CIND, CMER) to establish a service level connection
I then install an Audio Driver to route all incoming/outgoing sound to/from the device
Unfortunately, it seems that the SCO channel is never opened. My android phone still emits sound (i.e.: when pressing dial pad) on its own speaker.
When going in the sound preferences of Mac OS X, I can see my device as an input/output device but the sound level never change.
Here is my code :
(void)rfcommChannelData:(IOBluetoothRFCOMMChannel*)rfcommChannel data:(void *)dataPointer length:(size_t)dataLength
{
... STATE MACHINE GOES HERE and SEND AT COMMANDS ...
... AT CMER OK RECEIVED ...
ret = IOBluetoothAddSCOAudioDevice([[rfcommChannel getDevice] getDeviceRef], NULL);
if (ret != kIOReturnSuccess ){
IOBluetoothRemoveSCOAudioDevice([[rfcommChannel getDevice] getDeviceRef]);
NSLog(#"%#", #"Deleting previously audio device");
ret = IOBluetoothAddSCOAudioDevice([[rfcommChannel getDevice] getDeviceRef], NULL);
if (ret != kIOReturnSuccess) {
NSLog(#"%#", #"Can't install Audio Driver");
}
}
Any idea on why the Audio Driver is installed and reported by the system but can't open an SCO connection ?