Ion-sfu Flutter SDK not publishing stream - webrtc

I am working on a WebRTC video platform using Ion-sfu, which has been a really great experience so far, I have the SFU up and running, the AVP to record sessions, and a browser client working beautifully using the Javascript SDK.
Where I am stuck is with getting the Flutter SDK to play ball, I am using the sample code from the Github repo here:
https://github.com/pion/ion-sdk-flutter
The flutter client is connecting to the SFU, I can see the debug logs from the SFU:
[DEBUG] [57][subscriber.go][func1] => ice connection state: connected
I'm also receiving a stream in the Flutter app coming from my Javascript client, which also works flawlessly, so it appears the RTC connection between all the clients and the SFU is healthy.
I am rendering the local stream from ion.LocalStream.getUserMedia() in my flutter app so the stream also seems to be working, it's just not being published when I call await client.publish(localStream);
I am seeing this message in my debug logs:
I/flutter (12711): [ion-sdk-flutter] DEBUG: msg: {"id":0,"error":{"code":500,"message":"no rtc transport exists for this Peer","data":null},"jsonrpc":"2.0"}
I/flutter (12711): [ion-sdk-flutter] ERROR: error: code => 500, message => no rtc transport exists for this Peer
I am on the Flutter dev channel.
I have been hitting my head on this for some time now so if anybody has had experience using this and could point me in the right direction that would be a great help!

Related

Reactnative Signalwire application unable to establish Ice server peer connection

I am trying to implement Signalwire calling in a react-native app. I am facing issue in connecting ice servers. Sometimes it connects and calling between two persons get successful. But most of the time it throws error
"Cannot set properties of undefined (setting 'onicecandidate')"
I have tried to search a lot but could not succeed. Can you please guide me how this issue can be resolved? I am using following iceservers:
iceServers = [
{urls: ['stun:stun.l.google.com:19302','stun:stun1.l.google.com:19302','stun:stun2.l.google.com:19302']}
];
I have tried to find iceserver associated with my signalwire account but could not find, Please guide me how to get ice/turn/stun server urls and credentials. I am using Relay SDK for reactnative
It's not possible from this post to tell if you're using the SignalWire SDKs exclusively, if you're working from third party tools, or if you're doing your own setup. There are also a few variables in terms of how you've set up your app, what errors you're seeing from the SignalWire side, and the config of your ice servers.
With all that in mind, if you could reach out to SignalWire support (from your SignalWire space, select 'Help and Support' at the top right and then Submit a New Support Request) we can take a look at your setup and work through this with you.

Unable to connect my Chromecast with my custom receiver

I'm following this guide: https://codelabs.developers.google.com/codelabs/cast-receiver#4 to create a custom receiver. Unfortunately when I try to use the cactool for testing: https://casttool.appspot.com/cactool/ I always get and error regarding my device id: Failed to cast, please try again later. In the browser I get:
My application id: 8FEE03DD
Status: Published
Chromecast serial id: 8917AD6D304
Any idea?
The "is_device_registered" error is caused by the Cast Debug Logger calls which don't work to run in a browser (since it isn't running on the Chromecast device and thus can't find it).
After many iterations and testing I have noticed the following common causes for "Failed to cast. Please try again" while using the code labs setup:
Chromecast caches the application data, so if you for instance updates the Receiver App URL the device has to be restarted before the new URL is used
Ngrok session times out (solved by restarting ngrok and updating the Receiver App URL in the Cast Developer Console)
Observe if you see GET requests from the Chromecast in the http-server console (my CC gen 3 has 'Linux armv71' etc in User-Agent). If so, the Receiver App is configured correctly and your issue is with the code (HTML, js, etc)

Error 2001 AUDIO_INPUT_LEVEL_TOO_LOW in Agora video Call

Everything was working fine but After updating Agora to 3.1.2., once remote user joined the video call, After a few seconds, Video call disconnected and getting this error in the log
"type":"exception","code":2001,"msg":"AUDIO_INPUT_LEVEL_TOO_LOW"
Version info.
"ngx-agora": "2.0.1",
"agora-rtc-sdk": "3.1.2",
Angular 10.0.8
It is a known issue by the developer and the team is working on fixing it and is an open bug on the Agora IO Community Repo here.
In the words of the developer:
How to reproduce
If you create and publish your microphone audio track without any user interaction, the remote user may not hear you. In this case, the console will print some logs like SEND_AUDIO_BITRATE_TO_LOW and AUDIO_INPUT_LEVEL_TOO_LOW.
And once you interact with the webpage, the remote user will hear you.
Root cause
Agora Web SDK NG uses the AudioContext API to do some audio pre-processing by default. However, the AudioContext is restricted by the browser's autoplay policy. If user has not interacted with your webpage, the AudioContext will not run. So there is no audio data produced from the SDK's pre-processing module in this case.
How to avoid
We will fix this issue in v4.0.2, and it will be released next month.
For now, we recommend that you should ensure that the user has interacted with the webpage before the audio track is published. For example, the user is required to click the accpet or confirm button to start a call.

Fatal error in ../../webrtc/modules/utility/source/jvm_android.cc

We are facing one issue related to Twilio Programmable SDK & AppRTC version 57 for Android. As we have integrated both in existing Android application. You can have a look at the below link for your reference on Gradle dependencies and log cat crash logs.
Logcat crash logs -
E/rtc: #
# Fatal error in ../../webrtc/modules/utility/source/jvm_android.cc, line 233
# last system error: 88
# Check failed: !g_jvm
#
#
08-01 16:54:30.975 9534-9534/? A/libc: Fatal signal 6 (SIGABRT), code -6 in tid 9534
Twilio Programmable Video SDK
While we use Twilio Programmable multi-party video call, it's get crashed for the first time and when we perform same Twilio Programmable multi-party video call for the second time, it's get connected but AppRTC P2P video call gets crashed.
AppRTC
While we use AppRTC P2P video call, first it gets crashed and when we perform same AppRTC P2P video call for the second time, it's get connected but Twilio Multiparty call gets crashed.
As we need both AppRTC & Twilio Programmable Video SDK in our existing project.
Steps to reproduce
Perform AppRTC P2P/Twilio Video call.
When the video call is connected, app crashes.
Perform Twilio/AppRTC P2P Video call.
When the video call is connected, app crashes.
Thanks!
Twilio developer evangelist here.
I believe you've been in contact with Twilio support regarding this issue. I just thought I'd update this publicly too.
Currently, Twilio Video Android SDK is not side-by-side compatible with AppRTC. There is likely to be work in the future to make this possible, but for now it won't work.

react-native send crash report with stack trace to my server

I'm building a mobile app for both android and IOS by react-native. Now I want to sent crash report with stack trace + client's info to my server (by restful api). How can I catch error which I can not handle (make app crash).
I have checked crashlytics, but it can not send log to my server.
I would recommend crashlytics despite you mentioned it. I am using it for some time now and it works great. It is worth to mention that you won't receive any logs if your app isn't built in release mode (on android).