I've been trying to debug this issue, I'm creating a peer to peer audio streaming app.
Each time a mobile device using Safari receives a peers incoming MediaStream, it plays the stream for a fraction of a second and then stops.
I've checked the actual MediaStream & Audio object I'm using to play the incoming stream, but nothing of its status suggests that's it's paused, inactive or stopped.
I know there are some issues with Safari when it comes to programmatically playing Audio, but I thought that was on the initial playback of the clients first stream and then all subsequent streams should work.
I used the implementation from this article.
Anyone know of the reason for this?
Related
playing HLS live video streams based on videojs and videojs-http-streaming, how do I get events when the server stops pushing streams?
We are facing one issue related to Twilio Programmable SDK & AppRTC version 57 for Android. As we have integrated both in existing Android application. You can have a look at the below link for your reference on Gradle dependencies and log cat crash logs.
Logcat crash logs -
E/rtc: #
# Fatal error in ../../webrtc/modules/utility/source/jvm_android.cc, line 233
# last system error: 88
# Check failed: !g_jvm
#
#
08-01 16:54:30.975 9534-9534/? A/libc: Fatal signal 6 (SIGABRT), code -6 in tid 9534
Twilio Programmable Video SDK
While we use Twilio Programmable multi-party video call, it's get crashed for the first time and when we perform same Twilio Programmable multi-party video call for the second time, it's get connected but AppRTC P2P video call gets crashed.
AppRTC
While we use AppRTC P2P video call, first it gets crashed and when we perform same AppRTC P2P video call for the second time, it's get connected but Twilio Multiparty call gets crashed.
As we need both AppRTC & Twilio Programmable Video SDK in our existing project.
Steps to reproduce
Perform AppRTC P2P/Twilio Video call.
When the video call is connected, app crashes.
Perform Twilio/AppRTC P2P Video call.
When the video call is connected, app crashes.
Thanks!
Twilio developer evangelist here.
I believe you've been in contact with Twilio support regarding this issue. I just thought I'd update this publicly too.
Currently, Twilio Video Android SDK is not side-by-side compatible with AppRTC. There is likely to be work in the future to make this possible, but for now it won't work.
I know how to access audio input devices via getUserMedia() and route them to the WebAudio API. This works fine for microphones and such. But in my use-case, I would rather like to hook into the audio stream of an output device. The use case is that I want to create a spectrum analyser for audio coming from a digital audio workstation (DAW) running on the same PC.
I tried to enumerate the devices and call getUserMedia() with the device id of an audio device, but the stream returned only showed silence data. The only solution I found so far is to install an audio loopback device (like Soundflower on Macs) to route the DAW's output to and then use this as an input device for getUserMedia(). But this will require the user to install 3rd party software.
Is there any way to hook directly into the audio stream of an output device instead, before it is actually sent to the physical device (speaker or external soundcard)?
This can be achieved using the desktop capture APIs (chrome.desktopCapture.chooseDesktopMedia). An example for chrome is included here
We are facing a strange issue in webrtc call i.e. in a connected webrtc audio only call when some one upgrade the call(add video) the audio tracks will drops from orignator side.
steps to reproduce the problem
1.make a audio only call between two peers A and B.
2.updgrade call to video by calling getUserMedia again from peer A.
3.call established.
4.A can hear audio and view video.
5.B cant hear audio.
What is the expected result?
onaddstream(e) e.stream should contain both audio and video tracks
What do you see instead?
only video track is there at B's side (recipient)
What version of the product are you using? On what operating system?
Chrome 51/WIndows7
Please find the webrtc dump from below link
Webrtc dump
I've seen that kurento-room isn't able of managing a user that enters only with microphone(no webcam).
The user actually appears in the room, with a black screen in the place where the webcam is normally located, but no audio is received from him either.
Why can this be happening?
That's an error in the connection of WebRTC endpoints. The thing is that the endpoint only negotiates audio, but the connection was made with audio and video profiles, and the media server committed seppuku. It should be fixed by now, providing the right media profiles in the connect method.