Muting all sound for a streaming video in Actionscript-2 - actionscript-2

So I'm trying to mute a streamed video. For now I'm working with one in the same directory.
var flvURL = 'flvInThisDirectory.flv';
var netConn:NetConnection = new NetConnection();
netConn.connect(null);
var netStream:NetStream = new NetStream(netConn);
my_video2.attachVideo(netStream);
netStream.setBufferTime(0);
netStream.play(flvURL);
//all of the below are different, supposedly valid attempts to mute sound for
//this stream.. none of them work
var nsst = new SoundTransform(0);
nsst.volume = 0;
netStream.soundTransform=nsst;
netStream.fromSrvr.attachAudio(false);
var so = new Sound(netStream);
so.setVolume(0);
stopAllSounds();
But there's still sound. This is the same if the URL is remote, too. Any ideas?

Woops.
"var so = new Sound(netStream);"
Make this into
"var so = new Sound();"
not even gonna give myself a point for this one.

Related

Cannot control audio volume in Safari when using createMediaElementSource()

I have an html audio player and I use it with createMediaElementSource() to make a visualizer. But I can't change the volume for it from Safari. When I change the volume to 0 it logs it correctly (0) in the console, but the actual volume is still at 1. It works in other browsers.
const AudioContext = window.AudioContext ||
window.webkitAudioContext
var audio = new AudioContext()
var $audioPlayer = document.querySelector('.player-element__player')
var analyser = audio.createAnalyser()
analyser.fftSize = 32
analyser.connect(audio.destination)
var source = audio.createMediaElementSource($audioPlayer)
source.connect(analyser)
$audioPlayer.volume = 0
Seems like the solution for now is to make a GainNode and control the volume from there.

Firefox WebAudio API

Can't process Audio Data via createAnalyser. When remote audio stream is connected to the analyser sound gets muted or no playback. Is this ever going to get fixed, kind of frustrating
You can see a complete example with remote audio and createAnalyser here:
http://git.io/vegtp
var audioData = ajaxRequest.response;
var audioCtx = new (window.AudioContext)();
audioCtx.decodeAudioData(audioData ...
var analyser = audioCtx.createAnalyser();
With stream source: http://git.io/vegmk
var audioCtx = new AudioContext();
audioCtx.createMediaStreamSource(stream);
var analyser = audioCtx.createAnalyser();
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamSource
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createAnalyser

createMediaElementSource plays but getByteFrequencyData returns all 0's

I am attempting to visualize audio coming out of an element on a webpage. The source for that element is a WebRTC stream connecting to an Asterisk call via sip.js. The audio works as intended.
However, when I attempt to get the frequency data using web audio api, it returns an array of all 0's, even though the audio is working. This seems be a problem with createMediaElementSource. If I call getUserMedia and use createMediaStreamSource to connect my microphone to the input, I indeed get the frequency data returned.
This was attempted in both Chrome 40.0 and Firefox 31.4. In my search I found similar errors with Android Chrome but my versions of desktop Chrome and Firefox seem like they should be functioning correctly. So far I have a feeling that the error may be due to the audio player getting it's audio from another AudioContext in sip.js, or something having to do with CORS. All of the demos that I have tried work correctly, but only use createMediaStreamSource to get mic audio, or use createMediaElementSource to play a file (rather than streaming to an element).
My Code:
var context = new (window.AudioContext || window.webkitAudioContext)();
var analyser = context.createAnalyser();
analyser.fftSize = 64;
analyser.minDecibels = -90;
analyser.maxDecibels = -10;
analyser.smoothingTimeConstant = 0.85;
var frequencyData = new Uint8Array(analyser.frequencyBinCount);
var visualisation = $("#visualisation");
var barSpacingPercent = 100 / analyser.frequencyBinCount;
for (var i = 0; i < analyser.frequencyBinCount; i++) {
$("<div/>").css("left", i * barSpacingPercent + "%").appendTo(visualisation);
}
var bars = $("#visualisation > div");
function update() {
window.requestAnimationFrame(update);
analyser.getByteFrequencyData(frequencyData);
bars.each(function (index, bar) {
bar.style.height = frequencyData[index] + 'px';
console.debug(frequencyData[index]);
});
};
$("audio").bind('canplay', function() {
source = context.createMediaElementSource(this);
source.connect(analyser);
update();
});
Any help is greatly appreciated.
Chrome doesn't support WebAudio processing of RTCPeerConnection output streams (remote streams); see this question. Their bug is here.
Edit: they now support this in Chrome 50
See the test code for firefox about to land as part of this bug:
Bug 1081819. This bug will add webaudio input to RTCPeerConnections in Firefox; we've had working WebAudio processing of output MediaStreams for some time. The test code there tests both sides; note it depends a lot on the test framework, so just use it as a guide on hooking to webaudio.

Alter a stream received via webRTC PeerConnection with AudioNodes of Web Audio API

A complementary question to Send MediaStream object with Web Audio effects over PeerConnection.
So far I've tried connecting remote stream to AudioContext.destination:
var remoteSource = context.createMediaStreamSource(webRTCStream);
var delayFilter = context.createDelay(10.0);
var destination = context.destination;
remoteSource.disconnect(destination);
remoteSource.connect(delayFilter);
delayFilter.connect(destination);
connecting to MediaStreamDestination and then to context.destination
var destination = context.createMediaStreamDestination();
delayFilter.connect(destination);
destination.connect(context.destination);
I've even tried to intercept the track from the html element it self (which works perfectly for html5 <audio> tag)
myAudio.src = window.URL.createObjectURL(webRtcStream);
var remoteSource = context.createMediaElementSource(myAudio);
var delayFilter = context.createDelay(10.0);
var destination = context.createMediaStreamDestination();
remoteSource.disconnect(context.destination);
remoteSource.connect(delayFilter);
delayFilter.connect(destination);
destination.connect(context.destination);
However, none of them works in either Chrome or Firefox... Am i missing something? or it is not supposed to work/not implemented?

How to change the resolution of camera while recording video in WP8

I am using the video recording sample provided by microsoft here. I want to change the resolution of the video being recorded in my app. Currently its recording in highest resolution by default. How to do so?
videoCaptureDevice.DesiredFormat = new VideoFormat(PixelFormatType.Unknown, 480, 640, 30);
The above statement is throwing Argument Exception.
Also, if possible let me know how to capture from the front camera?
How to achieve this? Please help.
Second parameter for AudioVideoCaptureDevice.OpenAsync is the resolution. And you can get the resolutions using AudioVideoCaptureDevice.GetAvailableCaptureResolutions(sensor).
You may try this one.
private AudioVideoCaptureDevice VideoRecordingDevice;
private Windows.Foundation.Size resolution = new Windows.Foundation.Size(320, 240);
VideoRecordingDevice = await AudioVideoCaptureDevice.OpenAsync(CameraSensorLocation.Back, resolution);
NB: Remember that it may only used for wp8 or later version.
The Solution is (With my knowledge)
VideoCaptureDevice webcam = CaptureDeviceConfiguration.GetDefaultVideoCaptureDevice();
int videoformatcount = webcam.SupportedFormats.Count(); //We will get the avilable video format
if (videoformatcount > 0)
{
var Temp = webcam.SupportedFormats;
VideoFormat objVideoFormat = Temp[videoformatcount - 1];
webcam.DesiredFormat = new VideoFormat(PixelFormatType.Format8bppGrayscale, objVideoFormat.PixelWidth, objVideoFormat.PixelHeight, 1);
}
captureSource.VideoCaptureDevice = webcam;
This will produce the lowest resolution video
Use AudioVideoCaptureDevice to recoed video
StorageFolder isoStore = await ApplicationData.Current.LocalFolder.GetFolderAsync("Shared");
var file = await isoStore.CreateFileAsync("foos1.wmv", CreationCollisionOption.ReplaceExisting);
using (var s = await file.OpenAsync(FileAccessMode.ReadWrite))
{
Windows.Foundation.Size resolution = new Windows.Foundation.Size(640, 480);
avDevice = await AudioVideoCaptureDevice.OpenAsync(CameraSensorLocation.Back,
AudioVideoCaptureDevice.GetAvailableCaptureResolutions(CameraSensorLocation.Back).Last());
VideoBrush videoRecorderBrush = new VideoBrush();
videoRecorderBrush.SetSource(avDevice);
viewfinderRectangle.Fill = videoRecorderBrush;
await avDevice.StartRecordingToStreamAsync(s);
Thread.Sleep(30000);
await avDevice.StopRecordingAsync();
}
new MediaPlayerLauncher()
{
Media = new Uri(file.Path, UriKind.Relative),
}.Show();