Cycle.js: Getting Stream, Expecting Blob - cyclejs

I'm trying to create a record audio stream. I'm creating a promise stream from navigator mediaDevices.getUserMedia then mapping that stream to a media recorder stream. Finally I want to create a blob stream with the media recorder stream.
What I'm running into is that the blob variable in the subscribe function is a stream not a blob.
What is the correct way to take the results from addEventListener and turn it into a stream of blobs?
const mediaSource$ = xs.fromPromise(
navigator.mediaDevices.getUserMedia({ audio: true, video: false })
)
const mediaRecorder$ = mediaSource$
.map( mediaSource => {
const mediaRecorder = new window.MediaRecorder(
mediaSource,
{mimeType: 'audio/webm'}
)
return mediaRecorder
})
const blob$ = mediaRecorder$
.map( (mediaRecorder) =>
xs.create({
start: (listener) => {
mediaRecorder.addEventListener('dataavailable', (e) => {
console.log('Data Available', e.data)
listener.next(e.data)
})
},
stop: () => {}
})
)
xs.combine(action$, mediaRecorder$, blob$).subscribe({
next: ([action, mediaRecorder, blob]: [any, any, any]) => {
console.log('BOLB', blob); // getting a stream, not a blob
if(action.key === 'start_recording') mediaRecorder.start()
if(action.key === 'stop_recording') mediaRecorder.stop()
}
})

Your approach is almost correct, including the xs.create, but if you map to a stream you now have a stream of streams of events. To get a normal stream of events out, just add .flatten() after the map.

Related

webrtc getUserMedia : how to get a stream from getUserMedia and publish it to SRS?

How to get a stream using html5 getUserMedia and publish that to SRS ?
I want to get a stream directly from browser and not using OBS or ffmpeg.
Any sample available ?
Well, it dpends on your use scenario.
If you want to do live streaming, please see this post, the media flow:
Browser --WebRTC--> SRS --HLS/HTTP-FLV--> Viewer
If you want to do video meeting, please see this post, the media flow:
Browser <--WebRTC--> SRS <--WebRTC--> Viewer
Note that for video meeting, there should be NxN streams in a room.
I have a Solution.
Check the below Code...
HTML CODE: Here you need only Video tag.
Index.html
<video id="remoteScreen" autoplay="true"></video>
Screenshare.js file
const getLocalScreenCaptureStream = async () => {try {
const constraints = { video: { cursor: 'always' }, audio: false };
const screenCaptureStream = await navigator.mediaDevices.getDisplayMedia(constraints); return screenCaptureStream; } catch (error) {
console.error('failed to get local screen', error)}}
main.js
var localStreamScreen = null;
async function shareScreen() {localStreamScreen = await getLocalScreenCaptureStream(); console.log("localStreamScreen: ", localStreamScreen)}
screenshare.js
function handleRemoteStreamAddedScreen(event) {
console.log('Remote stream added.');
alert('Remote stream added.');
if ('srcObject' in remoteScreen) {
remoteScreen.srcObject = event.streams[0];
} else {
// deprecated
remoteScreen.src = window.URL.createObjectURL(event.stream);
}
remoteScreenStream = event.stream};
Hope, it will work for you.

How to play an audio file into Tone.Offline buffer output

Question
How do I play a local audio file inside Tone.Offline so that when it finishes and return it's promised buffer it will contain the played audio?
Code
Tone.Offline(({ transport }) => {
const p = new Tone.Player(src, () => {
transport.start();
}).toDestination();
transport.schedule((time) => {
p.start(time + 5).stop(time + 9); //this never happen!
});
}, 10).then((buffer) => { // how do I push parts of my audio file onto the buffer?
const wav = toWav(buffer);
saveAs(new Blob([wav], { type: "audio/wav" }), "./blibli.wav");
});
Issue
I've noticed the callback part inside transport.schedule never happen,
and so I never get to build the final buffer.

Closing WebRTC track will not close camera device or tab camera indicator

Banging my head to the wall with this one, I can't seem to understand what is holding on the camera video stream and not closing when MediaStreamTrack.stop() called.
I have a typescript class where I handle getting the WebRTC stream track and passing it using an observable event to a functional reactjs component, the below code is the component registering to the event and using state for the stream track.
const [videoStreamTrack, setVideoStreamTrack] = useState < MediaStreamTrack > (
null
)
useEffect(() => {
return () => {
videoStreamTrack?.stop()
videoElement.current.srcObject.getVideoTracks().forEach((track) => {
track.stop()
videoElement.current.srcObject.removeTrack(track)
})
videoElement.current.srcObject = null
}
}, [])
case RoomEvents.WebcamProducerAdded:
case RoomEvents.VideoStreamReplaced: {
if (result.data?.track) {
if (result.data.track.kind === 'video') {
previewVideoStreamTrack?.stop()
setPreviewVideoStreamTrack(null)
setVideoStreamTrack(result.data.track)
}
}
break
}
In the "Room" class I use the below code to grab the stream.
const videoDevice = this.webcam.device
if (!videoDevice) {
throw new Error('no webcam devices')
}
const userMedia = await navigator.mediaDevices.getUserMedia({
video: this.environmentPlatformService.isMobile ?
true : {
deviceId: {
exact: this.webcam.device.deviceId
},
...VIDEO_CONSTRAINS[this.webcam.resolution],
},
})
const videoTrack = userMedia.getVideoTracks()[0]
this.eventSubject.next({
eventName: RoomEvents.WebcamProducerAdded,
data: {
track: videoTrack,
},
})
I am holding to this.webcam.device details using the code below.
async updateInputOutputMediaDevices(): Promise < MediaDeviceInfo[] > {
await navigator.mediaDevices.getUserMedia({
audio: true,
video: true
})
const devices = await navigator.mediaDevices.enumerateDevices()
await this.updateWebcams(devices)
await this.updateAudioInputs(devices)
await this.updateAudioOutputs(devices)
return devices
}
private async updateWebcams(devices: MediaDeviceInfo[]) {
this.webcams = new Map < string, MediaDeviceInfo > ()
for (const device of devices.filter((d) => d.kind === 'videoinput')) {
this.webcams.set(device.deviceId, device)
}
const array = Array.from(this.webcams.values())
this.eventSubject.next({
eventName: RoomEvents.CanChangeWebcam,
data: {
canChangeWebcam: array.length > 1,
mediaDevices: array,
},
})
}
Refreshing the page will close the camera and tab indicator.
useEffect(() => {
return () => {
videoStreamTrack?.stop()
videoElement.current.srcObject.getVideoTracks().forEach((track) => {
track.stop()
videoElement.current.srcObject.removeTrack(track)
})
videoElement.current.srcObject = null
}
}, [])
So here you are search and destroying video tracks. Seems right-ish; we'll see
async updateInputOutputMediaDevices(): Promise < MediaDeviceInfo[] > {
await navigator.mediaDevices.getUserMedia({
audio: true,
video: true
})
const devices = await navigator.mediaDevices.enumerateDevices()
await this.updateWebcams(devices)
await this.updateAudioInputs(devices)
await this.updateAudioOutputs(devices)
return devices
}
Above I see there's a call for audio might be where the hiccups are? Can't overly examine but maybe you're opening both and closing just video? Try doing a loop through all tracks not just video and see what's there?
#blanknamefornow answer helped me nail the issue.
We are calling getUserMedia in multiple places not only in the
“room” class handling mediasoup actions but also fore
preview/device-selection/etc and didn’t really ever closed the
tracks retrieved.
Sometimes, those tracks are held into useState
variables and when component unmounted if you try to access the
variables they are already nulled by reactjs. The workaround is
since the HTML elements are still referenced stop the track when
needed. I believe this was the missing ingredient when trying to
figure it out.

React Native using firestore updating array

I am working on a mobile chat application to learn how to use cloud services and have been having some difficulty updating my array of maps without overwriting the array. I have tried multiple different ways but I can only either get it so it overwrites the whole array, or does nothing. I was trying to follow the documentation from firebase for NodeJS to append to the array but cannot get it to work. Any tips on what I am doing wrong? (in my code: db = firebase.firestore();)
sendMessage = async e => {
e.preventDefault();
let date = new Date();
const res2 = await db.collection('indivualChats').doc(this.state.chatID).update({
messages: db.FieldValue.arrayUnion({
mid: res.id,
msg: this.state.message,
timeSent: date.getDate() + "/" + date.getMonth() + "/" + date.getFullYear(),
uid: auth.currentUser.uid})
});
this.setState({
message: '',
})
};
cloud data layout
From Doug's answer you can't direct update of a specific index of array. The alternative way is to read the entire document, modify the data in memory and update the field back into the document or based on the document you can use arrayUnion() and arrayRemove() to add and remove elements.
My example:
Data structure:
Codes in nodejs:
async function transaction(db) {
const ref = db.collection('users').doc('john');
try {
await db.runTransaction(async (t) => {
const doc = await t.get(ref)
const newmessage = 'hello world!';
t.update(ref, {
"message": admin.firestore.FieldValue.arrayUnion({
text: newmessage
})
});
t.update(ref, {
'message': admin.firestore.FieldValue.arrayRemove({
text:doc.data().message[0].text
})
})
});
console.log('Transaction success!');
} catch (e) {
console.log('Transaction failure:', e);
}
}
transaction(db);

Upload images from React Native to LoopBack

I need to upload a selection of images that user picked from CameraRoll to the LoopBack Component Storage. The thing is that the component storage is working fine, because I can upload and download the files through Postman. But, when I try to upload from react native to loopback, it always returns "No file content upload" with http status 400.
I read a lot of people talking about it and tried everything and none worked for me.
First, I am taking the images from the CameraRoll and my images array looks like this:
[
{
exists: 1,
file: "assets-library://asset/asset.JPG?id=3FF3C864-3A1A-4E55-9455-B56896DDBF1F&ext=JPG",
isDirectory: 0,
md5: "428c2e462a606131428ed4b45c695030",
modificationTime: 1535592967.3309255,
size: 153652,
uri: null
}
]
In the example above I just selected one image.
I transformed to Blob, then I got:
[
{
_data: {
blobId: "3FF3C864-3A1A-4E55-9455-B56896DDBF1F",
name: "asset.JPG",
offset: 0,
size: 153652,
type: "image/jpeg"
}
}
]
So I tried a lot of things after this, tried to send the blob itself as the request body, tried to append to a form data and send the form data, but it doesn't matter the way I try, I always get the "No file content upload" response.
I also tried the example from Facebook, didn't work: https://github.com/facebook/react-native/blob/master/Libraries/Network/FormData.js#L28
The way I am trying now:
In my view:
finalizarCadastro = async () => {
let formData = new FormData();
let blobs = [];
for(let i=0;i<this.state.fotos.length;i++){
let response = await fetch(this.state.fotos[i]);
let blob = await response.blob();
blobs.push(blob);
}
formData.append("files", blobs);
this.props.servico.criar(formData);
}
And the function that send to my server:
criar: (servico) => {
this.setState({carregando: true});
axios.post(`${REQUEST_URL}/arquivos/seila/upload`, servico, {headers: {'content-type': 'multipart/form-data'}}).then(() => {
this.setState({carregando: false});
this.props.alertWithType("success", "Sucesso", "Arquivo salvo com sucesso");
}).catch(error => {
this.setState({carregando: false});
console.log(error.response);
this.props.alertWithType("error", "Erro", error.response.data.error.message);
})
}
I found the solution. So the problem was actually not the code itself, the problem was sending multiple files at the same time. To fix everything, I did this:
this.state.fotos.forEach((foto, i) => {
formData.append(`foto${i}`, {
uri: foto,
type: "image/jpg",
name: "foto.jpg"
});
})
this.props.servico.criar(formData);
And my function that sends the request to the server:
criar: (servico) => {
this.setState({carregando: true});
axios.post(`${REQUEST_URL}/arquivos/seila/upload`, servico).then((response) => {
this.setState({carregando: false});
this.props.alertWithType("success", "Sucesso", "Arquivo salvo com sucesso");
}).catch(error => {
this.setState({carregando: false});
this.props.alertWithType("error", "Erro", error.response.data.error.message);
})
},
So you don't need to set the Content-Type header to multipart/form-data and don't need to transform the images to blob, actually you just need the uri of each one, and I think the type and name attributes are opcional.