webrtc getUserMedia : how to get a stream from getUserMedia and publish it to SRS? - simple-realtime-server

How to get a stream using html5 getUserMedia and publish that to SRS ?
I want to get a stream directly from browser and not using OBS or ffmpeg.
Any sample available ?

Well, it dpends on your use scenario.
If you want to do live streaming, please see this post, the media flow:
Browser --WebRTC--> SRS --HLS/HTTP-FLV--> Viewer
If you want to do video meeting, please see this post, the media flow:
Browser <--WebRTC--> SRS <--WebRTC--> Viewer
Note that for video meeting, there should be NxN streams in a room.

I have a Solution.
Check the below Code...
HTML CODE: Here you need only Video tag.
Index.html
<video id="remoteScreen" autoplay="true"></video>
Screenshare.js file
const getLocalScreenCaptureStream = async () => {try {
const constraints = { video: { cursor: 'always' }, audio: false };
const screenCaptureStream = await navigator.mediaDevices.getDisplayMedia(constraints); return screenCaptureStream; } catch (error) {
console.error('failed to get local screen', error)}}
main.js
var localStreamScreen = null;
async function shareScreen() {localStreamScreen = await getLocalScreenCaptureStream(); console.log("localStreamScreen: ", localStreamScreen)}
screenshare.js
function handleRemoteStreamAddedScreen(event) {
console.log('Remote stream added.');
alert('Remote stream added.');
if ('srcObject' in remoteScreen) {
remoteScreen.srcObject = event.streams[0];
} else {
// deprecated
remoteScreen.src = window.URL.createObjectURL(event.stream);
}
remoteScreenStream = event.stream};
Hope, it will work for you.

Related

Webcam not working with JavaScript on some machines

I'm building a an Electron application with Vue.js that uses a webcam. The webcam works within the Electron application on one computer but it just shows a black screen on another. The only notable difference (I think) is that the machine where it works (Machine A) uses Node v14.15.0 and on the machine that it doesn't work (Machine B) uses v12.18.4
I have tested the webcam on Machine B separately. It works via the native camera app on windows and on this online tool. The interesting thing is that both the integrated and external webcams fail to work. As soon as I start the stream the light comes on but that's it. It seems that the promise from .getUserMedia() isn't resolving (see code snippet) but I can't identify why.
How can I get the webcam to stream?
let mediaInputs = [];
let devices = [];
if (!navigator.mediaDevices || !navigator.mediaDevices.enumerateDevices) {
console.log("enumerateDevices() not supported.");
return;
}
mediaInputs = await navigator.mediaDevices.enumerateDevices();
devices = await mediaInputs.filter(
(device) => device.kind === "videoinput"
);
//Stop any existing streams
if (this.video.srcObject !== undefined) {
const l = this.video.srcObject;
l.getTracks().forEach((track) => track.stop());
}
const sourceInfo = this.videoSources.find((o) => o.label === this.source);
const constraints = {
video: {
deviceId: sourceInfo.deviceId,
width: 1280,
height: 720,
},
audio: false,
};
try {
console.log('This line is logged');
//This is where I start the stream.
const stream = await navigator.mediaDevices.getUserMedia(constraints);
console.log('This line is never reached');
this.video = this.$refs.video;
this.video.srcObject = stream;
this.video.play();
} catch (error) {
this.showSnackbar(error);
console.error(error);
}
I just had to update to the latest version of Electron (11.1.1 at the time of writing) for it to work.
However if you're still having trouble, there's a GitHub thread on the topic that's still active

React Native Expo Audio | Play live stream from latest position

I'm writing an audio player, using Expo Audio, for an app I'm making for an online radio.
The audio comes from an online live stream and, I've successfully added the player and all the things related to it; however, the one issue I'm having is that if I pause the audio when I resume playing it the audio continues from when I paused it rather than from the current position and I need to pause it and play it again to get it to update to what's currently being played.
I play it with playAsync() and I've tried pausing with pauseAsync(), stopAsync(), setStatusAsync({ shouldPlay: false, positionMillis: 0 });
Any tips on how I can get it to work the way it should?
Here's the code I have for the audio player, it's a class from which then I create an instance of to be able to manage it from different places in the app:
class audioPlayer {
static instance = null;
static createInstance() {
var object = new audioPlayer();
return object;
}
_radioStream;
/**
* #returns {audioPlayer}
*/
static getInstance() {
if (audioPlayer.instance == null) {
audioPlayer.instance = audioPlayer.createInstance();
}
return audioPlayer.instance;
}
// Call this first to create a new audio element
createAudio() {
this._radioStream = new Audio.Sound();
};
async loadAudioAsync() {
try {
await this._radioStream.loadAsync(
{ uri: "radio straem"},
);
store.dispatch(setLiveState(true));
this.toggleAudio(); // Autoplay at start
return true;
} catch (error) {
if (error.code === "E_LOAD_ERROR") {
// In the case of an error we try to load again
setTimeout(this.loadAudioAsync, 10000);
throw new Error(error.code);
} else {
throw new Error(error);
};
};
};
async unloadAudioAsync() {
await this._radioStream.unloadAsync();
};
async getStatusAsync() {
return await this._radioStream.getStatusAsync();
};
async toggleAudio() {
// We're gonna play or pause depending on the status
let { isLoaded, isPlaying } = await this._radioStream.getStatusAsync();
// If the user presses the audio and the stream connection has been lost or something
// we try to load it again
if (!isLoaded) {
let res = await this.loadAudioAsync(); // Try to loadAudio again
if (res) this.toggleAudio(); // Retrigger the toggle to start playing
}
if (isLoaded && !isPlaying) {
store.dispatch(setPlayingStatus(true));
await this._radioStream.playAsync();
} else if (isLoaded && isPlaying) {
store.dispatch(setPlayingStatus(false));
await this._radioStream.setStatusAsync({ shouldPlay: false, positionMillis: 0 });
};
};
};
I just had the same exact problem (for my internet radio https://notylus.fr).
It's seems that I found a solution : instead of using
playbackInstance.pauseAsync()
I now use
playbackInstance.stopAsync()
AND for the play part, I add
await playbackInstance.playAsync() //play stream
playbackInstance.setPositionAsync(0) //ensure that you're at position 0
Last
Regards,

stream is undefined when using navigator.getUserMedia

I am using webrtc and trying to show the video after obtaining permission of getUserMedia()
here is what I am trying to do
var mediaConstraints = { audio: true, video: true };
const stream = await navigator.getUserMedia
(mediaConstraints, function() {
console.log("obtained successfully");
}, function() {
console.error("access was denied OR hardware issue");
});
however stream is undefied, it should have a value of any kind
navigator.getUserMedia is deprecated.
Try this instead
navigator.mediaDevices.getUserMedia()
navigator.getUserMedia is the legacy variant of getUserMedia
It uses callbacks and does not return a promise.
You're mixing styles, either use callbacks or navigator.mediaDevices.getUserMedia without callbacks.

Google Speech To Text API in React Native

I'm doing a project by React Native with following features
1. Users start recording audio
2. Stop recording audio
3. Save audio
4. Translate it to text (In my case translate to Vietnamese)
I'm done with 1,2,3. But I'm stuck at 4, I don't know how to use Google STT API with an audio file input, because it seems like just use STT as an intent.
Hope any ideas or solution related!
Here's my code:
requestAPI() {
// Imports the Google Cloud client library
const Speech = require('#google-cloud/speech')({
projectId: 'speech-to-text-175801',
keyFilename: '/keyfile.json'
});
const RNFS = require('react-native-fs');
// Your Google Cloud Platform project ID
const projectId = 'speech-to-text-175801';
// Instantiates a client
const speechClient = Speech({
projectId: projectId,
keyFilename: '/keyfile.json'
});
// The name of the audio file to transcribe
const fileName = this.state.audioPath;
// Reads a local audio file and converts it to base64
const file = RNFS.readFile(fileName);
const audioBytes = file.toString('base64');
// The audio file's encoding, sample rate in hertz, and BCP-47 language code
const audio = {
content: audioBytes
};
const config = {
encoding: 'aac',
sampleRateHertz: 32000,
languageCode: 'vi-VN'
};
const request = {
audio: audio,
config: config
};
// Detects speech in the audio file
speechClient.recognize(request)
.then((results) => {
const transcription = results[0].results[0].alternatives[0].transcript;
this.setState({
textReceived: transcription
})
})
.catch((err) => {
this.setState({
textReceived: 'Có lỗi Google Cloud STT, mời bạn request lại'
})
});
}
Error now: 'Unable to resolve module "child_process" from ...\google-auth-library\lib\auth\googleauth.js"
I think the problem is that you can't use Google speech SDK on react native, it's a node library. So you need to first send your file to nodejs server then call the sdk methods.
You can use react-native-speech-to-text-android to perform speech to text conversion.
import SpeechToText from 'react-native-google-speech-to-text';
const speechToTextHandler = async () => {
let result = null;
try {
result = await SpeechToText.startSpeech('Try saying something', 'en_IN');
console.log('Result: ', result);
} catch (error) {
console.log('error: ', error);
}
}

How to upload file to server using react-native

I am developing a app where i need to upload an image to the server. Based on the image i get a response which i need to render?.
Can you please help me how to upload an image using react-native?.
There is file uploading built into React Native.
Example from React Native code:
var photo = {
uri: uriFromCameraRoll,
type: 'image/jpeg',
name: 'photo.jpg',
};
var body = new FormData();
body.append('authToken', 'secret');
body.append('photo', photo);
body.append('title', 'A beautiful photo!');
var xhr = new XMLHttpRequest();
xhr.open('POST', serverURL);
xhr.send(body);
My solution is using fetch API and FormData.
Tested on Android.
const file = {
uri, // e.g. 'file:///path/to/file/image123.jpg'
name, // e.g. 'image123.jpg',
type // e.g. 'image/jpg'
}
const body = new FormData()
body.append('file', file)
fetch(url, {
method: 'POST',
body
})
I wrote something like that. Check out https://github.com/kamilkp/react-native-file-transfer
I have been struggling to upload images recently on react-native. I didn't seem to get the images uploaded. This is actually because i was using the react-native-debugger and network inspect on while sending the requests. Immediately i switch off network inspect, the request were successful and the files uploaded.
I am using the example from this answer above it works for me.
This article on github about the limitations of network inspect feature may clear things for you.
Just to build on the answer by Dev1, this is a good way to upload files from react native if you also want to show upload progress. It's pure JS, so this would actually work on any Javascript file.
(Note that in step #4 you have to replace the variables inside the strings with the type and file endings. That said, you could just take those fields out.)
Here's a gist I made on Github: https://gist.github.com/nandorojo/c641c176a053a9ab43462c6da1553a1b
1. for uploading one file:
// 1. initialize request
const xhr = new XMLHttpRequest();
// 2. open request
xhr.open('POST', uploadUrl);
// 3. set up callback for request
xhr.onload = () => {
const response = JSON.parse(xhr.response);
console.log(response);
// ... do something with the successful response
};
// 4. catch for request error
xhr.onerror = e => {
console.log(e, 'upload failed');
};
// 4. catch for request timeout
xhr.ontimeout = e => {
console.log(e, 'cloudinary timeout');
};
// 4. create formData to upload
const formData = new FormData();
formData.append('file', {
uri: 'some-file-path', // this is the path to your file. see Expo ImagePicker or React Native ImagePicker
type: `${type}/${fileEnding}`, // example: image/jpg
name: `upload.${fileEnding}` // example: upload.jpg
});
// 6. upload the request
xhr.send(formData);
// 7. track upload progress
if (xhr.upload) {
// track the upload progress
xhr.upload.onprogress = ({ total, loaded }) => {
const uploadProgress = (loaded / total);
console.log(uploadProgress);
};
}
2. uploading multiple files
Assuming you have an array of files you want to upload, you'd just change #4 from the code above to look like this:
// 4. create formData to upload
const arrayOfFilesToUpload = [
// ...
];
const formData = new FormData();
arrayOfFilesToUpload.forEach(file => {
formData.append('file', {
uri: file.uri, // this is the path to your file. see Expo ImagePicker or React Native ImagePicker
type: `${type}/${fileEnding}`, // example: image/jpg
name: `upload.${fileEnding}` // example: upload.jpg
});
})
In my opinion, the best way to send the file to the server is to use react-native-fs package, so install the package
with the following command
npm install react-native-fs
then create a file called file.service.js and modify it as follow:
import { uploadFiles } from "react-native-fs";
export async function sendFileToServer(files) {
return uploadFiles({
toUrl: `http://xxx/YOUR_URL`,
files: files,
method: "POST",
headers: { Accept: "application/json" },
begin: () => {
// console.log('File Uploading Started...')
},
progress: ({ totalBytesSent, totalBytesExpectedToSend }) => {
// console.log({ totalBytesSent, totalBytesExpectedToSend })
},
})
.promise.then(({ body }) => {
// Response Here...
// const data = JSON.parse(body); => You can access to body here....
})
.catch(_ => {
// console.log('Error')
})
}
NOTE: do not forget to change the URL.
NOTE: You can use this service to send any file to the server.
then call that service like the following:
var files = [{ name: "xx", filename:"xx", filepath: "xx", filetype: "xx" }];
await sendFileToServer(files)
NOTE: each object must have name,filename,filepath,filetype
A couple of potential alternatives are available. Firstly, you could use the XHR polyfill:
http://facebook.github.io/react-native/docs/network.html
Secondly, just ask the question: how would I upload a file in Obj-C? Answer that and then you could just implement a native module to call it from JavaScript.
There's some further discussion on all of this on this Github issue.
Tom's answer didn't work for me. So I implemented a native FilePickerModule which helps me choose the file and then use the remobile's react-native-file-transfer package to upload it. FilePickerModule returns the path of the selected file (FileURL) which is used by react-native-file-transfer to upload it.
Here's the code:
var FileTransfer = require('#remobile/react-native-file-transfer');
var FilePickerModule = NativeModules.FilePickerModule;
var that = this;
var fileTransfer = new FileTransfer();
FilePickerModule.chooseFile()
.then(function(fileURL){
var options = {};
options.fileKey = 'file';
options.fileName = fileURL.substr(fileURL.lastIndexOf('/')+1);
options.mimeType = 'text/plain';
var headers = {
'X-XSRF-TOKEN':that.state.token
};
options.headers = headers;
var url = "Set the URL here" ;
fileTransfer.upload(fileURL, encodeURI(url),(result)=>
{
console.log(result);
}, (error)=>{
console.log(error);
}, options);
})
Upload Files : using expo-image-picker npm module. Here we can upload any files or images etc. The files in a device can be accessed using the launchImageLibrary method. Then access the media on that device.
import * as ImagePicker from "expo-image-picker";
const loadFile = async () => {
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
aspect: [4, 3],
});
return <Button title="Pick an image from camera roll" onPress={loadFile} />
}
The above code used to access the files on a device.
Also, use the camera to capture the image/video to upload by using
launchCameraAsync with mediaTypeOptions to videos or photos.