How can i access to USB Camera with Device ID on Electron - camera

I have two camera on my laptop. (Integrated and Usb) I didn't get capture from my usb camera when i listed and choose device as USB camera.
I am sending the device id parameter for getUserMedia like;
Camera.js (This is different )
const { ipcRenderer } = require('electron');
var video = document.querySelector('#videoElement');
let deviceId;
function stop(e) {
var stream = video.srcObject;
var tracks = stream.getTracks();
for (var i = 0; i < tracks.length; i++) {
var track = tracks[i];
track.stop();
}
video.srcObject = null;
}
ipcRenderer.on('camera-id', (event, arg) => {
//alert(arg);
if (navigator.mediaDevices.getUserMedia) {
const constraints = {
video: {
deviceId: {
exact: arg,
},
},
};
navigator.mediaDevices
.getUserMedia(constraints)
.then(function (stream) {
alert(stream);
video.srcObject = stream;
})
.catch(function (err0r) {
alert(err0r);
});
}
});
I am getting "overconstrainederror" if i use exact parameter. If i give device id parameter directly its working fine but capture always coming from integrated camera. Never switch to USB camera.
const constraints = {
video: {
deviceId: arg,
},
};
Update:
DeviceId parameter is not functional. If i choose width and heigth in constrains camera is automatically opened whic is available for resolution.

Related

how to fix black video streaming on ios - kurento

i have a video conference app built in nodejs and kurento.
when i connect my camera on desktop/Android everything is ok.
when i connect camera on iphone, i can see the shared video on iphone (locally) very well, but the camera on other devices (receiving devices) in the room stays black and doesn't show up.
my client code:
const videoConstraints = {};
videoConstraints.facingMode = 'environment';
const constraints = {
video: videoConstraints,
audio: false
};
receiveMediaLocal(token, 'webcam', socket.id, constraints)
setTimeout(() => {
iKurentoClient.sendMessage({ ...data, type: 'video', mode: 'webcam', id: 'connectMedia' })
}, 100)
function receiveMediaLocal(sender, mode, connId, constraints, mediaScreen = null) {
var participant = new Participant(this, sender, mode, connId);
if (!this.participants[sender]) this.participants[sender] = {};
this.participants[sender][mode] = participant;
var media = participant.getMediaElement();
var options = {
localVideo: media,
mediaConstraints: constraints,
onicecandidate: participant.onIceCandidate.bind(participant)
}
participant.rtcPeer = new kurentoUtils.WebRtcPeer.WebRtcPeerSendonly(options,
function (error) {
if (error) {
this.socket.emit("error",error);
return console.error(error);
}
this.generateOffer(participant.offerToReceiveMedia.bind(participant));
}
);
}
how can i fix this bug?
thank you.

FCM Push notifications arrive twice (Nuxt with firebase)

So I searched here about this problem and I saw many experienced it but still didn't find solution.
FCM Push notifications arrive twice if the browser is in background
Ty for your help.
My nuxt config firebase serviced:
services: {
auth: true, // Just as example. Can be any other service.,
messaging: {
createServiceWorker: true,
fcmPublicVapidKey: "###", // OPTIONAL : Sets vapid key for FCM after initialization
inject: fs.readFileSync("./serviceWorker.js")
}
}
my service worker:
messaging.setBackgroundMessageHandler(function(payload) {
console.log("[firebase-messaging-sw.js] Received background message ");
self.registration.hideNotification();
return null;
});
self.addEventListener("push", function(e) {
data = e.data.json();
const options = {
tag: "notification-1",
body: data.notification.body,
vibrate: [100, 50, 100],
data: {
dateOfArrival: Date.now(),
primaryKey: "1"
}
};
self.registration.showNotification(data.notification.title, options);
});
self.addEventListener(
"notificationclick",
function(event) {
console.log("test", event);
event.notification.close();
const url = "home";
event.waitUntil(
self.clients.matchAll({ type: "window" }).then(windowClients => {
// Check if there is already a window/tab open with the target URL
for (let i = 0; i < windowClients.length; i++) {
const client = windowClients[i];
// If so, just focus it.
if (client.url === url && "focus" in client) {
return client.focus();
}
}
if (self.clients.openWindow) {
console.log("open window");
}
})
);
},
false
);
Add
self.registration.hideNotification();
On top of line
self.registration.showNotification(
This allows your app to hide the default notification in which case you will only have one notification.

Closing WebRTC track will not close camera device or tab camera indicator

Banging my head to the wall with this one, I can't seem to understand what is holding on the camera video stream and not closing when MediaStreamTrack.stop() called.
I have a typescript class where I handle getting the WebRTC stream track and passing it using an observable event to a functional reactjs component, the below code is the component registering to the event and using state for the stream track.
const [videoStreamTrack, setVideoStreamTrack] = useState < MediaStreamTrack > (
null
)
useEffect(() => {
return () => {
videoStreamTrack?.stop()
videoElement.current.srcObject.getVideoTracks().forEach((track) => {
track.stop()
videoElement.current.srcObject.removeTrack(track)
})
videoElement.current.srcObject = null
}
}, [])
case RoomEvents.WebcamProducerAdded:
case RoomEvents.VideoStreamReplaced: {
if (result.data?.track) {
if (result.data.track.kind === 'video') {
previewVideoStreamTrack?.stop()
setPreviewVideoStreamTrack(null)
setVideoStreamTrack(result.data.track)
}
}
break
}
In the "Room" class I use the below code to grab the stream.
const videoDevice = this.webcam.device
if (!videoDevice) {
throw new Error('no webcam devices')
}
const userMedia = await navigator.mediaDevices.getUserMedia({
video: this.environmentPlatformService.isMobile ?
true : {
deviceId: {
exact: this.webcam.device.deviceId
},
...VIDEO_CONSTRAINS[this.webcam.resolution],
},
})
const videoTrack = userMedia.getVideoTracks()[0]
this.eventSubject.next({
eventName: RoomEvents.WebcamProducerAdded,
data: {
track: videoTrack,
},
})
I am holding to this.webcam.device details using the code below.
async updateInputOutputMediaDevices(): Promise < MediaDeviceInfo[] > {
await navigator.mediaDevices.getUserMedia({
audio: true,
video: true
})
const devices = await navigator.mediaDevices.enumerateDevices()
await this.updateWebcams(devices)
await this.updateAudioInputs(devices)
await this.updateAudioOutputs(devices)
return devices
}
private async updateWebcams(devices: MediaDeviceInfo[]) {
this.webcams = new Map < string, MediaDeviceInfo > ()
for (const device of devices.filter((d) => d.kind === 'videoinput')) {
this.webcams.set(device.deviceId, device)
}
const array = Array.from(this.webcams.values())
this.eventSubject.next({
eventName: RoomEvents.CanChangeWebcam,
data: {
canChangeWebcam: array.length > 1,
mediaDevices: array,
},
})
}
Refreshing the page will close the camera and tab indicator.
useEffect(() => {
return () => {
videoStreamTrack?.stop()
videoElement.current.srcObject.getVideoTracks().forEach((track) => {
track.stop()
videoElement.current.srcObject.removeTrack(track)
})
videoElement.current.srcObject = null
}
}, [])
So here you are search and destroying video tracks. Seems right-ish; we'll see
async updateInputOutputMediaDevices(): Promise < MediaDeviceInfo[] > {
await navigator.mediaDevices.getUserMedia({
audio: true,
video: true
})
const devices = await navigator.mediaDevices.enumerateDevices()
await this.updateWebcams(devices)
await this.updateAudioInputs(devices)
await this.updateAudioOutputs(devices)
return devices
}
Above I see there's a call for audio might be where the hiccups are? Can't overly examine but maybe you're opening both and closing just video? Try doing a loop through all tracks not just video and see what's there?
#blanknamefornow answer helped me nail the issue.
We are calling getUserMedia in multiple places not only in the
“room” class handling mediasoup actions but also fore
preview/device-selection/etc and didn’t really ever closed the
tracks retrieved.
Sometimes, those tracks are held into useState
variables and when component unmounted if you try to access the
variables they are already nulled by reactjs. The workaround is
since the HTML elements are still referenced stop the track when
needed. I believe this was the missing ingredient when trying to
figure it out.

React Native Expo Audio | Play live stream from latest position

I'm writing an audio player, using Expo Audio, for an app I'm making for an online radio.
The audio comes from an online live stream and, I've successfully added the player and all the things related to it; however, the one issue I'm having is that if I pause the audio when I resume playing it the audio continues from when I paused it rather than from the current position and I need to pause it and play it again to get it to update to what's currently being played.
I play it with playAsync() and I've tried pausing with pauseAsync(), stopAsync(), setStatusAsync({ shouldPlay: false, positionMillis: 0 });
Any tips on how I can get it to work the way it should?
Here's the code I have for the audio player, it's a class from which then I create an instance of to be able to manage it from different places in the app:
class audioPlayer {
static instance = null;
static createInstance() {
var object = new audioPlayer();
return object;
}
_radioStream;
/**
* #returns {audioPlayer}
*/
static getInstance() {
if (audioPlayer.instance == null) {
audioPlayer.instance = audioPlayer.createInstance();
}
return audioPlayer.instance;
}
// Call this first to create a new audio element
createAudio() {
this._radioStream = new Audio.Sound();
};
async loadAudioAsync() {
try {
await this._radioStream.loadAsync(
{ uri: "radio straem"},
);
store.dispatch(setLiveState(true));
this.toggleAudio(); // Autoplay at start
return true;
} catch (error) {
if (error.code === "E_LOAD_ERROR") {
// In the case of an error we try to load again
setTimeout(this.loadAudioAsync, 10000);
throw new Error(error.code);
} else {
throw new Error(error);
};
};
};
async unloadAudioAsync() {
await this._radioStream.unloadAsync();
};
async getStatusAsync() {
return await this._radioStream.getStatusAsync();
};
async toggleAudio() {
// We're gonna play or pause depending on the status
let { isLoaded, isPlaying } = await this._radioStream.getStatusAsync();
// If the user presses the audio and the stream connection has been lost or something
// we try to load it again
if (!isLoaded) {
let res = await this.loadAudioAsync(); // Try to loadAudio again
if (res) this.toggleAudio(); // Retrigger the toggle to start playing
}
if (isLoaded && !isPlaying) {
store.dispatch(setPlayingStatus(true));
await this._radioStream.playAsync();
} else if (isLoaded && isPlaying) {
store.dispatch(setPlayingStatus(false));
await this._radioStream.setStatusAsync({ shouldPlay: false, positionMillis: 0 });
};
};
};
I just had the same exact problem (for my internet radio https://notylus.fr).
It's seems that I found a solution : instead of using
playbackInstance.pauseAsync()
I now use
playbackInstance.stopAsync()
AND for the play part, I add
await playbackInstance.playAsync() //play stream
playbackInstance.setPositionAsync(0) //ensure that you're at position 0
Last
Regards,

react native ble plx - Bluetooth is not supported in ios

When I start device scan by calling the method startDevcieScan(), most of the time "bluetooth is not supported" alert comes. This happens only in ios. I am unable to figure out why its happening.I tested the code on actual devices - iphone 5s, 6s and ipad.
Below is the code snippet:
let that = this;
let deviceList = [];
let count = 0;
that.manager.startDeviceScan(["1802"], null, (error, device) => {
if (error) {
try {
alert(`${error.message}`);
Log.info(`BLE scan error: ${error.message}`);
} catch (e) {
Log.info(`BLE scan error: ${error}`);
}
return;
}
let localDeviceList = deviceList;
let exists = false;
localDeviceList.map((d) => {
if (d.id == device.id) {
exists = true;
}
});
if (!exists) {
deviceList.push(device);
this.homeScreen.setState({listBLEDevices: deviceList});
let arr = this.homeScreen.state.availableDevices;
arr.push({id: device.id, name: device.name, connected: false});
this.homeScreen.setState({availableDevices: arr});
}
});
};