react-native-video: How to manually select video quality - react-native

I am building an app for video streaming using HLS from s3.
I want to support the functionality to select Video Quality.
Still unable to find how to select the desired quality.
Can any one please help me in this issue.
If some one knows some react-native api or some other solution, please help.
Thanks

it is possible to select a video quality manually according to the react-native-video documentation use the selectedVideoTrack Prop
see https://github.com/react-native-community/react-native-video#selectedvideotrack
example
selectedVideoTrack={{
type: "resolution",
value: 1080
}}

add these lines in react-native.config.js
module.exports = {
dependencies: {
'react-native-video': {
platforms: {
android: {
sourceDir: '../node_modules/react-native-video/android-exoplayer',
},
},
},
},
assets:['./src/assets/fonts/'],
};
remove assets:['./src/assets/fonts/'], if you don't have fonts directory.
Then in Video component select video track like this
selectedVideoTrack={{
type: "resolution",
value: 1080
}}
This solution tested with only Android devices.
Original answer: https://stackoverflow.com/a/71148981/4291272

If you have multiple quality renditions in your HLS video, you could use hls.js for the playback component. Normally this will just switch playback for you, but you can control this manually with the Quality Switch API.
I.e. you would get the available qualities using hls.levels and then iterate through them. Then set hls.currentLevel tot he desired quality.

Related

Change sound in expo.sendPushNotificationsAsync

I want to play some other sound when hitting any notification in my expo app from node server.I can't find any other sound than 'default'.Is there any other option I can use to play some other sound other than the default one on notifications in expo.My nodeJS code is below:
const receipts = expo.sendPushNotificationsAsync([
{
to: userObj.pushToken,
sound: 'default',
body: notification,
data: { withSome: notification },
priority: 'high',
},
]);
I looked up in the Message Format section of the docs and found this :
A sound to play when the recipient receives this notification. Specify
"default" to play the device's default notification sound, or omit this
field to play no sound.
Note that on apps that target Android 8.0+ (if using `expo build`, built
in June 2018 or later), this setting will have no effect on Android.
Instead, use `channelId` and a channel with the desired setting.
sound?: 'default' | null,
Looks like it is not possible to change sound through app.

Using a 4k Logitech webcam with WebRTC/getUserMedia

I have a 4k Logitech brio webcam and I can pull live video from it using WebRTC/getUserMedia. Sadly only in HD 1920x1080 … is there any way to use the 4k capabilities of the camera in the browser/electron app?
I'm working on a single instance media installation, so cross browser support is not an issue. I'm targeting towards whatever webkit electron-builder will package.
Thanks!
getUserMedia can be very... peculiar currently in most browsers, electron included.
First, make sure you are using your constraints correctly. To get 4k you should be trying something similar to this:
{
audio: false,
video: {
width: { exact: 3840 },
height: { exact: 2160 }
}
}
Then if that works, go from there on toning down the constraints to get other non-UHD webcams to work. Make sure you read up on the constraints and what is possible here, and always include the WebRTC adapter.js even in the latest version of electron it is still needed (mainly for converstion of error names to the proper "standard" ones).
Most likely you will end up with a constraints setup similar to this:
{
audio: false,
video: {
width: {
min: 1280,
ideal: 3840,
max: 3840
},
height: {
min: 720,
ideal: 2160,
max: 2160
}
}
}
That will make the browser attempt to get a 4k resolution, but then will step down to a minimum of 720p if needed.
Also, if you want to check if your browser/camera supports UHD correctly, you can always try this website which will run a test to get which resolutions getUserMedia supports on your system.
And finally, make sure you are choosing the right camera. Many new devices are including multiple environment-facing cameras, and if you don't define the deviceId you want to use, the useragent will pick for you, and they often choose poorly (for example, a Kyocera phone I recently worked with used a wide-angle lens by default unless told otherwise, and the wide-angle lens didn't support any "normal" resolutions making it fallback to a very low resolution and very strange aspect ratio.

Rally SDK 2.0 displaying debug information

I can get simple queries working and displaying using the new 2.0 framework, I want to display some debug type information back to the screen but I am finding the Rally default app is overlaying everything.
App is based on standard i.e.
Ext.define('ReleaseHistory', {
extend: 'Rally.app.App',
componentCls: 'app',
grid: null,
etc...
What is the best way for me to 'dump' information back to the screen using this framework, lets say I have an array of stuff, in the old SDK I can do something like this:
var results = JSON.stringify(dataAddedOrRemoved);
dojo.byId("debug").innerHTML = results;
Usually the easiest way to inspect code or data in an app is through the developer tools provided by your browser. console.log statements can also be useful.
Otherwise, you will probably find the following guide helpful:
https://help.rallydev.com/apps/2.0rc2/doc/#!/guide/add_content
SDK2 is built on top of ExtJS and everything is a component or a container. You will have a bad time directly modifying the DOM without the framework's knowledge. Here is a simple example of how to just add some arbitrary html content to your app body:
this.add({
xtype: 'component',
html: 'content-goes-here'
});

Phonegap camera API save picture taken to camera roll

Is it possible using the Cordova camera API to take a picture and then store it locally in the camera roll on iOS and Android? I know its possible, but does it involve native code somehow or can it be done in pure HTML? The documentation doesn't say anything about this.
Simply add saveToPhotoAlbum: true in cameraOptions param.
For example
navigator.camera.getPicture(onPhotoDataSuccess, onFail,
{
quality: 50,
destinationType: destinationType.FILE_URI,
saveToPhotoAlbum: true
});
saveToPhotoAlbum is set to false by default.
Reference
Do you mean the device's gallery?
If so, just use FILE_URI for Camera.DestinationType option.
Reference: http://docs.phonegap.com/en/2.6.0/cordova_camera_camera.md.html#cameraOptions

Sencha Touch 2: Simulating different profiles (phone/tablet) on Desktop OS (Linux/Windows)

I am learning Sench Touch 2. I figured if I want my app to be "responsive" to phone vs tablet platforms, I can use Profiles.
Ext.define('Mail.profile.Phone', {
extend: 'Ext.app.Profile',
config: {
name: 'Phone',
views: ['Main']
},
isActive: function() {
return Ext.os.is.Phone;
},
launch: function() {
Ext.create('Mail.view.phone.Main');
}
});
I am developing on Arch Linux with Chromium. I installed installed user agent switchers to fake running on a tablet/phone OS, but the profiles doesn't appear to change anything. I added console.log() in the launch function
I wonder if I am using Profiles for the right thing? For example, I may want my app to display a list and a details view on tablets, and the list, animating to the details view
on a phone. If so, do I need a real tablet/phone just to test this profiles?
You can append deviceType=Phone or deviceType=Tablet to simulate these different device types.
Example here:
http://dev.sencha.com/deploy/touch/examples/production/kitchensink/index.html?deviceType=Phone
I usually just change the "isActive" function in the Tablet and Phone profiles in order to achieve this. For example, if I want to test the Phone profile on my laptop, I just change it like this:
isActive: function() {
return !Ext.os.is.Phone;
}
that is the exact opposite of what it should be. This is in fact the function I use in my Tablet profile.