How to publish LocalScreenShare tracks from IOS to Web browser in React-Native using Twilio - react-native

I want to integrate Screen Share feature in my react-native application in which I am using Twilio for video communication. In Web we are able to achieve this by following these steps.
1 : We get the media device stream using
navigator.mediaDevices.getDisplayMedia({
video: true,
});
2 : Then we get the first stream tracks using
const newScreenTrack = first(stream.getVideoTracks());
3 : After that we set this newScreenTrack in some useState
const localScreenTrack = new TwilioVideo.LocalVideoTrack(
newScreenTrack
);
4 : After that we first unpublish the previous tracks and publish the new tracks using
videoRoom.localParticipant.publishTrack(newScreenTrack, {
name: "screen_share",
});
5 : And finally we pass these tracks in our ScreenShare component and render these tracks to View the screenShare from remote Participant.
I need to do the same thing in my react-native application as well. Where if localParticipant ask for screenShare permission to another participant. Participant will accept the permission and able to publish the localScreenShare tracks.
If anyone know this please help me in this. It would be really helpful. Thank you

I think this is an issue with the react-native-twilio-video-webrtc package. It seems that, as you discovered in this issue, that screen sharing was previously a feature of the library and it was removed as part of a refactor.
Sadly, the library does more work than the underlying Twilio libraries to look after the video and audio tracks. The Twilio library is built to be able to publish more than one track at a time, however this React Native library allows you to publish a single audio track and a single video track using the camera at a time.
In order to add screen sharing, you can either support pull requests like this one or refactor the library to separate getting access to the camera from publishing a video track, so that you can publish multiple video tracks at a time, including screen tracks.

Related

React-native-video extremely slow and possible workaround

What am I doing?
A react-native-video component is loaded into the screen with a source uri from my nodejs backend.
What is the problem? (Only tried on android)
My backend is working fine and the video is loaded, but extremely slow. I checked on the backend that the video component is requesting a byte-range of 1MB every ~1s. On the other hand, if I request the video ranges with axios, it takes a couple of ms to request multiple byte-ranges.
When using a video stored locally on the android device, the loading time is almost instantaneous.
I tried the default android player as well as exoplayer with no difference in loading times.
I am trying all this with only one Video component and when I comment it the app runs smoothly
Possible solution:
I don't want to load the data directly from the Video component and I am trying now to download the byte-ranges and store them locally. Later I plan to load these byte-ranges into the Video component.
import AsyncStorage from '#react-native-async-storage/async-storage';
...
await AsyncStorage.setItem(video_name, JSON.stringify(byte_range_data_from_backend))
This works fine and I can read the byte_range_data_from_backend later with:
const data_from_item = await AsyncStorage.getItem(video_name)
What is my question?
How can I get these video_ranges uris to pass them to the Video component?
I think I need to create some kind of file and update the file content every time I receive a new byte-range. Meanwhile, I would like the video to be played using the already downloaded byte-ranges.
Is this possible or is not a good approach? What would be a possible way to load videos fast? It would be nice to download some byte-ranges before even trying to play the video, so that the waiting time for the user is minimal.

How to read data in background in React Native

I have an app in React Native, that uses the bluetooth connection to read data from external devices.
I need a way to continue to read this data in background when, for example, the user starts a reading session and put the app in background.
What should I use to do this?
My code is divided in two parts:
Scan and Connect
Reading Data from external devices.
You need a background service for this. The following link for Android will help you.
github => react-native-foreground-service
If you want, you can do it yourself as described on the RN official site. But you'll have to write Java code for it.
React Native => Native Module

react native background geolocation

I have a requirement to save user's geolocation when react-native app is running in background. I know react-native-background-geolocation is available for this but I don't want to use any third party tool for this. Is there a way I can fetch users current location even if the app is running in the background?
With the last Expo SDK 32 you should be able to to it.
We’re excited to announce that this release includes initial support for background location, a highly requested feature from many Expo users. You can now define simple JavaScript tasks in your app and register them to receive location updates in the background. Additionally, you can set up geofencing tasks that are triggered when the device enters or leaves specific geographic regions

How do I notify users of new content available in tvOS apps from the home screen?

Push notifications have been left out of tvOS (understandably so) but the docs seem to contradict themselves in alerting users to the fact that there is something new available in your tvOS app.
Here it seems to say that you can add an app badge: https://developer.apple.com/library/prerelease/tvos/documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Chapters/WhatAreRemoteNotif.html
Here it says they've been removed from UIKit: https://developer.apple.com/library/prerelease/tvos/releasenotes/General/tvOS90APIDiffs/Objective-C/UIKit.html
Removed UIApplication.applicationIconBadgeNumber
Assuming the badge approach is not supported in this release, does anyone know the best practice for alerting a user that there is new content in your app without the user taking an explicit action? ie focusing on the app and showing them something in TopShelf?
I encountered the same problem and dived into this. Probably your best way is to update the topshelf with latest items, which is my way to solve this for now. You can use network calls to update the topshelf with content from your backend.
This depends on the type of application. E.g. showing the latest top movies for a movies app.
You can trigger an update of the topshelf after your network call completed using the following code:
NSNotificationCenter.defaultCenter().postNotificationName(TVTopShelfItemsDidChangeNotification, object: nil)
Make sure to implement the TVTopShelfProvider which should be clear using the following documentation:
This protocol is adopted by the principal class of an app’s TV Services extension. Apps that implement this extension can provide dynamic content to the Top Shelf element rather than having the system use the static image submitted with the app. The topShelfStyle property specifies the interface style you want, and the topShelfItems property specifies the content items to display. Whenever you change the content provided by the extension, post a TVTopShelfItemsDidChangeNotification notification to prompt the system to reload your content.
Icon badges are removed for app icons, push notifications as well (except for silent push notifications).

How do access native APIs with Sencha Touch?

If I wanted to create a mobile app that allows the user to take pictures with their phone, record audio notes and record video, how would I do that?
I was browsing through the Sencha Touch 2 API and while I see documentation on video and audio files, it seems like it is just providing a way for me to access files stored on the phone - not actual triggers to record, or take pictures.
Am I missing something?
How would I do what I want?
In order for Sencha Touch to have access to your phone capabilities, you need to use a product like Phone Gap
Unless there is a HTML5 api for doing those sorts of things I don't think you can do that. I know on PhoneGap there are native extensions added into that platform for access to things like microphone, camera, etc. I don't know if Sencha Touch has added any of those sorts of extensions in order for you do this.
Just thinking out of the box here, but you might be able to put Sencha javascript into a Web View from within an Android Java process. Then the Java code could expose an object in its process as an extension point to the Javascript engine for access to Camera, Microphone, what not.