React Native - Derectional Pad support Android TV App - react-native

I would want to build an Android TV app using React-Native. I have followed up the recommendation on this document: Building For TV Devices.
After, update the AndroidManifest.xml file I run the application using the command line - react-native run android. The app running without any issue; however, I tried to use the Directional-pad option from android emulator TV (720p) API 23 emulator and it didn't work. I was expecting to catch the event listed on the code below and write to the console respective test for each event. On the other hand, even the component that was used for text didn't get highlighted either focus on when I try to navigate using Directional-pad.
I am reaching out to the community to see if someone had this issue in the past and what was your issue and what you have done to resolve it? Also, as I am listing the steps below, if you could let me know if I missing something?
Please, let me know if you need any extra information in order to help me.
react-native init Dpad
cd Dpad
Update code based on - Building For
TV Devices
Start Android TV (720p) API 23 emulator.
react-native run-android
ANNEX:
Android TV (720p) API 23
Here is the code:
import React, { Component } from 'react';
import { Text, View } from 'react-native';
import Channel from '../channel/channel.component';
import styles from './presentation.component.styles';
var TVEventHandler = require('TVEventHandler');
export default class Grid extends Component {
constructor(props){
super(props);
this.state = {
command: 'undefined'
}
}
setcomand( command) {
this.setState( () => { return { command: command }; });
}
_tvEventHandler: null;
_enableTVEventHandler() {
this._tvEventHandler = new TVEventHandler();
this._tvEventHandler.enable(this, function(cmp, evt) {
if (evt && evt.eventType === 'right') {
setcomand('Press Right!');
} else if(evt && evt.eventType === 'up') {
setcomand('Press Up!');
} else if(evt && evt.eventType === 'left') {
setcomand('Press Left!');
} else if(evt && evt.eventType === 'down') {
setcomand('Press Down!');
}
});
}
_disableTVEventHandler() {
if (this._tvEventHandler) {
this._tvEventHandler.disable();
delete this._tvEventHandler;
}
}
componentDidMount() {
this._enableTVEventHandler();
console.warn("component did mount");
}
componentWillUnmount() {
this._disableTVEventHandler();
console.warn("component Will Unmount");
}
render() {
return (
<View style={styles.container}>
<Text>{this.state.command}</Text>
<Channel name="Globo" description="Its brazilian TV channles for news"/>
<Channel name="TVI" description="Its Portuguese TV channles for news"/>
<Channel name="TVI" description="Its Portuguese TV channles for news"/>
</View>
);
}
}

I'm also struggling with this problem for a month. Still can't find help/solution.
I testing this on Android Studio Emulator and also on few real android TV boxes with real remote d-pads.
I still can't figure out if it's React Native problem (bug) or Android TV devices don't emit response (keyCode) on directional d-pad arrows.
I can reproduce events like: focus, blur, select, fastForward, playPause, rewind, but no way to get events like e.g. "left".
I search a lot of google and other sites, you are first one who struggling with same issue.
I feel like no one cares about Android TV in React-Native.
You can also comment my Issue thread on React-Native github page.
https://github.com/facebook/react-native/issues/20924
I hope we figure it out soon.
Cheers

I was not able to verify that Directional-pad works with react-native. That was my goal of this demo. I am learning to build android Tv using react-native and so far Directional-pad it's being a big challenge given that TV user won't use touch screen event.
However, I couldn't find out why my app was not responding to the Directional-pad keyboard (left, right, up and down). There was no code error.
Have you tried to use Directional-pad to navigate on your react-native app?
Thank you,

Dave -
I decided to develop Android TV app using React Native because the video that react-native team shared - [https://www.youtube.com/watch?v=EzIQErHhY20] and the tutorial page [https://facebook.github.io/react-native/docs/building-for-apple-tv]. I think that's everything we have; other than that we won't get further support.
GOOD NEWS - I have started a new project from scratch using react native version 0.57.0, node version V10.7.not and **npm Version 4.6.1. Also, for navigation, I am using react-navigation version 2. I was able to see that my Directional-pad emulator was working, however, I was not able to see the focus on the element that I am navigating (left, right, down, up).
I will be working to see how I can fix the focus issue.
Let's keeping share our progress and feel free to reach out.
Thank you,
Justimiano Alves

use this code and u can see the console on debugger
_tvEventHandler: any;
_enableTVEventHandler() {
var self = this;
this._tvEventHandler = new TVEventHandler();
this._tvEventHandler.enable(this, function (cmp, evt) {
console.log("kcubsj"+evt.eventType)
if (evt && evt.eventType === 'right') {
console.log('right');
} else if (evt && evt.eventType === 'up') {
console.log('up');
} else if (evt && evt.eventType === 'left') {
console.log('left');
} else if (evt && evt.eventType === 'down') {
console.log('down');
} else if (evt && evt.eventType === 'select') {
//self.press();
}
});
}
_disableTVEventHandler() {
if (this._tvEventHandler) {
this._tvEventHandler.disable();
delete this._tvEventHandler;
}
}
componentDidMount() {
this._enableTVEventHandler();
}
componentWillUnmount() {
this._disableTVEventHandler();
}

I have really good news about support for D-Pad arrow events (up, down, left, right).
It turned out that one of Android TV contributor for react-native is person from my country. I reach out contact with him and tell about this problem. He check that out and actually, there is missing code for that.
He made pull request to support that in react-native. It should be fixed in one of upcoming new version releases (he said it might took about month).
Temporarly I know how to handle this (add code and recompile java files), I already tested it and its work great. All events now working. If you really need that support now and don't want to wait, I can share how to do that.
Cheers

Yes. I would like to see your solution because I am able to navigate using the D-pad but I couldn't see which element I am navigating to. I need to highlight or show focus on the element that I navigating to.

Having console.log inside the TVEventHandler callback seems to break it when running without remote js debugger on.

I have observed that D-pad does not work if there is no focusable component. To solve this, I have placed a transparent touchable opacity component on my screen. After that D-pad started working. My code for D-pad key event is given below:
enableTVEventHandler() {
this.tvEventHandler = new TVEventHandler();
this.tvEventHandler.enable(this, (cmp, { eventType, eventKeyAction }) => {
// eventKeyAction is an integer value representing button press(key down) and release(key up). "key up" is 1, "key down" is 0.
if (eventType === 'playPause' && eventKeyAction === 0)
{
console.log('play pressed')
}
else if(eventType === 'fastForward' && eventKeyAction === 0){
console.log('forward pressed')
}
else if (eventType === 'rewind' && eventKeyAction === 0){
console.log('rewind pressed')
}
else if (eventType === 'select' && eventKeyAction === 0)
{
console.log('select pressed')
}else if(eventType === 'left' && eventKeyAction === 0){
console.log('left pressed')
}
else if(eventType === 'right' && eventKeyAction === 0){
console.log('right pressed')
}
else if(eventType === 'up' && eventKeyAction === 0){
console.log('up pressed')
}
else if(eventType === 'down' && eventKeyAction === 0){
console.log('down pressed')
}
});
}

Related

$refs are null after route change

I have a keyboard navigation system. When you press ArrowUp or ArrowDown, an event is emitted FROM app.js (best place I found to listen to these keypresses since they need to be system-wide) TO the mounted() in the component.
The Event.$on() INSIDE the mounted() part of the component then calls a function that uses $refs to identify the currently selected item and, when ENTER is pressed, show it's modal.
app.js code (listen to the keypresses):
else if (event.key === 'ArrowUp' || event.key === 'ArrowDown' || event.key === 'Enter') {
event.preventDefault()
switch (this.$router.currentRoute.path) {
case "/pedidos":
Event.$emit('navegarSetasPedidos', event.key)
break;
case "/clientes":
Event.$emit('navegarSetasClientes', event.key)
break;
}
}
mounted() section of the component in question:
mounted() {
Event.$on('navegarSetasPedidos', (key) => {this.navegarSetas(key)})
}
function responsible for the navigation (sorry for bad formating, haven't figured how stackoverflow's codeblock thing works yet):
navegarSetas(key) {
if (this.navegacaoSetasAtiva == false) {
this.navegacaoSetasAtiva = true
this.navegacaoAtual = 0
} else if (this.modalAtivado == false && this.navegacaoSetasAtiva == true) {
if (key == 'ArrowDown' && this.navegacaoAtual < this.pedidos.length - 1) {
this.navegacaoAtual++
let elementoSelecionado = this.$refs['pedido'+this.navegacaoAtual][0].$el
let boundaries = elementoSelecionado.getBoundingClientRect()
if (boundaries.top < 0 || boundaries.top > (window.innerHeight || document.documentElement.clientHeight)){
elementoSelecionado.scrollIntoView({behavior: 'smooth'})
}
} else if (key == 'ArrowUp' && this.navegacaoAtual <= this.pedidos.length && this.navegacaoAtual > 0) {
this.navegacaoAtual--
let elementoSelecionado = this.$refs['pedido'+this.navegacaoAtual][0].$el
let boundaries = elementoSelecionado.getBoundingClientRect()
if (boundaries.top < 0 || boundaries.top > (window.innerHeight || document.documentElement.clientHeight)){
elementoSelecionado.scrollIntoView({behavior: 'smooth'})
}
} else if (key == 'Enter') {
let pedidoSelecionado = this.pedidos[this.navegacaoAtual].id
Event.$emit('changeShow', pedidoSelecionado)
}
}
This works very well the first time it is acessed. The problem is, if I change the current route to show another component and then return to the previous component, I get a lot of "this.$refs['pedido'+this.navegacaoAtual][0].$el is undefined" errors, but the system still works normally, albeit erratically.
The funny thing is: if I console log "this.$refs['pedido'+this.navegacaoAtual][0].$el is undefined", I'll get an EMPTY log before the errors, then ANOTHER one right below it, this time, not empty.
Everywhere else I've searched this says that the problem is due to how Vue re-renders things, and that I'm calling this event BEFORE it's rendered, which shouldn't be possible since I'm calling it inside mounted().
Any help is greatly appreciated, thank you!
Turns out, after a LOT of searching, the Event.$on event setters also work as the normal JavaScript ones (which makes a lot of sense now that I'm thinking about it)—meaning that you have to destroy them whenever your component is unmounted (aka Destroyed).
Even though VUE Dev Tools was picking only one event after the re-route, it was still firing two (seen through console.log() returning one empty value, a bunch of errors, and another value with filled array AFTER the errors).
The solution to this was simply adding Event.$off('eventName') on the destroyed() function of the component.

WebRTC ontrack how to tell if it is a screen sharing session?

function OnTrack(e) {
if (e.track.kind === "audio") {
}
else if (e.track.kind === "video") {
}
// Screen Sharing e.track.kind === 'video'
};
In the code above we can distrigush between Audio / Video. But how can I tell if the Video stream is actually coming from a screen sharing session?
Modify the SDP
Found a workaround, by modifying the SDP. I hope someone can come out with a better solution than this.
The getSettings() method contains the properties of the track, including cursor, displaySurface, logicalSurface which are only present on a screen sharing track.
With displaySurface as an example:
function OnTrack(e) {
let settings = e.track.getSettings()
if (e.track.kind === "audio") {
}
else if (e.track.kind === "video") {
if (settings.displaySurface && (
settings.displaySurface === "application" ||
settings.displaySurface === "browser" ||
settings.displaySurface === "monitor" ||
settings.displaySurface === "window")) {
// Screen Sharing
}
}
};
The values of displaySurface are application, browser, monitor, window.
You can also get displaySurface from the getConstraints() method of the track.
Update
It turned out you need to add these constraints/settings when you are calling the getDisplayMedia(constraints):
These constraints apply to MediaTrackConstraints objects specified as
part of the DisplayMediaStreamConstraints object's video property when
using getDisplayMedia() to obtain a stream for screen sharing.
Note from the MDN page
Not all user agents support all of these surface types.
More information
MediaStreamTrack getSettings()
MediaTrackSettings
MediaTrackSettings DisplaySurface
MediaStreamTrack getConstraints()
MediaDevices getDisplayMedia()

Identify orientation with degrees on startup

Without a 3rd party lib, we can detect orientation changes with DeviceEventEmitter with this undocumented feature like this:
import { DeviceEventEmitter } from 'react-native'
function handleOrientationDidChange(data) {
console.log('orientation changed, data:', data)
}
DeviceEventEmitter.addListener('namedOrientationDidChange', handleOrientationDidChange);
This gives us data that looks like this:
{ rotationDegrees: -90, isLandscape: true, name: "landscape-primary" }
Note: I tested this only on Android. It would be nice to know if it works on iOS too.
However this only works ON CHANGE. Is there a way to get this info on startup?
Have you tried this library!
Here is example usage from the repo:
componentWillMount() {
// The getOrientation method is async. It happens sometimes that
// you need the orientation at the moment the JS runtime starts running on device.
// `getInitialOrientation` returns directly because its a constant set at the
// beginning of the JS runtime.
const initial = Orientation.getInitialOrientation();
if (initial === 'PORTRAIT') {
// do something
} else {
// do something else
}
}

How to check if a user device is using fingerprint/face as unlock method. [ReactNative] [Expo]

I'm using ReactNative based on Expo Toolkit to develop a App and I want know how I can check if the user is using the fingerprint (TouchID on iPhone) or face detection (FaceID on iPhone X>) to unlock the device.
I already know how to check if device has the required hardware using Expo SDK, as follow:
let hasFPSupport = await Expo.Fingerprint.hasHardwareAsync();
But I need check if the user choose the fingerprint/face as unlock method on your device, instead pattern or pin.
Thanks
Here's an update to Donald's answer that takes into account Expo's empty string for the model name of the new iPhone XS. It also takes into account the Simulator.
const hasHardwareSupport =
(await Expo.LocalAuthentication.hasHardwareAsync()) &&
(await Expo.LocalAuthentication.isEnrolledAsync());
let hasTouchIDSupport
let hasFaceIDSupport
if (hasHardwareSupport) {
if (Constants.platform.ios) {
if (
Constants.platform.ios.model === '' ||
Constants.platform.ios.model.includes('X')
) {
hasFaceIDSupport = true;
} else {
if (
Constants.platform.ios.model === 'Simulator' &&
Constants.deviceName.includes('X')
) {
hasFaceIDSupport = true;
}
}
}
hasTouchIDSupport = !hasFaceIDSupport;
}
EDIT: Expo released an update that fixes the blank model string. However you might want to keep a check for that just in case the next iPhone release cycle causes the same issue.
Currently, you could determine that a user has Face ID by checking Expo.Fingerprint.hasHardwareAsync() and Expo.Fingerprint.isEnrolledAsync(), and then also checking that they have an iPhone X using Expo.Constants.platform (docs here).
So:
const hasHardwareSupport = await Expo.Fingerprint.hasHardwareAsync() && await Expo.Fingerprint.isEnrolledAsync();`
if (hasHardwareSupport) {
const hasFaceIDSupport = Expo.Constants.platform.ios && Expo.Constants.platform.ios.model === 'iPhone X';
const hasTouchIDSupport = !hasFaceIDSupport;
}
Incase you tried the above answer and it's not working please note as at the time of my post expo's documentation has changed
- import * as LocalAuthentication from 'expo-local-authentication';
- let compatible = await LocalAuthentication.hasHardwareAsync()
We can check if device has scanned fingerprints:
await Expo.Fingerprint.isEnrolledAsync()
So, this can be used to reach the objective as follow:
let hasFPSupport = await Expo.Fingerprint.hasHardwareAsync() && await Expo.Fingerprint.isEnrolledAsync();

Qml - ReferenceError: Screen is not defined

Trying to build a little fotball game as an project in school but I'm having some issues. So when I run the code it says that ReferenceError: Screen is not defined, but accordign to me I have defined it.
This code is just a prototype, going to change the keys to buttons later on so that it can actually work on a phone.
import QtQuick 2.0
Item {
id:root
width:Screen.width
height:Screen.height-10
focus:true
Keys.onPressed: {
if(event.key===Qt.Key_Up)
{
event.accepted = true;
player.y=(player.y) - 40
}
if(event.Key === Qt.Key_Down){
event.accepted = true;
player.y = (player.y)+ 40
}
if (event.key === Qt.Key_Right)
{ event.accepted=true;
player.x=(player.x)-40
}
if (event.key === Qt.Key_Left)
{event.accepted = true;
player.x=(player.x) +40
}
}
Flickable {
width:Screen.width
height:Screen.height
contentHeight: Screen.height*4
contentWidth:Screen.width
interactive:true
boundsBehavior: Flickable.StopAtBounds
Image{
id: feild
anchors.fill:parent
source:"Namnlös.png"
sourceSize.height:Screen.height*4
sourceSize.width:Screen.width
}
Image {
id: player
source:"asd.png"
x:Screen.width/2
y:Screen.height/2
}
}
}
So if you run this code you'll only get the player showing up, and then disapear instantly, the field is not shown.
You lack the Screen import.
import QtQuick.Window 2.1
Screen docs
Resizing items to screen is abnormal, you should simply use
resizeMode property
and anchor all child items inside root item.