Fonts getting cut down in react native on One Plus, Oppo etc - react-native

I am facing a very rare and unseen issue that the fonts are getting worn out on some of the devices like the One Plus, Oppo etc.
I don't know what's the reason behind that. But this is a device specific issue on some devices only.
Well, If anyone comes across the same problem I have the solution for them :
Add a space " " at the end of the text
Give the Text a set width.
There is a proposed component at the bottom of the issue.
You can refer to this react native issue link
Also, this expo link helps
But, my main concern is these all are hacks. Is there any rock solid answer to solve these things? Why to do extra work, why can't react native solve that? Or are there any permanent solutions to fix this?
Please don't mark it as duplicate. I have the answer, I want to know the reason or any global answer to solve that in one go. I dont want to change all my 1000 text styles. :(

Related

How to configure MOCP mapping and Expression with metahuman and Live Link

as my title says I am having trouble mapping face data from the Live Link app to a metahuman.
Here is what I have done so far:
Created a UE5 project (Film/Video & Live Event)
Imported a metahuman (custom-made) into the project
Added required plugins to project (Live Link, ArKit, Apple-etc)
Connected Live Link mobile application to local network
Set the metahuman's animation controller to the Live Link feed
Calibrated the Live Link data within the Live Link application
The problem I am having:
Parts of the face are not responding at all (ex. metahuman's right eyebrow does not respond to me lifting my left eyebrow).
The left corner of the mouth seems to be stuck (ex. when I try to open my mouth, all points respond except for the single point stays where it is).
The mapping/naming of facial components seems to be mirrored/off/labeled wrong (ex. if I was to wink my right eye this would result in my right eye closing and my right cheek pressing upward. On the metahuman, the left eye would blink and the right cheek would raise.)
These issues are very frustrating as I can not seem to get past this basic calibration. I see online people using these same tools and getting results with the metahuman's facial movements that are really clean. Is there something I am missing? I know that after the metahuman has been calibrated, I will create sequences, am I supposed to be modifying these value there? I am not sure... I have commented on every video I can find and I have posted this question in the Unreal Discord (here) with basically no help.
Note: I don't need a full solution, I just need to be pointed in the right direction! Please let me know if there is anything I am missing in my setup or calibration workflow.
thanks for reading.
Had the exact same problem with a customer project, and the issue was it wasn't using the correct animation blueprint. Once we switched it to the correct animation blueprint for LiveLink the issue was immediately resolved. I remember that exact same facial expression - sorry as I know this was a frustrating one for the customer as well.
It seemed that I was doing everything right. After upgrading from 5.0.3 to 5.1 the issue stopped completely.

rendering different screen based on input by the user React Native

Hello fellow programmers, i wanted to ask how can i achieve different screen rendering based on input by the user. I have done it in one way but i want to know is there any better way to do?
Attached is the link to snack on expo. You can run it and see the code of how i have done it.
If there are better alternates, please let me know.
Thankyou.
please dont mark the question as duplicate before going throught it.
For better choice you can using react-navigation https://reactnavigation.org/docs/auth-flow/ This package work in Expo
The example you posted looks mostly like something I would use conditional rendering techniques for like the && syntax
See here for more guidance:
https://reactjs.org/docs/conditional-rendering.html
If your pages are dramatically different though (not just options flipping on and off) then I would recommend react navigation as giri commented.

How to detect a screenshot in a RN application?

I'm having some problem about a screenshot-detector in an application.
I'm using EXPO, but I've no idea about how can I detect a screenshot.
Suppose that I've a profile screen. Each user has a profile page, I want to detect when another user does a screenshot on the screen.
My problem is: how can I detect this? I readed about the gesture here, but it didn't help me. With the gestures I can detect, instead, the scroll in a page, the location x and location y.
I readed also this answer, but about the event touchesCancelled:withEvent:touchesCancelled:withEvent: I have not found anything in React Native docs (and also EXPO docs).
So: which is the idea behind a screenshot detector? Thanks for your help!
You can't do it without detaching EXPO. There is already a feature request for that.
If you decide to detach EXPO, react-native-screenshot-detector might be helpful. The solution is very similar to the one from the linked question.
Currently with expo there is no way to do that, But if you detach you can use this package it supports android&ios screenshot detecting react-native-detector

Custom input box styling in react-native-elements

Being new to react-native, I am trying to style my input fields using react-native-elements to look similar to their example ones (Example).
Even after going through the source code for the example I still can't seem to figure out where I would start when it comes to customizing these fields to that extent.
As a heads up for anyone else looking to do the same thing, I ended up using this library to make nicer looking animated input fields. It isn't quite the same as what I had originally planned, but if you're in the same scenario as I was, I definitely recommend taking a look.

React Native TouchableNativeFeedback

Using React-Native has a long press error
Using TouchableHighlight And TouchableNativeFeedback
Attempted to transition from state RESPONDER_INACTIVE_PRESS_IN to RESPONDER_ACTIVE_LONG_PRESS_IN
My code
This issue usually only appears in android when you are debugging.
This issue https://github.com/facebook/react-native/issues/5823 goes into further detail.
Try turning debugging off and see if you still get the error.
Aligning the clocks on your phone and computer may solve this problem. My experience is that the two clocks do not need to be exact match, just make sure they are not off by more than a few seconds. You can easily achieve this by adjusting one clock manually.
Yes the same issue I faced it, after a much more research I found its's solution that instead of using TouchableHighlight, you can use TouchableOpacity component .
For detail refer this link -
http://androidseekho.com/others/reactnative/how-to-solve-touchable-longpressdelaytimeout-problem-in-react-native/