Programatically turn screen upside-down in react-native - react-native

I am building a two player game.
When player A makes a turn, the screen should turn upside down for player seated on the opposite side.
I tried using transform on a View, but think that only works on text.
So I am looking for a solution to either
a. Keep the device orientation, but turn it upside down by 180 degrees.
b. Rotate the root view on my app by 180 degrees
I would appreciate suggestions.

Correction - turning the view using transform worked for me.
It was merely not working when in storyboard mode (not sure why).
Below works!
const rotateView = {
flex: 1,
transform: [{
rotate: '-180deg'
}],
}
View is rotated by 180 degrees.

Related

getUserMedia (Selfie) Full Screen on Mobile

I've the following constraints which are working perfectly fine over Chrome in Desktop (simulating mobile resolution)
const constraints = {
audio: false,
video: {
width: screen.width,
height: screen.height
}
};
navigator.mediaDevices.getUserMedia(constraints).then(stream => {})
However when actually trying this on iPhone / Safari the camera doesn't respects this at all and gets super small or distorted - removing the width / height from the constraints makes it better ratio but not full screen at all, just centralized.
I've also tried with min / max constraints without lucky.
Is there any way to get this working on iPhones?
I have built a few AR Websites which are mobile first. When you request a resolution the web browser sees if the resolution exists, and if it doesn't it then decides if it should emulate the feed for you. Not all browsers do emulation (even though it is part of the spec). This is why it may work in some browsers and not others. Safari won't emulate the resolution you are asking for with the camera you have picked (I presume the front).
You can read more about this here (different problem, but provides a deeper explaination): Why the difference in native camera resolution -vs- getUserMedia on iPad / iOS?
Solution
The way I tackled this is:
Without canvas
Ask for a 720p feed, fallback to 480p feed if 720 gives an over-constrained error. This will work cross-browser.
Have a div element which is 100% width and height, fills the screen, and sets overlay to hidden.
Place the video element connected to the MediaStream inside, make it 100% height of the container. The parent div overlay hidden will in effect crop the sides. There will be no feed distortion.
With canvas
Do not show the video element, use a canvas as the video view. Make the canvas the same size as your screen or the same aspect ratio and use CSS to make it fill the screen (latter is more performant).
Calculate the top, left, width and height variables to draw the video in the canvas (make sure your calculation centers the video). Make sure you do a cover calculation vs fill. The aim is to crop the parts of the video which do not need to be shown (I.e. like the descriptions of various methods in https://css-tricks.com/almanac/properties/o/object-fit) . Example on how to draw video into a canvas here: http://html5doctor.com/video-canvas-magic/
This will give you the same effect of what you are looking for. Production examples of something similar.
https://www.maxfactor.com/vmua/
https://demo.holitionbeauty.com/
P.s. when I get time I can code an example, short on hours this week.
There are a couple of quirks on mobile gUM() you need to know about.
First, if the device is in portrait orientation things work weirdly. You need to swap the width and height. So, let's say you're on a 480x640 device (do those even exist? who cares? it's an example). To get the appropriate size video you need
const constraints = {
audio: false,
video: {
width: screen.height,
height: screen.width
}
};
I can't figure out exactly why it's like this. But it is. On iOS and Android devices.
Second, it's hard to get the cameras to deliver exactly the same resolution as the device screen size. I tweak the width and height to make them divisible by eight and I get a decent result.
Third, I figure the sizes I need by putting a <video ...> tag in my little web app with CSS that makes it fill the browser screen, then querying its size with
const rect = videoElement.getBoundingClientRect()
const width = rect.width > rect.height ? rect.width : rect.height
const height = rect.width > rect.height ? rect.height : rect.width
This makes the mobile browser do the work of figuring out what size you actually need, and adapts nicely to the browser's various toolbars.

How to detect if screen has rounded corners in react-native

How can I detect whether device screen has rounded corners and also estimate the radius of the corners (if possible) ?
I want to modify my view more typically cardview to adapt these screens. I have successfully retrieved the screen width and height by using Dimensions
width: Dimensions.get('window').width,
height: Dimensions.get('window').height
I am not able to adapt my parent view according to the curves at 4 corners with above approach. If I give static radius to the parent view it give bad look & feel on rectangular screens and it's not acceptable.
One approach I thought of is generate a list of all devices with rounded corners and apply border radius to only these devices. But it's harder to maintain the list and adapt the list with newcomers in the market.
Can anyone help me with it ? Any sort of approach or guideline will really help me. Thank you in advance
After struggling with it, I came with a much easier solution that why shouldn't I ask user whether he/she has rounded cornered screen when the app starts first time and later giving an option in settings under my app to change whenever user wanted.
I stored user's selection in local storage and modified my view based on that flag. Now I don't have to maintain a list of all devices instead it will cover all use cases.
Currently there is no option to get corner radius from Dimensions. There is only 4 values in Dimensions object given below.
{ width: 384, height: 592, scale: 2, fontScale: 1 }
Even if you have answered and accepted your own answer already, it is not really a solution for the original post.
I think basically all phones with notches have round corners while devices without notches typically do not have round corners. If you have notches, you have an inset in your Safe Area. If you are using react-native-safe-area-context for example, you can get the insets with
const insets = useSafeAreaInsets();
const hasNotch = insets.top || insets.bottom || insets.right || insets.left;

Replicating Camera View - DeviceOrientationControls to TrackballControls

I am trying to replicate a view from a phone (using DeviceOrientationControls) to a desktop (using TrackballControls). I am passing the view state (camera position & direction) through an intermediary server, and have that part mostly working.
I'm having trouble setting the camera rotation on the desktop. The cameras are sync'd to look at the same point, but the view on the desktop (receiving the view state from the phone) rotates around the view angle.
I definitely don't fully understand quaternions or rotation order. I've tried applying those, but clearly I'm out of my element. I guess I'm just looking for some hints on how to sync the camera rotation on the desktop.
Looks like I had a (trackball) controls.update() in my animate() that was blowing away the rotation that I was setting. Camera position and direction are not changed by this, but the rotation (or the "roll" of the camera) was.
In TrackballControls, it would be nice to have a setting for programmatically updating the camera's rotation that wouldn't get squashed by a call to rotateCamera(). I'll have to think about that, because it doesnt seem like it would be easy to implement.

Implementing image gestures: UIImageView mode conflicts with pinch gesture

Pan works fine for me, but pinch with recognizer code like this does not:
- (void)pinchDetected:(UIPinchGestureRecognizer *)pinchRecognizer
{
CGFloat scale = pinchRecognizer.scale;
self.imageView.transform = CGAffineTransformScale(self.imageView.transform, scale, scale);
pinchRecognizer.scale = 1.0;
}
What happens is that the image view is continuously resetting the image according to its "mode", whether it's center, aspect fit, etc.
I solved my problem: I'm making my first image viewer, and to learn how to pinch and zoom, I naively googled for how to support gestures, which are not enabled by simply adding an image view to a view controller.
Unfortunately, there are many "tutorials" on this, showing how to program with the gesture recognizers, etc. And I spent a few hours going down this route unnecessarily. I kept going because I felt tantalizingly close to getting things working: The pan gesture was flawless and was "just" zoom that was broken.
(Side question: is there some awesome source for current, iOS 6 "best practices"?)
It turns out, this is the wrong path and needlessly complex for basic gesture recognition. All that's needed is to place the image view in a scroll view. 99% of the programming is taken care of. (I was convinced this had to be the case — I couldn't believe that such core functionality wouldn't be provided by cocoa touch.)

AssetsLibrary with UIImagePicker - ELCImagePickerController

I was wondering if anyone else has used the following in their iOS applications.
https://github.com/elc/ELCImagePickerController
Basically it is a clone of the UIImagePicker using the using the AssetsLibrary that is available with iOS 4.0.
For the most part I like it, but I ran into two issues.
1) When on a device, it take quite a while when there are more that 200 images in a Library. While it works when once it loads, it takes quite a bit longer than I would ideally like.
2) When selecting some images, it brings it over with a different orientation that is shown on the screen. (looks like that happens most with pictures I took from the iPhone) I even seen it turn an image upside down.
I am just curious if anyone else has used this, and if so, were they able to overcome these issues.
Regarding orientation you can use the "ALAssetPropertyOrientation" and get the image orientation and than get any orientation that you may need.
Below are the orientation that iOS support
typedef enum {
UIImageOrientationUp, // default orientation
UIImageOrientationDown, // 180 deg rotation
UIImageOrientationLeft, // 90 deg CCW
UIImageOrientationRight, // 90 deg CW
UIImageOrientationUpMirrored, // as above but image mirrored along other axis. horizontal flip
UIImageOrientationDownMirrored, // horizontal flip
UIImageOrientationLeftMirrored, // vertical flip
UIImageOrientationRightMirrored, // vertical flip
} UIImageOrientation;
1) Load first 100 images, update GUI and load other in background.
2) UIImage have property imageOrientation