I'm a newbie to React Native so if I'm asking a very dumb question. Please forgive me for wasting your time.
I need to apply multiple devices layout in my React Native app. Let's say my application screens have completely different appearances but the same business processes on mobile and tablet devices.
How do I achieve that in React Native? Where do I start digging?
EDIT 2020:
Ok, I was a newbie...! Sorry!
Like #Hariks says, you could try to use something like this module
and put something like:
import Device from 'react-native-device-detection';
// Mobile Styles
let imageSize = 60;
// Tablet Styles
if(Device.isTablet) {
imageSize = 120;
}
Old answer: (if you want to detect OS)
I'm newbie too, and, from what I've understood, and extracted from here, there are two methods:
By naming files (recommended) Platform specific files can be named as “[filename].android.js” and “[filename].ios.js” for Android
and iOS respectively. If we import or require [filename], it picks up
the file depending on the host platform.
By adding conditionals in the source code For example, if you want the background of the navbar in different colors for iOS and
Android we can write the following code:
Code: backgroundColor: (Platform.OS === ‘ios’ ) ? ‘gray’ : ‘blue’
Of course, you should take a look at the official documentation.
If you are styling based on the OS, you could use Platform as mentioned by #anfuca. If you need to style based on devices ie tabs and phone, there is a handy module react-native-device-detection
You could do something like this in your style file
import Device from 'react-native-device-detection';
// Mobile Styles
let imageSize = 60;
// Tablet Styles
if(Device.isTablet) {
imageSize = 120;
}
Also you could create a global style file where you could define fontsizes and all based on the device/pixel ratios.
commonstyle.js
import Device from 'react-native-device-detection';
let h1 = 20;
let h2 = 18;
let h3 = 16;
if(Device.isTablet) {
h1 = 25;
h2 = 22;
h3 = 20;
}
module.exports = {
h1,
h2,
h3
};
You can use device detection for detecting mobile or tablet and use separate styling for mobile and tablet accordingly
https://www.npmjs.com/package/react-native-device-detection
Related
Setting MaterialTheme.colors
I'm trying to make a very basic window in Jetpack Compose for Desktop (not mobile), but I'm having some difficulties with changing the colors of the window. I've looked at some tutorials and examples, but maybe I don't quite understand how color themes are correctly implemented.
The code that I wrote should create a window with a dark background, but the window when the program runs is white.
Please provide any insights you can as to what I am doing wrong.
Code (Kotlin)
import androidx.compose.desktop.*
import androidx.compose.material.*
import androidx.compose.ui.unit.*
fun main() = Window(
title = "Window",
resizable = false,
size = IntSize(1200, 800),
) {
MaterialTheme(colors = darkColors()) {
}
}
Window
Other Info
macOS Big Sur
IntelliJ 2021.2
Jetpack Compose 0.4.0
The MaterialTheme only provides colors for all views inside the container, it does not create or render the view.
Most Material components will use these colors as default values, but you can also use these colors in your views using, for example, MaterialTheme.colors.background.
You need to put some view inside, size it and apply some background color, for example:
MaterialTheme(colors = darkColors()) {
Box(Modifier.fillMaxSize().background(MaterialTheme.colors.background))
}
You can use Scaffold to see changes.
In your example:
...
MaterialTheme(colors = darkColors()) {
Scaffold {
// your content
}
}
...
You can read about it:
https://developer.android.com/jetpack/compose/layouts/material
or here:
https://metanit.com/kotlin/jetpack/4.11.php
Native resolution for Apple TV seems to be 1920x1080 (as expected) but for Android TV / Fire TV it seems to be 961.5022957581195x540.8450413639423 (according to Dimensions.get('window')).
So, when I run my app on Apple TV everything looks fine. But when I run it on an Android TV nothing fits on the screen.
Is there a way to force the Android TV to shrink everything? Or do I have to create two different style sheets for the different devices to change font sizes and dimensions of all my components?
We use a different approach, on the base Application class for tv we add this
class TvApplication extends Application {
#Override
protected void attachBaseContext(Context base) {
Configuration configuration = new Configuration(base.getResources().getConfiguration());
configuration.densityDpi = configuration.densityDpi / 2;
Context newContext = base.createConfigurationContext(configuration);
super.attachBaseContext(newContext);
}
}
with this, we have consistent width & height when using dimensions, and we can use the same styling values on all platforms without doing any manipulation on the JS side.
it's not perfect, but it's more convenient when building for multiple platforms
Use Platform.OS to check for platform and use margin property in styles to get the content right on screen in android. This is normal behavior in android tv.
You have PixelRatio and Dimensions for this purpose in React Native. Along with this you need to use a RN module react-native-pixel-perfect, this keeps your app pixel perfect across all devices, quickly and easily
import {PixelRatio, Dimensions} from 'react-native';
import {create} from 'react-native-pixel-perfect';
let displayProps = {
width: PixelRatio.roundToNearestPixel(
Dimensions.get('window').width * PixelRatio.get(),
),
height: PixelRatio.roundToNearestPixel(
Dimensions.get('window').height * PixelRatio.get(),
),
};
let perfectSize = create(displayProps);
Now always pass your size in pixels to this method to get original device pixels based on the devices.
const styles = StyleSheet.create({
container: {
width: perfectSize(500),
height: perfectSize(300)
}
});
Your container will adapt to the devices correctly. Based on their screen resolution.
In case if you have a minimum height x width to support but some devices are less than the minimum screen resolution and you want to still achieve the same results in those devices. Then you can set the minimum screen resolution on this function like below.
let displayProps = {
width: PixelRatio.roundToNearestPixel(
Math.max(1920, Dimensions.get('window').width * PixelRatio.get()),
),
height: PixelRatio.roundToNearestPixel(
Math.max(1080, Dimensions.get('window').height * PixelRatio.get()),
),
};
So in my case if the screen resolution is less than 1920x1080 lets say 720p devices then this will help rendering the UI in 1920x1080.
I'm trying to implement this camera but one of the obstacles I'm facing right now, is the merging of two cameras (what he describes here).
At first I tried to make a non-rectangular camera, but I don't think it's possible without changing a lot of things in the way HaxeFlixel renders.
And then I found the alphaMask() function in the FlxSpriteUtil package and I think it would be a better solution.
Not only would it solve my problem, it would actually permit all kinds of funky-shaped cameras, you just have to create the right mask!
But the new problem is that I don't know how to (and again, if it's possible without changing a bit the FlxCamera) apply it to the camera.
Internally, the FlxCamera might use a FlxSprite, but only in blit render mode, and I am in tiles render mode (haven't found how to change, not good enough solution in my opinion), which uses a Flash Sprite instead and I don't know what to do with it.
So in short, do you have an idea how to apply an AlphaMask to a FlxCamera? Or another way to achieve what I'm trying to do?
PS: If you want to have a look at the (ugly and frenchly commented) code, it's over here!
You can render the contents of a FlxCamera to a FlxSprite (though it does require conditional code based on the render mode). The TurnBasedRPG tutorial game uses this for the wave effect in the combat screen, see CombatHUD.hx:
if (FlxG.renderBlit)
screenPixels.copyPixels(FlxG.camera.buffer, FlxG.camera.buffer.rect, new Point());
else
screenPixels.draw(FlxG.camera.canvas, new Matrix(1, 0, 0, 1, 0, 0));
Here's a code example that uses this to create a HaxeFlixel-shaped camera:
package;
import flixel.tweens.FlxTween;
import flash.geom.Matrix;
import flixel.FlxCamera;
import flixel.FlxG;
import flixel.FlxSprite;
import flixel.FlxState;
import flixel.graphics.FlxGraphic;
import flixel.system.FlxAssets;
import flixel.util.FlxColor;
import openfl.geom.Point;
using flixel.util.FlxSpriteUtil;
class PlayState extends FlxState
{
static inline var CAMERA_SIZE = 100;
var maskedCamera:FlxCamera;
var cameraSprite:FlxSprite;
var mask:FlxSprite;
override public function create():Void
{
super.create();
maskedCamera = new FlxCamera(0, 0, CAMERA_SIZE, CAMERA_SIZE);
maskedCamera.bgColor = FlxColor.WHITE;
maskedCamera.scroll.x = 50;
FlxG.cameras.add(maskedCamera);
// this is a bit of a hack - we need this camera to be rendered so we can copy the content
// onto the sprite, but we don't want to actually *see* it, so just move it off-screen
maskedCamera.x = FlxG.width;
cameraSprite = new FlxSprite();
cameraSprite.makeGraphic(CAMERA_SIZE, CAMERA_SIZE, FlxColor.WHITE, true);
cameraSprite.x = 50;
cameraSprite.y = 100;
cameraSprite.cameras = [FlxG.camera];
add(cameraSprite);
mask = new FlxSprite(FlxGraphic.fromClass(GraphicLogo));
var redSquare = new FlxSprite(0, 25);
redSquare.makeGraphic(50, 50, FlxColor.RED);
add(redSquare);
FlxTween.tween(redSquare, {x: 150}, 1, {type: FlxTween.PINGPONG});
}
override public function update(elapsed:Float):Void
{
super.update(elapsed);
var pixels = cameraSprite.pixels;
if (FlxG.renderBlit)
pixels.copyPixels(maskedCamera.buffer, maskedCamera.buffer.rect, new Point());
else
pixels.draw(maskedCamera.canvas);
cameraSprite.alphaMaskFlxSprite(mask, cameraSprite);
}
}
Is there a code sample available that illustrates how to use a 2D transform (such as rotate and scale) with a JPG in a react-native application, perhaps with the code in the tutorial as a starting point?
If possible, it would be helpful to see code for two scenarios:
1) automatically apply a transform when the app is launched
2) apply a transform after different types of user gestures
At some point in the future it would be interesting to see how to create 3D transforms and animation effects.
Update: You can see the entire example in my sample app here: https://github.com/grgur/react-native-memory-game
Animation is now AnimationExperimental so we'll need to modify zvona's solution.
First, make sure RCTAnimationExperimental is a linked library
If not, then follow these steps:
Navigate to node_modules/react-native/Libraries/Animation/
Drag and drop RCTAnimationExperimental.xcodeproj to Libraries (should look like the image above)
Click on your project name (in the example above, my project name is Memory)
Switch to the Build Phases tab
Expand Libraries/RCTAnimationExperimental.xcodeproj/Products
Drag libRctAnimationExperimental.a to Link Binary With Libraries
Ok, the hardest part is now over. Head over to your JavaScript file. Animation is no longer part of the react-native package so we have to include it explicitly.
var React = require('react-native');
var AnimationExperimental = require('AnimationExperimental');
Alright, champ, you're ready to animate. Make sure you know what you're animating. The view you will be animating is referred to as node.
AnimationExperimental.startAnimation({
node: this.refs.image,
duration: 400,
easing: 'easeInQuad',
property: 'opacity',
toValue: 0.1,
});
And that's it!
At the moment of writing, available properties are: opacity, position, positionX, positionY, rotation, scaleXY
Currently, this is a bit more complex process and I'm planning to write a blog post about that. However, as a brief starter, I write something here.
First problem is that RCTAnimation / RCTAnimationManager is not present at all, if you've created your project with react-native init [ProjectName] (https://github.com/facebook/react-native/issues/226).
You need to add it in XCode from a plus sign in top left corner: "Add Files to [ProjectName]". Then you navigate to node_modules > Libraries > Animation > RCTAnimation.xcodeproj. After it's imported, you need to drag it under Libraries in your project.
Then you need to open tab Build Phases. There you have menu Link Binary With Libraries (x items). Drag from Products under RCTAnimation.xcodeproj file named libRCTAnimation.a to the menu.
Now, you can build your project to support animations. I'm not that familiar with XCode, so there could be a even more simple way of achieving this, but I got it sorted like this.
Second Problem is that not all the available (or planned) functionality is there. At least I ran through the jungle of trials and errors before I got anything on the screen.
Try e.g. this code at first to get fully proofed that animations are working:
var {
Animation,
AppRegistry,
StyleSheet,
Text,
View
} = React;
var styles = StyleSheet.create({
test: {
width: 400,
height: 400,
backgroundColor: 'blue',
opacity: 0
}
});
var AnimationTest = React.createClass({
componentDidMount () {
Animation.startAnimation(this.refs['this'], 400, 0, 'linear', {opacity: 1});
},
render () {
return (
<View ref='this' style={styles.test}>
<Text>Just an animation test</Text>
</View>
)
}
});
AppRegistry.registerComponent('AnimationTest', () => AnimationTest);
This should get you going. If you need any further assistance, please notify me.
If I ever succeed in writing a more complete instructions in a form of a blog article, I'll update it to this answer.
Check out the 2048 demo application for example usage of the RCTAnimation library:
https://github.com/facebook/react-native/tree/master/Examples/2048
It doesn't use any especially complex transforms, but does animate position, opacity, and scaleXY of various elements with code that looks like this:
Animation.startAnimation(this.refs['this'], 300, 0, 'easeInOutQuad', {scaleXY: [1, 1]});
I have created an app using appcelerator for Iphone , which buy click on buttons it will play a relative sound , here is the code, but the problem is when i play the audio many times and play different audios using this function the sound starts to lag and have noise inside, can anybody help me with it , Thanks.
var soundplaying = 0;
var sound;
function playaudio(url) {
if (soundplaying == 0) {
sound = Ti.Media.createSound({});
sound.setUrl('../assets/audio/' + url);
sound.addEventListener('complete', function() {
sound.release();
soundplaying = 0;
});
sound.play();
soundplaying = 1;
}
}
(i have tried to release the sound object after each time but still no use, I tried to createSound only once but seems the titanium dose not support changing url for Media.Sound) dynamically.
I could have solve this issue temporary , by changing the audio files format to .m4a (aac).
i was using mp3 earlier.