I am trying to disable the seek function on react native video. I have a full video that I want to preview for 30 seconds. In order to do this I want to disable the seek button so a user cannot skip through the video.
I have tried giving onSeek the value of function that exits the video player however this does not seem to do anything.
if(!loading) {
return <Video source={{uri: uri}} // Can be a URL or a local file.
onFullscreenPlayerDidDismiss={this.onDismiss}
preferredPeakBitrate={this.state.preferredPeakBitrate}
ref={(player) => {
if(!this.state.playing && player) {
player.presentFullscreenPlayer()
this.setState({ playing: true })
}
}} // Store reference
rate={1.0} // 0 is paused, 1 is normal.
volume={1.0} // 0 is muted, 1 is normal.
muted={false} // Mutes the audio entirely.
paused={false} // Pauses playback entirely.
resizeMode="cover" // Fill the whole screen at aspect ratio.*
repeat={false} // Repeat forever.
playInBackground={true} // Audio continues to play when app entering background.
playWhenInactive={true} // [iOS] Video continues to play when control or notification center are shown.
ignoreSilentSwitch={"ignore"} // [iOS] ignore | obey - When 'ignore', audio will still play with the iOS hard silent switch set to silent. When 'obey', audio will toggle with the switch. When not specified, will inherit audio settings as usual.
progressUpdateInterval={PROGRESS_MILLISECONDS} // [iOS] Interval to fire onProgress (default to ~250ms)
onError={this.onVideoError} // Callback when video cannot be loaded
onProgress={this.onProgress}
onLoadStart={this.onStart}
onEnd={this.stopPlaybackPing}
/>
} else {
return <View />
}
}
Short answer: No, you can't.
You called presentFullscreenPlayer() to play the video, unfortunately, you can't disable any buttons on the player. Because that's the default player made by Apple if you're running your app on iPhone, not by the people who created react-native-video, and I don't believe there's any public API that allows you to do so.
What you can do, however, is to write your own full screen player, with any button you want/don't want on there. Here's some hint:
Create a custom component called CustomVideo, which takes the url of the video as a prop:
// CustomVideo.js file
import React, { PureComponent } from 'react';
import { ... } from 'react-native';
import Video from 'react-native-video';
export class CustomVideo extends PureComponent {
constructor(props) {
super(props)
this.state = {
// Have any state you want here, for example
paused: false,
played: 0,
duration: 0,
isFullscreen: false
}
}
render() {
const { url } = this.props;
const { paused, played, duration, isFullscreen } = this.state;
return(
<View style={{ ... }}>
<Video
source={{ uri: url }}
...
/>
// =======> Here, you add your custom buttons <=======
// Suppose you want a pause/play button
<TouchableOpacity onPress={this.toggleVideo}>
<Text>{paused ? "Play" : "Pause"}</Text>
</TouchableOpacity>
// If you want a progress indicator, which users
// can use to skip videos, then use `Slider` component
<Slider
value={...}
step={...}
onValueChange={(value) => ...}
/>
// Here, you toggle whether you want to play the video
// in full screen mode, if so, render it in a modal
// Also, add a full screen toggle button to the video
// the same way you add a play/pause button
<Modal visible={isFullscreen}>
<View>
<Video ... />
</View>
</Modal>
</View>
);
}
}
So, next time, when you want render a video, instead of calling <Video source={{ uri: '...' }} />, you can call your <CustomVideo url='https://....' /> component.
Related
I'm trying to create a workaround to calculate/get content height which should be rendered inside React Native WebView component.
For that Im artificially calling onScroll event with Javascript , then trying to update state with the actual content height. But even console log shows different values for content height, which should trigger state update , it doesnt happen. Only when really scrolling it gets updated. But its for sure that even without scrolling the event is triggered as console log is working.
Any idea why here state update fails? (In the code snippet some backticks are missing)
const [contentHeight, setContentHeight] = useState(10);
const webViewScript =
setTimeout(function() {
window.postMessage(document.body.scrollDown);
}, 1000);
true; // note: this is required, or you'll sometimes get silent failures
;
<WebView
injectedJavaScript={window.ReactNativeWebView.postMessage(${webViewScript})}
domStorageEnabled={true}
automaticallyAdjustContentInsets={false}
onScroll={(event) =>
setContentHeight(Number(event.nativeEvent.contentSize.height));
}
javaScriptEnabled
style={{ height: contentHeight }}
source={{ html: htmlTempl, baseUrl: '' }}
/>
I'm working on an app where we want to allow our users (tech support personnel) to take a photo of an employee's badge so we can analyze the photo and extract the badge ID (we already have an open source algorithm to do this) to create a ticket. I have the app running a webview of the application used by helpdesk personnel from their desktops, as well as a button at the top to open the camera (currently it's just an alert). Code is below:
const onPress = () => {
Alert.alert('Will capture the badge using the camera');
};
export default class App extends React.Component {
render() {
return (
<View style={styles.container}>
<Button
onPress={onPress}
title="Capture Badge"
color="#841584"
accessibilityLabel="Capture a Customer's ID via a picture of their badge"
/>
<WebView
source={{uri: "www.example.com"}}
/>
</View>
);
}
Basically, what do I need to put in onPress so that I can take a picture and then send it to our algorithm (the algorithm will be just a function call at the end of onPress)? I've done research on using a camera in react native, but everything is taking about rendering a camera in the view. I only want to open a camera view if a user taps the "Capture Badge" button at the top of the app.
You can use react-native-image-cropper-picker. If I understood you correctly, you want to be able to launch the camera and then as a promise, send the image to your object detection algorithm. With this library you can launch the camera as follows:
openCamera(){
ImagePicker.openCamera({
width: 300,
height: 400,
cropping: true
}).then(image => {
//Your image
console.log(image);
});
}
I have a list of items with a Video URL in each row.I have used React-Native-Video component for Video view.
<TouchableOpacity
style={styles.fullScreen}
onPress={() => this.setState({ paused: !this.state.paused })}
>
<Video source={{ uri: rowData.podcastUrl }} // URL
ref={(ref) => {
// Store reference
this.player = ref
......
......
}}
paused={true}
</TouchableOpacity>
Above code is inside renderMyList() function which is called on the rendering of each row.
When I load my UI all the videos are either in play mode or stopped based on 'paused' state false or true(whatever I pass in above. But I want to properly handle the play/pause of individual video(list item).I have maintained a paused state variable.
Please suggest some sample code.
If you are planning to let only one video to play any given moment you can do something like below
onPress={() => { this.setState({ playing: 'someUniqueIdForThisVideo'}) }}
And check if that video is playing
paused={this.state.playing !== 'videoIdOrSomething'}
If you want multiple videos to be played you can do something like below
onPress={() => { this.setState({ ['someUniqueIdForThisVideo']: true}) }}
And check state like below
paused={this.state['someUniqueIdForThisVideo'] !== true}
This is just to give you a rough idea. You need to implement some sort of logic to set state to false if its already playing.
I'm interested in having a view which initially loads with my React Native app that essentially has nested components in it. These components will give visual queues to the user as to what state the app is in, eg: still loading data from the server, etc. Basically, it's not just a static splash screen. I might also add some kind of spinner/progress bar, eg: other animated components.
There are solutions out there for static splash screens that initially show while your app loads into memory, but I need to be able to load an initial component, and then remove it when the application's data is ready to go. Is there a convention/pattern to follow in order to achieve this? Is there a way to mount a component, then remove it when it's no longer necessary allowing the rest of the app to be displayed? What's a best practice for this using React Native?
This is what I used to do:
Use <Modal /> to provide your initial, interactive page. It will blocks the screen, with semi-transparent background; If you like it to be full width, just use flex: 1 within the <View /> inside <Modal />.
Use global object / queue for loading status information. My choice is rxjs, then your initial page can just listen to this one source of truth, suggest a BehaviorSubject. So you can subscribe on it for something like:
...
{ tag: 'FetchRemoteData', progress: 10 }
{ tag: 'LoadingComponent', progress: 5 }
{ tag: 'FetchRemoteData', progress: 20 }
...
Read it until match your "load complete" conditions, then close the model.
To make it clear with code.
app.js
render() {
return (
<View>
<InitialBlockingPage />
<YourMainApp />
</View>
);
}
initial-blocking-page.js
constructor(props) {
super(props);
this.state = {
visible: true
};
}
componentDidMount() {
globalQueue.subscribe( () => {
/// pseudo code: until fully loaded
if (fullloaded) this.setState({visible: false});
});
}
render() {
return (
<Modal visible={this.state.visible}>
<SplashScreenWithData />
</Modal>
);
}
I am trying to insert youtube video to my react-native project. I added- react-native-video library, then this code:
import React, { Component } from 'react';
import {
AppRegistry,
StyleSheet,
Text,
View,
Video
} from 'react-native';
export default class video extends Component {
render() {
return (
<Video source={{uri: "Gladiator trailer.mp4"}}
ref={(ref) => {
this.player = ref
}} // Store reference
rate={1.0} // 0 is paused, 1 is normal.
volume={1.0} // 0 is muted, 1 is normal.
muted={false} // Mutes the audio entirely.
paused={false} // Pauses playback entirely.
resizeMode="cover" // Fill the whole screen at aspect ratio.
repeat={true} // Repeat forever.
playInBackground={false} // Audio continues to play when app entering background.
playWhenInactive={false} // [iOS] Video continues to play when control or notification center are shown.
progressUpdateInterval={250.0} // [iOS] Interval to fire onProgress (default to ~250ms)
onLoadStart={this.loadStart} // Callback when video starts to load
onLoad={this.setDuration} // Callback when video loads
onProgress={this.setTime} // Callback every ~250ms with currentTime
onEnd={this.onEnd} // Callback when playback finishes
onError={this.videoError} // Callback when video cannot be loaded
style={styles.backgroundVideo} />
);
}
}
const styles = StyleSheet.create({
backgroundVideo: {
position: 'absolute',
top: 0,
left: 0,
bottom: 0,
right: 0,
}
});
AppRegistry.registerComponent('video', () => video);
I am receiving this error: element type is invalid: expected a string (for build-in components) or a class/function (for composite components) but got: undefined. Check the render method of 'video'.
It throws that error when there's something wrong with a component you're returning in the render method. In this case I think it's your Video component, to my knowledge I don't believe there is a Video component provided by RN. If you're pulling it in from another library, make sure to import it from there and not RN.
For example:
import Video from 'react-native-video';