I am trying to make a video player App in React-native. I want a feature like start video from there where the user left before closing the Application. Any clue?
I am using React-native-video
You can get the current time from the video component by simply running videoRef.currentTime.
You can then store this in your localStorage
More specifically if you want to achieve this using store/local storage.Please check this out.
import Video from "react-native-video"
constructor(props) {
super(props)
this.progress = 0
this.onProgress = this.onProgress.bind(this)
//Here you can store this progress time to async storage or store.So that
when you navigate back to this screen you can either get the progress
from storage or from initial state = 0
}
onProgress = (data) => {
this.progress = data.currentTime
}
<Video
ref={(ref) => { this.player = ref }}
source={{ uri: '.....' }}
onProgress={this.onProgress}
onLoad={() => {
if (menuVideoProgress > 0)
this.player.seek(menuVideoProgress) //here recent progress times comes
from storage
}}
/>
Related
I am making a function for saving a photo by react-native-vision-camrea.
For that, I am using RNFS from react-native-fs to access the android file path and saving react-native-vision-camera's useRef-Value.
Below is my Code.
function Cam(){
const devices = useCameraDevices()
const device = devices.back
const camera = useRef(null)
const takePhotoOptions = {
qualityPrioritization: 'speed',
flash: 'on'
}
const captureHandler = async()=>{
try{
if (camera.current == null) throw new Error('Camera Ref is Null');
console.log('Photo taking ....');
const data = await camera.current.takePhoto(takePhotoOptions)
await RNFS.moveFile(`/${data.path}`, `${RNFS.ExternalDirectoryPath}/temp.jpg`)
.then(()=>console.log("Image Moved",`${data.path}`,'-- to --',`${RNFS.ExternalDirectoryPath}`))
await RNFS.writeFile(`${RNFS.ExternalDirectoryPath}/temp.jpg`, data, 'ascii')
.then(()=>console.log("Image wrote",`${data.path}`,'-- to --',`${RNFS.ExternalDirectoryPath}`))
}catch(err){
console.log(err)
}
}
return(<>
{device != null&&<>
<Button title='Take a Photo' onPress={()=>captureHandler()}/>
<View style={{ flex: 1 }}>
<Camera
ref={camera}
style={StyleSheet.absoluteFill}
device={device}
photo={true}
isActive={true}
/>
</View>
</>}
</>)
}
I got permission to CAMERA/WRITE_EXTERNAL_STORAGE.
And available to get data.path and RNFS.ExternalDirectoryPath.
But I could not find the file that I took a picture for saving.
I tried to attempt both moveFile/writeFile functions from RNFS.
Below is the log I got
LOG Photo taking...
LOG Image Moved /data/user/0/com.camera/cache/mrousavy6998557498651783260.jpg -- to -- /storage/emulated/0/Android/data/com.camera/files
LOG Image wrote /data/user/0/com.camera/cache/mrousavy6998557498651783260.jpg -- to -- /storage/emulated/0/Android/data/com.camera/files
From react-native-fs documentation for Android:
Android support is currently limited to only the DocumentDirectory. This maps to the app's files directory.
Try to use DocumentDirectory instead of ExternalDirectoryPath
I implemented Google analytics in the React native app.
The code to log the screen is as follows.
<NavigationContainer
linking={linking}
ref={(navigationRef) =>
(this.navigationRef = navigationRef)
}
onReady={() => {
this.routeNameRef =
this.navigationRef.getCurrentRoute().name;
// first log in init
Analytics.logScreenView({
screen_name: this.routeNameRef,
screen_class: this.routeNameRef,
});
}}
onStateChange={async () => {
const previousRouteName = this.routeNameRef;
const currentRoute = this.navigationRef.getCurrentRoute();
const currentRouteName = currentRoute.name;
const currentScreenName =
(currentRoute.params &&
currentRoute.params.screenName) ||
currentRouteName;
if (previousRouteName !== currentRouteName) {
await Analytics.logScreenView({
screen_name: currentScreenName,
screen_class: currentRouteName,
});
}
// Save the current route name for later comparision
this.routeNameRef = currentRouteName;
}}
>
After the app was released, I tried to determine which page the user was browsing in Google Analytics path exploration.
But when i set STARTING POINT to "first_open"(the first time a user launches an app after installing or re-installing it), STEP 1's page title and screen name is set to (not set)
What is wrong with this? What am i doing wrong?
and step2 has 39 event count, but step3 has 35 event count.
where is 4 event count? 4 user exited app?
Is there something wrong with the way I look at the report?
How exactly is it to access the app and see which page the user has moved to?
Debugging did not detect that screen name was not set.
I am trying to incorporate this WYSIWYG package into my react native project (0.64.3). I built my project with a managed workflow via Expo (~44.0.0).
The problem I am noticing is that the editor will sometimes render with the text from my database and sometimes render without it.
Here is a snippet of the function that retrieves the information from firebase.
const [note, setNote] = useState("");
const getNote = () => {
const myDoc = doc(db,"/users/" + user.uid + "/Destinations/Trip-" + trip.tripID + '/itinerary/' + date);
getDoc(myDoc)
.then(data => {
setNote(data.data()[date]);
}).catch();
}
The above code and the editor component are nested within a large function
export default function ItineraryScreen({route}) {
// functions
return (
<RichEditor
onChange={newText => {
setNote(newText)
}}
scrollEnabled={false}
ref={text}
initialFocus={false}
placeholder={'What are you planning to do this day?'}
initialContentHTML={note}
/>
)
}
Here is what it should look like with the text rendered (screenshot of simulator):
But this is what I get most of the time (screenshot from physical device):
My assumption is that there is a very slight delay between when the data for the text editor is actually available vs. when the editor is being rendered. I believe my simulator renders correctly because it is able to process the getNote() function faster.
what I have tried is using a setTimeOut function to the display of the parent View but it does not address the issue.
What do you recommend?
I believe I have solved the issue. I needed to parse the response better before assigning a value to note and only show the editor and toolbar once a value was established.
Before firebase gets queried, I assigned a null value to note
const [note, setNote] = useState(null);
Below, I will always assign value to note regardless of the outcome.
if(data.data() !== undefined){
setNote(data.data()[date]);
} else {
setNote("");
}
The last step was to only show the editor once note no longer had a null value.
{
note !== null &&
<RichToolbar
style={{backgroundColor:"white", width:"114%", flex:1, position:"absolute", left:0, zIndex:4, bottom: (toolbarVisible) ? keyboardHeight * 1.11 : 0 , marginBottom:-40, display: toolbarVisible ? "flex" : "none"}}
editor={text}
actions={[ actions.undo, actions.setBold, actions.setItalic, actions.setUnderline,actions.insertLink, actions.insertBulletsList, actions.insertOrderedList, actions.keyboard ]}
iconMap={{ [actions.heading1]: ({tintColor}) => (<Text style={[{color: tintColor}]}>H1</Text>), }}
/>
<RichEditor
disabled={disableEditor}
initialFocus={false}
onChange={ descriptionText => { setNote(descriptionText) }}
scrollEnabled={true}
ref={text}
placeholder={'What are you planning to do?'}
initialContentHTML={note}
/>
}
It is working properly.
Anyone knows how to only play the first few seconds, and repeat it in a react-native video?
I have a profile page which displays all the videos of the user. Currently its playing the whole video but i only want the first few seconds (something like Tiktok).
If you want to just loop the first seconds, just add an onProgress and send back to starting second when it passes a certain second:
class Video extends Component {
render() {
const {source, YOUR_VARIABLE_FOR_SECONDS} = this.props;
return <VideoPlayer
source={source}
ref={(ref) => {
this.player = ref
}}
onProgress={({currentTime}) => {
if (currentTime > YOUR_VARIABLE_FOR_SECONDS) {
this.player.seek(0)
}
}}
/>
}
}
There is no sense in doing it that way. Probably TikTok takes the "five seconds" and transforms it into GIF using something like a GIFIFY (https://github.com/jclem/gifify). If someone asked me to do this, I would certainly do so.
EDIT:
I think that you can create a code/component that displays only the X seconds, I think its possible
I'm using react-native-camera library in my React Native project.
But I have a problem when I try to take the photo. It takes 3-4 seconds before the photo is saved. When I click on the button to take the photo I hear sound but then it takes about 3-4 seconds to save the photo.
The render method is as follows:
return (
<View style={styles.container}>
<Camera ref={(cam) => {this.camera = cam;}}
style={styles.preview}
aspect={Camera.constants.Aspect.fill}>
{this.imageOverlay()}
<Text style={styles.capture} onPress={this.takePicture.bind(this, this.state.type)}>[CAPTURE]</Text>
</Camera>
</View>
)
And takePicture function is as follows:
takePicture(type){
let self = this;
this.camera.capture({target: Camera.constants.CaptureTarget.disk})
.then((data) => {
<!---------------------------------------------------------->
<!------------It takes 3-4 seconds to come here------------!>
<!---------------------------------------------------------->
let n_pictures = [];
each(this.state.pictures, function (picture){
if(picture.item !== type.item){
n_pictures.push(picture)
}else{
n_pictures.push({
title: type.title,
noImage: false,
imageOverlay: type.imageOverlay,
image: data.path,
imageIcon: type.imageIcon,
overlay: false,
item: type.item,
mandatory: type.mandatory
})
}
});
self.setState({pictures: n_pictures, showCamera: false})
})
.catch(err => console.error(err));
}
Any idea how to solve it?
Can I at least put an loading screen until the photo is saved?
So I had this same issue, and after a while of searching on the internet I could not find an answer to speed it up.
However, to answer your query about using a loading screen you may want to look into Javascript promises. For my app, I redirected the user immediately to a new screen and showed a loading picture while the promise was not resolved/rejected. Once it was resolved, the picture showed.
I know this is an old answer, but I'm going to put this here for anyone else who may have a similar issue.