react-native-vision-camera save a photo with react-native-fs - react-native

I am making a function for saving a photo by react-native-vision-camrea.
For that, I am using RNFS from react-native-fs to access the android file path and saving react-native-vision-camera's useRef-Value.
Below is my Code.
function Cam(){
const devices = useCameraDevices()
const device = devices.back
const camera = useRef(null)
const takePhotoOptions = {
qualityPrioritization: 'speed',
flash: 'on'
}
const captureHandler = async()=>{
try{
if (camera.current == null) throw new Error('Camera Ref is Null');
console.log('Photo taking ....');
const data = await camera.current.takePhoto(takePhotoOptions)
await RNFS.moveFile(`/${data.path}`, `${RNFS.ExternalDirectoryPath}/temp.jpg`)
.then(()=>console.log("Image Moved",`${data.path}`,'-- to --',`${RNFS.ExternalDirectoryPath}`))
await RNFS.writeFile(`${RNFS.ExternalDirectoryPath}/temp.jpg`, data, 'ascii')
.then(()=>console.log("Image wrote",`${data.path}`,'-- to --',`${RNFS.ExternalDirectoryPath}`))
}catch(err){
console.log(err)
}
}
return(<>
{device != null&&<>
<Button title='Take a Photo' onPress={()=>captureHandler()}/>
<View style={{ flex: 1 }}>
<Camera
ref={camera}
style={StyleSheet.absoluteFill}
device={device}
photo={true}
isActive={true}
/>
</View>
</>}
</>)
}
I got permission to CAMERA/WRITE_EXTERNAL_STORAGE.
And available to get data.path and RNFS.ExternalDirectoryPath.
But I could not find the file that I took a picture for saving.
I tried to attempt both moveFile/writeFile functions from RNFS.
Below is the log I got
LOG Photo taking...
LOG Image Moved /data/user/0/com.camera/cache/mrousavy6998557498651783260.jpg -- to -- /storage/emulated/0/Android/data/com.camera/files
LOG Image wrote /data/user/0/com.camera/cache/mrousavy6998557498651783260.jpg -- to -- /storage/emulated/0/Android/data/com.camera/files

From react-native-fs documentation for Android:
Android support is currently limited to only the DocumentDirectory. This maps to the app's files directory.
Try to use DocumentDirectory instead of ExternalDirectoryPath

Related

How can i remove (not set) in google analytics path exploration with react native?

I implemented Google analytics in the React native app.
The code to log the screen is as follows.
<NavigationContainer
linking={linking}
ref={(navigationRef) =>
(this.navigationRef = navigationRef)
}
onReady={() => {
this.routeNameRef =
this.navigationRef.getCurrentRoute().name;
// first log in init
Analytics.logScreenView({
screen_name: this.routeNameRef,
screen_class: this.routeNameRef,
});
}}
onStateChange={async () => {
const previousRouteName = this.routeNameRef;
const currentRoute = this.navigationRef.getCurrentRoute();
const currentRouteName = currentRoute.name;
const currentScreenName =
(currentRoute.params &&
currentRoute.params.screenName) ||
currentRouteName;
if (previousRouteName !== currentRouteName) {
await Analytics.logScreenView({
screen_name: currentScreenName,
screen_class: currentRouteName,
});
}
// Save the current route name for later comparision
this.routeNameRef = currentRouteName;
}}
>
After the app was released, I tried to determine which page the user was browsing in Google Analytics path exploration.
But when i set STARTING POINT to "first_open"(the first time a user launches an app after installing or re-installing it), STEP 1's page title and screen name is set to (not set)
What is wrong with this? What am i doing wrong?
and step2 has 39 event count, but step3 has 35 event count.
where is 4 event count? 4 user exited app?
Is there something wrong with the way I look at the report?
How exactly is it to access the app and see which page the user has moved to?
Debugging did not detect that screen name was not set.

Populte WYSIWYG editor after react native fetch

I am trying to incorporate this WYSIWYG package into my react native project (0.64.3). I built my project with a managed workflow via Expo (~44.0.0).
The problem I am noticing is that the editor will sometimes render with the text from my database and sometimes render without it.
Here is a snippet of the function that retrieves the information from firebase.
const [note, setNote] = useState("");
const getNote = () => {
const myDoc = doc(db,"/users/" + user.uid + "/Destinations/Trip-" + trip.tripID + '/itinerary/' + date);
getDoc(myDoc)
.then(data => {
setNote(data.data()[date]);
}).catch();
}
The above code and the editor component are nested within a large function
export default function ItineraryScreen({route}) {
// functions
return (
<RichEditor
onChange={newText => {
setNote(newText)
}}
scrollEnabled={false}
ref={text}
initialFocus={false}
placeholder={'What are you planning to do this day?'}
initialContentHTML={note}
/>
)
}
Here is what it should look like with the text rendered (screenshot of simulator):
But this is what I get most of the time (screenshot from physical device):
My assumption is that there is a very slight delay between when the data for the text editor is actually available vs. when the editor is being rendered. I believe my simulator renders correctly because it is able to process the getNote() function faster.
what I have tried is using a setTimeOut function to the display of the parent View but it does not address the issue.
What do you recommend?
I believe I have solved the issue. I needed to parse the response better before assigning a value to note and only show the editor and toolbar once a value was established.
Before firebase gets queried, I assigned a null value to note
const [note, setNote] = useState(null);
Below, I will always assign value to note regardless of the outcome.
if(data.data() !== undefined){
setNote(data.data()[date]);
} else {
setNote("");
}
The last step was to only show the editor once note no longer had a null value.
{
note !== null &&
<RichToolbar
style={{backgroundColor:"white", width:"114%", flex:1, position:"absolute", left:0, zIndex:4, bottom: (toolbarVisible) ? keyboardHeight * 1.11 : 0 , marginBottom:-40, display: toolbarVisible ? "flex" : "none"}}
editor={text}
actions={[ actions.undo, actions.setBold, actions.setItalic, actions.setUnderline,actions.insertLink, actions.insertBulletsList, actions.insertOrderedList, actions.keyboard ]}
iconMap={{ [actions.heading1]: ({tintColor}) => (<Text style={[{color: tintColor}]}>H1</Text>), }}
/>
<RichEditor
disabled={disableEditor}
initialFocus={false}
onChange={ descriptionText => { setNote(descriptionText) }}
scrollEnabled={true}
ref={text}
placeholder={'What are you planning to do?'}
initialContentHTML={note}
/>
}
It is working properly.

Start video from there where we left

I am trying to make a video player App in React-native. I want a feature like start video from there where the user left before closing the Application. Any clue?
I am using React-native-video
You can get the current time from the video component by simply running videoRef.currentTime.
You can then store this in your localStorage
More specifically if you want to achieve this using store/local storage.Please check this out.
import Video from "react-native-video"
constructor(props) {
super(props)
this.progress = 0
this.onProgress = this.onProgress.bind(this)
//Here you can store this progress time to async storage or store.So that
when you navigate back to this screen you can either get the progress
from storage or from initial state = 0
}
onProgress = (data) => {
this.progress = data.currentTime
}
<Video
ref={(ref) => { this.player = ref }}
source={{ uri: '.....' }}
onProgress={this.onProgress}
onLoad={() => {
if (menuVideoProgress > 0)
this.player.seek(menuVideoProgress) //here recent progress times comes
from storage
}}
/>

react-native-camera slow on react native android

I'm using react-native-camera library in my React Native project.
But I have a problem when I try to take the photo. It takes 3-4 seconds before the photo is saved. When I click on the button to take the photo I hear sound but then it takes about 3-4 seconds to save the photo.
The render method is as follows:
return (
<View style={styles.container}>
<Camera ref={(cam) => {this.camera = cam;}}
style={styles.preview}
aspect={Camera.constants.Aspect.fill}>
{this.imageOverlay()}
<Text style={styles.capture} onPress={this.takePicture.bind(this, this.state.type)}>[CAPTURE]</Text>
</Camera>
</View>
)
And takePicture function is as follows:
takePicture(type){
let self = this;
this.camera.capture({target: Camera.constants.CaptureTarget.disk})
.then((data) => {
<!---------------------------------------------------------->
<!------------It takes 3-4 seconds to come here------------!>
<!---------------------------------------------------------->
let n_pictures = [];
each(this.state.pictures, function (picture){
if(picture.item !== type.item){
n_pictures.push(picture)
}else{
n_pictures.push({
title: type.title,
noImage: false,
imageOverlay: type.imageOverlay,
image: data.path,
imageIcon: type.imageIcon,
overlay: false,
item: type.item,
mandatory: type.mandatory
})
}
});
self.setState({pictures: n_pictures, showCamera: false})
})
.catch(err => console.error(err));
}
Any idea how to solve it?
Can I at least put an loading screen until the photo is saved?
So I had this same issue, and after a while of searching on the internet I could not find an answer to speed it up.
However, to answer your query about using a loading screen you may want to look into Javascript promises. For my app, I redirected the user immediately to a new screen and showed a loading picture while the promise was not resolved/rejected. Once it was resolved, the picture showed.
I know this is an old answer, but I'm going to put this here for anyone else who may have a similar issue.

React Native Retrieve Actual Image Sizes

I would like to be able to know the actual size of a network-loaded image that has been passed into <Image /> I have tried using onLayout to work out the size (as taken from here https://github.com/facebook/react-native/issues/858) but that seems to return the sanitised size after it's already been pushed through the layout engine.
I tried looking into onLoadStart, onLoad, onLoadEnd, onProgress to see if there was any other information available but cannot seem to get any of these to fire. I have declared them as follows:
onImageLoadStart: function(e){
console.log("onImageLoadStart");
},
onImageLoad: function(e){
console.log("onImageLoad");
},
onImageLoadEnd: function(e){
console.log("onImageLoadEnd");
},
onImageProgress: function(e){
console.log("onImageProgress");
},
onImageError: function(e){
console.log("onImageError");
},
render: function (e) {
return (
<Image
source={{uri: "http://adomain.com/myimageurl.jpg"}}
style={[this.props.style, this.state.style]}
onLayout={this.onImageLayout}
onLoadStart={(e) => {this.onImageLoadStart(e)}}
onLoad={(e) => {this.onImageLoad(e)}}
onLoadEnd={(e) => {this.onImageLoadEnd(e)}}
onProgress={(e) => {this.onImageProgress(e)}}
onError={(e) => {this.onImageError(e)}} />
);
}
Thanks.
Image component now provides a static method to get the size of the image. For example:
Image.getSize(myUri, (width, height) => {this.setState({width, height})});
You can use resolveAssetSource method from the Image component :
import picture from 'pathToYourPicture';
const {width, height} = Image.resolveAssetSource(picture);
This answer is now out of date. See Bill's answer.
Image.getSize(myUri, (width, height) => { this.setState({ width, height }) });
Old Answer (valid for older builds of react native)
Ok, I got it working. Currently this takes some modification of the React-Native installation as it's not natively supported.
I followed the tips in this thread to enabled me to do this.
https://github.com/facebook/react-native/issues/494
Mainly, alter the RCTNetworkImageView.m file: add the following into setImageURL
void (^loadImageEndHandler)(UIImage *image) = ^(UIImage *image) {
NSDictionary *event = #{
#"target": self.reactTag,
#"size": #{
#"height": #(image.size.height),
#"width": #(image.size.width)
}
};
[_eventDispatcher sendInputEventWithName:#"loaded" body:event];
};
Then edit the line that handles the load completion:
[self.layer removeAnimationForKey:#"contents"];
self.layer.contentsScale = image.scale;
self.layer.contents = (__bridge id)image.CGImage;
loadEndHandler();
replace
loadEndHandler();
with
loadImageEndHandler(image);
Then in React-Native you have access to the size via the native events. data from the onLoaded function - note the documentation currently says the function is onLoad but this is incorrect. The correct functions are as follows for v0.8.0:
onLoadStart
onLoadProgress
onLoaded
onLoadError
onLoadAbort
These can be accessed like so:
onImageLoaded: function(data){
try{
console.log("image width:"+data.nativeEvents.size.width);
console.log("image height:"+data.nativeEvents.size.height);
}catch(e){
//error
}
},
...
render: function(){
return (
<View style={{width:1,height:1,overflow='hidden'}}>
<Image source={{uri: yourImageURL}} resizeMode='contain' onLoaded={this.onImageLoaded} style={{width:5000,height:5000}} />
</View>
);
}
Points to note:
I have set a large image window and set it inside a wrapping element of 1x1px this is because the image must fit inside if you are to retrieve meaningful values.
The resize mode must be 'contain' to enable you to get the correct sizes, otherwise the constrained size will be reported.
The image sizes are scaled proportionately to the scale factor of the device, e.g. a 200*200 image on an iPhone6 (not 6 plus) will be reported as 100*100. I assume that this also means it will be reported as 67*67 on an iPhone6 plus but I have not tested this.
I have not yet got this to work for GIF files which traverse a different path on the Obj-C side of the bridge. I will update this answer once I have done that.
I believe there is a PR going through for this at the moment but until it is included in the core then this change will have to be made to the react-native installation every time you update/re-install.
TypeScript example:
import {Image} from 'react-native';
export interface ISize {
width: number;
height: number;
}
function getImageSize(uri: string): Promise<ISize> {
const success = (resolve: (value?: ISize | PromiseLike<ISize>) => void) => (width: number, height: number) => {
resolve({
width,
height
});
};
const error = (reject: (reason?: any) => void) => (failure: Error) => {
reject(failure);
};
return new Promise<ISize>((resolve, reject) => {
Image.getSize(uri, success(resolve), error(reject));
});
}