React Native expo-image-picker does not working all phone - react-native

I am using expo-image-picker and if I select an image in android emulator and save it, I cannot see the image I saved from emulator when I enter the program with my real device. In other words, with whichever device I save the picture, it only appears on that device. It does not appear on other devices. How can I solve this?
I am using API for database operations (with axios)
Here is the code
const PickImage = async () => {
allowPhotoRequests()
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
aspect: [4, 3],
quality: 1,
base64: true
})
if (!result.cancelled) {
setImage(result.uri) // I think I have to do something here
}
Submit code:
const addPet = async () => {
try {
petApi.post('/', {
userId: userId,
Age: age,
Weight: weight,
userName: currentUser,
userPhone: currentUserPhone,
petImage: image,
city: city,
district: district
})
.then(function (response) {
alert('Success!')
})
}
catch (error) {
console.log(error);
}
}
Example image output:
file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540yas1nkiziltas%252FPettyApp/ImagePicker/cb2923b3-5de8-4692-8244-0ce9b987001a.jpg

There are 2 ways you can solve this problem as you're using this Expo:
Submit image data as base64
Review that backend API support BLOB and you can fetch BLOB with code
below.
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function (e) {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", [YOUR_FILE_PATH_URI], true);
xhr.send(null);
});
// Use blob after fetch
console.log(blob)
// We're done with the blob, close and release it
blob.close();

You are saving the petImage in your patApi database for a specific userId. On any device, to get that image, you need to fetch this data again, I don't see you fetching this image data back after you post it. This is the part you are missing.

Related

Image not found React Native (Expo)

Hello fellow programmers,
I am trying to show an image using the UserAvatar component in React-Native (Expo) but I am facing a problem where the link I am getting from the API is not working 404 Not Found, what is the best possible way to avoid this problem. I tried to create a blob using the URL of the image but it was not successful
This is the error message i am getting in the app
Online fetched source is not a supported image at node_modules/react-native-user-avatar/src/helpers.js:41:6 in fetchImage
Here is one of the solutions i have tried:
urlToBlob = (url) => new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onerror = reject;
xhr.onreadystatechange = () => {
if (xhr.readyState === 4) {
resolve(xhr.response);
}
};
xhr.open('GET', url);
xhr.responseType = 'blob'; // convert type
xhr.send();
})this.urlToBlob(data)
.then((blob) => {
console.log(blob);
});
The approach that i took to solve this problem is very simple.
axios
.get(image_url)
.then((res) => {
if (res.status === 200) {
setImageStatus(200);
}
})
.catch((err) => {
setImageStatus(err.response.status);
});
}
When the response status is 200 then the image exists if not fallback to the default image.

ReactNative upload blob data to ASP .NET CORE Web API

This is my ReactNative code
const changeAvatar = async () => {
// No permissions request is necessary for launching the image library
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.Images,
allowsEditing: true,
aspect: [4, 3],
quality: 1,
});
if (!result.cancelled) {
const image = await fetchImageFromUri(result.uri);
const uploadUrl = await uploadImage(image);
}
}
const fetchImageFromUri = async (uri) => {
const response = await fetch(uri);
const blob = await response.blob();
return blob;
};
const uploadImage = async (image) => {
var formdata = new FormData();
formdata.append("blob", image, image._data.name);
formdata.append("nameeee", "nameeee");
var requestOptions = {
method: 'POST',
body: formdata,
redirect: 'follow'
};
fetch(global.domain + "/api/profile/avatar", requestOptions)
.then(response => response.text())
.then(result => console.log(result))
.catch(error => console.log('error', error));
}
And this is my ASP .NET Web API code to handle file upload
[Route("avatar")]
[HttpPost]
public async Task<object> Avatar([FromForm] string nameeee, [FromForm] IFormFile blob)
{
return "";
}
I have test my API by using postman to upload a file, it's work great.
But's in my ReactNative code, the file chooser handle convert selected image to blob, so it can't upload to the API. What do i need to modify my ReactNative code to upload selected image to web API ?

React Native : convert local image to url

I'm pretty new using React Native (Expo in this case) and Firebase database.
My problem is that when I upload an image in my app thanks to Image Picker, the link is a local link, so reading only with my device, and then deleted when I erase the cache
Here is my code :
useEffect(() => {
(async () => {
if (Platform.OS !== "web") {
const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync();
if (status !== "granted") {
alert("Sorry, we need camera roll permissions to make this work!");
}
}
})();
}, []);
const pickImage = async () => {
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
quality: 1,
});
if (!result.cancelled) {
setImage(result.uri);
}
};
// My current image is locate to : "file:///data/user/0/host.exp.exponent/cache/
// ExperienceData/ImagePicker/2abe4097-05ed-4d23-5648-f279d5a6f995.jpg"
// And what I want is to locate my image to : "https://someting..."
So I want to convert this image uri link in a url link, to be shared and never erased.
Anyone has an idea about how to proceed ?
Thanks a lot !
Let's break down the problem as below :
1. Pick an image from the Media Library.
const pickImage = async () => {
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
quality: 1,
});
if (!result.cancelled) {
setImage(result.uri);
}
};
2. Fetch Image BLOB Data
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function (e) {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", [IMAGE_URI_HERE], true);
xhr.send(null);
});
3. Upload image BLOB to a remote server (Firebase Storage)
const metadata = { contentType: "image/jpg" };
const imgRef = firebase.storage().ref('folderName/fileName');
await imgRef.put(blob, metadata);
// We're done with the blob, close and release it
blob.close();
// Image permanent URL
const imgURL = await imgRef.getDownloadURL();

React Native: Failed to execute 'append' on 'FormData': parameter 2 is not of type 'Blob'. at new ApolloError

I am trying to upload image from my react native app to graphql by using Apollo client with createUploadLink(). When I am trying to mutate data by passing a ReactNativeFile as a variable, then it says
"network request failed: Failed to execute 'append' on 'FormData': parameter 2 is not of type 'Blob'. at new ApolloError ".
This this the mutation which i am trying to use
mutation publishPost(
$content: String!
$LocationInput: LocationInput!
$InputPostAttachment: [InputPostAttachment!]
) {
publishPost(
content: $content
location: $LocationInput
attachments: $InputPostAttachment
) {
content
}
}
InputPostAttachment has type
type InputPostAttachment {
type: PostAttachmentType!
file: Upload!
}
Apollo client settings and i am using apollo-upload-client
const httpLink = createUploadLink({
uri: 'http://localhost:8000/graphql',
});
const authLink = setContext(async (headers: any) => {
const token = await getToken();
return {
...headers,
headers: {
authorization: token ? `Bearer ${token}` : null,
},
};
});
const link = authLink.concat(httpLink);
// create an inmemory cache instance for caching graphql data
const cache = new InMemoryCache();
// instantiate apollo client with apollo link instance and cache instance
export const client = new ApolloClient({
link,
cache,
});
File upload Function and i am using react-native-image-crop-picker for multi image selection
const [image, setimage] = useState([]);
const _pickImage = () => {
ImagePicker.openPicker({
includeBase64: true,
multiple: true,
}).then((images: any) => {
let imageData: any = [];
images.map((data: any) => {
const file = new ReactNativeFile({
uri: data.path,
name: data.filename,
type: data.mime,
});
imageData.push({
type: 'IMAGE',
file: file,
});
});
setimage(imageData);
console.log(images);
});
};
const handlePost = async () => {
const InputPostAttachment: any = [...image];
const LocationInput = {
place: place,
vicinity: vicinity,
province: province,
};
publishPost({variables: {content, LocationInput, InputPostAttachment}})
.then(({data}) => {
console.log(data);
props.navigation.navigate('Home');
})
.catch((err) => {
console.log('err happened');
console.log(err);
});
};
could someone please help me out from this?
In addition to the chrome debugger issue, this error also happens on the expo web.
To anyone uploading images on expo web (or react-native web), here's a working solution:
/** Load image from camera/roll. */
const result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
quality: 1,
});
if (result.cancelled) {
return;
}
/** web platform: blob. */
const convertBase64ToBlob = async (base64) => {
const response = await fetch(base64);
const blob = await response.blob();
return blob;
};
/** android/ios platform: ReactNativeFile.*/
const createReactNativeFile = (uri) => {
const file = new ReactNativeFile({
uri,
type: mime.lookup(uri) || 'image',
name: `file-${Date.now()}`,
});
return file;
};
/** Use blob for web, ReactNativeFile otherwise. */
const file = Platform.OS === 'web'
? await convertBase64ToBlob(result.uri)
: createReactNativeFile(result.uri);
/** Upload image with apollo. */
mutate({ variables: { file } });
On the web platform, ImagePicker returns a base64 value instead of a file path. This problem doesn't happen if the platform is Android or iOS, as ImagePicker returns a file path, which is expected by apollo-upload-client.
The solution is to detect if the URI is base64 (which happens when the platform is "web") and convert it to a blob.
My apollo-client was configured using apollo-boost and i was using chrome debugger to intercept the network was causing me this issue.
To be more specific I was using the below code to get the network requests sent by my app in the chrome debugger
global.XMLHttpRequest =
global.originalXMLHttpRequest || global.XMLHttpRequest;
global.FormData = global.originalFormData || global.FormData;
if (window.FETCH_SUPPORT) {
window.FETCH_SUPPORT.blob = false;
} else {
global.Blob = global.originalBlob || global.Blob;
global.FileReader = global.originalFileReader || global.FileReader;
}
apollo-upload-client wont send the data in multipart data if we are using chrome debugger. We will face network issue.This issue has the answer. or I had not removed apollo-boost and some part of my app was using it that was also a issue.

Cannot return position in from navigation.geolocation.getCurrentPosition() in react-native

I am trying to get the geolocation after an image has been taken in react-native. A user captures an image and the image along with the geolocation is stored in an object and sent via a http request to the server.
The function to save get the geolocation works fine bur I am unable to return the geolocation to be stored in the object for http transfer. I get an undefined.
console.log('getCoordinates run')
await navigator.geolocation.getCurrentPosition(
position => {
let coordinates = `${position.coords.longitude},
${position.coords.latitude}`
return coordinates
},
error => Alert.alert(error.message),
{ enableHighAccuracy: false, timeout: 20000, maximumAge: 1000 }
)
}
captureImage = async () => {
if (this.camera) {
const options = { quality: 0.5, base64: true };
const data = await this.camera.takePictureAsync(options);
console.log(data);
let postData = {
user: 1,
coordinates: this.getCoordinates(),
image: `data:image/jpeg;base64${data.base64}`,
}
console.log(postData)
axios.post('https://localhost:5000/api/posts', postData)
.then(post => res.json(post))
.catch(err => console.log(err))
}
}
Expected results is that when the captureImage function runs the getCoordinates function withing the postData object returns the current geolocation before that data is transferred to the server.
How geolocation.getCurrentPosition function works here is that it sets a callback to send data once it acquire user's location. It takes time to acquire and send relevant data. That's why we use callbacks or promises. But in your code, you just call the function and without waiting for its response, just do the API call.
I assume you have used Async function to do this. But if I were you, I'd try to use Promises here to resolve this issue. Simple example would be,
captureImage = async () => {
if (this.camera) {
// ... do your camera tasks
}
this.sendImageData(data); // data is what you got from camera.
}
getGeoInfo = () => {
return new Promise((resolve, reject) => {
navigator.geolocation.getCurrentPosition(
position => {
let coordinates = `${position.coords.longitude},
${position.coords.latitude}`
resolve(coordinates);
},
error => reject(error),
{ enableHighAccuracy: false, timeout: 20000, maximumAge: 1000 }
)
})
}
sendImageData = (cameraData) => {
let coordinates = '';
getGeoInfo.then(crdnts => coordinates = crdnts );
// now coordinates have all relevant data you wished.
const data = { //... make the object as you want }
// do the API call using axios as you've already done.
}