Invalid imageTag error from ImageStore.getBase64ForTag - react-native

I am trying to get a base64 image from facebook profile picture.
getImageFromFacebook() {
const imageURL = this.props.userInfo.picture;
Image.getSize(imageURL, (width, height) => {
var imageSize = {width, height};
ImageEditor.cropImage(imageURL, imageSize, (imageURI) => {
console.log(imageURI);
ImageStore.getBase64ForTag(imageURI, (base64Data) => {
this.setState({pictureBase64: base64Data});
ImageStore.removeImageForTag(imageURI);
}, (reason) => console.log(reason) )
}, (reason) => console.log(reason) )
}, (reason) => console.log(reason))
}
I am following the steps described in this https://github.com/facebook/react-native/issues/1158:
Use Image.getSize(uri) to get the image dimensions.
Use ImageEditor.cropImage(uri, cropData) to store a copy of the image in the ImageStore (if you pass the width and height you got in step 1) then cropImage won't actually crop the image, although it may still make a copy of it.
Use ImageStore.getBase64ForTag(uri) to get the base64 data of the new image (pass the uri you got from the cropImage function, not the original).
Don't forget to call ImageStore.removeImageForTag(uri) once you're done to delete the copy.
Although the ImageEditor.cropImage returns a valid URI (rct-image-store://0), ImageStore.getBase64ForTag fails with a reason:
code: "ERCTERRORDOMAIN0",
domain: "RCTErrorDomain",
message: "Invalid imageTag: rct-image-store://0"
What am I doing wrong?

Found the error!
imageSize should be set like this:
var imageSize = {
size: {
width,
height
},
offset: {
x: 0,
y: 0,
},
};

Related

issue in image to PDF not working in react native

I am trying to generate PDF from image path in react native so i am using below plugin for that
https://www.npmjs.com/package/react-native-image-to-pdf/v/1.2.0
As per above doc i configure all the thing and below is my code
const myAsyncPDFFunction = async () => {
try {
console.log('Call a');
let path ='file:///Users/macminiharshalk/Library/Developer/CoreSimulator/Devices/FADDF530-05FD-4A0E-9E61-C6AEDB719955/data/Containers/Data/Application/37B8FE42-B23A-4018-865F-F57670B3411E/tmp/606C88B3-5759-4942-A544-1231A0C17532.jpg';
const options = {
imagePaths: [path],
name: 'PDFName',
maxSize: {
// optional maximum image dimension - larger images will be resized
width: 900,
height: Math.round(
(Dimensions.get('window').height / Dimensions.get('window').width) *
900,
),
},
quality: 0.7, // optional compression paramter
// targetPathRN: "/storage/emulated/0/Download/", // only for android version 9 and lower
//for versions higher than 9 it is stored in (Download/img-to-pdf/)
};
console.log("options-->", options);
const pdf = await RNImageToPdf.createPDFbyImages(options);
console.log('PDF URIs-->', pdf);
console.log(pdf.filePath);
} catch (e) {
console.log(e);
}
};
When i console log i can able to see pdf path as below
/Users/macminiharshalk/Library/Developer/CoreSimulator/Devices/FADDF530-05FD-4A0E-9E61-C6AEDB719955/data/Containers/Data/Application/37B8FE42-B23A-4018-865F-F57670B3411E/Documents/PDFName.pdf
When i console option parameter it is showing as below
{"imagePaths": ["file:///Users/macminiharshalk/Library/Developer/CoreSimulator/Devices/FADDF530-05FD-4A0E-9E61-C6AEDB719955/data/Containers/Data/Application/37B8FE42-B23A-4018-865F-F57670B3411E/tmp/606C88B3-5759-4942-A544-1231A0C17532.jpg"], "maxSize": {"height": 1948, "width": 900}, "name": "PDFName", "quality": 0.7}
But when i open PDF image is not copy it is blank PDF so any idea how can i show image in PDF ?
please try
const newPath = path.replace('file://', ​​'');

react-native-image-crop-picker is Compressing the image before cropping

I have implemented this library. But on android only the image is being compressed when picking the image. For example, If I select like a large image 9921x4961 exactly and then log its width and height after being picked it will be 1241x621 which divided by 8 exactly. And this issue only appears on android.
Here is my implementation:
const res = await ImagePicker.openPicker({
multiple: true,
mediaType: 'any',
compressVideoPreset: 'HighestQuality',
maxFiles: 10
})
for await (const image of res) {
Image.getSize(image.path, async (width, height) => {
console.log('SIZE-PICKER: ', { width, height });
await cropAndAttachImage(image);
});
}
Set compressImageQuality: 1 in options.
If it still does not help, solution is to set a greater height/width manually.
const res = await ImagePicker.openPicker({
width: 2048, // or something else depending on your use case
height: 1024, // same as above
compressImageQuality: 1,
....
})
You can also set width / height x times viewport.
Similar issues #1356, #1085

React-Native-Camera get height and width from uri

I have an app, which takes a photo and add it to the pdf file. Problem is resizing. I can resize it with pixels, but that not keep the original ratio. I need original height and width for calculating the right size
Calculate: Divide height by width, example 1200/1600 = 0,75. Then we can resize pd image height 100px and width is 100 / 0,75.
Question is: How can I get the size of image (data.uri)?
My code:
takePicture = async() => {
if (this.camera) {
const options = { quality: 0.5, base64: true , fixOrientation: true}; // Option for wuality etc
const data = await this.camera.takePictureAsync(options); // Take the image
console.log(data.uri); // Prints image (data) address to console log
back(data.uri) // Go back to ReportFault page
}
};
let imgX = 297;
let imgY = 618;
page.drawImage(arr[i].path.substring(7),'jpg',{
x: imgX, // Position in pdf A4
y: imgY, // Position in pdf A4
width: 200,
height: 150,
})
you can use react native Image getSize method
try this
import { Image } from 'react-native';
.....
Image.getSize(uri, (width, height) => {
console.log(`The image dimensions are ${width}x${height}`);
}, (error) => {
console.error(`Couldn't get the image size: ${error.message}`);
});

Trying to load obj & mtl file with Three.js in React Native

Main objective : Load animated models exported from Maya into React Native app
Exported files : obj, mtl & png file
I have setup https://github.com/react-community/react-native-webgl in my React Native project and it is working properly.
Now, when I am trying to load the MTL file using the MTLLoader, I am getting following error:
Can't find variable: document
Apparently, the MTLLoader is calling TextureLoader which internally calls some load function which has 'document' reference. So what could be the solution to this ?
Here are the two files that I am using:
three.js
const THREE = require("three");
global.THREE = THREE;
if (!window.addEventListener)
window.addEventListener = () => { };
// require("three/examples/js/renderers/Projector");
require("three/examples/js/loaders/MTLLoader");
require("three/examples/js/loaders/OBJLoader");
export default THREE;
ThreeView.js
import React, { Component } from "react";
import { StyleSheet, View } from "react-native";
import { WebGLView } from "react-native-webgl";
import THREE from "./three";
import { image } from "src/res/image";
export default class ThreeView extends Component {
requestId: *;
componentWillUnmount() {
cancelAnimationFrame(this.requestId);
}
onContextCreate = (gl: WebGLRenderingContext) => {
const rngl = gl.getExtension("RN");
const { drawingBufferWidth: width, drawingBufferHeight: height } = gl;
const renderer = new THREE.WebGLRenderer({
canvas: {
width,
height,
style: {},
addEventListener: () => { },
removeEventListener: () => { },
clientHeight: height
},
context: gl
});
renderer.setSize(width, height);
renderer.setClearColor(0xffffff, 1);
let camera, scene;
let cube;
function init() {
camera = new THREE.PerspectiveCamera(75, width / height, 1, 1100);
camera.position.y = 150;
camera.position.z = 500;
scene = new THREE.Scene();
var mtlLoader = new THREE.MTLLoader();
mtlLoader.load('female-croupier-2013-03-26.mtl', function (materials) {
materials.preload();
var objLoader = new THREE.OBJLoader();
objLoader.setMaterials(materials);
objLoader.load('female-croupier-2013-03-26.obj', function (object) {
scene.add(object);
}, onLoading, onErrorLoading);
}, onLoading, onErrorLoading);
}
const onLoading = (xhr) => {
console.log((xhr.loaded / xhr.total * 100) + '% loaded');
};
const onErrorLoading = (error) => {
console.log('An error happened', error);
};
const animate = () => {
this.requestId = requestAnimationFrame(animate);
renderer.render(scene, camera);
// cube.rotation.y += 0.05;
gl.flush();
rngl.endFrame();
};
init();
animate();
};
render() {
return (
<View style={styles.container}>
<WebGLView
style={styles.webglView}
onContextCreate={this.onContextCreate}
/>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: "#fff",
alignItems: "center",
justifyContent: "center"
},
webglView: {
width: 300,
height: 300
}
});
This error is as others have said caused by threejs trying to use features from a browser which react-native does not have.
I've gotten so far as to be able to load the textures (which is the stage you're getting the error from) by monkey patching the texture loader to use the loader in react-native-webgl. Add this in your init function (right near the top preferably).
//make sure you have defined renderer and rngl
/*
const renderer = new THREE.WebGLRenderer(...)
const rngl = gl.getExtension("RN");
*/
const loadTexture = async function(url, onLoad, onProgress, onError) {
let textureObject = new THREE.Texture();
console.log("loading",url,'with fancy texture loader');
let properties = renderer.properties.get(textureObject);
var texture = await rngl.loadTexture({yflip: false, image: url});
/*
rngl.loadTexture({ image: url })
.then(({ texture }) => {
*/
console.log("Texture [" + url + "] Loaded!")
texture.needsUpdate = true;
properties.__webglTexture = texture;
properties.__webglInit = true;
console.log(texture);
if (onLoad !== undefined) {
//console.warn('loaded tex', texture);
onLoad(textureObject);
}
//});
return textureObject;
}
THREE.TextureLoader.prototype.load = loadTexture;
This solves the problem of loading textures and I can see them load in Charles but they still don't render on a model so I'm stuck past that point. Technically a correct answer but you'll be stuck as soon as you've implemented it. I'm hoping you can comment back and tell me you've gotten further.
I had a similar setup and encountered same issue. My option was to switch to JSONLoader which doesn’t need document to render in react-native. So, I just loaded my model in Blender with a three-js addon, then exported it as json. Just check out this process of adding a three-js adon to Blender
https://www.youtube.com/watch?v=mqjwgTAGQRY
All the best
this might get you closer:
The GLTF format supports embedding texture images (as base64). If your asset pipeline allows it, you could convert to GLTF and then load into three/react-native.
I had to provide some "window" polyfills for "decodeUriComponent" and "atob" because GLTFLoader uses FileLoader to parse the base64:
I've successfully loaded embedded buffers, but you'll need more polyfills to load textures. TextureLoader uses ImageLoader, which uses document.createElementNS
You are using the MTLLoader which uses TextureLoader, and the TextureLoader uses the ImageLoader.
The imageloader uses the document.createElementNS() function.
what i did to solve this was to directly call the THREEjs TextureLoader:
let texture = new THREE.Texture(
url //URL = a base 64 JPEG string in this case
);
(for the use of Texture check the Texture documentation)
Then i used the Image class from React native (instead of the THREEjs Image, which requires the DOM to be constructed) to give that to the Texture as a property:
import { Image } from 'react-native';
var img = new Image(128, 128);
img.src = url;
texture.normal = img;
And then finally map the texture over the target material:
const mat = new THREE.MeshPhongMaterial();
mat.map = texture;
In the react native documentation it will explain how the react native Image element can be used, it supports base64 encoded JPEG.
Maybe there's a way for you to single out the part where it calls for the TextureLoader and replace that part with this answer. Let me know how it works out.
side note, i havent tried to display this yet in my webGLView, but in the logs it looked like normal threejs objects, it's worth the try
Use TextureLoader from expo-three
import { TextureLoader } from "expo-three";
export function loadTexture(resource) {
if (textureCache[resource]) {
return textureCache[resource].clone();
}
const texture = new TextureLoader().load(resource);
texture.magFilter = NearestFilter;
texture.minFilter = NearestFilter;
textureCache[resource] = texture;
return texture;
}
Source: https://github.com/EvanBacon/Expo-Crossy-Road/blob/master/src/Node/Generic.js

Is it possible to use RGBA colors with PlanetaryJS elements?

I'm drawing some points over a planet drawn in planetaryjs, and I'd like to make them semi-transparent using RGBA color.
planet.loadPlugin(function(planet) {
planet.onDraw(function () {
planet.withSavedContext(function (context) {
var verylow = (function () {
var verylow = null;
$.ajax({
'async': false,
'global': false,
'url': 'verylow.json',
'dataType': "json",
'success': function (data) {
verylow = data;
}
});
return verylow;
})();
context.beginPath();
planet.path.context(context)(verylow);
context.fillStyle = 'green';
context.fill();
//context.stroke();
context.closePath();
});
});
});
context.fillStyle will also take hex codes, i.e.
context.fillStyle= '#999999';
but it won't take
context.fillStyle= 'rgba (255, 0, 0, 0.6)';
Is this just a limitation of the way that planetary is designed, or can anyone suggest another way to get any level of transparency?
edit: verylow.json contains a bunch of "Point" objects, FYI.
You need to remove the space after the rgba.