Using Adobe PDF Embed API and want to save annotated PDFs in a browser window to Firestore.
The file is uploaded to Firebase but corrupt and only about 9 bytes in size.
Please see the below code. Is there something I need to do with "content" in the callback?
Attached also a picture of the console.log.
const previewConfig = {
embedMode: "FULL_WINDOW",
showAnnotationTools: true,
showDownloadPDF: true,
showPrintPDF: true,
showPageControls: true
}
document.addEventListener("adobe_dc_view_sdk.ready", function () {
var adobeDCView = new AdobeDC.View({
clientId: "2eab88022c63447f8796b580d5058e71",
divId: "adobe-dc-view"
});
adobeDCView.previewFile({
content: { location: { url: decoded } },
metaData: { fileName: decodedTitle }
}, previewConfig);
/* Register save callback */
adobeDCView.registerCallback(
AdobeDC.View.Enum.CallbackType.SAVE_API,
async function (metaData, content, options) {
console.log(metaData);
console.log(content);
var meta = {
contentType: 'application/pdf'
};
var pdfRef = storageRef.child(decodedTitle);
var upload = await pdfRef.put(content, meta);
console.log('Uploaded a file!');
return new Promise(function (resolve, reject) {
/* Dummy implementation of Save API, replace with your business logic */
setTimeout(function () {
var response = {
code: AdobeDC.View.Enum.ApiResponseCode.SUCCESS,
data: {
metaData: Object.assign(metaData, { updatedAt: new Date().getTime() })
},
};
resolve(response);
}, 2000);
});
}
);
});
I was able to use putString() in Firebase Storage to upload the PDF to storage in the end.
Before I was only using put() which ended up having a corrupt file.
Related
I am trying to upload a lot of files from S3 to IPFS via Pinata. I haven't found in Pinata documentation something like that.
This is my solution, using the form-data library. I haven't tested it yet (I will do it soon, I need to code some things).
Is it a correct approach? anyone who has done something similar?
async uploadImagesFolder(
items: ItemDocument[],
bucket?: string,
path?: string,
) {
try {
const form = new FormData();
for (const item of items) {
const file = getObjectStream(item.tokenURI, bucket, path);
form.append('file', file, {
filename: item.tokenURI,
});
}
console.log(`Uploading files to IPFS`);
const pinataOptions: PinataOptions = {
cidVersion: 1,
};
const result = await pinata.pinFileToIPFS(form, {
pinataOptions,
});
console.log(`Piñata Response:`, JSON.stringify(result, null, 2));
return result.IpfsHash;
} catch (e) {
console.error(e);
}
}
I had the same problem
So, I have found this: https://medium.com/pinata/stream-files-from-aws-s3-to-ipfs-a0e23ffb7ae5
But in the article If am not wrong, is used a different version to the JavaScript AWS SDK v3 (nowadays the most recent: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html).
This is for the Client side with TypeScript:
If you have this version, for me works this code snippet:
export const getStreamObjectInAwsS3 = async (data: YourParamsType) => {
try {
const BUCKET = data.bucketTarget
const KEY = data.key
const client = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'secret-key'
}
})
const resource = await client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: KEY
}))
const response = resource.Body
if (response) {
return new Response(await response.transformToByteArray()).blob()
}
return null
} catch (error) {
return null
}
}
With the previous code, you can get the Blob Object for pass it to the File object with this method and get the URL resource using the API:
export const uploadFileToIPFS = async(file: Response) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`
const data = new FormData()
data.append('file', file)
try {
const response = await axios.post(url, data, {
maxBodyLength: Infinity,
headers: {
pinata_api_key: 'your-api',
pinata_secret_api_key: 'your-secret'
},
data: data
})
return {
success: true,
pinataURL: `https://gateway.pinata.cloud/ipfs/${ response.data.IpfsHash }`
}
} catch (error) {
console.log(error)
return null
}
}
I have found this solution from this nice article and you can explore other implementations (including the Node.js side)
i have been looking for a solution to convert Image from base64 string into Blob
i get my images via react-native-image-crop-picker
the image object i get is formatted in this way :
{
creationDate: "1299975445"
cropRect: null
data: "/9j...AA"
duration: null
exif: null
filename: "IMG_0001.JPG"
height: 2848
localIdentifier: "10...001"
mime: "image/jpeg"
modificationDate: "1441224147"
path: "/Users/...351F66445.jpg"
size: 1896240
sourceURL: "file:///Users/...IMG_0001.JPG"
width: 4288
}
which means i have the path and source url and image data as base64 string.
what i need is to upload the images that the user picks as blob file to the server.
so is there any way to do the conversion in react native.
ps: i have tried solutions i found online but non of them seems to work so far but none of them seems to work for me.
urlToBlob = (url) => new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onerror = reject;
xhr.onreadystatechange = () => {
if (xhr.readyState === 4) {
resolve(xhr.response);
}
};
xhr.open('GET', url);
xhr.responseType = 'blob'; // convert type
xhr.send();
})
this.urlToBlob(data)
.then((blob) => {
console.log(blob);
});
i tried this peace of code and this what i got in my console :
{
_data:
blobId: "B69744A5-B8D7-4E6B-8D15-1C95069737EA"
name: "Unknown"
offset: 0
size: 1896240
type: "image/jpeg"
__collector: null
}
after a lot of search i found a solution if anyone faced this issue.
i used a package 'extract-files' it allowed me to the images as files in order to send them to the server
it worked like this:
import { ReactNativeFile } from 'extract-files';
const FormData = require('form-data');
_addImage = async () => {
const { images } = this.state;
if (images.imageData.length > 0) {
const formData = new FormData();
// converting images to files
images.imageData.forEach((element) => {
const file = new ReactNativeFile({
uri: element.sourceURL,
name: element.filename,
type: element.mime,
});
formData.append('files', file);
});
// sending files to the server
await addImages(formData).then((response) => {
console.log('response : ', response);
if (response.success) {
this.pictures = response.filesNames;
this.setState({
isLoading: true,
pictures: response.filesNames
});
return response.success;
}
}).catch((err) => {
console.error(err);
});
}
}
i hope this is helpful
We've created a Cloud Function that generates a PDF. The library that we're using is
https://www.npmjs.com/package/html-pdf
The problem is when we try to execute the
.create()
method it times out with the following errors
"Error: html-pdf: PDF generation timeout. Phantom.js script did not exit.
at Timeout.execTimeout (/srv/node_modules/html-pdf/lib/pdf.js:91:19)
at ontimeout (timers.js:498:11)
This works fine on localhost but happens when we deploy the function on GCP.
Some solutions we've already tried:
Solution #1
Yes we've updated the timeout settings to
const options = {
format: "A3",
orientation: "portrait",
timeout: "100000"
// zoomFactor: "0.5"
// orientation: "portrait"
};
and it still doesn't work.
here's the final snippet that triggers the PDF function
const options = {
format: "A3",
orientation: "portrait",
timeout: "100000"
// zoomFactor: "0.5"
// orientation: "portrait"
};
try {
// let pdfRes = await new Promise(async (resolve, reject) => {
console.log("Before pdf.create()")
let pdfResponse = await pdf.create(html, options).toFile(localPDFFile, async function (err, res) {
if (err) {
console.log(err)
}
console.log('response of pdf.create(): ', res);
let uploadBucket = await bucket.upload(localPDFFile, {
metadata: { contentType: "application/octet-stream" }
});
let docRef = await db
.collection("Organizations")
.doc(context.params.orgId)
.collection("regulations")
.doc(context.params.regulationId)
.collection("reports")
.doc(context.params.reportId);
await docRef.update({
pdf: {
status: "created",
reportName: pdfName
}
});
});
} catch (error) {
console.log('error: ', error);
}
``
I have seen many cases like this even in my current project we use step functions (when cloud functions needs more computational power we divide them into chunks i.e mini cloud functions).
But i think step functions will not work in your case either because you are using single module.
In your case you should use compute engine to perform this operation.
Using promise, We can fix this timeout error
var Handlebars = require('handlebars');
var pdf = require('html-pdf');
var options = {
height: "10.5in", // allowed units: mm, cm, in, px
width: "8in" // allowed units: mm, cm, in, px
"timeout": 600000
};
var document = {
html: html1,
path: resolvedPath + "/" + filename,
data: {}
};
var create = function(document, options) {
return new Promise((resolve, reject) => {
// Compiles a template
var html = Handlebars.compile(document.html)(document.data);
var pdfPromise = pdf.create(html, options);
// Create PDF from html template generated by handlebars
// Output will be PDF file
pdfPromise.toFile(document.path, (err, res) => {
if (!err)
resolve(res);
else
reject(err);
});
});
}
This seems to be a problem with the html, my problem was that I had an image source linked to a deleted image in a server and that was what caused the time out, I solved it by putting the image in the server's route and that was it, I hope this to be useful to someone
With React Native on Android I am trying to send the image profile of the user from local cache to a firebase storage bucket. If I send it as blob or Uint8Array, when I open the image on firebase I get the error The image "https://firebasestorage<resto of url here>" cannot be displayed because it contain errors. If I send it as base64 data url,it does not upload to the bucket and I get the message Firebase Storage: String does not match format 'base64': Invalid character found. I have tested the base64 data url with a decoder and it works. How can I get this to work, either as blob, Uint8Array or base64?. Here is the code:
As blob
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return new Blob([data], { type: mime });
})
.then(blob => {
return imageRef.put(blob, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
return Promise.resolve(bytes);
}
As Uin8Array
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return imageRef.put(data, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
const imageBytes = new Uint8Array(bytes.length);
for ( let i = 0; i < imageBytes.length; i++) {
imageBytes[i] = bytes.charCodeAt(i);
}
return Promise.resolve(imageBytes);
}
As base64 data url
imageBase64Url = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAMAAAADCAIAAADZSiLoAAAAF0lEQVQI12P8//8/AwMDAwMDEwMMIFgAVCQDA25yGkkAAAAASUVORK5CYII=";
return imageRef.putString(imageBase64Url, 'data_url');
The URI
I retrieve the uri from this object:
Object {
"cancelled": false,
"height": 60,
"type": "image",
"uri": "file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540anonymous%252FMCC_Project-ee81e7bd-82b1-4624-8c6f-8c882fb131c4/ImagePicker/6ec14b33-d2ec-4f80-8edc-2ee501bf6e92.jpg",
"width": 80,
}
We found at least two problems with the way I was trying to retrieve the picture and send it to the Firebase bucket:
1) When retrieving the image from memory and trying to send it as blob to the bucket, FileSystem.readAsStringAsync(imageUri) was returning for some reason a corrupted file
2) Instead when trying to save the image to Firebase bucket as base64, the problem seems to be with firebase, since not even the very same examples provided here https://firebase.google.com/docs/storage/web/upload-files were working.
The solution:
We retrieved the image from local cache with XMLHttpRequestinstead of Expo's FileSystem, and saved it to Firebase bucket as blob:
import React, { Component } from 'react';
import firebase from './firebase';
export default async function saveImage(picture, uid) {
const storageRef = firebase
.storage('gs://*<bucket-here>*')
.ref(uid + '/' + 'profile-picture.jpeg');
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function() {
resolve(xhr.response);
};
xhr.onerror = function(e) {
console.log(e);
reject(new TypeError('Network request failed'));
};
xhr.responseType = 'blob';
xhr.open('GET', picture.uri, true);
xhr.send(null);
});
const metadata = {
contentType: 'image/jpeg',
};
return (downloadURL = await new Promise((resolve, reject) => {
try {
storageRef.put(blob, metadata).then(snapshot => {
snapshot.ref.getDownloadURL().then(downloadURL => {
resolve(downloadURL);
});
});
} catch (err) {
reject(err);
}
}));
}
Im trying to build an image uploader with meteor to Amazon S3. Thanks to Hubert OG, Ive found AWS-SDK which makes things easy.
My problem is that the data uploaded seems to be corrupt. When I download the file it says, the file may be corrupt. Probably it is.
Inserting the data into an image src, does work, and the preview of the image shows up as it supposed to, so the original file, and probably the data is correct.
I'm loading the file with FileReader, and than pass the result data to AWS-SDK putObject method.
var file=template.find('[type=file]').files[0];
var key="uploads/"+file.name;
var reader=new FileReader();
reader.onload=function(event){
var data=event.target.result;
template.find('img').src=data;
Meteor.call("upload_to_s3",file,"uploads",reader.result);
};
reader.readAsDataURL(file);
and this is the method on the server:
"upload_to_s3":function(file,folder,data){
s3 = new AWS.S3({endpoint:ep});
s3.putObject(
{
Bucket: "myportfoliositebucket",
ACL:'public-read',
Key: folder+"/"+file.name,
ContentType: file.type,
Body:data
},
function(err, data) {
if(err){
console.log('upload error:',err);
}else{
console.log('upload was succesfull',data);
}
}
);
}
I wrapped an npm module as a smart package found here: https://atmosphere.meteor.com/package/s3policies
With it you can make a Meteor Method that returns a write policy, and with that policy you can upload to S3 using an ajax call.
Example:
Meteor.call('s3Upload', name, function (error, policy) {
if(error)
onFinished({error: error});
var formData = new FormData();
formData.append("AWSAccessKeyId", policy.s3Key);
formData.append("policy", policy.s3PolicyBase64);
formData.append("signature", policy.s3Signature);
formData.append("key", policy.key);
formData.append("Content-Type", policy.mimeType);
formData.append("acl", "private");
formData.append("file", file);
$.ajax({
url: 'https://s3.amazonaws.com/' + policy.bucket + '/',
type: 'POST',
xhr: function() { // custom xhr
var myXhr = $.ajaxSettings.xhr();
if(myXhr.upload){ // check if upload property exists
myXhr.upload.addEventListener('progress',
function (e){
if(e.lengthComputable)
onProgressUpdate(e.loaded / e.total * 100);
}, false); // for handling the progress of the upload
}
return myXhr;
},
success: function () {
// file finished uploading
},
error: function () { onFinished({error: arguments[1]}); },
processData: false,
contentType: false,
// Form data
data: formData,
cache: false,
xhrFields: { withCredentials: true },
dataType: 'xml'
});
});
EDIT:
The "file" variable in the line: formData.append("file", file); is from a line similar to this: var file = document.getElementById('fileUpload').files[0];
The server side code looks like this:
Meteor.methods({
s3Upload: function (name) {
var myS3 = new s3Policies('my key', 'my secret key');
var location = Meteor.userId() + '/' + moment().format('MMM DD YYYY').replace(/\s+/g, '_') + '/' + name;
if(Meteor.userId()) {
var bucket = 'my bucket';
var policy = myS3.writePolicy(location, bucket, 10, 4096);
policy.key = location;
policy.bucket = bucket;
policy.mimeType = mime.lookup(name);
return policy;
}
}
});
The body should be converted to buffer – see the documentation.
So instead of Body: data you should have Body: new Buffer(data, 'binary').