GCloud Function & GStorage: sending downloaded file error: 'write after end' - express

My goal is to retrieve a file from Google Storage and then send it back via response. The problem is that, when I launch this function for the first time it crashes with Error [ERR_STREAM_WRITE_AFTER_END]: write after end. The next executions work fine.
exports = module.exports = region(defaultRegion).https.onRequest(async (req, res): Promise<void> => {
const [authError] = await to(handleAuth(req, res));
if (authError) {
res.status(500).send(authError.message);
return;
}
const { assetId, contentType, path } = req.query;
const file = bloqifyStorage.bucket().file(`assets/${assetId}/${path}`);
const fileExists = (await file.exists())[0];
if (!fileExists) {
res.status(404).send(`${path} was not found`);
return;
}
const fileStream = file.createReadStream();
fileStream.pipe(res).on('end', (): void => {
res.setHeader('content-type', contentType);
});
});
What am I missing here?
Edit: removing the set header doesn't solve it. Ex:
const fileStream = file.createReadStream();
// res.setHeader('content-type', contentType);
fileStream.pipe(res).on('error', (e): void => {
console.log(e);
});
It will print the same error.

Related

Accessing Cache Images to upload to S3, in React native

Using react-native-vision-camera, I saw that there is a path for the image. It seem readable by react native tag.
I attempted to upload using this path (I used both file:// for android, and same for IOS), however it failed. Each time the file was detected as "jpeg" or "jpg" but I couldn't access it.
After downloading (From S3 amazon where I uploaded) and converting the jpg to txt, I only find the "file://path".
import { Storage } from "aws-amplify";
export default async function s3UploadBackup(file, user) {
let formatted_date = moment().format("DD-MM-YYYY");
let filePath = file.split("/");
let fileImageName = filePath[filePath.length - 1];
try {
console.log("Files contains :" + JSON.stringify(file));
// example of one of the URL I used "file:///storage/emulated/0/Android/data/com.app/files/Pictures/image-c64a66b3-489d-4af6-bf93-7adb507ceda1790666367.jpg"
const fileName = `${formatted_date}---${user.businessName}---${user.phoneNumber}---${user.location}---${fileImageName}`;
return Storage.put(uploadBackup.path + user.sub + "/" + user.phoneNumber + "/" + fileName, file, {
// contentType: "image/jpeg"
contentType: file.mime
})
} catch(error) {
console.log(error);
}
}
AWS-AMPLIFY support uploading file as BLOB and converting to specified file extension (JPEG, PNG,...).
Assume that we have local file URI - file:///storage/emulated/0/Android/data/com.app/files/Pictures/image-c64a66b3-489d-4af6-bf93-7adb507ceda1790666367.jpg
Let we refactor s3UploadBackup function
import { Storage } from "aws-amplify";
export default async function s3UploadBackup(file, user) {
let formatted_date = moment().format("DD-MM-YYYY");
let filePath = file.split("/");
let fileImageName = filePath[filePath.length - 1];
try {
console.log("Files contains :" + JSON.stringify(file));
// example of one of the URL I used "file:///storage/emulated/0/Android/data/com.app/files/Pictures/image-c64a66b3-489d-4af6-bf93-7adb507ceda1790666367.jpg"
const fileName = `${formatted_date}---${user.businessName}---${user.phoneNumber}---${user.location}---${fileImageName}`;
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function (e) {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", localPath, true);
xhr.send(null);
});
await Storage.put("yourKeyHere", blob, {
contentType: "image/jpeg", // contentType is optional
});
// We're done with the blob, close and release it
blob.close();
} catch(error) {
console.log(error);
// We're done with the blob, close and release it
blob.close();
}
}

getdownloadurl from Firebase storage expo cli

I am trying to display the image from firebase storage. Below is the file location copied from firebase storage. It is a jpeg file
profile/2186uPKjgo4pMOQNm0Cm/profilepic
My following code returned error.
useEffect(() => {
function geturl(){
const filename = "profile/"+userid+"/profilepic.jpeg";
var ref = firebase.storage().ref(filename);
console.log(filename);
// This returns the exact file name
ref.getDownloadURL().then((url)=> {
console.log(url);
});
}
geturl();
}, []);
I got this error [object Object]. After that, I tried the following code async await
useEffect(() => {
async function geturl(){
const filename = "profile/"+userid+"/profilepic.jpeg";
var ref = firebase.storage().ref(filename);
console.log("inside geturl");
const downloadurl = await ref.getDownloadURL();
console.log(downloadurl);
}
geturl();
}, []);
Now Im getting the following error.
Possible Unhandled Promise Rejection (id: 29):
"code_": "storage/object-not-found",
"message_": "Firebase Storage: Object 'profile/2186uPKjgo4pMOQNm0Cm/profilepic.jpeg' does not exist.",
"name_": "FirebaseError",
"serverResponse_": "{
\"error\": {
\"code\": 404,
\"message\": \"Not Found. Could not get object\",
\"status\": \"GET_OBJECT\"
}
}",
}
Please let me know how I can get the url?
here you go you can use this function it uploads image to firebase storage and get the image uri at the same time
const uploadImage = async () => {
const response = await fetch(image);
const blob = await response.blob();
let filename = image.substring(image.lastIndexOf('/')+1);
const ext = filename.split('.').pop();
const name = filename.split('.').slice(0, -1).join('.');
filename = name + Date.now() + '.' + ext;
try {
var ref = firebase.storage().ref().child('post-images/'+filename);
await ref.put(blob)
.then(snapshot => {
return snapshot.ref.getDownloadURL();
})
.then(downloadURL => {
console.log(`Successfully uploaded file and got download link');
return downloadURL;
});
return null;
} catch (error) {
return null;
}
}

Why when I upload file with apollo-server the file is uploaded but the file is 0kb?

I tried to solve the problem but I don't understand why the file is uploaded but his size is 0Kb.
I see this code in the tutorial but he works on that tutorial but, is not worked for me
const { ApolloServer, gql } = require('apollo-server');
const path = require('path');
const fs = require('fs');
const typeDefs = gql`
type File {
url: String!
}
type Query {
hello: String!
}
type Mutation {
fileUpload(file: Upload!): File!
}
`;
const resolvers = {
Query: {
hello: () => 'Hello world!',
},
Mutation: {
fileUpload: async (_, { file }) => {
const { createReadStream, filename, mimetype, encoding } = await file;
const stream = createReadStream();
const pathName = path.join(__dirname, `/public/images/${filename}`);
await stream.pipe(fs.createWriteStream(pathName));
return {
url: `http://localhost:4000/images/${filename}`,
};
},
},
};
const server = new ApolloServer({
typeDefs,
resolvers,
});
server.listen().then(({ url }) => {
console.log(`๐Ÿš€ Server ready at ${url}`);
});
then when I upload the file, it is uploaded, but the file is 0kb
like this
What is happening is the resolver is returning before the file has uploaded, causing the server to respond before the client has finished uploading. You need to promisify and await the file upload stream events in the resolver.
Here is an example:
https://github.com/jaydenseric/apollo-upload-examples/blob/c456f86b58ead10ea45137628f0a98951f63e239/api/server.js#L40-L41
In your case:
const resolvers = {
Query: {
hello: () => "Hello world!",
},
Mutation: {
fileUpload: async (_, { file }) => {
const { createReadStream, filename } = await file;
const stream = createReadStream();
const path = path.join(__dirname, `/public/images/${filename}`);
// Store the file in the filesystem.
await new Promise((resolve, reject) => {
// Create a stream to which the upload will be written.
const writeStream = createWriteStream(path);
// When the upload is fully written, resolve the promise.
writeStream.on("finish", resolve);
// If there's an error writing the file, remove the partially written
// file and reject the promise.
writeStream.on("error", (error) => {
unlink(path, () => {
reject(error);
});
});
// In Node.js <= v13, errors are not automatically propagated between
// piped streams. If there is an error receiving the upload, destroy the
// write stream with the corresponding error.
stream.on("error", (error) => writeStream.destroy(error));
// Pipe the upload into the write stream.
stream.pipe(writeStream);
});
return {
url: `http://localhost:4000/images/${filename}`,
};
},
},
};
Note that itโ€™s probably not a good idea to use the filename like that to store the uploaded files, as future uploads with the same filename will overwrite earlier ones. I'm not really sure what will happen if two files with the same name are uploaded at the same time by two clients.

Cannot upload image to s3 using serverless framework but it work in offline (buffer issue)

I'm trying to deploy a lambda function allowing me to upload a picture to S3.
The lambda works well in offline but when I'm deploy it to AWS, the function doesn't work.
The first error I encountered was this one :
ERROR (node:7) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
So, I followed the recommendation to use Buffer.from() method instead. But it doesn't work too. The lambda run until the timeout.
Can someone tell me where I was wrong or suggest me another solution ?
Below my lambda function :
const AWS = require("aws-sdk");
const Busboy = require("busboy");
const uuidv4 = require("uuid/v4");
require("dotenv").config();
AWS.config.update({
accessKeyId: process.env.ACCESS_KEY_ID,
secretAccessKey: process.env.SECRET_ACCESS_KEY,
subregion: process.env.SUB_REGION
});
const s3 = new AWS.S3();
const getContentType = event => {
// see the second block of codes
};
const parser = event => {
// see the third block of codes
};
module.exports.main = (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
const uuid = uuidv4();
const uploadFile = async (image, uuid) =>
new Promise(() => {
// const bitmap = new Buffer(image, "base64"); // <====== deprecated
const bitmap = Buffer.from(image, "base64"); // <======== problem here
const params = {
Bucket: "my_bucket",
Key: `${uuid}.jpeg`,
ACL: "public-read",
Body: bitmap,
ContentType: "image/jpeg"
};
s3.putObject(params, function(err, data) {
if (err) {
return callback(null, "ERROR");
}
return callback(null, "SUCCESS");
});
});
parser(event).then(() => {
uploadFile(event.body.file, uuid);
});
};
getContentType() :
const getContentType = event => {
const contentType = event.headers["content-type"];
if (!contentType) {
return event.headers["Content-Type"];
}
return contentType;
};
parser()
const parser = event =>
new Promise((resolve, reject) => {
const busboy = new Busboy({
headers: {
"content-type": getContentType(event)
}
});
const result = {};
busboy.on("file", (fieldname, file, filename, encoding, mimetype) => {
file.on("data", data => {
result.file = data;
});
file.on("end", () => {
result.filename = filename;
result.contentType = mimetype;
});
});
busboy.on("field", (fieldname, value) => {
result[fieldname] = value;
});
busboy.on("error", error => reject(error));
busboy.on("finish", () => {
event.body = result;
resolve(event);
});
busboy.write(event.body, event.isBase64Encoded ? "base64" : "binary");
busboy.end();
});
new Buffer(number) // Old
Buffer.alloc(number) // New
new Buffer(string) // Old
Buffer.from(string) // New
new Buffer(string, encoding) // Old
Buffer.from(string, encoding) // New
new Buffer(...arguments) // Old
Buffer.from(...arguments) // New
You are using callbackWaitsForEmptyEventLoop which basically let lambda function thinks that the work is not over yet. Also, you are wrapping it in promise but not resolving it. You can simplify this logic using following inbuilt promise function on aws-sdk
module.exports.main = async event => {
const uuid = uuidv4();
await parser(event); // not sure if this needs to be async or not. check
const bitmap = Buffer.from(event.body.file, "base64"); // <======== problem here
const params = {
Bucket: "my_bucket",
Key: `${uuid}.jpeg`,
ACL: "public-read",
Body: bitmap,
ContentType: "image/jpeg"
};
const response = await s3.putObject(params).promise();
return response;
};

React native FS copyFile() never resolves

I have to copy a file picked with react-native-document-picker to the temporary path of my app and then read this file, but when it reaches the await ReactNativeFS.copyFile(realPath, tempPath); line, it never resolves. Here's the code
searchAndReadFiles = async () => {
try {
const fileSelected = await DocumentPicker.pick ({
type: DocumentPicker.types.plainText,
});
const decodedURI = decodeURIComponent(fileSelected.uri);
const split = decodedURI.split('/');
const name = split.pop();
const inbox = split.pop();
const realPath = `${ReactNativeFS.TemporaryDirectoryPath}/${name}`;
const tempPath = `${ReactNativeFS.ExternalStorageDirectoryPath}/${fileSelected.name}`;
await ReactNativeFS.copyFile(realPath, tempPath);
const fileRead = await ReactNativeFS.readFile(tempPath);
} catch (err) {
console.warn(err);
}
}
So I discovered that the copyFile() method by itself decodes the path, and I just passed the coded uri (the same way as I was receiving from document-picker pick() method) as argument and it worked fine, thank you.
searchAndReadFiles = async () => {
try {
const fileSelected = await DocumentPicker.pick ({
type: DocumentPicker.types.allFiles,
});
const destPath = `${ReactNativeFS.CachesDirectoryPath}/${fileSelected.name}`;
await ReactNativeFS.copyFile(fileSelected.uri, destPath);
...