How to delete uploaded file in Sailsjs (req.file)? - file-upload

In Sails.js, one could receive an uploaded file as such:
myControllerAction: function(req, res) {
req.file('avatar', function(err, uploadedFiles) {
// uploaded avatar image will be available here
console.log(uploadedFiles[0]);
}
}
Suppose I received a file, but it is not properly formatted the way I want. I would just reply with an error. One thing I would like to do is make sure that received file does not remain in the filesystem (i.e. if it exists somewhere, delete it). How can I ensure that?

Just use node fs module to delete uploaded file.
const fs = require('fs');
fs.unlink(insertFilePathHere, function(err) {
if (err) return console.log(err); // handle error as you wish
// file deleted... continue your logic
});

Related

How to get a stream while a video is being recorded?

I'd like to upload data chunk to server while recording a video(not after recording).
I tried to go with react-native-camera or react-native-vision-camera.
but as far as I know, recordAsync method only resolves the final version of recorded video.
Is there any smart way to get video chunk or stream during recording.
or should I use react-native-fs or rn-fetch-blob or something like that?
== update ==
I could probably achieve it like it gets done in the link below.
https://medium.com/react-native-training/build-youtube-alike-livestreams-with-react-native-8dde24adf543
If your problem is with regards to uploading a large file to the server, maybe you can go with react-native-background-upload, and show a progress notification, this will upload the file even when the app is in background.
There is also a package where in chunck upload is possible by breaking the file into multiple chunks : react-native-chunk-upload
import ChunkUpload from 'react-native-chunk-upload';
const chunk = new ChunkUpload({
path: response.path, // Path to the file
size: 10095, // Chunk size (must be multiples of 3)
fileName: response.fileName, // Original file name
fileSize: response.size, // Original file size
// Errors
onFetchBlobError: (e) => console.log(e),
onWriteFileError: (e) => console.log(e),
});
chunk.digIn(this.upload.bind(this));
upload(file, next, retry, unlink) {
const body = new FormData();
body.append('video', file.blob); // param name
axios.post('url', body, {
headers: {
"Content-Type": "multipart/form-data",
"Accept": 'application/json',
// Customize the headers
"x-chunk-number": file.headers["x-chunk-number"],
"x-chunk-total-number": file.headers["x-chunk-total-number"],
"x-chunk-size": file.headers["x-chunk-size"],
"x-file-name": file.headers["x-file-name"],
"x-file-size": file.headers["x-file-size"],
"x-file-identity": file.headers["x-file-identity"]
}
}).then((res) => {
...
})

React-native: download and unzip large language file

A multilingual react-native app. Each language bundle is ~50MB. It doesn't make sense to include all of them in a bundle. So, what do I do about it?
I assume the right way to go here is to download the respective language files upon language selection.
What do I do with it next? Do I suppose to store it using AsyncStorage or what?
Briefly explaining, you will:
Store JSON as ZIP in Google Storage (save memory/bandwidth/time)
Unzip file to JSON (in RN)
Store JSON in AsyncStorage (in RN)
Retrieve from AsyncStorage (in RN)
[Dependencies Summary] You can do this, using these deps:
react-native
react-native-async-storage
rn-fetch-blob
react-native-zip-archive
Tip: Always store big language json in zip format (this can save up to 90% of size).
I made a quick test here: one 3.52MB json file, turned out a 26KB zipped file!
Let's consider that yours stored zip file, can be accessed by using a public url, eg: https://storage.googleapis.com/bucket/folder/lang-file.zip.
Install and link all above RN deps, it's required to get this working.
Import the deps
import RNFetchBlob from 'rn-fetch-blob';
import { unzip } from 'react-native-zip-archive';
import AsyncStorage from '#react-native-community/async-storage';
Download the file using rn-fetch-blob. This can be done using:
RNFetchBlob
.config({
// add this option that makes response data to be stored as a file,
// this is much more performant.
fileCache : true,
})
.fetch('GET', 'http://www.example.com/file/example.zip', {
//some headers ..
})
.then((res) => {
// the temp file path
console.log('The file saved to ', res.path())
// Unzip will be called here!
unzipDownloadFile(res.path(), (jsonFilePath) => {
// Let's store this json.
storeJSONtoAsyncStorage(jsonFilePath);
// Done!
// Now you can read the AsyncStorage everytime you need (using function bellow).
});
});
[function] Unzip the downloaded file, using react-native-zip-archive:
function unzipDownloadFile(target, cb) {
const targetPath = target;
const sourcePath = `${target}.json`;
const charset = 'UTF-8';
unzip(sourcePath, targetPath, charset)
.then((path) => {
console.log(`unzip completed at ${path}`)
return cb(path);
})
.catch((error) => {
console.error(error)
});
}
[function] Store JSON in AsyncStorage:
function storeJSONtoAsyncStorage (path) {
RNFetchBlob.fs.readFile(path, 'utf-8')
.then((data) => {
AsyncStorage.setItem('myJSON', data);
});
}
Retrieve JSON data from AsyncStorage (everytime you want):
AsyncStorage.getItem('myJSON', (err, json) => {
if (err) {
console.log(err);
} else {
const myJSON = JSON.parse(json);
// ... do what you need with you json lang file here...
}
})
That's enough to get dynamic json lang files working in React Native.
I'm using this approach to give a similar feature to my i18n'ed project.
Yes you are right to make the translation file downloadable.
You can store the downloaded file in the document directory of your app.
After that you can use a package to load the translations. For instance
https://github.com/fnando/i18n-js.
I would also suggest taking a look at the i18n library which is a standard tool for internationalisation in JavaScript.
Consider taking a look at this documentations page where you can find an option of loading a translation bundle or setting up a backend provider and hooking into it.
Also, to answer the storage question, if you do not plan on setting up a backend: AsyncStorage would be an appropriate place to store your key - translation text pairs.

Input form provides File - how to I upload it to Azure Blob storage using Vue?

I'm clearly missing something here so forgive me - all examples seem to involve express and I don't have express in my setup. I am using Vue.js.
Ultimately, want my client-side Vue app to be able to upload any file to azure blob storage.
I have the file(File api) from my Vue form. However, it does not provide a path (I believe this is for security reasons). The Azure docs have this snippet example:
const uploadLocalFile = async (containerName, filePath) => {
return new Promise((resolve, reject) => {
const fullPath = path.resolve(filePath);
const blobName = path.basename(filePath);
blobService.createBlockBlobFromLocalFile(containerName, blobName, fullPath, err => {
if (err) {
reject(err);
} else {
resolve({ message: `Local file "${filePath}" is uploaded` });
}
});
});
};
Is this not the api I should be using? What should I be doing to upload any type of blob to blob storage?
UPDATE
Following #Adam Smith-MSFT comments below I have tried the vue-azure-storage-upload but can't seem to get the files up to azure.
startUpload () {
if (!this.files || !this.baseUrl) {
window.alert('Provide proper data first!')
} else {
this.files.forEach((file:File) => {
this.$azureUpload({
baseUrl: this.baseUrl + file.name,
sasToken: this.sasToken,
file: file,
progress: this.onProgress,
complete: this.onComplete,
error: this.onError
// blockSize
})
})
}
},
According to the console the response.data is undefined and when the onError method fires, that too gives me an undefined event.
I'd highly recommend checking the following tutorial: https://www.npmjs.com/package/vue-azure-blob-upload
The author used a specific npm package to upload blobs(you can using file service) to upload objects:
npm i --save vue-azure-blob-upload
I'd also recommend checking the Storage JS documentation: https://github.com/Azure/azure-storage-js/tree/master/file , it provides specific examples related to Azure File Storage as well.

Deleting an image when deleting an item using multer

Ok so I am using multer to upload an image of an item that a user submits. However when the user deletes the item the image file is still on the server.
I have been trying to figure this out and I am not sure if its the way findByIdAndRemove() is deleting the user item before I am making the call to remove the image or what but I am getting the error saying that it cant find the file name of undefined. I am using a promise thinking it would allow me to do this but I am not sure of where else to go. I am new at this and learning as I go. Here is my delete route:
router.delete("/item/:id", middleware.isLoggedIn, (req, res) => {
Promise.all([
(Item.findByIdAndRemove(req.params.id),
fs.unlinkSync("./public/uploads/" + req.item.image))
])
.then(() => {
return res.render("products");
})
.catch(err => {
return console.log("err", err.stack);
});
Any suggestions?

Download large object from AWS S3

I have an Angular web application in which allows users to download files locally (installers). Some files exceed 1.5 GB in size, which causes the browser to crash (Chrome) when using 'normal' s3.getObject(opts, function(err, data){}) calls, since the entire file binary data is cached....?
I have tried to use other techniques, like streaming (StreamSaver.js), but with no luck.
I am trying to chunk the file data, but in the follow code, the 'httpData' event does not get called until the entire file's binary data is loaded...which seems to defeat the purpose of chunking. I am not understanding this event, or I have something misconfigured.
cache.S3.getObject({ Bucket: 'anduin-installers', Key: filePath })
.on('httpDownloadProgress', function (progress) {
$timeout(function () {
pkg.Download.Progress = Math.floor((progress.loaded / progress.total) * 100.0);
});
})
.on('httpData', function (chunk, response) {
console.log('???');
})
.on('complete', function (response) {
$timeout(function () {
pkg.Download.Active = false;
pkg.Download.Progress = 0;
});
})
.send();
Any ideas no how to make event 'httpData' fire as data chunks are received instead of waiting for th whole file? Or should I go with another solution?
Thanks!