Download large object from AWS S3 - amazon-s3

I have an Angular web application in which allows users to download files locally (installers). Some files exceed 1.5 GB in size, which causes the browser to crash (Chrome) when using 'normal' s3.getObject(opts, function(err, data){}) calls, since the entire file binary data is cached....?
I have tried to use other techniques, like streaming (StreamSaver.js), but with no luck.
I am trying to chunk the file data, but in the follow code, the 'httpData' event does not get called until the entire file's binary data is loaded...which seems to defeat the purpose of chunking. I am not understanding this event, or I have something misconfigured.
cache.S3.getObject({ Bucket: 'anduin-installers', Key: filePath })
.on('httpDownloadProgress', function (progress) {
$timeout(function () {
pkg.Download.Progress = Math.floor((progress.loaded / progress.total) * 100.0);
});
})
.on('httpData', function (chunk, response) {
console.log('???');
})
.on('complete', function (response) {
$timeout(function () {
pkg.Download.Active = false;
pkg.Download.Progress = 0;
});
})
.send();
Any ideas no how to make event 'httpData' fire as data chunks are received instead of waiting for th whole file? Or should I go with another solution?
Thanks!

Related

In which stage should I generate signed urls to load S3 bucket objects in an app?

I am using S3 to store images in my app.
This is the function that generates the signed url that the user can use to upload an image:
const key = `images/${Date.now()}.jpeg`;
s3_config
.getImageSignedUrl(key)
.then((url) => {
res.status(200).send({ key, url });
})
.catch((error) => {
res.status(500).send({
message: "There was an error generating pre-signed url.",
});
});
So after the image is uploaded, the image url will look like this:
https://BUCKET_NAME.s3.amazonaws.com/images/1667119739573.jpeg
Now, in order to make the images only accessible on the website, I will also use a signed-url.
This way when someone uses the direct link:
https://BUCKET_NAME.s3.amazonaws.com/images/1667119739573.jpeg
He will get AccessDenied error.
And only users within the app, will be able to access the images using the signed urls.
This is how I generated the signed url for loading an image:
var getImageReadSignedUrl = async function (key) {
return new Promise((resolve, reject) => {
s3.getSignedUrl(
"getObject",
{
Bucket: AWS_BUCKET_NAME,
Key: key,
Expires: 300,
},
(err, url) => {
if (err) {
reject(err);
} else {
resolve(url);
}
}
);
});
};
And if I feed it an image key:
getImageReadSignedUrl("images/1667119739573.jpeg");
It will generate a signed url that will allow the user to access the private image:
https://BUCKET_NAME.s3.eu-west-3.amazonaws.com/images/1667119739573.jpeg?X-Amz-Algorithm=xxxxxxxxxxxxxxxxx&X-Amz-Credential=xxxxxxxxxxxxxxxxxxxx9%2Feu-xxxx-3%2Fs3%2Faws4_request&X-Amz-Date=202211xxxxxxxx35Z&X-Amz-Expires=300&X-Amz-Signature=5ab0exxxxxxxxxxxxxxxxxx8dc401dc7fxxxxxxxxa5124&X-Amz-SignedHeaders=host
Now, so far so good. Everything works perfectly as intended.
My problem is when or how or where exactly I should use the function getImageReadSignedUrl.
Since in the database, I am saving the direct link to the image:
https://BUCKET_NAME.s3.amazonaws.com/images/1667119739573.jpeg
When the user is using the app, he will receive that url.
And it will be using inside the img html tag, to render the image.
Now, the question is, should I use getImageReadSignedUrl everytime before there's an image url in the data that's sent back to the user and send the signed url instead?
Even though, this makes sense, this means that I will have to go through the entire app in the backend and call that function, everytime there's an image to be sent back to the user.
Is there another approach that makes more sense and is not as tedious as this?
FYI, I am using the MERN stack and an EC2 instance.
Thank you.

How to get a stream while a video is being recorded?

I'd like to upload data chunk to server while recording a video(not after recording).
I tried to go with react-native-camera or react-native-vision-camera.
but as far as I know, recordAsync method only resolves the final version of recorded video.
Is there any smart way to get video chunk or stream during recording.
or should I use react-native-fs or rn-fetch-blob or something like that?
== update ==
I could probably achieve it like it gets done in the link below.
https://medium.com/react-native-training/build-youtube-alike-livestreams-with-react-native-8dde24adf543
If your problem is with regards to uploading a large file to the server, maybe you can go with react-native-background-upload, and show a progress notification, this will upload the file even when the app is in background.
There is also a package where in chunck upload is possible by breaking the file into multiple chunks : react-native-chunk-upload
import ChunkUpload from 'react-native-chunk-upload';
const chunk = new ChunkUpload({
path: response.path, // Path to the file
size: 10095, // Chunk size (must be multiples of 3)
fileName: response.fileName, // Original file name
fileSize: response.size, // Original file size
// Errors
onFetchBlobError: (e) => console.log(e),
onWriteFileError: (e) => console.log(e),
});
chunk.digIn(this.upload.bind(this));
upload(file, next, retry, unlink) {
const body = new FormData();
body.append('video', file.blob); // param name
axios.post('url', body, {
headers: {
"Content-Type": "multipart/form-data",
"Accept": 'application/json',
// Customize the headers
"x-chunk-number": file.headers["x-chunk-number"],
"x-chunk-total-number": file.headers["x-chunk-total-number"],
"x-chunk-size": file.headers["x-chunk-size"],
"x-file-name": file.headers["x-file-name"],
"x-file-size": file.headers["x-file-size"],
"x-file-identity": file.headers["x-file-identity"]
}
}).then((res) => {
...
})

electron.js and sql - correct way to set it up?

I am new to electron.js - been reading the documentation and some similar post here:
How do I make a database call from an Electron front end?
Secure Database Connection in ElectronJS Production App?
Electron require() is not defined
How to use preload.js properly in Electron
But it's still not super clear how to properly implement a secure SQL integration. Basically, I want to create a desktop database client. The app will connect to the remote db and users can run all kind of predefined queries and the results will show up in the app.
The documentation says that if you are working with a remote connection you shouldn't run node in the renderer. Should I then require the SQL module in the main process and use IPC to send data back and forth and preload IPCremote?
Thanks for the help
Short answer: yes
Long answer:
Allowing node on your renderer poses a big security risk for your app. It is best practices in this case to create pass a function to your preloader. There are a few options you can use to do this:
Pass a ipcRenderer.invoke function wrapped in another function to your renderer in your preload. You can then invoke a call to your main process which can either send info back via the same function or via sending it via the window.webContents.send command and listening for it on the window api on your renderer. EG:
Preload.js:
const invoke = (channel, args, cb = () => {return}) => {
ipcRenderer.invoke(channel, args).then((res) => {
cb(res);
});
};
const handle = (channel, cb) => {
ipcRenderer.on(channel, function (Event, message) {
cb(Event, message);
});
};
contextBridge.exposeInMainWorld("GlobalApi", {
invoke: invoke,
handle:handle
});
Renderer:
let users
window.GlobalApi.handle("users", (data)=>{users=data})
window.GlobalApi.invoke("get", "users")
or:
let users;
window.GlobalApi.invoke("get", "users", (data)=>{users=data})
Main:
ipcMain.handle("get", async (path) => {
let data = dbFunctions.get(path)
window.webContents.send(
path,
data
);
}
Create a DB interface in your preload script that passes certain invocations to your renderer that when called will return the value that you need from your db. E.G.
Renderer:
let users = window.myCoolApi.get("users");
Preload.js:
let get = function(path){
let data = dbFuncions.readSomeDatafromDB("path");
return data; // Returning the function itself is a no-no shown below
// return dbFuncions.readSomeDatafromDB("path"); Don't do this
}
contextBridge.exposeInMainWorld("myCoolApi", {
get:get
});
There are more options, but these should generally ensure security as far as my knowledge goes.

Input form provides File - how to I upload it to Azure Blob storage using Vue?

I'm clearly missing something here so forgive me - all examples seem to involve express and I don't have express in my setup. I am using Vue.js.
Ultimately, want my client-side Vue app to be able to upload any file to azure blob storage.
I have the file(File api) from my Vue form. However, it does not provide a path (I believe this is for security reasons). The Azure docs have this snippet example:
const uploadLocalFile = async (containerName, filePath) => {
return new Promise((resolve, reject) => {
const fullPath = path.resolve(filePath);
const blobName = path.basename(filePath);
blobService.createBlockBlobFromLocalFile(containerName, blobName, fullPath, err => {
if (err) {
reject(err);
} else {
resolve({ message: `Local file "${filePath}" is uploaded` });
}
});
});
};
Is this not the api I should be using? What should I be doing to upload any type of blob to blob storage?
UPDATE
Following #Adam Smith-MSFT comments below I have tried the vue-azure-storage-upload but can't seem to get the files up to azure.
startUpload () {
if (!this.files || !this.baseUrl) {
window.alert('Provide proper data first!')
} else {
this.files.forEach((file:File) => {
this.$azureUpload({
baseUrl: this.baseUrl + file.name,
sasToken: this.sasToken,
file: file,
progress: this.onProgress,
complete: this.onComplete,
error: this.onError
// blockSize
})
})
}
},
According to the console the response.data is undefined and when the onError method fires, that too gives me an undefined event.
I'd highly recommend checking the following tutorial: https://www.npmjs.com/package/vue-azure-blob-upload
The author used a specific npm package to upload blobs(you can using file service) to upload objects:
npm i --save vue-azure-blob-upload
I'd also recommend checking the Storage JS documentation: https://github.com/Azure/azure-storage-js/tree/master/file , it provides specific examples related to Azure File Storage as well.

Local storage solutions for large data including images on React Native

Here's the flow of how my end-product should work:
When the user opens the app for the first time, fetch all the data
i.e., including images(150+) and relevant JSON objects.
On opening the app subsequently, the images and data should load
from local storage i.e., no need for internet at all.
I know it seems weird but this is my use case:
The product is a Wayfinder running on Android Box(55-inch touchscreen TV ) which will be placed in the shopping mall. It will not have access to the internet unless I manually connect it.
Hence it should load the data when opening for the first time i.e. when I'm configuring the application.
Solutions I have come across:
Realm: Local database management with excellent support for react-native - my option right now
Native Async Storage: Not suitable for large data
SQLite: Not comfortable with SQL queries
I'm still looking for options on how differently this problem can be tackled. Also, I'm familiar with Redux.
Thanks.
Check out react-native-fs (or expo-file-system if working with expo).
It is specially designed to store files on the device. In your component, it would look something like this:
const RNFS = require('react-native-fs');
RNFS
.downloadFile({ fromUrl: myURL, toFile: myFilePath })
.promise
.then(res => console.log('Done'));
use pouchDB database , this is work with indexDB local browser database
call XHR request for image and convert response to binary data and store in local database
when need to preview image , get from database and make a blobUrl and show in img tag
axios.get(url, {
progress: false, responseType: 'arraybuffer',
onDownloadProgress: (progressEvent) => {
precent = (100 * progressEvent.loaded / progressEvent.total)
console.log(precent)
}
})
.then(resp => {
//get db
let db = $db.dbModel
//set attach
db.get(doc._id).then((doc) => {
db.putAttachment(doc._id, 'index.mp4', doc._rev, new Blob([new Uint8Array(resp.data)], {type: 'video/mp4'}), 'video/mp4')
.then(res => {
// console.log('success store file')
})
})
})
https://github.com/mohammadnazari110/pwa_offline_video_download