Input form provides File - how to I upload it to Azure Blob storage using Vue? - vue.js

I'm clearly missing something here so forgive me - all examples seem to involve express and I don't have express in my setup. I am using Vue.js.
Ultimately, want my client-side Vue app to be able to upload any file to azure blob storage.
I have the file(File api) from my Vue form. However, it does not provide a path (I believe this is for security reasons). The Azure docs have this snippet example:
const uploadLocalFile = async (containerName, filePath) => {
return new Promise((resolve, reject) => {
const fullPath = path.resolve(filePath);
const blobName = path.basename(filePath);
blobService.createBlockBlobFromLocalFile(containerName, blobName, fullPath, err => {
if (err) {
reject(err);
} else {
resolve({ message: `Local file "${filePath}" is uploaded` });
}
});
});
};
Is this not the api I should be using? What should I be doing to upload any type of blob to blob storage?
UPDATE
Following #Adam Smith-MSFT comments below I have tried the vue-azure-storage-upload but can't seem to get the files up to azure.
startUpload () {
if (!this.files || !this.baseUrl) {
window.alert('Provide proper data first!')
} else {
this.files.forEach((file:File) => {
this.$azureUpload({
baseUrl: this.baseUrl + file.name,
sasToken: this.sasToken,
file: file,
progress: this.onProgress,
complete: this.onComplete,
error: this.onError
// blockSize
})
})
}
},
According to the console the response.data is undefined and when the onError method fires, that too gives me an undefined event.

I'd highly recommend checking the following tutorial: https://www.npmjs.com/package/vue-azure-blob-upload
The author used a specific npm package to upload blobs(you can using file service) to upload objects:
npm i --save vue-azure-blob-upload
I'd also recommend checking the Storage JS documentation: https://github.com/Azure/azure-storage-js/tree/master/file , it provides specific examples related to Azure File Storage as well.

Related

"The original argument must be of type function" ERROR for promisifying client.zrem?

I am making a cron job instance that is running using Node to run a job that removes posts from my Redis cache.
I want to promisify client.zrem for removing many posts from the cache to insure they are all removed but when running my code I get the error below on line: "client.zrem = util.promisify(client.zrem)"
"TypeError [ERR_INVALID_ARG_TYPE]: The "original" argument must be of type function. Received undefined"
I have another Node instance that runs this SAME CODE with no errors, and I have updated my NPM version to the latest version, according to a similar question for this SO article but I am still getting the error.
TypeError [ERR_INVALID_ARG_TYPE]: The "original" argument must be of type Function. Received type undefined
Any idea how I can fix this?
const Redis = require("redis")
const util = require(`util`)
const client = Redis.createClient({
url: process.env.REDIS,
})
client.zrem = util.promisify(client.zrem) // ERROR THROWN HERE
// DELETE ONE POST
const deletePost = async (deletedPost) => {
await client.zrem("posts", JSON.stringify(deletedPost))
}
// DELETES MANY POSTS
const deleteManyPosts = (postsToDelete) => {
postsToDelete.map(async (post) => {
await client.zrem("posts", JSON.stringify(post))
})
}
module.exports = { deletePost, deleteManyPosts }
Node Redis 4.x introduced several breaking changes. Adding support for Promises was one of those. Renaming the methods to be camel cased was another. Details can be found at in the README in the GitHub repo for Node Redis.
You need to simply delete the offending line and rename the calls to .zrem to .zRem.
I've also noticed that you aren't explicitly connecting to Redis after creating the client. You'll want to do that.
Try this:
const Redis = require("redis")
const client = Redis.createClient({
url: process.env.REDIS,
})
// CONNECT TO REDIS
// NOTE: this code assumes that the Node.js version supports top-level await
client.on('error', (err) => console.log('Redis Client Error', err));
await client.connect(); //
// DELETE ONE POST
const deletePost = async (deletedPost) => {
await client.zRem("posts", JSON.stringify(deletedPost))
}
// DELETES MANY POSTS
const deleteManyPosts = (postsToDelete) => {
postsToDelete.map(async (post) => {
await client.zRem("posts", JSON.stringify(post))
})
}
module.exports = { deletePost, deleteManyPosts }

Storage.put() throwing - AWSS3Provider - error uploading TypeError: Cannot read property 'byteLength' of undefined

JavaScript Framework (Vue)
Amplify APIs (Storage)
Amplify Categories (storage)
Getting Cannot read property 'byteLength' of undefined while uploading media to s3 bucket using amplify storage, below is the code I am currently using
async onUpload(fileArr) {
if (fileArr.length > 0) {
console.log("fileArr", fileArr);
fileArr.map(async (obj) => {
try {
console.log({ Storage, Amplify });
console.log("Object =>", obj);
let baseData = await this.toBase64(obj);
console.log("Base Data =>", baseData);
const arrayBuffer = decode(baseData);
console.log("Array buffer =>", arrayBuffer);
let result = await Storage.put(
`hub/${obj.name}`,
arrayBuffer,
{
contentType: obj.type,
}
);
console.log("S3 Upload Result =>", result);
} catch (err) {
console.log("Error in uploading", err);
}
});
}
},
I tried to convert media to base64 before uploading but still getting same error
ERROR OUTPUT
Error - AWSS3Provider - error uploading TypeError: Cannot read property 'byteLength' of undefined
I granted full access to my IAM user and role but it didn't work either, I looked for multiple solution available online but still I didn't make it out.
I also faced this problem while uploading data to s3 using Apmplify.
The problem is not in your code its library version issue, make sure to use same Amplify version which is used on server
Latest Amplify version is 4.3.0
https://www.npmjs.com/package/aws-amplify

Using AWS SDK (JS) for s3.selectObjectContent gives error on 'on' keyword

I'm using AWS SDK for Javascript version 2.730.0 (latest at time of writing) in a Typescript file in Node.JS.
I'm using the selectObjectContent operation to query a CSV file, and following the guide in the documentation I have this block:
import * as S3 from 'aws-sdk/clients/s3';
const s3 = new S3();
...
s3.selectObjectContent(params, (err, data) => {
if (!err){
data.Payload.on('data', (event) => {
// Do something with returned records
});
}
});
The line data.Payload.on('data', (event) => { is giving this error in the linter:
Property 'on' does not exist on type 'EventStream<{ Records?: RecordsEvent; Stats?: StatsEvent; Progress?: ProgressEvent; Cont?: ContinuationEvent; End?: EndEvent; }>'.
What do I need to change for on to work?
I ran into the same problem myself. Found this problem post on another forum:
https://www.gitmemory.com/issue/aws/aws-sdk-js/3525/725076849
It does not explicitly show code to solve the problem but based on the information, I solved it as follows:
import { ReadStream } from "fs";
const eventStream = data.Payload as ReadStream;
eventStream.on("data", ({ Records, Stats, Progress, Cont, End }: ...
TypeScript no longer complains.

React-native: download and unzip large language file

A multilingual react-native app. Each language bundle is ~50MB. It doesn't make sense to include all of them in a bundle. So, what do I do about it?
I assume the right way to go here is to download the respective language files upon language selection.
What do I do with it next? Do I suppose to store it using AsyncStorage or what?
Briefly explaining, you will:
Store JSON as ZIP in Google Storage (save memory/bandwidth/time)
Unzip file to JSON (in RN)
Store JSON in AsyncStorage (in RN)
Retrieve from AsyncStorage (in RN)
[Dependencies Summary] You can do this, using these deps:
react-native
react-native-async-storage
rn-fetch-blob
react-native-zip-archive
Tip: Always store big language json in zip format (this can save up to 90% of size).
I made a quick test here: one 3.52MB json file, turned out a 26KB zipped file!
Let's consider that yours stored zip file, can be accessed by using a public url, eg: https://storage.googleapis.com/bucket/folder/lang-file.zip.
Install and link all above RN deps, it's required to get this working.
Import the deps
import RNFetchBlob from 'rn-fetch-blob';
import { unzip } from 'react-native-zip-archive';
import AsyncStorage from '#react-native-community/async-storage';
Download the file using rn-fetch-blob. This can be done using:
RNFetchBlob
.config({
// add this option that makes response data to be stored as a file,
// this is much more performant.
fileCache : true,
})
.fetch('GET', 'http://www.example.com/file/example.zip', {
//some headers ..
})
.then((res) => {
// the temp file path
console.log('The file saved to ', res.path())
// Unzip will be called here!
unzipDownloadFile(res.path(), (jsonFilePath) => {
// Let's store this json.
storeJSONtoAsyncStorage(jsonFilePath);
// Done!
// Now you can read the AsyncStorage everytime you need (using function bellow).
});
});
[function] Unzip the downloaded file, using react-native-zip-archive:
function unzipDownloadFile(target, cb) {
const targetPath = target;
const sourcePath = `${target}.json`;
const charset = 'UTF-8';
unzip(sourcePath, targetPath, charset)
.then((path) => {
console.log(`unzip completed at ${path}`)
return cb(path);
})
.catch((error) => {
console.error(error)
});
}
[function] Store JSON in AsyncStorage:
function storeJSONtoAsyncStorage (path) {
RNFetchBlob.fs.readFile(path, 'utf-8')
.then((data) => {
AsyncStorage.setItem('myJSON', data);
});
}
Retrieve JSON data from AsyncStorage (everytime you want):
AsyncStorage.getItem('myJSON', (err, json) => {
if (err) {
console.log(err);
} else {
const myJSON = JSON.parse(json);
// ... do what you need with you json lang file here...
}
})
That's enough to get dynamic json lang files working in React Native.
I'm using this approach to give a similar feature to my i18n'ed project.
Yes you are right to make the translation file downloadable.
You can store the downloaded file in the document directory of your app.
After that you can use a package to load the translations. For instance
https://github.com/fnando/i18n-js.
I would also suggest taking a look at the i18n library which is a standard tool for internationalisation in JavaScript.
Consider taking a look at this documentations page where you can find an option of loading a translation bundle or setting up a backend provider and hooking into it.
Also, to answer the storage question, if you do not plan on setting up a backend: AsyncStorage would be an appropriate place to store your key - translation text pairs.

In cloud code seems impossible to use Parse.Config.get() with express is it correct?

Is there any way to use Parse.Config.get() inside an expressjs app hosted in cloud code?
Looks very easy to use Parse.Object and Parse.User but with Parse.Config.get() the code is not deployed using "parse deploy"
We manage to use it adding the jssdk in html and using "frontend js" but haven't find any way to use in directly in express controllers.
Thanks
It seem to be related with some kind of permissions issues...
var Parse = require('parse-cloud-express').Parse;
var Util = require('util')
Parse.Cloud.define("currentConfig", function(request, response) {
console.log('Ran currentConfig cloud function.');
// why do I have to do this?
Parse.initialize(xxx, yyy);
Parse.Config.get().then(function(config) {
// never called
// ...
console.log(config.get('xxx'))
}, function(error) {
console.log(Util.inspect(error))
});
});
Output
Ran currentConfig cloud function.
{ code: undefined, message: 'unauthorized' }
Edited code which work for me:
var Parse = require('parse-cloud-express').Parse;
var Util = require('util')
Parse.initialize("appId", "restApiKey", "masterKey");
Parse.Cloud.define("currentConfig", function(request, response) {
console.log('Ran currentConfig cloud function.');
Parse.Cloud.useMasterKey();
Parse.Config.get().then(function(config) {
// never called
// ...
console.log(config.get('xxx'))
}, function(error) {
console.log(Util.inspect(error))
});
});
EDIT: Add solution :)