Convert from buffer | encryption - cryptography

I want to return type string from this function below.
Im using "crypto" from javascript library to encrypt, but this function return type buffer.
I tried .toString('utf8');
but didnt work
import fs from 'fs'
import crypto from 'crypto'
export function encryptText (plainText) {
return crypto.publicEncrypt({
key: fs.readFileSync('public_key.pem', 'utf8'),
padding: crypto.constants.RSA_PKCS1_OAEP_PADDING,
oaepHash: 'sha256'
},
// We convert the data string to a buffer
Buffer.from(plainText)
)
}
Its possible convert?

Related

Google Cloud Function -- Convert BigQuery Data to Gzip (Compressed) Json then Load to Cloud Storage

*For context, this script is largely based on the one found in this guide from Google: https://cloud.google.com/bigquery/docs/samples/bigquery-extract-table-json#bigquery_extract_table_json-nodejs
I have the below script which is functioning. However, it writes a normal JSON file to cloud storage. To be a bit more optimized for file transfer and storage,I wanted to use const {pako} = require('pako'); to compress the files before loading.
I haven't been able to figure out how to accomplish this, unfortunately, after numerous attempts.
Anyone have any ideas?
**I'm assuming it has something to do with the options in .extract(storage.bucket(bucketName).file(filename), options);, but again, pretty lost in how to figure this out unfortunately...
Any help would be appreciated! :)
**The intent of this function is:
It is a Google Cloud function
It gets data from BigQuery
It writes that data in JSON format to Cloud Storage
My goal is to integrate Pako (or another means of compression) to compress the JSON files to gzip format prior to moving into storage.
const {BigQuery} = require('#google-cloud/bigquery');
const {Storage} = require('#google-cloud/storage');
const functions = require('#google-cloud/functions-framework');
const bigquery = new BigQuery();
const storage = new Storage();
functions.http('extractTableJSON', async (req, res) => {
// Exports my_dataset:my_table to gcs://my-bucket/my-file as JSON.
// https://cloud.google.com/bigquery/docs/samples/bigquery-extract-table-json#bigquery_extract_table_json-nodejs
const DateYYYYMMDD = new Date().toISOString().slice(0,10).replace(/-/g,"");
const datasetId = "dataset-1";
const tableId = "example";
const bucketName = "domain.appspot.com";
const filename = `/cache/${DateYYYYMMDD}/example.json`;
// Location must match that of the source table.
const options = {
format: 'json',
location: 'US',
};
// Export data from the table into a Google Cloud Storage file
const [job] = await bigquery
.dataset(datasetId)
.table(tableId)
.extract(storage.bucket(bucketName).file(filename), options);
console.log(`Job ${job.id} created.`);
res.send(`Job ${job.id} created.`);
// Check the job's status for errors
const errors = job.status.errors;
if (errors && errors.length > 0) {
res.send(errors);
}
});
If you want to gzip compress the result, simply use that option
// Location must match that of the source table.
const options = {
format: 'json',
location: 'US',
gzip: true,
};
Job done ;)
From guillaume blaquiere Ah, you are looking for an array of rows!!! Ok, you can't have it out of the box. BigQuery export JSONL file (JSON Line, with 1 valid JSON per line, representing a row in BQ) – guillaume blaquiere
Turns out that I had a misunderstanding of the expected output. I was expecting a JSON Array, whereas the output is individual JSON lines, as Guillaume mentioned above.
So, if you're looking for a JSON Array output, you can still use the helper found below to convert the output, but turns out, that was in fact the expected output and I was mistakenly thinking it was inaccurate (sorry - I'm new ...)
// step #1: Use the below options to export to compressed JSON (as per guillaume blaquiere's note)
const options = {
format: 'json',
location: 'US',
gzip: true,
};
// step #2 (if you're looking for a JSON Array): you can use the below helper function to convert the response.
function convertToJsonArray(text: string): any {
// wrap in array and add comma at end of each line and remove last comma
const wrappedText = `[${text.replace(/\r?\n|\r/g, ",").slice(0, -1)}]`;
const jsonArray = JSON.parse(wrappedText);
return jsonArray;
}
For reference / in case it's helpful, i created this function that'll handle both compressed and uncompressed JSON that's returned.
The application of this is that i'm writing the BigQuery table to JSON in cloud storage to act as a "cache" then requesting that file from a React app and using the below to parse the file in the React app for use on frontend.
import pako from 'pako';
function convertToJsonArray(text: string): any {
const wrappedText = `[${text.replace(/\r?\n|\r/g, ",").slice(0, -1)}]`;
const jsonArray = JSON.parse(wrappedText);
return jsonArray;
}
async function getJsonData(arrayBuffer: ArrayBuffer): Promise<any> {
try {
const Uint8Arr = pako.inflate(arrayBuffer);
const arrayBuf = new TextDecoder().decode(Uint8Arr);
const jsonArray = convertToJsonArray(arrayBuf);
return jsonArray;
} catch (error) {
console.log("Error unzipping file, trying to parse as is.", error)
const parsedBuffer = new TextDecoder().decode(arrayBuffer);
const jsonArray = convertToJsonArray(parsedBuffer);
return jsonArray;
}
}

react-native-fs library readFile() function returns infinite base64 string

I got to check this library react-native-fs because I have this task wherein I have to convert a video into base64 string before sending it to our API. I am using the readFile function that's provided by this library to achieve that. But somehow I am not sure why it returns an infinite base64 encoded string.
Here is my code snippet:
import { RNFS } from 'react-native-fs'
const callbackFunctionVideo = async (videoData: any) => {
const videoUri= {
uri: videoData.assets[0].uri,
fileName: videoData.assets[0].fileName,
type: videoData.assets[0].type
}
const base64_vid = RNFS.readFile(videoUri.uri, 'base64').then(res=> res).catch(err=>err)
console.log(base64_vid)
}
Then when it logs, it will be an infinite base64 string(I'm not sure if it is looping though).

Google Buckets / Read by line

I know that is currently possible to download objects by byte range in Google Cloud Storage buckets.
const options = {
destination: destFileName,
start: startByte,
end: endByte,
};
await storage.bucket(bucketName).file(fileName).download(options);
However, I would need to read by line as the files I deal with are *.csv:
await storage
.bucket(bucketName)
.file(fileName)
.download({ destination: '', lineStart: number, lineEnd: number });
I couldn't find any API for it, could anyone advise on how to achieve the desired behaviour?
You could not read a file line by line directly from Cloud Storage, as it stores them as objects , as shown on this answer:
The string you read from Google Storage is a string representation of a multipart form. It contains not only the uploaded file contents but also some metadata.
To read the file line by line as desired, I suggest loading it onto a variable and then parse the variable as needed. You could use the sample code provided on this answer:
const { Storage } = require("#google-cloud/storage");
const storage = new Storage();
//Read file from Storage
var downloadedFile = storage
.bucket(bucketName)
.file(fileName)
.createReadStream();
// Concat Data
let fileBuffer = "";
downloadedFile
.on("data", function (data) {
fileBuffer += data;
})
.on("end", function () {
// CSV file data
//console.log(fileBuffer);
//Parse data using new line character as delimiter
var rows;
Papa.parse(fileBuffer, {
header: false,
delimiter: "\n",
complete: function (results) {
// Shows the parsed data on console
console.log("Finished:", results.data);
rows = results.data;
},
});
To parse the data, you could use a library like PapaParse as shown on this tutorial.

asyncStorage with ReactNative: JSON.parse doesn't when getting object back

Hy everyone !
I've stored a simple object in Async Storage in a ReactNative app.
But when I get it back, it isn't correctly parsed : all keys still got the quotes marks (added by JSON.stringify() when storing it) ...
I store data like that:
const storeData = () => {
let myData = {
title: 'Hummus',
price: '6.90',
id: '1'
}
AsyncStorage.setItem('#storage_Key', JSON.stringify(myData));
}
and then access data like that:
const getData= async () => {
const jsonValue = await AsyncStorage.getItem('#storage_Key')
console.log(jsonValue);
return JSON.parse(jsonValue);
}
and my object after parsing looks like that:
{"title":"Hummus","price":"6.90","id":"1"}
Any idea why quotes aren't removed from keys ??
That's because JSON specification says the keys should be string. What you are using is the modern representation of JSON called JSON5 (https://json5.org/). JSON5 is a superset of JSON specification and it does not require keys to be surrounded by quotes in some cases. When you stringify, it returns the result in JSON format.
Both JSON and JSON5 are equally valid in modern browsers. So, you should not be worries about breaking anything programmatically just because they look different.
You can use JSON5 as shown below and it will give you your desired Stringified result.
let myData = {
title: 'Hummus',
price: '6.90',
id: '1'
}
console.log(JSON5.stringify(myData));
console.log(JSON.stringify(myData));
<script src="https://unpkg.com/json5#^2.0.0/dist/index.min.js"></script>
Like this:
// JSON5.stringify
{title:'Hummus',price:'6.90',id:'1'}
// JSON.stringify
{"title":"Hummus","price":"6.90","id":"1"}

How can i encode a string in base64 using meteor

I am trying to use a form to upload files to a s3 bucket using Meteor. I am following this amazon article. At "Sign Your S3 POST Form", near the end, I need to encode a string to base64 but I've been unable to find a way to do this. Can anyone tell me how to do this? Notice that the string first needs to be encoded and then signed. This is how it's done in python:
import base64
import hmac, hashlib
policy = base64.b64encode(policy_document)
signature = base64.b64encode(hmac.new(AWS_SECRET_ACCESS_KEY, policy, hashlib.sha1).digest())
You can do this without the NodeJS crypto module, creating a package looked a bit like breaking a fly on the wheel to me so I figured out this:
if (Meteor.isServer) {
Meteor.methods({
'base64Encode':function(unencoded) {
return new Buffer(unencoded || '').toString('base64');
},
'base64Decode':function(encoded) {
return new Buffer(encoded || '', 'base64').toString('utf8');
},
'base64UrlEncode':function(unencoded) {
var encoded = Meteor.call('base64Encode',unencoded);
return encoded.replace(/\+/g, '-').replace(/\//g, '_').replace(/=+$/, '');
},
'base64UrlDecode':function(encoded) {
encoded = encoded.replace(/-/g, '+').replace(/_/g, '/');
while (encoded.length % 4)
encoded += '=';
return Meteor.call('base64Decode',encoded);
}
console.log(Meteor.call('base64Encode','abc'));
});
This is based on the base64.js by John Hurliman found at https://gist.github.com/jhurliman/1250118 Note that this will work like a charm on the server but for porting it to the client you have call the methods with a callback function that stores the result as a session variable.
You need NodeJS crypto module to perform these tasks.
First create a "packages" directory at the root of your meteor project, then create a "my-package" directory.
Inside it, you need two files : a "package.js" and "my-package.js".
package.js should look like :
Package.describe({
summary:"MyPackage doing amazing stuff with AWS."
});
Package.on_use(function(api){
// add your package file to the server app
api.add_files("my-package.js","server");
// what we export outside of the package
// (this is important : packages have their own scope !)
api.export("MyPackage","server");
});
my-package.js should look like :
var crypto=Npm.require("crypto");
MyPackage={
myFunction:function(arguments){
// here you can use crypto functions !
}
};
The function you will probably need is crypto.createHmac.
Here is an example code of how I encode a JSON security policy in base64 then use it to generate a security signature in my own app :
encodePolicy:function(jsonPolicy){
// stringify the policy, store it in a NodeJS Buffer object
var buffer=new Buffer(JSON.stringify(jsonPolicy));
// convert it to base64
var policy=buffer.toString("base64");
// replace "/" and "+" so that it is URL-safe.
return policy.replace(/\//g,"_").replace(/\+/g,"-");
},
encodeSignature:function(policy){
var hmac=crypto.createHmac("sha256",APP_SECRET);
hmac.update(policy);
return hmac.digest("hex");
}
This will allow you to call MyPackage.myFunction in the server-side of your Meteor app.
Last but not last, don't forget to "meteor add my-package" in order to use it !
You can use meteor-crypto-base64 package.
CryptoJS.enc.Base64.stringify(CryptoJS.enc.Utf8.parse('Hello, World!'));
//"SGVsbG8sIFdvcmxkIQ=="