asyncStorage with ReactNative: JSON.parse doesn't when getting object back - react-native

Hy everyone !
I've stored a simple object in Async Storage in a ReactNative app.
But when I get it back, it isn't correctly parsed : all keys still got the quotes marks (added by JSON.stringify() when storing it) ...
I store data like that:
const storeData = () => {
let myData = {
title: 'Hummus',
price: '6.90',
id: '1'
}
AsyncStorage.setItem('#storage_Key', JSON.stringify(myData));
}
and then access data like that:
const getData= async () => {
const jsonValue = await AsyncStorage.getItem('#storage_Key')
console.log(jsonValue);
return JSON.parse(jsonValue);
}
and my object after parsing looks like that:
{"title":"Hummus","price":"6.90","id":"1"}
Any idea why quotes aren't removed from keys ??

That's because JSON specification says the keys should be string. What you are using is the modern representation of JSON called JSON5 (https://json5.org/). JSON5 is a superset of JSON specification and it does not require keys to be surrounded by quotes in some cases. When you stringify, it returns the result in JSON format.
Both JSON and JSON5 are equally valid in modern browsers. So, you should not be worries about breaking anything programmatically just because they look different.
You can use JSON5 as shown below and it will give you your desired Stringified result.
let myData = {
title: 'Hummus',
price: '6.90',
id: '1'
}
console.log(JSON5.stringify(myData));
console.log(JSON.stringify(myData));
<script src="https://unpkg.com/json5#^2.0.0/dist/index.min.js"></script>
Like this:
// JSON5.stringify
{title:'Hummus',price:'6.90',id:'1'}
// JSON.stringify
{"title":"Hummus","price":"6.90","id":"1"}

Related

Google Cloud Function -- Convert BigQuery Data to Gzip (Compressed) Json then Load to Cloud Storage

*For context, this script is largely based on the one found in this guide from Google: https://cloud.google.com/bigquery/docs/samples/bigquery-extract-table-json#bigquery_extract_table_json-nodejs
I have the below script which is functioning. However, it writes a normal JSON file to cloud storage. To be a bit more optimized for file transfer and storage,I wanted to use const {pako} = require('pako'); to compress the files before loading.
I haven't been able to figure out how to accomplish this, unfortunately, after numerous attempts.
Anyone have any ideas?
**I'm assuming it has something to do with the options in .extract(storage.bucket(bucketName).file(filename), options);, but again, pretty lost in how to figure this out unfortunately...
Any help would be appreciated! :)
**The intent of this function is:
It is a Google Cloud function
It gets data from BigQuery
It writes that data in JSON format to Cloud Storage
My goal is to integrate Pako (or another means of compression) to compress the JSON files to gzip format prior to moving into storage.
const {BigQuery} = require('#google-cloud/bigquery');
const {Storage} = require('#google-cloud/storage');
const functions = require('#google-cloud/functions-framework');
const bigquery = new BigQuery();
const storage = new Storage();
functions.http('extractTableJSON', async (req, res) => {
// Exports my_dataset:my_table to gcs://my-bucket/my-file as JSON.
// https://cloud.google.com/bigquery/docs/samples/bigquery-extract-table-json#bigquery_extract_table_json-nodejs
const DateYYYYMMDD = new Date().toISOString().slice(0,10).replace(/-/g,"");
const datasetId = "dataset-1";
const tableId = "example";
const bucketName = "domain.appspot.com";
const filename = `/cache/${DateYYYYMMDD}/example.json`;
// Location must match that of the source table.
const options = {
format: 'json',
location: 'US',
};
// Export data from the table into a Google Cloud Storage file
const [job] = await bigquery
.dataset(datasetId)
.table(tableId)
.extract(storage.bucket(bucketName).file(filename), options);
console.log(`Job ${job.id} created.`);
res.send(`Job ${job.id} created.`);
// Check the job's status for errors
const errors = job.status.errors;
if (errors && errors.length > 0) {
res.send(errors);
}
});
If you want to gzip compress the result, simply use that option
// Location must match that of the source table.
const options = {
format: 'json',
location: 'US',
gzip: true,
};
Job done ;)
From guillaume blaquiere Ah, you are looking for an array of rows!!! Ok, you can't have it out of the box. BigQuery export JSONL file (JSON Line, with 1 valid JSON per line, representing a row in BQ) – guillaume blaquiere
Turns out that I had a misunderstanding of the expected output. I was expecting a JSON Array, whereas the output is individual JSON lines, as Guillaume mentioned above.
So, if you're looking for a JSON Array output, you can still use the helper found below to convert the output, but turns out, that was in fact the expected output and I was mistakenly thinking it was inaccurate (sorry - I'm new ...)
// step #1: Use the below options to export to compressed JSON (as per guillaume blaquiere's note)
const options = {
format: 'json',
location: 'US',
gzip: true,
};
// step #2 (if you're looking for a JSON Array): you can use the below helper function to convert the response.
function convertToJsonArray(text: string): any {
// wrap in array and add comma at end of each line and remove last comma
const wrappedText = `[${text.replace(/\r?\n|\r/g, ",").slice(0, -1)}]`;
const jsonArray = JSON.parse(wrappedText);
return jsonArray;
}
For reference / in case it's helpful, i created this function that'll handle both compressed and uncompressed JSON that's returned.
The application of this is that i'm writing the BigQuery table to JSON in cloud storage to act as a "cache" then requesting that file from a React app and using the below to parse the file in the React app for use on frontend.
import pako from 'pako';
function convertToJsonArray(text: string): any {
const wrappedText = `[${text.replace(/\r?\n|\r/g, ",").slice(0, -1)}]`;
const jsonArray = JSON.parse(wrappedText);
return jsonArray;
}
async function getJsonData(arrayBuffer: ArrayBuffer): Promise<any> {
try {
const Uint8Arr = pako.inflate(arrayBuffer);
const arrayBuf = new TextDecoder().decode(Uint8Arr);
const jsonArray = convertToJsonArray(arrayBuf);
return jsonArray;
} catch (error) {
console.log("Error unzipping file, trying to parse as is.", error)
const parsedBuffer = new TextDecoder().decode(arrayBuffer);
const jsonArray = convertToJsonArray(parsedBuffer);
return jsonArray;
}
}

Passing array of objects into AWS Amplify GraphQL API mutation

I am using AWS Amplify's GraphQL API on my React Native app. I have an array of objects in my app like so:
const [data, setData] = useState([
{
id: someid,
thing: thing,
otherThing: otherThing
},
{
id: someid,
thing: thing,
otherThing: otherThing
}
]);
What would this need to look like in my schema.graphql? I currently have this defined like so:
type someThing #model {
UserID: String!
thingName: String
thingID: String! #primaryKey(sortKeyFields: ["UserID"])
data: [AWSJSON]
}
I'm currently getting this error in my app after calling createSomeThing mutation where I pass data: data as an input:
Variable 'data' has an invalid value. Unable to parse {id=8f3aa794-1881-4eaa-ba4e-0ac979b5b0a6, thing=pasta, otherThing=one} as valid JSON.
What's the issue here? Did I define this incorrectly in the schema.graphql? Or do I need to transform the data before passing into my mutation?
Solved by passing in data as a string, simply JSON.Stringify(data)

GraphQL queries must be strings

I am writing a data fetching service on an Express backend. It needs to fetch data from a graphQL endpoint.
I get the following error. I know it's descriptive of the issue but I don't understand it.
'GraphQL queries must be strings. It looks like you\\'re sending the internal graphql-js representation of a parsed query in your request instead of a request in the GraphQL query language. You can convert an AST to a string using the `print` function
from `graphql`, or use a client like `apollo-client` which converts the internal representation to a string for you.' }
This is the function I am using:
fetchMultipleProducts(first : Number, offset : number){
fetch({
query: gql`
query {
getProduct(query: {}, first : ${first}, offset : ${offset}) {
id
code
}
}
`
})
.then(res => {
Logger.info("Fetched data");
console.log(res);
return res;
})
.catch(err => {
Logger.error("Failed to fetch", err);
});
I am trying to pass in variables into it, I assume that's allowed? And using the Gql tag is standard?
Some help would be appreciated, thanks guys.
I removed the Gql tag and sent a string as instructed in the error message. Apologies for my silliness.

var within firebase set

I am trying to create some dynamic JSON based on a value of a name like below
this.merchantFirebase.child(firebase.auth().currentUser.uid).update({
this.props.data.name: {
status: this.state.productSwitch
}
});
I was thinking this would create something like
this.merchantFirebase.child(firebase.auth().currentUser.uid).update({
latte: {
status: this.state.productSwitch
}
});
but it is just given me an error of unexpected token
You'll need to use a different notation for this:
var updates = {};
updates[this.props.data.name] = { status: this.state.productSwitch };
this.merchantFirebase.child(firebase.auth().currentUser.uid).update(updates);
By using square-bracket notation, JavaScript "knows" that it needs to evaluate this.props.data.name as an expression, instead of using it as the literal name of the property (as it tries to do in your code).

vue.js - Cannot read property 'toLowerCase' of undefined

I am filtering projects with a computed property like this:
filtered_projects() {
return this.projects.filter( (project) => {
return project.title.toLowerCase().indexOf(this.project_filter.toLowerCase()) !== -1
})
}
When I add a new project using this
submitNewProject() {
let path = '/api/projects';
Vue.http.post(path, this.project)
.then( (rsp) => {
this.projects.push(rsp.data);
this.project = this.getSingleProject();
this.create_project = false;
return true;
});
}
This is giving me an error that I can't find
TypeError: Cannot read property 'toLowerCase' of undefined
It may just be that you are not correctly passing the projects data to the projects array.
Firstly vue-resource now uses body not data to get the response, therefore you may need to use:
this.projects.push(rsp.body)
then this will give you a single array item containing all projects which doesn't look like what you want. I believe instead you're after:
this.projects = rsp.body
Assuming the response body is an array of objects this will then allow:
this.projects.filter(project => {})
to work as expected. Meaning project.title should now be valid
EDIT
For a project title to be set to lowerCase you must be returning an object with a title param, i.e.
rsp.body = {
title: 'FOO',
}
which you'd then set on the projects array via:
this.projects.push(rsp.body)
so the first thing to fix is your response, sort your endpoint so it returns what you are expecting, then the rest of the above code should work
You need preserve "this" before Vue.http.post (self=this) and then in callback change this.projects to self.projects.
Explanation:
How to access the correct `this` context inside a callback?