How does node-redis method setex() convert Buffer to string? - node-redis

I'm using node-redis and I was hoping that someone could help me to figure out how this library converts Buffer to string. I gzip my data before I store it in redis with node-gzip and this call returns Promise<Buffer>
const data = JSON.stringify({ data: 'test' });
const compressed = await gzip(data, { level: 9 });
I tested following 2 approaches of saving buffer data into redis
Without .toString() - I pass the Buffer to the library and it will take care of the conversion
const result = await redisClient.setex('testKey', 3600, compressed);
and with .toString()
const result = await redisClient.setex('testKey', 3600, compressed.toString());
When I try these 2 approaches I don't get the same value saved in redis. I tried to use different params for .toString() to match the output of 1) but it didn't work
Reason why I need the saved value in 1) format is that I'm matching value format that what one of php pages generates
My code is working fine without .toString() but I would like to know how node-redis handles it internally
I've tried to find the answer in the source code and to debug and step into library calls but I didn't find the answer that I was looking for and I hope that someone can help me with this

It looks like happens in utils.js file:
utils.js
if (reply instanceof Buffer) {
return reply.toString();
}
Also use the proper options (i.e. return_buffers):
node-redis README
redis.createClient({ return_buffers: true });

Related

Google Cloud Function -- Convert BigQuery Data to Gzip (Compressed) Json then Load to Cloud Storage

*For context, this script is largely based on the one found in this guide from Google: https://cloud.google.com/bigquery/docs/samples/bigquery-extract-table-json#bigquery_extract_table_json-nodejs
I have the below script which is functioning. However, it writes a normal JSON file to cloud storage. To be a bit more optimized for file transfer and storage,I wanted to use const {pako} = require('pako'); to compress the files before loading.
I haven't been able to figure out how to accomplish this, unfortunately, after numerous attempts.
Anyone have any ideas?
**I'm assuming it has something to do with the options in .extract(storage.bucket(bucketName).file(filename), options);, but again, pretty lost in how to figure this out unfortunately...
Any help would be appreciated! :)
**The intent of this function is:
It is a Google Cloud function
It gets data from BigQuery
It writes that data in JSON format to Cloud Storage
My goal is to integrate Pako (or another means of compression) to compress the JSON files to gzip format prior to moving into storage.
const {BigQuery} = require('#google-cloud/bigquery');
const {Storage} = require('#google-cloud/storage');
const functions = require('#google-cloud/functions-framework');
const bigquery = new BigQuery();
const storage = new Storage();
functions.http('extractTableJSON', async (req, res) => {
// Exports my_dataset:my_table to gcs://my-bucket/my-file as JSON.
// https://cloud.google.com/bigquery/docs/samples/bigquery-extract-table-json#bigquery_extract_table_json-nodejs
const DateYYYYMMDD = new Date().toISOString().slice(0,10).replace(/-/g,"");
const datasetId = "dataset-1";
const tableId = "example";
const bucketName = "domain.appspot.com";
const filename = `/cache/${DateYYYYMMDD}/example.json`;
// Location must match that of the source table.
const options = {
format: 'json',
location: 'US',
};
// Export data from the table into a Google Cloud Storage file
const [job] = await bigquery
.dataset(datasetId)
.table(tableId)
.extract(storage.bucket(bucketName).file(filename), options);
console.log(`Job ${job.id} created.`);
res.send(`Job ${job.id} created.`);
// Check the job's status for errors
const errors = job.status.errors;
if (errors && errors.length > 0) {
res.send(errors);
}
});
If you want to gzip compress the result, simply use that option
// Location must match that of the source table.
const options = {
format: 'json',
location: 'US',
gzip: true,
};
Job done ;)
From guillaume blaquiere Ah, you are looking for an array of rows!!! Ok, you can't have it out of the box. BigQuery export JSONL file (JSON Line, with 1 valid JSON per line, representing a row in BQ) – guillaume blaquiere
Turns out that I had a misunderstanding of the expected output. I was expecting a JSON Array, whereas the output is individual JSON lines, as Guillaume mentioned above.
So, if you're looking for a JSON Array output, you can still use the helper found below to convert the output, but turns out, that was in fact the expected output and I was mistakenly thinking it was inaccurate (sorry - I'm new ...)
// step #1: Use the below options to export to compressed JSON (as per guillaume blaquiere's note)
const options = {
format: 'json',
location: 'US',
gzip: true,
};
// step #2 (if you're looking for a JSON Array): you can use the below helper function to convert the response.
function convertToJsonArray(text: string): any {
// wrap in array and add comma at end of each line and remove last comma
const wrappedText = `[${text.replace(/\r?\n|\r/g, ",").slice(0, -1)}]`;
const jsonArray = JSON.parse(wrappedText);
return jsonArray;
}
For reference / in case it's helpful, i created this function that'll handle both compressed and uncompressed JSON that's returned.
The application of this is that i'm writing the BigQuery table to JSON in cloud storage to act as a "cache" then requesting that file from a React app and using the below to parse the file in the React app for use on frontend.
import pako from 'pako';
function convertToJsonArray(text: string): any {
const wrappedText = `[${text.replace(/\r?\n|\r/g, ",").slice(0, -1)}]`;
const jsonArray = JSON.parse(wrappedText);
return jsonArray;
}
async function getJsonData(arrayBuffer: ArrayBuffer): Promise<any> {
try {
const Uint8Arr = pako.inflate(arrayBuffer);
const arrayBuf = new TextDecoder().decode(Uint8Arr);
const jsonArray = convertToJsonArray(arrayBuf);
return jsonArray;
} catch (error) {
console.log("Error unzipping file, trying to parse as is.", error)
const parsedBuffer = new TextDecoder().decode(arrayBuffer);
const jsonArray = convertToJsonArray(parsedBuffer);
return jsonArray;
}
}

Redis StackExchange LuaScripts with parameters

I'm trying to use the following Lua script using C# StackExchange library:
private const string LuaScriptToExecute = #"
local current
current = redis.call(""incr"", KEYS[1])
if current == 1 then
redis.call(""expire"", KEYS[1], KEYS[2])
return 1
else
return current
end
Whenever i'm evaluating the script "as a string", it works properly:
var incrementValue = await Database.ScriptEvaluateAsync(LuaScriptToExecute,
new RedisKey[] { key, ttlInSeconds });
If I understand correctly, each time I invoke the ScriptEvaluateAsync method, the script is transmitted to the redis server which is not very effective.
To overcome this, I tried using the "prepared script" approach, by running:
_setCounterWithExpiryScript = LuaScript.Prepare(LuaScriptToExecute);
...
...
var incrementValue = await Database.ScriptEvaluateAsync(_setCounterWithExpiryScript,
new[] { key, ttlInSeconds });
Whenever I try to use this approach, I receive the following error:
ERR Error running script (call to f_7c891a96328dfc3aca83aa6fb9340674b54c4442): #user_script:3: #user_script: 3: Lua redis() command arguments must be strings or integers
What am I doing wrong?
What is the right approach in using "prepared" LuaScripts that receive dynamic parameters?
If I look in the documentation: no idea.
If I look in the unit test on github it looks really easy.
(by the way, is your ttlInSeconds really RedisKey and not RedisValue? You are accessing it thru KEYS[2] - shouldnt that be ARGV[1]? Anyway...)
It looks like you should rewrite your script to use named parameters and not arguments:
private const string LuaScriptToExecute = #"
local current
current = redis.call(""incr"", #myKey)
if current == 1 then
redis.call(""expire"", #myKey, #ttl)
return 1
else
return current
end";
// We should load scripts to whole redis cluster. Even when we dont have any.
// In that case, there will be only one EndPoint, one iteration etc...
_myScripts = _redisMultiplexer.GetEndPoints()
.Select(endpoint => _redisMultiplexer.GetServer(endpoint))
.Where(server => server != null)
.Select(server => lua.Load(server))
.ToArray();
Then just execute it with anonymous class as parameter:
for(var setCounterWithExpiryScript in _myScripts)
{
var incrementValue = await Database.ScriptEvaluateAsync(
setCounterWithExpiryScript,
new {
myKey: (RedisKey)key, // or new RedisKey(key) or idk
ttl: (RedisKey)ttlInSeconds
}
)// .ConfigureAwait(false); // ? ;-)
// when ttlInSeconds is value and not key, just dont cast it to RedisKey
/*
var incrementValue = await
Database.ScriptEvaluateAsync(
setCounterWithExpiryScript,
new {
myKey: (RedisKey)key,
ttl: ttlInSeconds
}
).ConfigureAwait(false);*/
}
Warning:
Please note that Redis is in full-stop mode when executing scripts. Your script looks super-easy (you sometimes save one trip to redis (when current != 1) so i have a feeling that this script will be counter productive in greater-then-trivial scale. Just do one or two calls from c# and dont bother with this script.
First of all, Jan's comment above is correct.
The script line that updated the key's TTL should be redis.call(""expire"", KEYS[1], ARGV[1]).
Regarding the issue itself, after searching for similar issues in RedisStackExchange's Github, I found that Lua scripts do not work really well in cluster mode.
Fortunately, it seems that "loading the scripts" isn't really necessary.
The ScriptEvaluateAsync method works properly in cluster mode and is sufficient (caching-wise).
More details can be found in the following Github issue.
So at the end, using ScriptEvaluateAsync without "preparing the script" did the job.
As a side note about Jan's comment above that this script isn't needed and can be replaced with two C# calls, it is actually quite important since this operation should be atomic as it is a "Rate limiter" pattern.

Cloud Function retrieving a value based on URL parameter

I am trying to write a Cloud Function in node where I can return a token from a parameter.
The URL I use is...
https://us-central1-nmnm03.cloudfunctions.net/GetAccount?taccount=Asd
my function is this... and its wrong. I suspect I am not assigning TT properly.
var functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.GetAccount = functions.https.onRequest((req, res) => {
const t = admin.database().ref('/newaccout/'+req.query.account)
const tt = t.child(token)
res.send( "res is " + tt );
});
req.query.account is the Key. One of the Items in the document is token
ideally, I would like to get something like...
{"token":"23453458885"}
Could I get a node hint please... thanks
Though, I am not a firebase geek. What it seems from the documentation is that you will have two events that you can use to listen for retrieving child data. You can read further more here. The given options are used for different cases. Please follow through the mentioned link to have clear view.
Inside your cloud function you can try doing following:
const t = admin.database().ref('/newaccout/'+req.query.account)
t.on('child_added', function(data) {
res.json({
token: data.token
})
})
Or maybe like this:
const t = admin.database().ref('/newaccout/'+req.query.account)
t.once('value', function(snapshot) {
//Process it like above
//But here you will get al child elements at once
});
It looks like you are expecting to query the value found at a database reference stored at t. Unfortunately, you haven't actually performed a query yet. tt is just yet another Reference object that points to a location in the database. You should use the once() method on Reference to query a database location. Also bear in mind that you are using a variable called token, but you haven't defined yet in your code. To me, that looks like it would generate an error to me.
You might be well served by looking at a bunch of the sample code.

Dojo datagrid jsonrest response headers

I'd like to use custom headers to provide some more information about the response data. Is it possible to get the headers in a response from a dojo datagrid hooked up to a jsonRest object via an object store (dojo 1.7)? I see this is possible when you are making the XHR request, but in this case it is being made by the grid.
The API provides an event for a response error which returns the response object:
on(this.grid, 'FetchError', function (response, req) {
var header = response.xhr.getAllResponseHeaders();
});
using this I am successfully able to access my custom response headers. However, there doesn't appear to be a way to get the response object when the request is successful. I have been using the undocumented private event _onFetchComplete with aspect after, however, this does not allow access to the response object, just the response values
aspect.after(this.grid, '_onFetchComplete', function (response, request)
{
///unable to get headers, response is the returned values
}, true);
Edit:
I managed to get something working, but I suspect it is very over engineered and someone with a better understanding could come up with a simpler solution. I ended up adding aspect around to allow me to get hold of the deferred object in the rest store which is returned to the object store. Here I added a new function to the deffered to return the headers. I then hooked in to the onFetch of the object store using dojo hitch (because I needed the results in the current scope). It seems messy to me
aspect.around(restStore, "query", function (original) {
return function (method, args) {
var def = original.call(this, method, args);
def.headers = deferred1.then(function () {
var hd = def.ioArgs.xhr.getResponseHeader("myHeader");
return hd;
});
return def;
};
});
aspect.after(objectStore, 'onFetch', lang.hitch(this, function (response) {
response.headers.then(lang.hitch(this, function (evt) {
var headerResult = evt;
}));
}), true);
Is there a better way?
I solved this today after reading this post, thought I'd feed back.
dojo/store/JsonRest solves it also but my code ended up slightly different.
var MyStore = declare(JsonRest, {
query: function () {
var results = this.inherited(arguments);
console.log('Results: ', results);
results.response.then(function (res) {
var myheader = res.xhr.getResponseHeader('My-Header');
doSomethingWith(myheader);
});
return results;
}
});
So you override the normal query() function, let it execute and return its promise, and attach your own listener to its 'response' member resolving, in which you can access the xhr object that has the headers. This ought to let you interpret the JsonRest result while fitting nicely into the chain of the query() all invokers.
One word of warning, this code is modified for posting here, and actually inherited from another intermediary class that also overrode query(), but the basics here are pretty sound.
If what you want is to get info from the server, also a custom key-value in the cookie can be a solution, that was my case, first I was looking for a custom response header but I couldn't make it work so I did the cookie way getting the info after the grid data is fetched:
dojo.connect(grid, "_onFetchComplete", function (){
doSomethingWith(dojo.cookie("My-Key"));
});
This is useful for example to present a SUM(field) for all rows in a paginated datagrid, and not only those included in the current page. In the server you can fetch the COUNT and the SUM, the COUNT will be sent in the Content-Range header and the SUM can be sent in the cookie.

Does a foreach loop work directly with a JSonStore data object store?

I create a JSonStore with a JSON formatted array of objects.
I have verified it is properly formatted.
I then try to use a dojo forEach loop on it but the JSonStore doesn't seem to have any data in it. I can specify the target in my web page URL and it shows the right data. But using console.log(myJsonStore) shows an object but I don't see the data in Firebug. I also don't see any GET for the service providing the data. It's like specifying the target path in a URL in the browser fires the GET but not when I try to trigger it in the postCreate where my foreach is located.
The answer from Ricardo, i believe is a little incorrect, seeing as the JsonRest.query function returns a dojo.Deferred.
You have a REST call being made asynchroniously through store read api - and once it returns values, it will promise to run whats set as the callback.
Try this for your loop iterator instead
storeObj.query( {} ).then(function ( results ) {
dojo.forEach( results, function( obj ) {
console.log( obj );
});
}
you can do this:
var storeObj = new JsonRest({
target: "/some/resource"
});
storeObj.query({}).forEach(function(obj){console.log(obj);});
that should do the trick