How to run SQL query with inside a javascript loop(map)? - sql

what I'm trying to do is:
loop through array of objects using map function
run multiple SQL queries according to object
append result of queries with object
I'm looping through offers which is array of object.I get free_item and buy_item which are associated with offer.I am using knexjs for postgresql database with nodejs
Here's actual code:
offers = await Promise.all(offers.map(async offer => {
free_item_id = await db("offer_free_items").where({"offer_free_items.offer_id":offer.id}).select(["item_id"]).first()
console.log("-----------------debug line 1------------------")
buy_item_id = await db("offer_buy_items").where({"offer_buy_items.offer_id":offer.id}).select(["item_id"]).first()
console.log("-----------------debug line 2-------------------")
offer["free_item_id"] = get_item_id
offer["buy_item_id"] = buy_item_id
return offer
}))
The Problem is it is not running in correct sequence.The sequence of the output is
debug line 1
debug line 1
debug line 2
debug line 2
The correct order should be like this:
debug line 1
debug line 2
debug line 1
debug line 2

Using map() iterates through your array, executing your provided function on each item and storing the results in an array. As you're passing an asynchronous function, map() will run the function on each item in parallel and return an array of pending promises, which is then wrapped in the Promise.all call to wait for them to finish.
If you're looking to run through your array sequentially, a simple for...of loop will do:
for (let offer of offers) {
offer["free_item_id"] = await db("offer_free_items").where({"offer_free_items.offer_id":offer.id}).select(["item_id"]).first()
console.log("-----------------debug line 1------------------")
offer["buy_item_id"] = await db("offer_buy_items").where({"offer_buy_items.offer_id":offer.id}).select(["item_id"]).first()
console.log("-----------------debug line 2-------------------")
}

Related

Execute Lua script to delete all keys matching a pattern on Redis DB via stackexchange.redis

I have a Lua script which deletes all keys matching a pattern. The script is the following:
EVAL "return redis.call('del', 'defaultKey', unpack(redis.call('keys', ARGV[1])))" 0 ad:*
This works fine within redis-cli, but I want to execute this within a .NET app using StachExchange.Redis.
I tried the following:
await db.ExecuteAsync("EVAL", "\"return redis.call('del', 'defaultKey', unpack(redis.call('keys', ARGV[1])))\" 0 ad:*");
but I get the following error
| Errormessage: ERR wrong number of arguments for 'eval' command
Found the answer
var script = "return redis.call('del', 'defaultKey', unpack(redis.call('keys', #keypattern)))";
var prepared = LuaScript.Prepare(script);
var noOfDeletedKeys = db.ScriptEvaluate(prepared, new { keypattern = (RedisKey)"ad:*" });

How to pass an array as a parameter to statementExecPromisified(statement,[]) function of sap-hdbext-promisfied in nodejs?

I have an array of users as below
let usersarr = ["'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"]
I want to fetch data about the above users(if exists) from Hana database. I am using sap-hdbext-promisfied library in node.js.
My database connection is working fine. So, I am trying to execute a select query as below
async function readUsers(xsaDbConn){
try{
let usersarr = ["'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"]
const checkuserexiststatement = await xsaDbConn.preparePromisified("SELECT USER_NAME FROM USERS WHERE USER_NAME IN (?)")
let checkuserexistresult = await xsaDbConn.statementExecPromisified(checkuserexiststatement, [usersarr])
console.log(checkuserexistresult)
return checkuserexistresult
}catch(err){
console.log(err)
return;
}
}
Below is the output I get
PS C:\Users\Documents\XSA\SAC_POC\cap_njs> npm start
> cap_njs#1.0.0 start C:\Users\Documents\XSA\SAC_POC\cap_njs
> node server.js
myapp is using Node.js version: v12.18.3
myapp listening on port 3000
[]
I get an empty array object as output. This is not the expected output, instead it should provide details about the users as they exist in the database.
The above code works when I provide single user value instead of multiple users in an array as shown below
async function readUsers(xsaDbConn, tempxsahdbusers){
try{
let usersarr = 'SAC_XSA_HDB_USER_ABC'
const checkuserexiststatement = await xsaDbConn.preparePromisified("SELECT USER_NAME FROM USERS WHERE USER_NAME IN (?)")
let checkuserexistresult = await xsaDbConn.statementExecPromisified(checkuserexiststatement, [usersarr])
console.log(checkuserexistresult)
return checkuserexistresult
}catch(err){
console.log(err)
return;
}
}
Output Of Above Code -
PS C:\Users\Documents\XSA\SAC_POC\cap_njs> npm start
> cap_njs#1.0.0 start C:\Users\Documents\XSA\SAC_POC\cap_njs
> node server.js
myapp is using Node.js version: v12.18.3
myapp listening on port 3000
[ 'SAC_XSA_HDB_USER_ABC' ]
So, why is it giving an empty array object when I provide an array as a parameter instead of a variable? Is it possible to provide an array as a parameter to the function statementExecPromisified(statement, []) of sap-hdbext-promisfied library in node.js ?
Your
let usersarr = ["'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"]
has exactly one value, the String:
"'SAC_XSA_HDB_USER_ABC','SAC_XSA_HDB_USER_DEF'"
When passing the userarr in the statementExecPromisified function as a parameter you are actually passing a nested array in an array. You could either try
xsaDbConn.statementExecPromisified(checkuserexiststatement, [usersarr[0]])
or separate the values in the userarr and add multiple ? in the prepared statement and reference each single value with userarr[x].

IntelliJ IDEA LiveTemplate auto increment between usages

I am trying to make my life easier with Live Templates in intelliJ
I need to increment some param by 1 every-time I use the snippet.
So I tried to develop some groovyScript, and I am close, but my groovy capabilities keeps me back. the number is not incremented by 1, but incremented by 57 for some reason... (UTF-8?)
here is the script:
File file = new File("out.txt");
int code = Integer.parseInt(file.getText('UTF-8'));
code=code+1;
try{
if(_1){
code = Integer.parseInt(_1);
}
} catch(Exception e){}
file.text = code.toString();
return code
So whenever there's param passed to this script (with _1) the initial value is set, and otherwise simply incremented.
this script needs to be passed to the live template param with:
groovyScript("File file = new File(\"out.txt\");int code = Integer.parseInt(file.getText(\'UTF-8\'));code=code+1;String propName = \'_1\';if(this.hasProperty(propName) && this.\"$propName\"){code = Integer.parseInt(_1);};file.text =code.toString();return code", "<optional initial value>")

Iterate over a CSV Data Set Config with varying starting index in Apache JMeter

My requirement is to iterate over a CSV Data Set Config in Apache JMeter with a varying starting index. Let us assume I have started a test plan in JMeter today and my CSV file has 8 variables. The first time my sampler will run from 1st row to 8th row. The next time I will start running my test plan I want sampler to pick values from 2nd index to 8th index. In this manner, I want to iterate over CSV file using CSV Data set config.
I am able to initialize a counter for every test run in Apache JMeter using setUp ThreadGroup and tearDown Thread group. I am able to extract the same using _P(count) in JMeter.
In setUp Thread group I have included JSR 223 Sampler and written a script like
def file = new File('number')
if (!file.exists() || !file.canRead()) {
number = '1'
}
else {
number = file.text
}
props.put('number', number as String)
In tearDown Thread Group the JSR223 Sampler has a script like
def number = props.get('number') as int
number++
new File('number').text = number
I want to loop over my CSV data set config file with the counter through properties file( which is getting incremented by 1 for every test run)
Please check the below plan:-
Input CSV example:-
If Controller has the below code:-
${__groovy(vars.get('Used').take(1)!='Y')}
In JSR223 post processor, I have the below code:-
def inputFile = new File("C:\\Path\\toFile\\Excel\\OutputCSV.csv")
def lines = inputFile.readLines()
boolean isWrite = false;
lines.each { String line ->
if(line.contains('Used'))
{
inputFile.write(line + '\n')
}
else
{
if(line.startsWith('Y'))
{
inputFile.append(line + '\n')
}
else if (!isWrite)
{
inputFile.append('Y' + line + '\n')
isWrite = true;
}
else
{
inputFile.append(line + '\n')
}
}
}
First Run output:-
Second Run output:-
As you can see, in first run sample 1 execute 4 time and in 2nd it is executed 3 times.
This is not the nicest or best code, just first try.
Please check if helps.

ReactiveX collect elements processed before a failure

I'm using RxJava to create a background job syncronizing my db.
It connects to an external source and start to process entries, map them and insert in the db.
When it ends I need the list with all the elements processed, I can get it when everything goes right, but how can I collect all the elements processed if during the flow something fail?
final List<String> res = Observable.create(onSubscribe)
.buffer(4)
.flatMap(TestRx::doStuff)
.buffer(8)
.map(TestRx::calculateList)
.toList()
.toBlocking()
.single();
System.out.println("strings = " + res);
What I would like to have is a way that if doStuff or calculateList throw exceptions, the flow stop an returns the list with everything it processed until the error.
List<String> res = Observable.create(onSubscribe)
.buffer(4)
.flatMap(TestRx::doStuff)
.onErrorResumeNext(Observable.empty()) // turn error into completion
.buffer(8)
.map(TestRx::calculateList)
.onErrorResumeNext(Observable.empty()) // turn error into completion
.toList()
.toBlocking()
.single();
System.out.println("strings = " + res);