How to Query map in Groovy with sql? - sql

I am trying to query a map in Groovy using mysql using:
def dat = [["id": person[-1], "date" : appt]]
dat.each{ db ->
sql.eachRow(
"select * from ${Sql.expand(db)};",
{ println "\t$db ${it.mid}"} );
but I get an error:
java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '=10886, date=19-01-2017}' at line 1
what seems to be the problem here?
************************************************* EDIT *****************************************************
I'm now trying to insert the map into a mysql, which I am then querying using:
sql.execute '''DROP TABLE IF EXISTS EDSS'''
sql.execute '''
CREATE TABLE EDSS (
id INT,
Clinic VARCHAR(15),
EDSS VARCHAR(64),
item VARCHAR(64)
);
'''
sql.withBatch("INSERT INTO EDSS (id, Clinic, EDSS, item) VALUES (?,?,?,?)"){ bt ->
bt.addBatch(df)
}
def res = sql.eachRow("select * from EDSS"){ row ->
println "$row"
}
the sql.eachRow part works fine i.e. I can select, but in the the insert statement I only seem to be inserting the first row within the map i.e.
println(df):
[1025386, 20-10-2017, null, ahddkw9d9c]
[10213446, 19-04-2017, 2.5, null]
[102382, 19-04-2017, null, null]
[1628466, 19-04-2017, null, 292jdmd02d]
[1111345, 18-09-2015, unchanged, null]
but:
println(res):
[1025386, 20-10-2017, null, ahddkw9d9c]
*********************************** Another EDIT *****************************************************
So trying to loop round all the values in the map, df, with:
sql.withBatch { stmt ->
df.each { k, v, x, y ->
stmt.addBatch("INSERT INTO EDSS (study_id, Clinic, EDSS, NHS) VALUES ('$k', '$v', '$x', '$y')")
}
}
results in:
groovy.lang.MissingMethodException: No signature of method: sql$_run_closure1$_closure2$_closure4.doCall() is applicable for argument types: (java.lang.String) values: [1025386]
I'm used to R where everything is nice and vectorized. If anyone can help at all that would be excellent!

What I observed here :
You are trying to insert the java.util.ArrayList [[val11, val12,...],[val21, val22,...]]. you almost close at what you are trying to do.
Note : in comment dagget already mentioned this answer. I am just trying to show in code.
Answer :
def df = [
[1025386, '20-10-2017', null, 'ahddkw9d9c'],
[10213446, '19-04-2017', 2.5, null],
[102382, '19-04-2017', null, null],
[1628466,'19-04-2017', null, '292jdmd02d'],
[1111345, '18-09-2015', 'unchanged', null]
]
sql.withBatch("INSERT INTO EDSS (id, Clinic, EDSS, item) VALUES (?,?,?,?)"){ bt ->
df.each { row ->
bt.addBatch (row) // <==== Answer Line
}
}
sql.eachRow("select * from EDSS"){ row ->
println "$row"
}
Description :
The variable bt is actually BatchingPreparedStatementWrapper object. See the groovy document
So the addBatch() method can accept these types List<Object> or Object[].
Output :
[id:1025386, Clinic:20-10-2017, EDSS:[null], item:ahddkw9d9c]
[id:10213446, Clinic:19-04-2017, EDSS:2.5, item:[null]]
[id:102382, Clinic:19-04-2017, EDSS:[null], item:[null]]
[id:1628466, Clinic:19-04-2017, EDSS:[null], item:292jdmd02d]
[id:1111345, Clinic:18-09-2015, EDSS:unchanged, item:[null]]

Related

How to pass a param for a binding in PostgreSQL - COPY (... ) TO STDOUT (FORMAT binary)?

I have some simple test table in postgres like below:
--DROP TABLE test_point
CREATE TABLE test_point
(
serie_id INT NOT NULL,
version_ts INT NOT NULL,
PRIMARY KEY (serie_id, version_ts)
);
I try to load a data from it by using COPY TO STDOUT and binary buffers. This is sql definition I use in a test case:
COPY (
SELECT version_ts
FROM test_point
WHERE
serie_id = $1::int
) TO STDOUT (FORMAT binary);
It works ok, if I don't provide any param to bind to in SQL. If I use simple select it recognizes params also as well.
I was trying to provide explicit info about param type during stmt preparation also, but results were similar (it doesn't recognize param).
This is a message I receive during the test case:
0x000001740a288ab0 "ERROR: bind message supplies 1 parameters, but prepared statement \"test1\" requires 0\n"
How to properly provide a param for COPY() statement?
I don't want to cut/concatenate strings for timestamp params and similar types.
Below is a test case showing the issue.
TEST(TSStorage, CopyParamTest)
{
auto sql = R"(
COPY (
SELECT version_ts
FROM test_point
WHERE
serie_id = $1::int
) TO STDOUT (FORMAT binary);
)";
auto connPtr = PQconnectdb("postgresql://postgres:pswd#localhost/some_db");
auto result = PQprepare(connPtr, "test1", sql, 0, nullptr);
// Lambda to test result status
auto testRes = [&](ExecStatusType status)
{
if (PQresultStatus(result) != status)
{
PQclear(result);
auto errorMsg = PQerrorMessage(connPtr);
PQfinish(connPtr);
throw std::runtime_error(errorMsg);
}
};
testRes(PGRES_COMMAND_OK);
PQclear(result);
int seriesIdParam = htonl(5);
const char *paramValues[] = {(const char *)&seriesIdParam};
const int paramLengths[] = {sizeof(seriesIdParam)};
const int paramFormats[] = {1}; // 1 means binary
// Execute prepared statement
result = PQexecPrepared(connPtr,
"test1",
1, //nParams,
paramValues,
paramLengths,
paramFormats,
1); // Output format - binary
// Ensure it's in COPY_OUT state
//testRes(PGRES_COPY_OUT);
if (PQresultStatus(result) != PGRES_COPY_OUT)
{
auto errorMsg = PQerrorMessage(connPtr);
int set_breakpoint_here = 0; // !!! !!! !!!
}
PQclear(result);
PQfinish(connPtr);
}

A remote access exception in DolphinDB,Can't find the object with name loadTable('dfs://zctestDB','trainInfoTable')

I want to remotely query the database in DolphinDB.The database is created on the server 38.124.2.173 with the following script in the server ,
tableSchema = table(100:0,`trainID`ts`tag01`tag02,`tag03,[INT,TIMESTAMP,FLOAT,FLOAT,FLOAT )
db1 = database("",VALUE,(today()-92)..(today()+60))
db2 = database("",RANGE,0..80*10+1)
db = database("dfs://zctestDB",COMPO,[db1,db2])
dfsTable = db.createPartitionedTable(tableSchema,"trainInfoTable",`ts`trainID)
My query code as below,
def testParallelQuery( connVector,trainIDs,startTime, endTime ){
cols=`trainID`ts`tag01
whereConditions=[<trainID in trainIDs>,expr(sqlCol(`ts),between,startTime:endTime)]
script=sql(sqlCol(cols),"loadTable('dfs://zctestDB','trainInfoTable')",whereConditions)
return ploop(remoteRun{,script}, connVector)
}
host="38.124.2.173"
port=30599
connVector = loop(xdb, take(host, 10), port, "admin", "123456")
testParallelQuery( connVector,1..5,2019.06.14T00:00:00.000, 2019.06.14T01:00:00.000 )
The following exception occurred after I ran it,
Error was raised when execution : Can't find the object with name loadTable('dfs://zctestDB','trainInfoTable')
How can I solve this problem?
Function sql helps one construct a sql statement dynamically. The parameter from accepts three types of data: (1) a table object, (2) an expression representing a table or a table join, (3) a variable associated with a table object.
In your particular case, please pass an expression to from as follows.
def runSQL(trainIDs, startTime, endTime){
cols = `trainID`ts`tag01
whereConditions = [<trainID in trainIDs>, expr(sqlCol(`ts), between, startTime:endTime)]
return sql(sqlCol(cols), loadTable('dfs://zctestDB','trainInfoTable'), whereConditions).eval()
}
def testParallelQuery( connVector,trainIDs,startTime, endTime ){
return ploop(remoteRun{,runSQL{trainIDs, startTime, endTime}}, connVector)
}
host="38.124.2.173"
port=30599
connVector = loop(xdb, take(host, 10), port, "admin", "123456")
testParallelQuery( connVector,1..5,2019.06.14T00:00:00.000, 2019.06.14T01:00:00.000 )

What is the proper way to use IF THEN in AQL?

I'm trying to use IF THEN style AQL, but the only relevant operator I could find in the AQL documentation was the ternary operator. I tried to add IF THEN syntax to my already working AQL but it gives syntax errors no matter what I try.
LET doc = DOCUMENT('xp/a-b')
LET now = DATE_NOW()
doc == null || now - doc.last >= 45e3 ?
LET mult = (doc == null || now - doc.last >= 6e5 ? 1 : doc.multiplier)
LET gained = FLOOR((RAND() * 3 + 3) * mult)
UPSERT {_key: 'a-b'}
INSERT {
amount: gained,
total: gained,
multiplier: 1.1,
last: now
}
UPDATE {
amount: doc.amount + gained,
total: doc.total + gained,
multiplier: (mult < 4 ? FLOOR((mult + 0.1) * 10) / 10 : 4),
last: now
}
IN xp
RETURN NEW
:
RETURN null
Gives the following error message:
stacktrace: ArangoError: AQL: syntax error, unexpected identifier near 'doc == null || now - doc.last >=...' at position 1:51 (while parsing)
The ternary operator can not be used like an if/else construct in the way to tried. It is for conditional (sub-)expressions like you use to calculate mult. It can not stand by itself, there is nothing it can be returned or assigned to if you write it like an if-expression.
Moreover, it would require braces, but the actual problem is that the body contains operations like LET, UPSERT and RETURN. These are language constructs which can not be used inside of expressions.
If I understand correctly, you want to:
insert a new document if no document with key a-b exists yet in collection xb
if it does exist, then update it, but only if the last update was 45 seconds or longer ago
Does the following query work for you?
FOR id IN [ 'xp/a-b' ]
LET doc = DOCUMENT(id)
LET key = PARSE_IDENTIFIER(id).key
LET now = DATE_NOW()
FILTER doc == null || now - doc.last >= 45e3
LET mult = (doc == null || now - doc.last >= 6e5 ? 1 : doc.multiplier)
LET gained = FLOOR((RAND() * 3 + 3) * mult)
UPSERT { _key: key }
INSERT {
_key: key,
amount: gained,
total: gained,
multiplier: 1.1,
last: now
}
UPDATE {
amount: doc.amount + gained,
total: doc.total + gained,
multiplier: (mult < 4 ? FLOOR((mult + 0.1) * 10) / 10 : 4),
last: now
}
IN xp
RETURN NEW
I added _key to INSERT, otherwise the document will get an auto-generated key, which does not seem intended. Using a FOR loop and a FILTER acts like an IF construct (without ELSE). Because this is a data modification query, it is not necessary to explicitly RETURN anything and in your original query you RETURN null for the ELSE case anyway. While yours would result in [ null ], mine produces [ ] (truly empty result) if you try execute the query in quick succession and nothing gets updated or inserted.
Note that it is necessary to use PARSE_IDENTIFIER() to get the key from the document ID string and do INSERT { _key: key }. With INSERT { _key: doc._key } you would run into an invalid document key error in the insert case, because if there is no document xp/a-b, DOCUMENT() returns null and doc._key is therefore also null, leading to _key: null - which is invalid.

Is there unpivot or cross apply in ServiceStack ormlite?

I am using ServiceStack 4.5.14. I want to pass a list of Guid to such as below query.
Table Name: Image
Columns: (Id -> Type=Guid) (ImageId -> Type=Guid) (Guid -> Type=Guid)
var result = Db.ExecuteSql("select value from image unpivot (value for col in (Id, ImageId)) un where Guid=(#param) order by Guid",
new { param = "5de7f247-f590-479a-9c29-2e68a57e711c" });
It returns a result which their Id and ImageId are 000.... while they are null.
Another question is: how can I send a list of Guid as parameter to above query?
To query a parameterized field you should include the Guid instead of the string, e.g:
var result = Db.ExecuteSql(
#"select value from image unpivot (value for col in (Id, ImageId)) un
where Guid=(#param) order by Guid",
new { param = new Guid("5de7f247-f590-479a-9c29-2e68a57e711c") });
If values are null, it's likely masquerading an error, you can bubble errors with:
OrmLiteConfig.ThrowOnError = true;
Or enable debug logging with:
LogManager.LogFactory = new ConsoleLogFactory();
In v5+ you can also inspect SQL commands before they're executed with:
OrmLiteConfig.BeforeExecFilter = dbCmd => Console.WriteLine(dbCmd.GetDebugString());

List is returning with two sets of square brackets from sql executeInsert

I am working on a groovy script which uses the result of an sql exceuteInsert in the next insert statement executed.
The list returned from the executeInsert has two sets of squarebrackets around it, which is causing sql syntax errors. I am unsure why it is returning with two sets of brackets and if there is a way of removing both.
I have managed to remove one set of brackets using a join. My code is as follows:
db.withTransaction {
def ticketResultList
parsedTicketData.each { ticket ->
String ticketQuery = "INSERT INTO ticket" + "(name, summary) VALUES" + "('${ticket.name}','${ticket.summary}')"
def ticketResult = db.executeInsert(ticketQuery)
ticketResultList = ticketResult.join(",")
}
parsedStatusData.each { status ->
String ticketStatusQuery = "INSERT INTO ticket_status" + "(status, status_date, ticket_id, version) VALUES" + "('${status.status}','${status.statusDate}', ${ticketResultList}, 1)"
db.executeInsert(ticketStatusQuery)
}
The following is the sql error that is being received: Rolling back due to: Incorrect integer value: [31041] for column 'ticket_id' at row 1
Without knowing anything about the code or motivation behind the code /and without reading the comments too!) I'd write something like this:
db.withTransaction {
def ticketResultList = []
parsedTicketData.each { ticket ->
def ticketQuery = "INSERT INTO ticket" + "(name, summary) VALUES" + "('${ticket.name}','${ticket.summary}')"
def ticketResult = db.executeInsert(ticketQuery)
ticketResultList << ticketResult
}
parsedStatusData.each { status ->
def ticketStatusQuery = "INSERT INTO ticket_status" + "(status, status_date, ticket_id, version) VALUES" + "('${status.status}','${status.statusDate}', ${ticketResultList.flatten().join(',')}, 1)"
db.executeInsert(ticketStatusQuery)
}
Based on the information from Groovy executeInsert.
I'm assuming:
your table ticket returns one autogerarated key
the lists parsedStatusData and parsedTicketData have the same length and corresponding entries on the same possition.
db.withTransaction {
parsedTicketData.each { ticket ->
def INS1 = "INSERT INTO ticket (name, summary) VALUES (${ticket.name},${ticket.summary})"
def ticketResult = db.executeInsert(INS1)
ticketResultList << ticketResult[0][0] /* get the 1st returned value */
}
parsedStatusData.eachWithIndex { status, i ->
def INS2 = """INSERT INTO ticket_status (status, status_date, ticket_id, version)
VALUES (${status.status},${status.statusDate}, ${ticketResultList[i]}, 1)"""
db.executeInsert(INS2)
}
}
Note also that I reformulated your SQL strings, this is not only more compact, but leads also to the use of bind varibles. The produced SQL is something like
INSERT INTO ticket (name, summary) VALUES (:1,:2) RETURNING ID INTO :3