Scala 22param limit trying to find a workaround and still use for comprehensions instead of plain SQL in Slick - sql

I'm working with 23 fields here, at last count. I've generally thrown my hands up with trying to count them after reducing from a 31-fielded table by using a foreign-key.
All the good links
Fundamental explanation of how to read and understand Slick's schema code provided by one very good Faiz.
On 22+ parameters...
Stefan Zeigar has been immensely helpful in the example code he's written in this discussion and also more directly linked to here on Github
The good Stefan Zeigar has also posted here on plain SQL queries
What this post is about
I think the above is enough to get me on my way to a working refactoring of my app so that CRUD is feasible. I'll update this question or ask new questions if something comes up and stagnates me. The thing is...
I miss using for comprehensions for querying. I'm talking about Slick's Query Templates
The problem I run into when I use a for comprehensions is that the table... will probably have
object Monsters extends Table[Int]("monster_table"){
// lots of column definitions
def * = id /* for a Table[Int] despite
having 21 other columns I'm not describing
in this projection/ColumnBase/??? */
}
and the * projection won't describe everything I want to return in a query.
The usual simple for comprehension Slick query template will look something like this:
def someQueryTemplate = for {
m <- Monsters
} yield m
and m will be an Int instead of the entire object I want because I declared the table to be a Table[Int] because I can't construct a mapped projection of 22 params because of all the code that needs to be generated for compiler support of class generation for each tuple and arbitrariness
So... in a nutshell:
Is there any way to use Query Templates in Slick with 22+ columns?

I stumbled on this question after answering a similar question a couple days ago. Here's a link to the question. slick error : type TupleXX is not a member of package scala (XX > 22)
To answer the question here, you are able to use nested tuples to solve the 22 column limit. When solving the problem myself, the following post was extremely helpful. https://groups.google.com/forum/#!msg/scalaquery/qjNW8P7VQJ8/ntqCkz0S4WIJ
Another option is to use the soon-to-be-released version of Slick which uses HLists to remove the column count limitation. This version of Slick can be pulled from the Slick master branch on Github. Kudos to cvogt for pointing out this option. https://github.com/slick/slick/blob/master/slick-testkit/src/main/scala/com/typesafe/slick/testkit/tests/MapperTest.scala#L249

Related

Automation framework design with Karate with database look up [duplicate]

This question already has an answer here:
Does Karate supports Neo4j Database?
(1 answer)
Closed 1 year ago.
I am trying to design a test framework using karate. I am facing a challenge there. Any help or pointers will be much appreciated .
Its a applications where extensive calculations are involved with complex mathematical rules. Some of the static parameters are fetched from database(MS SQL Server). Values for these parameters are expected to change every quarter of every 6 months.
How shall I construct my expected result set to be put in feature files so that I won't able to change it manually every time the value in database changes?
Best Regards,
Abhi
Here's what I would do. First, move all "dynamic" stuff into JSON files. For example: constants.json
{
"riskFactor": 0.5,
"ciRatio": 2
}
Then read this into a test:
Background:
* def constants = read('constants.json')
Then use the values in a test like this:
* def result = someCalc()
* match result = constants.riskFactor * 100 / constants.ciRatio
Now once you get that working, all you need to figure out is how to read that JSON file from a database. For that refer this: https://stackoverflow.com/a/52714248/143475
That said, I strongly recommend NOT using a database, it all sounds great in theory - but you are just going to add complexity and dependencies to your tests. Also, if you are going to rely on a "production" database, that violates some test principles, and I've seen many teams fall into that trap.
P.S. you can inject all key-value pairs in a JSON into scope as variables:
* karate.set(read('constants.json'))

Dynamic where clause and operand SQL Server

Having read almost all topics related to dynamic where clauses, I still can't find a way through.
Here is my source table:
Source Table
And the result I want is:
Results
In fact I want to return all values satisfying the Test value condition but don't know how to implement it dynamicly (I have a table with 700K lines).
Thanks a lot for your help.
EDIT:
Following your answers, I will detailed a bit more the approach.
Unfortunately, as I'm a new user, I'm not allowed to post pictures directly in the post.
I'm basicly performing segregation of duties controls over the SAP system.
Basicly, I want to test if some of the access of a customer are conflictual based on SAP extractions against a knowledge template stating potential conflicts.
Here is an simplified example of the SAP extract:
And here is a simplified example of the Potential conflict template:
<table><tbody><tr><th>Field</th><th>Value</th></tr><tr><td>ACTVT</td><td>AZ0220</td></tr><tr><td>KOART</td><td>K</td></tr><tr><td>BUKRS</td><td>*</td></tr><tr><td>EKGRP</td><td>03</td></tr></tbody></table>
This is a faxe example of the raw data of the customer:
<table><tbody><tr><th>FIELD</th><th>RANGE_START</th><th>RANGE_END</th></tr><tr><td>ACTVT</td><td>AZ01*</td><td>AZ99*</td></tr><tr><td>KOART</td><td>A</td><td>L</td></tr><tr><td>BUKRS</td><td>011</td><td>099</td></tr><tr><td>EKGRP</td><td>2</td><td>10</td></tr></tbody></table>
I thought a way of doing this is to use dynamic where clause.
Thanks a lot for your help

Continued issue using OrderBy and Take() with Included relations

I'm aware of numerous issues raised on the EF7 Github repo, and these still appear to be present with the latest RC2 nightly builds.
The issue appears to be when using OrderBy and Take() when you're trying to Include relations within your query. I believe, according to previous issues raised on Github that the SQL generated within the join is incorrect and does not take into account the OrderBy. After reading replies, people have suggested using Skip(0).Take(x) as a workaround, but unfortunately in my scenario, this didn't work.
In my specific scenario, I'm querying a model based on an ancestor PK. So, in order to work around this, instead of passing the PK straight into the query, I've added the ancestor model to a List<T> and used the following code in the query instead:
Before (not working when Including nested relations): p => p.examplePK == id
After (appears to be working):
p => myList.Select(c => c.myId).Contains(p.examplePK)
I'm not entirely sure why the 2nd example works and I would be grateful of any info that can be given - could there potentially be some client side evaluation going on here instead? From what I gather, the 2nd example will be performing a SQL IN statement, will it not?
Thanks in advance!

Converting an SQL Statement into R Code Without SQLDF

I'm a new-ish programmer in R and I'm having a bit of an issue with some SQL code.
What I want to do is to convert this operation to base R code. I know it's quite complicated and I tried using merge but I didn't really manage to get anywhere.
Censored <- sqldf("SELECT Censored1.ModelYearID, Censored1.InServiceDate, Censored1.Censored, Censored1.VIN
FROM Censored1 LEFT JOIN Claims ON Censored1.VIN = Claims.VIN
GROUP BY Censored1.ModelYearID, Censored1.InServiceDate, Censored1.Censored, Censored1.VIN, Claims.VIN
HAVING (((Claims.VIN) Is Null))")
The reason I want to do this is because I have ~1600 different Claims tables in a data frame list (df_listl) which are named like this:
LabourOperation.ModelYearID e.g. Q123456.1997, Q234567.1998
and I need to run this query for every one of these tables, putting each of the comparable censored tables in the same kind of list.
If anyone could help me with this, that would be great. It's a bit complicated and I'm really struggling as I've only just learnt that you can put data frames in lists!
I was thinking that lapply might be a good way to go but I'm not very good with functions yet.
Thank you in advance :D

SQL queries to their natural language description

Are there any open source tools that can generate a natural language description of a given SQL query? If not, some general pointers would be appreciated.
I don't know much about NLP, so I am not sure how difficult this is, although I saw from some previous discussion that the vice versa conversion is still an active area of research. It might help to say that the SQL tables I will be handling are not arbitrary in any sense, yet mine, which means that I know exact semantics of each table and its columns.
I can devise two approaches:
SQL was intended to be "legible" to non-technical people. A naïve and simpler way would be to perform a series of replacements right on the SQL query: "SELECT" -> "display"; "X=Y" -> "when the field X equals to value Y"... in this approach, using functions may be problematic.
Use a SQL parser and use a series of templates to realize the parsed structure in a textual form: "(SELECT (SUM(X)) (FROM (Y)))" -> "(display (the summation of (X)) (in the table (Y))"...
ANTLR has a grammar of SQL you can use: https://github.com/antlr/grammars-v4/blob/master/sqlite/SQLite.g4 and there are a couple SQL parsers:
http://www.sqlparser.com/sql-parser-java.php
https://github.com/facebook/presto/tree/master/presto-parser/src/main
http://db.apache.org/derby/
Parsing is a core process for executing a SQL query, check this for more information: https://decipherinfosys.wordpress.com/2007/04/19/parsing-of-sql-statements/
There is a new project (I am part of) called JustQuery.Me which intends to do just that with NLP and google's SyntaxNet. You can go to the https://github.com/justquery-me/justqueryme page for more info. Also, sign up for the mailing list at justqueryme-development#googlegroups.com and we will notify you when we have a proof of concept ready.