Why use a CROSSJOIN when it seems to be implied? - ssas

What is the difference between the following two, and which is preferred over the other?
CROSSJOIN(
[Team].[Hierarchy].[Conference].[NFC],
{[Team].[Name].[Name].[Detroit Lions], [Team].[Name].[Name].[Minnesota Vikings]}
)
And:
(
[Team].[Hierarchy].[Conference].[NFC],
{[Team].[Name].[Name].[Detroit Lions], [Team].[Name].[Name].[Minnesota Vikings]}
)
It seems any sets supplied within a tuple are automatically crossjoined, so why the need for the CROSSJOIN` keyword/function? Is it correct to say that the following three are the same?
CROSSJOIN({}, {}, ...) == ({}, {}, ...) == {} * {} * ...

There is no diffrence between the above two statements. However in the last statement
CROSSJOIN({}, {}, ...) == ({}, {}, ...) == {} * {} * ...
the last part "({}, {}, ...) == {} * {} *" is not necessarily true. When you write
"{} * {}" it only exist if both sets (represented by {}) are same in dimensonility and hierarchility.
For details refer
Dimensionality and hierarchility

Related

How to make complex nested where conditions with typeORM?

I am having multiple nested where conditions and want to generate them without too much code duplication with typeORM.
The SQL where condition should be something like this:
WHERE "Table"."id" = $1
AND
"Table"."notAvailable" IS NULL
AND
(
"Table"."date" > $2
OR
(
"Table"."date" = $2
AND
"Table"."myId" > $3
)
)
AND
(
"Table"."created" = $2
OR
"Table"."updated" = $4
)
AND
(
"Table"."text" ilike '%search%'
OR
"Table"."name" ilike '%search%'
)
But with the FindConditions it seems not to be possible to make them nested and so I have to use all possible combinations of AND in an FindConditions array. And it isn't possible to split it to .where() and .andWhere() cause andWhere can't use an Object Literal.
Is there another possibility to achieve this query with typeORM without using Raw SQL?
When using the queryBuilder I would recommend using Brackets
as stated in the Typeorm doc: https://typeorm.io/#/select-query-builder/adding-where-expression
You could do something like:
createQueryBuilder("user")
.where("user.registered = :registered", { registered: true })
.andWhere(new Brackets(qb => {
qb.where("user.firstName = :firstName", { firstName: "Timber" })
.orWhere("user.lastName = :lastName", { lastName: "Saw" })
}))
that will result with:
SELECT ...
FROM users user
WHERE user.registered = true
AND (user.firstName = 'Timber' OR user.lastName = 'Saw')
I think you are mixing 2 ways of retrieving entities from TypeORM, find from the repository and the query builder. The FindConditions are used in the find function. The andWhere function is use by the query builder. When building more complex queries it is generally better/easier to use the query builder.
Query builder
When using the query build you got much more freedom to make sure the query is what you need it to be. With the where you are free to add any SQL as you please:
const desiredEntity = await connection
.getRepository(User)
.createQueryBuilder("user")
.where("user.id = :id", { id: 1 })
.andWhere("user.date > :date OR (user.date = :date AND user.myId = :myId)",
{
date: specificCreatedAtDate,
myId: mysteryId,
})
.getOne();
Note that depending on your used database the actual SQL that you use here needs to be compatible. With that could also come a possible draw back of using this method. You will tie your project to a specific database. Make sure to read up about the aliases for tables you can set if you are using relations this would be handy.
Repository
You already saw that this is much less comfortable. This is because the find function or more specific the findOptions are using objects to build the where clause. This makes is harder to implement a proper interface to implement nested AND and OR clauses side by side. There for (I assume) they have chosen to split AND and OR clauses. This makes the interface much more declarative and means the you have to pull your OR clauses to the top:
const desiredEntity = await repository.find({
where: [{
id: id,
notAvailable: Not(IsNull()),
date: MoreThan(date)
},{
id: id,
notAvailable: Not(IsNull()),
date: date
myId: myId
}]
})
I cannot imagin looking a the size of the desired query that this code would be very performant.
Alternatively you could use the Raw find helper. This would require you to rewrite your clause per field, since you will only get access to the one alias at a time. You could guess the column names or aliases but this would be very poor practice and very unstable since you cannot directly control this easily.
if you want to nest andWhere statements if a condition is meet here is an example:
async getTasks(filterDto: GetTasksFilterDto, user: User): Promise<Task[]> {
const { status, search } = filterDto;
/* create a query using the query builder */
// task is what refer to the Task entity
const query = this.createQueryBuilder('task');
// only get the tasks that belong to the user
query.where('task.userId = :userId', { userId: user.id });
/* if status is defined then add a where clause to the query */
if (status) {
// :<variable-name> is a placeholder for the second object key value pair
query.andWhere('task.status = :status', { status });
}
/* if search is defined then add a where clause to the query */
if (search) {
query.andWhere(
/*
LIKE: find a similar match (doesn't have to be exact)
- https://www.w3schools.com/sql/sql_like.asp
Lower is a sql method
- https://www.w3schools.com/sql/func_sqlserver_lower.asp
* bug: search by pass where userId; fix: () whole addWhere statement
because andWhere stiches the where class together, add () to make andWhere with or and like into a single where statement
*/
'(LOWER(task.title) LIKE LOWER(:search) OR LOWER(task.description) LIKE LOWER(:search))',
// :search is like a param variable, and the search object is the key value pair. Both have to match
{ search: `%${search}%` },
);
}
/* execute the query
- getMany means that you are expecting an array of results
*/
let tasks;
try {
tasks = await query.getMany();
} catch (error) {
this.logger.error(
`Failed to get tasks for user "${
user.username
}", Filters: ${JSON.stringify(filterDto)}`,
error.stack,
);
throw new InternalServerErrorException();
}
return tasks;
}
I have a list of
{
date: specificCreatedAtDate,
userId: mysteryId
}
My solution is
.andWhere(
new Brackets((qb) => {
qb.where(
'userTable.date = :date0 AND userTable.type = :userId0',
{
date0: dates[0].date,
userId0: dates[0].type,
}
);
for (let i = 1; i < dates.length; i++) {
qb.orWhere(
`userTable.date = :date${i} AND userTable.userId = :userId${i}`,
{
[`date${i}`]: dates[i].date,
[`userId${i}`]: dates[i].userId,
}
);
}
})
)
That will produce something similar
const userEntity = await repository.find({
where: [{
userId: id0,
date: date0
},{
id: id1,
userId: date1
}
....
]
})

Querying an array of objects in JSONB

I have a table with a column of the data type JSONB. Each row in the column has a JSON that looks something like this:
[
{
"A":{
"AA": "something",
"AB": false
}
},
{
"B": {
"BA":[
{
"BAAA": [1,2,3,4]
},
{
"BABA": {
....
}
}
]
}
}
]
Note: the JSON is a complete mess of lists and objects, and it has a total of 300 lines. Not my data but I am stuck with it. :(
I am using postgresql version 12
How would I write the following queries:
Return all row that has the value of AB set to false.
Return the values of BAAA is each row.
You can find the AB = false rows with a JSON Path query:
select *
from test
where data ## '$[*].A.AB == false'
If you don't know where exactly the key AB is located, you can use:
select *
from test
where data ## '$[*].**.AB == false'
To display all elements from the array as rows, you can use:
select id, e.*
from test
cross join jsonb_array_elements(jsonb_path_query_first(data, '$[*].B.BA.BAAA')) with ordinality as e(item, idx)
I include a column "id" as a placeholder for the primary key column, so that the source of the array element can be determined in the output.
Online example

Ramda.js - how to view many values from a nested array

I have this code:
import {compose, view, lensProp, lensIndex, over, map} from "rambda";
let order = {
lineItems:[
{name:"A", total:33},
{name:"B", total:123},
{name:"C", total:777},
]
};
let lineItems = lensProp("lineItems");
let firstLineItem = lensIndex(0);
let total = lensProp("total");
My goal is to get all the totals of all the lineItems (because I want to sum them). I approached the problem incrementally like this:
console.log(view(lineItems, order)); // -> the entire lineItems array
console.log(view(compose(lineItems, firstLineItem), order)); // -> { name: 'A', total: 33 }
console.log(view(compose(lineItems, firstLineItem, total), order)); // -> 33
But I can't figure out the right expression to get back the array of totals
console.log(view(?????, order)); // -> [33,123,777]
That is my question - what goes where the ????? is?
I coded around my ignorance by doing this:
let collector = [];
function collect(t) {
collector.push(t);
}
over(lineItems, map(over(total, collect)), order);
console.log(collector); // -> [33,123,777]
But I'm sure a ramda-native knows how to do this better.
It is possible to achieve this using lenses (traversals), though will likely not be worth the additional complexity.
The idea is that we can use R.traverse with the applicative instance of a Const type as something that is composable with a lens and combines zero or more targets together.
The Const type allows you to wrap up a value that does not change when mapped over (i.e. it remains constant). How do we combine two constant values together to support the applicative ap? We require that the constant values have a monoid instance, meaning they are values that can be combined together and have some value representing an empty instance (e.g. two lists can be concatenated with the empty list being the empty instance, two numbers can be added with zero being the empty instace, etc.)
const Const = x => ({
value: x,
map: function (_) { return this },
ap: other => Const(x.concat(other.value))
})
Next we can create a function that will let us combine the lens targets in different ways, depending on the provided function that wraps the target values in some monoid instance.
const foldMapOf = (theLens, toMonoid) => thing =>
theLens(compose(Const, toMonoid))(thing).value
This function will be used like R.view and R.over, accepting a lens as its first argument and then a function for wrapping the target in an instance of the monoid that will combine the values together. Finally it accepts the thing that you want to drill into with the lens.
Next we'll create a simple helper function that can be used to create our traversal, capturing the monoid type that will be used to aggregate the final target.
const aggregate = empty => traverse(_ => Const(empty))
This is an unfortunate leak where we need to know how the end result will aggregated when composing the traversal, rather than simply knowing that it is something that needs to be traversed. Other languages can make use of static types to infer this information, but no such luck with JS without changing how lenses are defined in Ramda.
Given you mentioned that you would like to sum the targets together, we can create a monoid instance that does exactly that.
const Sum = x => ({
value: x,
concat: other => Sum(x + other.value)
})
This just says that you can wrap two numbers together and when combined, they will produce a new Sum containing the value of adding them together.
We now have everything we need to combine it all together.
const sumItemTotals = order => foldMapOf(
compose(
lensProp('lineItems'),
aggregate(Sum(0)),
lensProp('total')
),
Sum
)(order).value
sumItemTotals({
lineItems: [
{ name: "A", total: 33 },
{ name: "B", total: 123 },
{ name: "C", total: 777 }
]
}) //=> 933
If you just wanted to extract a list instead of summing them directly, we could use the monoid instance for lists instead (e.g. [].concat).
const itemTotals = foldMapOf(
compose(
lensProp('lineItems'),
aggregate([]),
lensProp('total')
),
x => [x]
)
itemTotals({
lineItems: [
{ name: "A", total: 33 },
{ name: "B", total: 123 },
{ name: "C", total: 777 }
]
}) //=> [33, 123, 777]
Based on your comments on the answer from customcommander, I think you can write this fairly simply. I don't know how you receive your schema, but if you can turn the pathway to your lineItems node into an array of strings, then you can write a fairly simple function:
const lineItemTotal = compose (sum, pluck ('total'), path)
let order = {
path: {
to: {
lineItems: [
{name: "A", total: 33},
{name: "B", total: 123},
{name: "C", total: 777},
]
}
}
}
console .log (
lineItemTotal (['path', 'to', 'lineItems'], order)
)
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.27.0/ramda.js"></script>
<script> const {compose, sum, pluck, path} = R </script>
You can wrap curry around this and call the resulting function with lineItemTotal (['path', 'to', 'lineItems']) (order), potentially saving the intermediate function for reuse.
Is there a particular reason why you want to use lenses here? Don't get me wrong; lenses are nice but they don't seem to add much value in your case.
Ultimately this is what you try to accomplish (as far as I can tell):
map(prop('total'), order.lineItems)
you can refactor this a little bit with:
const get_total = compose(map(prop('total')), propOr([], 'lineItems'));
get_total(order);
You can use R.pluck to get an array of values from an array of objects:
const order = {"lineItems":[{"name":"A","total":33},{"name":"B","total":123},{"name":"C","total":777}]};
const result = R.pluck('total', order.lineItems);
console.log(result);
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.27.0/ramda.js"></script>

Parsing JSON in Snowflake

I'm trying to parse a the below nested JSON in Snowflake using the latteral function in Snowflake but I wanted to each nested column in "GoalTime" to show up as a column. For example,
GoalTime_InDoorOpen
2020-03-26T12:58:00-04:00
GoalTime_InLastOff
null
GoalTime_OutStartBoarding
2020-03-27T14:00:00-04:00
"GoalTime": [
{
"GoalName": "GoalTime_InDoorOpen",
"GoalTime": "2020-03-26T12:58:00-04:00"
},
{
"GoalName": "GoalTime_InLastOff"
},
{
"GoalName": "GoalTime_InReadyToTow"
},
{
"GoalName": "GoalTime_OutTowAtGate"
},
{
"GoalName": "GoalTime_OutStartBoarding",
"GoalTime": "2020-03-27T14:00:00-04:00"
},
or if you have many rows (what appear to be flights) and thus you need to columns per flight this code be what you are after
with data as (
select flight_code, parse_json(json) as json from values ('nz101','{GoalTime:[{"GoalName": "GoalA", "GoalTime": "2020-03-26T12:58:00-04:00"}, {"GoalName": "GoalB"}]}'),
('nz201','{GoalTime:[{"GoalName": "GoalA"}, {"GoalName": "GoalB", "GoalTime": "2020-03-26T12:58:00-02:00"}]}')
j(flight_code, json)
), unrolled as (
select d.flight_code, f.value:GoalName as goal_name, f.value:GoalTime as goal_time
from data d,
lateral flatten (input => json:GoalTime) f
)
select *
from unrolled
pivot(min(goal_time) for goal_name in ('GoalA', 'GoalB'))
order by flight_code;
it gives the results:
FLIGHT_CODE 'GoalA' 'GoalB'
nz101 "2020-03-26T12:58:00-04:00" null
nz201 null "2020-03-26T12:58:00-02:00"
create or replace function JSON_STRING()
returns string
language javascript
as
$$
return `
[
{
"GoalName": "GoalTime_InDoorOpen",
"GoalTime": "2020-03-26T12:58:00-04:00"
},
{
"GoalName": "GoalTime_InLastOff"
},
{
"GoalName": "GoalTime_InReadyToTow"
},
{
"GoalName": "GoalTime_OutTowAtGate"
},
{
"GoalName": "GoalTime_OutStartBoarding",
"GoalTime": "2020-03-27T14:00:00-04:00"
}
]
`;
$$;
select value:GoalName::string as GoalName, value:GoalTime::timestamp as GoalTime
from lateral flatten(input => parse_json(JSON_STRING()));
-- See how the lateral flatten combination works on a JSON variant:
select * from lateral flatten(input => parse_json(JSON_STRING()));
I wrote this to run in any Snowflake worksheet, no tables needed. The function on top simply allows the JSON to be written as a multi-line string in the SQL statement below it. It has no other use than representing a string holding your JSON.
Step 1 is to PARSE_JSON, which converts a string into a variant data type formatted as a JSON object.
Step 2 is the lateral flatten. If you do a select star on that, it will return a number of columns. One of them is "value".
Step 3 is to extract the properties you want using single : notation for the property name and dots to traverse down the nodes from there (if there are any).
Step 4 is to cast the property to the data type you want using double :: notation. This is especially important if you're doing comparisons on the column particularly in join keys.
Note that there's a slight invalid part of the JSON that did not allow it to parse. In the top level the array had a property, which did not parse. I removed that to allow parsing.
Probably close to what you seek is using a standard SQL UNION statement.
Given the following are true to recreate the solution:
Created a table 'JSON_GOALS' with one column for raw JSON called, GOALS_RAW
You have loaded JSON data into a table as the raw JSON, with compliant JSON object array syntax, and a parent, GoalTimeGroup, ex: {[{}]}, so
{
"GoalTimeGroup": [{
"GoalName": "GoalTime_InDoorOpen",
"GoalTime": "2020-03-26T12:58:00-04:00"
},
{
"GoalName": "GoalTime_InLastOff"
},
{
"GoalName": "GoalTime_InReadyToTow"
},
{
"GoalName": "GoalTime_OutTowAtGate"
},
{
"GoalName": "GoalTime_OutStartBoarding",
"GoalTime": "2020-03-27T14:00:00-04:00"
}
]
}
Doing so allows you to write a fairly standard JSON retrieve in Snowflake with the following syntax:
SELECT GOALS_RAW:GoalTimeGroup[0].GoalName, GOALS_RAW:GoalTimeGroup[1].GoalName, GOALS_RAW:GoalTimeGroup[2].GoalName
FROM JSON_GOALS
UNION
SELECT GOALS_RAW:GoalTimeGroup[0].GoalTime, GOALS_RAW:GoalTimeGroup[1].GoalTime, GOALS_RAW:GoalTimeGroup[2].GoalName
FROM JSON_GOALS
;
This gives you closer to the answer you are looking for and seems to provide a simpler solution. You can also control how many rows you'd want based on your JSON object attributes for each GOAL object.
Recommendations to enhance this would be to create a function that could detect the depth of each nested element and perhaps auto generate the indexes for 'n' number of columns.
The library below provides a method called "ExecuteAll" which one of the params is "tags", so if you provide an array of tags and values, all of them will be parsed and validated plus keeping the features of the sql injection protection from Snowflake.
snowflake-multisql

Node.js - sqlstring alternative which allows named named replacements

The sqlstring node module allows creating of queries using an ordered array. So if I have a template query like:
sqlstring.format('Select * from users where id = ?', ['my_id'])
It will become:
Select * from users where id = 'my_id'
However here I need to remember the order of the question marks, so if the same thing is being in multiple places it becomes a hassle. Is there an alternative which allows me to do the following:
sqlstring.format('Select :id + :foo as bar from users where id = :id', {id: 1, foo: 3})
Which would become:
Select 1 + 3 as bar from users where id = 1
I know knex query builder does this, but I don't want install the entirety of knex just for the query builder.
You can use mysql2 package, that support that format:
Named placeholders
You can use named placeholders for parameters by setting
namedPlaceholders config value or query/execute time option. Named
placeholders are converted to unnamed ? on the client (mysql protocol
does not support named parameters). If you reference parameter
multiple times under the same name it is sent to server multiple
times.
connection.config.namedPlaceholders = true;
connection.execute('select :x + :y as z', {x: 1, y: 2}, function (err, rows) {
// statement prepared as "select ? + ? as z" and executed with [1,2] values
// rows returned: [ { z: 3 } ]
});
connection.execute('select :x + :x as z', {x: 1}, function (err, rows) {
// select ? + ? as z, execute with [1, 1]
});
connection.query('select :x + :x as z', {x: 1}, function (err, rows) {
// query select 1 + 1 as z
});