MS Access Having Clause - sql

Alright so I understand the point of the HAVING clause. I am having an issue and I am wondering if I can solve this the way I want to.
I want to execute one query using ADODB.Recordset and then use the Filter function to sift through the data set.
The problem is the query at the moment which looks like this:
SELECT tblMT.Folder, tblMT.MTDATE, tblMT.Cust, Sum(tblMT.Hours)
FROM tblMT
GROUP BY tblMT.Folder, tblMT.MTDATE, tblMT.Cust
HAVING tblMT.Cust LIKE "TEST*" AND Min(tblMT.MTDATE)>=Date()-30 AND MAX(tblMT.MTDATE)<=Date()
ORDER BY tblMT.TheDATE DESC;
So the above works as expected.... however I want to be able to use the tblMT.Cust as the filter without having to keep re querying the database. If I remove it I get a:
Data type mismatch in criteria expression.
Is what I am trying to do possible? If someone can point me in the right direction here would be great.

Ok... the type mismatch is caused because either tblmt.mtdate isn't a date field or tblmt.hours isn't a number field AND you have data that either isn't a date or isn't a number when the customer isn't like 'TEST*'. Or, for some customers, you have a NULL in mt.date and null can't be compared with >=. you'd still get the error if you said where tblMt.cust not like "TEST*" too.
Problem is likely with the data or your expectation and you need to handle it.
What data types are tblMT.hours and tblMt.MtDate?

Related

"Cannot construct data type datetime" when filtering data, but all values filtered DO have valid dates

I am convinced that this question is NOT a duplicate of:
Cannot construct data type datetime, some of the arguments have values which are not valid
In that case the values passed in are explicitly not valid. Whereas in this case the values that the function could be expected to be called upon are all valid.
I know what the actual problem is, and it's not something that would help most people that find the other question. But it IS something that would be good to be findable on SO.
Please read the answer, and understand why it's different from the linked question before voting to close as dupe of that question.
I've run some SQL that's errored with the error message: Cannot construct data type datetime, some of the arguments have values which are not valid.
My SQL uses DATETIMEFROMPARTS, but it's fine evaluating that function in the select - it's only a problem when I filter on the selected value.
It's also demonstrating weird, can't-possibly-be-happening behaviour w.r.t. other changes to the query.
My query looks roughly like this:
WITH FilteredDataWithDate (
SELECT *, DATETIMEFROMPARTS(...some integer columns representing date data...) AS Date
FROM Table
WHERE <unrelated pre-condition filter>
)
SELECT * FROM FilteredDataWithDate
WHERE Date > '2020-01-01'
If I run that query, then it errors with the invalid data error.
But if I omit the final Date > filter, then it happily renders every result record, so clearly none of the values it's filtering on are invalid.
I've also manually examined the contents of Table WHERE <unrelated pre-condition filter> and verified that everything is a valid date.
It also has a wild collection of other behaviours:
If I replace all of ...some integer columns representing date data... with hard-coded numbers then it's fine.
If I replace some parts of that data with hardcoded values, that fixes it, but others don't. I don't find any particular patterns in what does or doesn't help.
If I remove most of the * columns from the Table select. Then it starts to be fine again.
Specifically, it appears to break any time I include an nvarchar(max) column in the CTE.
If I add an additional filter to the CTE that limits the results to Id values in the following ranges, then the results are:
130,000 and 140,000. Error.
130,000 and 135,000. Fine.
135,000 and 140,000. Fine.!!!!
Filtering by the Date column breaks everything ... but ORDER BY Date is fine. (and confirms that all dates lie within perfectly sensible bounds.)
Adding TOP 1000000 makes it work ... even though there are only about 1000 rows.
... WTAF?!
This took me a while to decode, but it turns out that the SS compiler doesn't necessarily restrict its execution of the function just to rows that are, or could be, relevant to the result set.
Depending on the execution plan it arrives at, the function could get called on any record in Table, even one that doesn't satisfy WHERE <unrelated pre-condition filter>.
This was found by another user, for another function, over here.
So the fact that it could return all the results without the filter wasn't actually proving that every input into the function was valid. And indeed there were some records in the table that weren't in the result set, but still had invalid data.
That actually means that even if you were to add an explicit WHERE filter to exclude rows containing invalid date-component data ... that isn't actually guaranteed to fix it, because the function may still get called against the 'excluded' rows.
Each of the random other things I did will have been influencing the query plan in one way or another that happened to fix/break things.
The solution is, naturally, to fix the underlying table data.

SQL column name with function

I want to create 3 new columns with their names reffering to some date varibales and this is not possible to write them like this. So the first column name should be YEAR2022, 2nd column YEAR2021 and 3rd column YEAR2020.
Can you please give me an idea how to write this?
select column1*2 as CONCAT('YEAR',YEAR(DATEADD(YY,0,GETDATE()))),
column1*3 as CONCAT('YEAR',YEAR(DATEADD(YY,-1,GETDATE()))),
column1*4 as CONCAT('YEAR',YEAR(DATEADD(YY,-2,GETDATE()))) FROM table1
The error that I get is:
Incorrect syntax near 'YEAR'.
As I mentioned in my comment, an alias cannot be an expression, it has to be a literal. As such the expressions you have tried to use are not allowed and generate syntax errors.
Normally, this sort requirement is the sign of a design flaw, or that you're trying to do something that should be in the presentation in the SQL layer. I'm going to assume the latter here. As a result, instead you should use static names for the columns, and then in your presentation layer, control the name of the columns there, so that when they are presented to the end user they have the names you want (for today that would be YEAR2022, YEAR2021 and YEAR2020). Then your query would just look like this:
select column1*2 AS ThisYear,
column1*3 AS YearPrior,
column1*4 AS Year2Prior
FROM dbo.table1;
How you change the names of the columns in your presentation layer is a completely different question (we don't even know what language you are using to write your application). If you want to ask about that, I suggest asking a new question (only if after you've adequately researched the problem and failed to find a solution), so that we can help you there.
Note that Though you can achieve a solution via dynamic SQL, I would strongly suggest it is the wrong solution here, and thus why I haven't given an answer providing a demonstration.

grafana multi value query in timestream

i have some problems displaying my aws timestream data in grafana. I added as a global dashboard variable DevEUI with 3 different specific values. But when i am using the multivalue syntax ${DevEUI} in my query with more then one value i get everytime a error.
hope somebody can give me a hint.
Regards and thanks in advance
You are most probably having a list of values as the value of your multivalue Grafana variable, but you are still using the = operator in your query. Try ... and DevEUI IN ('${DevEUI}'). Or maybe without the single quotes or the parantheses... the exact syntax depends on your Grafana variable.
But, this is just an educated guess, since I cannot see neither your database schema nor the definition of this Grafana variable (both of which are important details in a question like yours, for future reference).
This is how I did it for a multivalued string value:
timestream_variable_name = ANY(VALUES ${grafana_variable_name:singlequote})
You might have to adjust the formatting Grafana applies to the concatenated variable value it generates, depending on your data type.
I know this is long after the original question but #alparius pointed me in the right direction so I wanted to update the fix for the problem Joe reported.
Use formatting to get the proper quotes/values when formatting your query. Something like this:
Select * from database where searchiterm IN (${Multi-Value_Variable:sqlstring})

Google Bigquery, WHERE clause based on JSON item

I've got a bigquery import from a firestore database where I want to query on a particular field from a document. This was populated via the firestore-bigquery extension and the document data is stored as a JSON string.
I'm trying to use a WHERE clause in my query that uses one of the fields from the JSON data. However this doesn't seem to work.
My query is as follows:
SELECT json_extract(data,'$.title') as title,p
FROM `table`
left join unnest(json_extract_array(data, '$.tags')) as p
where json_extract(data,'$.title') = 'technology'
data is the JSON object and title is an attribute of all of the items. The above query will run but yield 'no results' (There are definitely results there for the title in question as they appear in the table preview).
I've tried using WHERE title = 'technology' as well but this returns an error that title is an unrecognized field (hence the json_extract).
From my research this should work as a standard SQL JSON query but doesn't seem to work on Bigquery. Does anyone know of a way around this?
All I can think of is if I put the results in another table, but I don't know if that's a workable solution as the data is updated via the extension on an update, so I would need to constantly refresh my second table as well.
Edit
I'm wondering if configuring a view would help with this? Though ultimately I would like to query this based on different parameters and the docs here https://cloud.google.com/bigquery/docs/views suggest you can't reference query parameters in a view
I've since managed to work this out, and will share the solution for anyone else with the same problem.
The solution was to use JSON_VALUE in the WHERE clause instead e.g:
where JSON_VALUE(data,'$.title') = 'technology';
I'm still not sure if this is the best way to do this in terms of performance and cost so I will wait to see if anyone else leaves a better answer.

Using sql subquery in Tableau

I trying to calculate a new field in Tableau similar to a sub query in SQL. However, my numbers not matching up when I try to do this. I am stuck at this point and I am trying to see what others have done.
To provide reference, below is the subquery that I am trying to duplicate in Tableau.
select
((sum(n.non_influenced_sales*n.calls)/sum(n.calls))-(sum(n.average_sales*calls)/sum(calls))))/
(sum(n.non_influenced_sales*n.calls)/sum(n.calls)) as impact
from (select
count(d.id) as calls
,avg(d.sale) as average_sales
,avg(case when non_influenced=1 then d.sale else null end) as non_influenced_sales
from data d
group by skill) n
When I try to build the same calculation in Tableau, I am able to get the same results as long as I comment out the group by skill. However, when I try to group by skill, my attempts to match the number have not been working.
The closest I have come is when I try to fix the level of detail expression by using include. Tableau code:
(include [skill]:([non_influenced_sales]-[average_sales])/[non_influenced_sales]}
However, doing this or using fixed has not worked and I can't match the numbers I am getting from SQL.
FYI, Impact is an aggregated measure. I built the sub-query part in tableau by just creating separate fields for the calculation I needed. So for example
Non Influenced Sales calculated in Tableau:
avg(if [non_influenced]=1 then [non_influenced_sales] end)
However, I am not sure if this matters or not.
I have also tried creating custom sql. I am able to get a rolled up version using all of the dates correct. But when I want to get down to different dates/use other filters, things get messy real quick. I am trying to build relationships on a date level, but that hasn't worked either.
Is there an easier way to do this?