SQL COMMAND + CASE Crystal Reports - sql

I am having a an issue with SQL Command and CASE. I am pretty new to Crystal Reports/SQL and I have a basic code that I am playing around with to learn. I want to clean out a field -- that has imported from SQL Server. I just want to do something simple like this:
SELECT "I"."I_TYPE", "Alleg” =
CASE
WHEN "ALLEGs"."ALLEG” LIKE ‘*im*’ THEN ‘Improper’
ELSE ‘UNKNOWN’
END
I get an error that says Database Connector Error:
'42000:[MS][SQL..Incorrect syntax near ' . ' . Databse vender code 102.
Can you even use CASE as an IF THEN statement in the SQL Command. I am aware of SQL expressions, but I am trying to pull data to sql command to prevent performance decrease.

I am not sure about crystal report but your query formation don't look correct. It should be
SELECT I.I_TYPE,
CASE WHEN ALLEGs.ALLEG LIKE '%im%' THEN 'Improper' ELSE 'UNKNOWN' END AS 'Alleg'

Rahul, is correct for direct SQL commands to the server.
However sometimes when running Crystal Reports we use VBA within the report to do some data tuning rather than modifying the raw SQL on the fly.
This leaves the raw SQL a known result (verifiable on the SQL Server) and then modify the output in Crystal to fit the end users filtering requirements.
This is not efficient with large result sets but when the results are smaller (under 50k records) we usually have our team go with simple (post filtering) to reduce design and testing time.
This technique works very well with dynamic filters on the option section.
example: [Record Selection]
if {?Select Sales Person} <> "ENT" then
{R0033___P2A.ProjectionSP} = {?Select Sales Person} and {R0033___P2A.FM} >= 0
else
{R0033___P2A.ProjectionSP} > "" and {R0033___P2A.FM} >= 0
Where {?Select Sales Person} is a user selection filter and {R0033___P2A} is a predefined report view or stored procedure.

Related

Dealing with filtered Pass Through Query in MS Access

I have a relatively complex SQL query (complex to run in Access) and want to run it in MS Access. It works with the pass-through query well but going forward I will face an issue that is related to a filter I apply in the query. I select the current report date within the where function. Below is a part of my query I try to handle ;
select LS.PID_FACILITY, LS.ASOF_DTE, LS.DATA_CYCLE_FLG, LS.CUST_ACC, LS.CUST_SMUN, LS.CUST_NME, LS.CUST_CTY,
WHERE LS.ASOF_DTE='19-SEP-22'
I do not want to change asof_dte filter manually everyday. If this was a normal access query I could join another table that includes only the current report date. But I cannot do it in a pass-through query. What is the alternative way to do it? I read something about creating variables or strings, but I could not relate them to my problem, since I am a beginner at creating such solutions.
Thank you all.
Well, two VERY intresting things here.
First, YES a great idea to include the date in the PT query. But, you don't want to change that date each time.
Soluton:
Add a paramter to the query, and then from Access code add that paramter. It is VERY easy to do this (one line of code!!! - don't adopt the zillion examples out there that has a boatload of ADO code - NOT required!.
However, BEFORE we start dealing with above?
A MUCH better and simple, less work way to approach this?
in place of stored procedure?
if possbile, create a view. and use that for the report.
Why?
Because you then get TWO VERY valuable bonus.
First, you can freely use the reports "where" clause, and it respects the where clause and STILL runs server side!!!
In other words, create a view for that existing query, but WITHOUT the date set in that view.
You then link to the view from access client side.
Now, to open (filter) the report, you can do this:
docmd.OpenReport "MyReport",acViewPreview,,"LS.ASOF_DTE='19-SEP-22'"
Now, of couse the above "where" clause can be a varible (string).
NOTE SUPER but SUPER careful here:
If you base the reprot on a pass-though query (that then uses the stored procedure), then the filter occures CLIENT SIDE!!!! (all rows will be returned and THEN filtered if you report is based on that stored procedure.
But, if you use a view?
The the filter makes it to the server side!!!!
While both the pass-through query or the "view" can be filtered with the above "open report" and the where clause we have above?
The view will still filter server side - the pass-though query will NOT!!!
Now, the 3rd way, is of course to build the stored procedure to accept a date parmater.
You then could do this:
with Currentdb.QueryDefs("MyPassThoughQueryGoesHere")
.SQL = "EXEC MyStoreProc " + "19-SEP-22"
END WITH
docmd.OpenReport "MyReport",acViewPreview
So, you CAN add and have a PT query and add a paramter as per above.
However, unless that stored procedure has some speical code, you are MUCH better off to create a view server side, base the reprot on that view, and simple pass + use the traditional "where" clause of the open report command. Even if that view has no filter, returns all rows in the table?
With the "where" clause of the open report command, ONLY those rows meeting that critera will be pulled down the network pipe.
So, say a invoice table with 1 million rows.
Create a view, link the view in access.
base report on that view.
Now, do this:
docmd.OpenReport "rptInvoice",,,"InvoiceNum = 134343"
The above will ONLY PULL down 1 row from the server. Even if the view has no filter and would return 1 million rows.
So, using a view is less work then creating the stored procedure.
But, you can modify the stored procedure to accept a paramter, and then as noted use the above example to modify the PT query you have, and THEN open the report.
I think overall, it is less work to use view. Furthermore, if you have a slow running report now?
Replace the query (move it) to sql server side. Get it working. Now link to that view (give it same name as what the client side query was in Access).
Now, EVEN if you had some fancy filter code in VBA, and used openReport with the "where" clause? It will now work, only pull the records down the network pipe, you get stored procedure performance without the hassles. and the date format and "where" clause for open report is access/VBA style - not sql server style SQL.
So, high recommend you try and dump the stored procedure and use a view (and EVEN better is any where clause works - not just one based on pre-defined parameters for the stored procedure - so you not limited to parameters)
. However, no big deal - the above "EXEC dbo.MyStoreProce " & strDate example would also work fine if you have a date parameter you wish to supply to the pass-though query.

Writing SQL in Informatica Data Quality Analyst

I am new to Informatica Data Quality Analyst (Version 9.5.1 HotFix3) and I am having trouble in generating a basic SQL statement.
The SQL statement is being written against a mapping specification of a table that was originally imported as a flat file. The statement looks like:
Select ColumnA, ColumnB FROM Table1
WHERE Table1.ColumnA = 'S'
The SELECT .... FROM portion of the statement works fine but I encounter errors when I throw in the WHERE clause. I think my statement looks like standard SQL so I'm not sure why this will not work. Does Informatica Analyst accept SQL written only in a specific form? Are the inverted commas causing problems?
the query must work which you are trying to execute. If its not fetching the results, you need to do the following steps:
1) Load the data from source flat file to any database(oracle). You can directly import the data from flat file to table via sql developer.
2) Execute the query with the filter condition. If it doesnt fetch any rows, your query in idq is fetching correct results. If not, there is something missing in your idq code.

How does Tableau run queries on Redshift? (And/or why can't Redshift display Tableau queries?)

I'm kicking tires on BI tools, including, of course, Tableau. Part of my evaluation includes correlating the SQL generated by the BI tool with my actions in the tool.
Tableau has me mystified. My database has 2 billion things; however, no matter what I do in Tableau, the query Redshift reports as having been run is "Fetch 10000 in SQL_CURxyz", i.e. a cursor operation. In the screenshot below, you can see the cursor ids change, indicating new queries are being run -- but you don't see the original queries.
Is this a Redshift or Tableau quirk? Any idea how to see what's actually running under the hood? And why is Tableau always operating on 10000 records at a time?
I just ran into the same problem and wrote this simple query to get all queries for currently active cursors:
SELECT
usr.usename AS username
, min(cur.starttime) AS start_time
, DATEDIFF(second, min(cur.starttime), getdate()) AS run_time
, min(cur.row_count) AS row_count
, min(cur.fetched_rows) AS fetched_rows
, listagg(util_text.text)
WITHIN GROUP (ORDER BY sequence) AS query
FROM STV_ACTIVE_CURSORS cur
JOIN stl_utilitytext util_text
ON cur.pid = util_text.pid AND cur.xid = util_text.xid
JOIN pg_user usr
ON usr.usesysid = cur.userid
GROUP BY usr.usename, util_text.xid;
Ah, this has already been asked on the AWS forums.
https://forums.aws.amazon.com/thread.jspa?threadID=152473
Redshift's console apparently doesn't display the query behind cursors. To get that, you can query STV_ACTIVE_CURSORS: http://docs.aws.amazon.com/redshift/latest/dg/r_STV_ACTIVE_CURSORS.html
Also, you can alter your .TWB file (which is really just an xml file) and add the following parameters to the odbc-connect-string-extras property.
UseDeclareFetch=0;
FETCH=0;
You would end up with something like:
<connection class='redshift' dbname='yourdb' odbc-connect-string-extras='UseDeclareFetch=0;FETCH=0' port='0000' schema='schm' server='any.redshift.amazonaws.com' [...] >
Unfortunately there's no way of changing this behavior trough the application, you must edit the file directly.
You should be aware of the performance implications of doing so. While this greatly enhances debugging there must be a reason why Tableau chose not to allow modification of these parameters trough the application.

SQL queries in batch don't execute

My project is in Visual Foxpro and I use MS SQL server 2008. When I fire sql queries in batch, some of the queries don't execute. However, no error is thrown. I haven't used BEGIN TRAN and ROLLBACK yet. What should be done ??
that all depends... You don't have any sample of your queries posted to give us an indication of possible failure. However, one thing I've had good response with from VFP to SQL is to build into a string (I prefer using TEXT/ENDTEXT for readabilty), then send that entire value to SQL. If there are any "parameter" based values that are from VFP locally, you can use "?" to indicate it will come from a variable to SQL. Then you can batch all in a single vs multiple individual queries...
vfpField = 28
vfpString = 'Smith'
text to lcSqlCmd noshow
select
YT.blah,
YT.blah2
into
#tempSqlResult
from
yourTable YT
where
YT.SomeKey = ?vfpField
select
ost.Xblah,
t.blah,
t.blah2
from
OtherSQLTable ost
join #tempSqlResult t
on ost.Xblah = t.blahKey;
drop table #tempSqlResult;
endtext
nHandle = sqlconnect( "your connection string" )
nAns = sqlexec( nHandle, lcSqlCmd, "LocalVFPCursorName" )
No I don't have error trapping in here, just to show principle and readability. I know the sample query could have easily been done via a join, but if you are working with some pre-aggregations and want to put them into temp work areas like Localized VFP cursors from a query to be used as your next step, this would work via #tempSqlResult as "#" indicates temporary table on SQL for whatever the current connection handle is.
If you want to return MULTIPLE RESULT SETs from a single SQL call, you can do that too, just add another query that doesn't have an "into #tmpSQLblah" context. Then, all instances of those result cursors will be brought back down to VFP based on the "LocalVFPCursorName" prefix. If you are returning 3 result sets, then VFP will have 3 cursors open called
LocalVFPCursorName
LocalVFPCursorName1
LocalVFPCursorName2
and will be based on the sequence of the queries in the SqlExec() call. But if you can provide more on what you ARE trying to do and their samples, we can offer more specific help too.

Is there a size limit for the SQL text in a PeopleSoft App Engine SQL Step/Action?

I'm getting the following error: AeSymResolveStatement [775] ... Meta-SQL error at or near position 34338 in statement (108,512). The SQL statement itself is over 40,000 chars long, hence the question.
The DB is oracle. Running on Tools 8.49.24.
I know that there is a limit on the size of the SQL used in an Application Engine (SQL Step). I had once recieved a similar error while trying to use an exceptionally long SQL in an Application Engine.
I wouldn't be surprised if that same limit applies to SQL Objects.
To fix the problem, I was able to split the SQL into 2 (was an update statement). Hopefully that's possible in your case as well.
There is no such limit.
You can confirm this yourself by creating an SQL like:
select 'x' from PS_INSTALLATION where
1 = 1 and
1 = 1 and
1 = 1 and
1 = 1 and
/* ... copy paste '1 = 1 and' 90000 times or so times more */
1 = 1
Although it makes pside quite slow, It saves and validates just fine.
There are limits within PeopleCode, mostly due to the limits on string length, however I have never found a limit on stored SQL statements.
Personally I'd look at breaking the statement into pieces in some way.
You could:
Using the inbuilt looping mechanism of App Engines
Use a mixture of SQL and PeopleCode
Use a temporary table and perform intermediate SQLs, storing in the temp table
Apart from giving your database a heart seizure, not the mention the DBA when he sees the statement in the SQL monitor. You are saving yourself a world of pain if you ever have to look at the statement again.
I think the SQLs in App Engines are stored as longs, so it would be 4GB under Oracle, something similarly huge under DB2.