The query time is controllable using parameter value [' | case randomblob(1000000000) when not null then "" else "" end | '], which caused the request to take [142] milliseconds, parameter value [' | case randomblob(1000000000) when not null then "" else "" end | '], which caused the request to take [142] milliseconds, when the original unmodified query with value [24] took [66] milliseconds.
So I found a SQL injection vuln on my site and its
' | case randomblob(1000000000) when not null then "" else "" end | '
my site
https://sample.com/cdn-cgi/bm/cv/result?req_id=6506bd25b9e42c3e
I don't know how to see the database on sqlmap to see if its vuln is that serious how can I test this SQL injection manually??
the link of the portswigger would help to understand the issue. if your server is delayed because of the request, your db server is vulnerable for SQLi.
https://portswigger.net/web-security/sql-injection/blind/lab-time-delays
https://portswigger.net/web-security/sql-injection/blind/lab-time-delays-info-retrieval
Related
I have the following stored procedure (In MS SQL):
ALTER PROCEDURE [dbo].[proc_GetWorksWithEngineerVisits3]
#sTextSearch nvarchar(255) = NULL,
#bCompleteFlag bit = NULL,
#dExpectedStartDateTime datetime = NULL,
#dExpectedEndDateTime datetime = NULL,
#sResponsible_UserIDs nvarchar(255) = NULL,
#bEnableTextSearchFilter bit = false,
#bEnableCompleteFlagFilter bit = false,
#bEnableExpectedDateTimeRangeFilter bit = false,
#bEnableResponsible_UserIDFilter bit = false
AS
SELECT *
FROM dbo.vwWorksWithEngineerVisits
WHERE
--TextSearch Filter Start
(sCustomer LIKE CASE
WHEN #bEnableTextSearchFilter = 1
THEN '%' + #sTextSearch + '%'
ELSE sCustomer
END
OR
sSite LIKE CASE
WHEN #bEnableTextSearchFilter = 1
THEN '%' + #sTextSearch + '%'
ELSE sSite
END
OR
sCallID LIKE CASE
WHEN #bEnableTextSearchFilter = 1
THEN '%' + #sTextSearch + '%'
ELSE sCallID
END)
--TextSearch Filter End
AND
--Complete Filter Start
bIsComplete = CASE
WHEN #bEnableCompleteFlagFilter = 1
THEN #bCompleteFlag
ELSE bIsComplete
END
--Complete Filter End
AND
--Expected DateTime Range Filter Start
dExpectedStartDateTime >= CASE
WHEN #bEnableExpectedDateTimeRangeFilter = 1
THEN #dExpectedStartDateTime
ELSE dExpectedStartDateTime
END
AND
dExpectedEndDateTime <=
CASE
WHEN #bEnableExpectedDateTimeRangeFilter = 1
THEN #dExpectedEndDateTime
ELSE dExpectedEndDateTime
END
----Expected DateTime Range Filter End
AND
--Responsible_UserID Filter Start
lResponsible_UserID in (
CASE
WHEN #bEnableResponsible_UserIDFilter = 0
THEN lResponsible_UserID
ELSE (SELECT Value FROM dbo.CSVToList(#sResponsible_UserIDs) AS CSVToList_1)
END
)
--Responsible_UserID Filter End
ORDER BY dExpectedEndDateTime
The output is fine, but it is very slow (15 sec for only 5000 rows) Executing dbo.vwWorksWithEngineerVisits directly takes 1sec for the same number. When executing the SP, I am setting all enable flags = 0.
DECLARE #return_value int
EXEC #return_value = [dbo].[proc_GetWorksWithEngineerVisits3]
#sTextSearch = NULL,
#bCompleteFlag = False,
#dExpectedStartDateTime = N'01/01/1969',
#dExpectedEndDateTime = N'01/01/2021',
#sResponsible_UserIDs = NULL,
#bEnableTextSearchFilter = 0,
#bEnableCompleteFlagFilter = 0,
#bEnableExpectedDateTimeRangeFilter = 0,
#bEnableResponsible_UserIDFilter = 0
SELECT 'Return Value' = #return_value
I want to be able to only filter a column, if the corresponding flag is set. I probably could just check for NULL in the primary parameters and reduce the parameters, but I don't think it changes the problem I am having.
The first 4 Case filters are very basic, and when I comment the remaining last 3 out, the performance/result is instantaneous. As soon as I add one of last 3 back into the mix, things slow down as above. What makes these different is that they do ">=" or "in", rather than just an "=" or "like". The other thing that I noticed is that when I changed the following:
lResponsible_UserID in (
CASE
WHEN #bEnableResponsible_UserIDFilter = 0
THEN lResponsible_UserID
ELSE (SELECT Value FROM dbo.CSVToList(#sResponsible_UserIDs) AS CSVToList_1)
END
to
lResponsible_UserID in (
CASE
WHEN #bEnableResponsible_UserIDFilter = 0
THEN lResponsible_UserID
ELSE lResponsible_UserID
END
This also speed things up to 1 sec. How is this the case that changing the else part of the statement makes any difference whatsoever, when the flag is always 0, so should never run?
I need these filters, and I need them dynamic. There are a mix of operator types (including an IN that targets a function). Is there a way to refactor this stored procedure to have the same result (it does work), but in a much more optional way?
Apologies if I have missed something in my post, and I will edit if this pointed out.
Thanks
That's a big query!
SQL Server runs a compiler against the queries in your sp when you define it. Then it uses that compiled procedure, blithely ignoring any optimizations that might come from your specific parameter values. This page explains:
When SQL Server executes procedures, any parameter values that are used by the procedure when it compiles are included as part of generating the query plan. If these values represent the typical ones with which the procedure is subsequently called, then the procedure benefits from the query plan every time that it compiles and executes. If parameter values on the procedure are frequently atypical, forcing a recompile of the procedure and a new plan based on different parameter values can improve performance.
In your situation, your parameter settings dramatically simplify the search you want. But the compiled sp doesn't know that so it uses an excessively generalized search plan.
Try appending this to the query in your SP (after your ORDER BY clause) to force the generation of a new, hopefully more specific, execution plan.
OPTION (RECOMPILE)
Also, you can tidy up your filter clauses and make them a little less gnarly.
Try this for your text-search cases: Change
sCustomer LIKE CASE
WHEN #bEnableTextSearchFilter = 1
THEN '%' + #sTextSearch + '%'
ELSE sCustomer
END
to
(#bEnableTextSearchFilter <> 1 OR sCustomer LIKE '%' + #sTextSearch + '%')
This will refrain from saying column LIKE column when your filter is disabled, and may save some time.
You can apply the same principle to the rest of your CASE statements too.
Note: the filter pattern column LIKE '%value%' is inherently slow; it can't use an index range scan on column because the text-matching isn't anchored at the beginning of the pattern. Rather it must scan all the values.
i have a major problem and trying to find a workaround. I have an application in PB12.5 that works on both sql and oracle dbs.. (with a lot of data)
and i m using CURSOR at a point,, but the aplications crashes only in sql. Using debuging in PB i found that the sql connection returs -1 due to huge transaction size. But i want to fetch row by row my data.. is any work around to fetch data like paging?? i mean lets fetch the first 1000 rows next the other 1000 and so on.. i hope that you understand what i want to achieve (to break the fetch process and so to reduce the transaction size if possible) , here is my code
DECLARE trans_Curs CURSOR FOR
SELECT associate_trans.trans_code
FROM associate_trans
WHERE associate_trans.usage_code = :ggs_vars.usage ORDER BY associate_trans.trans_code ;
OPEN trans_Curs;
FETCH trans_Curs INTO :ll_transId;
DO WHILE sqlca.sqlcode = 0
ll_index += 1
hpb_1.Position = ll_index
if not guo_associates.of_asstrans_updatemaster( ll_transId, ls_error) then
ROLLBACK;
CLOSE trans_Curs;
SetPointer(Arrow!)
MessageBox("Update Process", "Problem with the update process on~r~n" + sqlca.sqlerrtext)
cb_2.Enabled = TRUE
return
end if
FETCH trans_Curs INTO :ll_transId;
LOOP
CLOSE trans_Curs;
Since the structure of your source table s not fully presented, I'll make some assumptions here.
Let's assume that the records include a unique field that can be used as a reference (could be a counter or a timestamp). I'll assume here that the field is a timestamp.
Let's also assume that PB accepts cursors with parameters (not all solutions do; if it does not, there are simple workarounds).
You could modify your cursor to be something like:
[Note: I'm assuming also that the syntax presented here is valid for your environment; if not, adaptations are simple]
DECLARE TopTime TIMESTAMP ;
DECLARE trans_Curs CURSOR FOR
SELECT ots.associate_trans.trans_code
FROM ots.associate_trans
WHERE ots.associate_trans.usage_code = :ggs_vars.usage
AND ots.associate_trans.Timestamp < TopTime
ORDER BY ots.associate_trans.trans_code
LIMIT 1000 ;
:
:
IF (p_Start_Timestamp IS NULL) THEN
TopTime = CURRENT_TIMESTAMP() ;
ELSE
TopTime = p_Start_Timestamp ;
END IF ;
OPEN trans_Curs;
FETCH trans_Curs INTO :ll_transId;
:
:
In the above:
p_Start_Timestamp is a received timestamp parameter which would initially be empty and then will contain the OLDEST timestamp fetched in the previous invocation,
CURRENT_TIMESTAMP() is a function of your environment returning the current timestamp.
This solution will work solely when you need to progress in one direction (i.e. from present to past) and that you are accumulating all the fetched records in an internal buffer in case you need to scroll up again.
Hope this makes things clearer.
First of all thank you FDavidov for your effort, so i managed to do it using dynamic datastore instead of cursor,, so here is my solution in case someone else need this.
String ls_sql, ls_syntax, ls_err
Long ll_row
DataStore lds_info
ls_sql = "SELECT associate_trans.trans_code " &
+ " FROM associate_trans " &
+ " WHERE associate_trans.usage_code = '" + ggs_vars.usage +"' "&
+ " ORDER BY associate_trans.trans_code"
ls_syntax = SQLCA.SyntaxFromSQL( ls_sql, "", ls_err )
IF ls_err <> '' THEN
MessageBox( 'Error...', ls_err )
RETURN
END IF
lds_info = CREATE DataStore
lds_info.Create( ls_syntax, ls_err )
lds_info.SetTransObject( SQLCA )
lds_info.Retrieve( )
DO WHILE sqlca.sqlcode = 0 and ll_row <= ll_count
FOR ll_row = 1 TO ll_count
ll_transId = lds_info.GetItemNumber( ll_row, 'trans_code' )
ll_index += 1
hpb_1.Position = ll_index
do while yield(); loop
if not guo_associates.of_asstrans_updatemaster( ll_transId, ls_error) then
ROLLBACK;
DESTROY lds_info
SetPointer(Arrow!)
MessageBox("Update Process", "Problem with the update process on~r~n" + sqlca.sqlerrtext)
cb_2.Enabled = TRUE
return
end if
NEXT
DESTROY lds_info
LOOP
We've got a QVW Script failing as it can't find the table to concatenate on, or load into the QVD.
ERROR messages shown on partial reload
ERROR MESSAGE 1
Table not found
Concatenate (DATES)
LOAD
'P' & Num(period,'00') & yearcode AS #dFinYearPeriod,
Num(period,'00') as dFinYearPeriod,
Num(period,'00') as dFinPeriod,
'' as dMonthEnd,
Text(yearcode-1) & '/' & Text(yearcode-2000) AS dFinYear,
Num(yearcode) AS dFinYearOnly,
'' AS dMonth,
yearcode as dYear,
'' AS dMonthNo,
'' as dFinYearEnd_Cur,
'' as dFinYearEnd_Prev
ERROR MESSAGE 2
Table not found
STORE DATES into C:\QlikView\QVD\DATES.qvd (qvd)
We've been running back and forth through the script and can't find the cause of the error. Nothing has been changed in the QVW as far as we're aware, the OLEDB connection is fine, and the stored procedure involved is working correctly, as is the sql script.
from the error messages we're getting this looks to be the script failure point, but we can't work out why...
DATES:
LOAD
'P' & Num(dFinPeriod,'00') & Date(dFinYearEnd_Cur,'YYYY') AS #dFinYearPeriod,
if(isnull(dMonthEnd),
Num(dFinPeriod,'00'),
(if(dMonthEnd = '',
Num(dFinPeriod,'00'),
Num(dFinPeriod,'00') & ' (' & Text(Date (dMonthEnd,'MMM')) & ')'
)
)
) as dFinYearPeriod,
Num(dFinPeriod,'00') as dFinPeriod,
Date(dMonthEnd, 'DD/MM/YYYY') as dMonthEnd,
Text(Date(dFinYearEnd_Prev,'YYYY')) & '/' & Text(Date (dFinYearEnd_Cur,'YY')) AS dFinYear,
Year(Date(dFinYearEnd_Cur, 'DD/MM/YYYY')) AS dFinYearOnly, //Return integer
Text(Date(dMonthEnd,'MMM')) AS dMonth,
Text(Date(dMonthEnd,'YYYY')) as dYear,
Num(Month(dMonthEnd),'00') AS dMonthNo,
Date(dFinYearEnd_Cur,'DD/MM/YYYY') as dFinYearEnd_Cur,
Date(dFinYearEnd_Prev,'DD/MM/YYYY') as dFinYearEnd_Prev
//Filter to only financial year 2011/2 and later
WHERE Text(Date(dFinYearEnd_Cur,'YYYY'))>=2012
;
SQL EXEC
dbo.spGetMonthEnds
;
//Add on the non-date f periods ie. 13 to 16
Concatenate (DATES)
LOAD
'P' & Num(period,'00') & yearcode AS #dFinYearPeriod,
Num(period,'00') as dFinYearPeriod,
Num(period,'00') as dFinPeriod,
'' as dMonthEnd,
Text(yearcode-1) & '/' & Text(yearcode-2000) AS dFinYear,
Num(yearcode) AS dFinYearOnly,
'' AS dMonth,
yearcode as dYear,
'' AS dMonthNo,
'' as dFinYearEnd_Cur,
'' as dFinYearEnd_Prev
;
SQL Select
yearcode,
period
from
d_details
where
period <>'R' and
period >12 and period <=16
and yearcode >=2012
group by
yearcode,
period
;
STORE DATES into $(vFolder)DATES.qvd (qvd);
DROP Table DATES;
message on full reload
Connecting to Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=WRVS;Data Source=rvs-psfsql-1-a;Use Procedure for Prepare=1;Auto Translate=True;Packet Size=4096;Workstation ID=WRVS-CLICK-1-A;Use Encryption for Data=False;Tag with column collation when possible=False
Connected
DATES << EXEC
dbo.spGetMonthEnds
48 lines fetched
DATES << Select
yearcode,
period
from
d_details
where
61 lines fetched
the script execution seems to work ish, it's pulling lines back, but can't seem to find the DATES table to concatenate or store into the QVD.
The date manipulation taking place was all in place preiously, and there's nothing weird coming through on the SQL scripts to break any of that.
Any ideas please?
Thanks
It turned out to be a Kerberos error unrelated to the Qlikview application itself
Error Code: 0x7 KDC_ERR_S_PRINCIPAL_UNKNOWN
I think it was related to some virus cleanup work that took place overnight. A server reboot solved the issue.
I'm attempting to execute a simple "if/then" statement on an Oracle SQL database using RODBC in R. The SQL statement works fine in SQL Developer v4.0.2.15 but throws an error when performing the same statement in R
sqlQuery(channel, "
select
Variable1,
Variable2,
CASE WHEN Variable1 = 0 then 0 else 1 end as Var3
from schema.TABLE
where ROWNUM<100;
"
)
The error message (updated):
[1] "[1] "HY000 936 [Oracle][ODBC][Ora]ORA-00936: missing expression\n"
[2] "[2] ...
The statement works fine when removing the CASE WHEN line, so the error must be in the CASE WHEN syntax.
I've found a semicolon sometimes causes problems in cases like this.
> sqlQuery(con, "select case when dummy = 'X' then 1 else 0 end from dual")
CASEWHENDUMMY='X'THEN1ELSE0END
1 1
> sqlQuery(con, "select case when dummy = 'X' then 1 else 0 end from dual;")
[1] "HY000 911 [Oracle][ODBC][Ora]ORA-00911: invalid character\n"
[2] "[RODBC] ERROR: Could not SQLExecDirect 'select case when dummy = 'X' then 1 else 0 end from dual;'"
> close(con)
If you look at the SQL statement (which seems to appear in it's entirety in the error message), we see:
'\nselect\n UW_NET_ULTIMATE_LOSS_USD,\n UW_NET_WRITTEN_PREMIUM_USD,\n
IF UW_NET_WRITTEN_PREMIUM_USD = 0 THEN testvar=0 ELSE testvar=1 end\n
from bqiread.POLICY_METRICS\nwhere ROWNUM<100;\n
There's no CASE statement here....it apprears to be an IF/THEN/ELSE, which is not valid Oracle syntax.
Perhaps your tool is generating SQL for a different RDBMS than Oracle, and attempting to execute it in Oracle?
I have a table that I receive as a table-type parameter in my stored procedure (it comes from an excel workbook source, but that's a different story). It has several columns that I need to validate the values against a list of valid values for each column.
Let's say my table OriginDetails looks like this (please note that this is just mock data; I have two such tables with each 8 columns I will be validating) -
Origin | Status | Priority | ErrMsg
------------------------------------------
Testing | In Review | Low |
Design | Initiated | Medium |
Prod | Declined | Critical |
And, I am validating the values in the columns Origin, Status and Priority against three different lists (actually I am validating the values against data in tables, but for simplicity I hard-coded these values here), and updating the ErrMsg column based on my validations -
UPDATE OriginDetails
SET ErrMsg = ErrMsg + '|Invalid Origin'
WHERE Origin NOT IN ('Pre-Design','Design','Development')
UPDATE OriginDetails
SET ErrMsg = ErrMsg + '|Unrecognized Status'
WHERE Status NOT IN ('In Review','Approved')
UPDATE OriginDetails
SET ErrMsg = ErrMsg + '|Priority check failed'
WHERE Priority NOT IN ('Critical','Medium','High')
This is all fine and dandy, works great - but I end up with 16 such update statements for 2 tables together, so I have a really large and ugly block of code (and a lot of duplication also since I have almost identical code for 2 tables).
Is there a way I can actually do all the updates in one single statement for each table?
Something like the below, except that it should execute each of the conditions instead of only one -
UPDATE OriginDetails
SET ErrMsg = ErrMsg +
(CASE WHEN Origin NOT IN ('Pre-Design','Design','Development')
THEN '|Invalid Origin'
WHEN Status NOT IN ('In Review','Approved')
THEN '|Unrecognized Status'
WHEN Priority NOT IN ('Critical','Medium','High')
THEN '|Priority check failed'
END)
Any ideas/direction are appreciated. Thanks.
Something like this should work well (and only require you type the values a single time):
UPDATE OriginDetails
SET ErrMsg = ErrMsg +
CASE WHEN Origin NOT IN ('Pre-Design','Design','Development') THEN '|Invalid Origin' ELSE '' END
+ CASE WHEN Status NOT IN ('In Review','Approved') THEN '|Unrecognized Status' ELSE '' END
+ CASE WHEN Priority NOT IN ('Critical','Medium','High') THEN '|Priority check' ELSE '' END
And here is the SQL Fiddle.
Good luck.
How about this:
UPDATE OriginDetails
SET ErrMsg = ErrMsg +
(CASE
WHEN Origin NOT IN ('Pre-Design','Design','Development') and Status NOT IN ('In Review','Approved') and Priority NOT IN ('Critical','Medium','High')
THEN '|Invalid Origin|Unrecognized Status|Priority check failed'
WHEN
Origin NOT IN ('Pre-Design','Design','Development') and Status NOT IN ('In Review','Approved') and Priority IN ('Critical','Medium','High')
THEN '|Invalid Origin|Unrecognized Status'
END)
you can add all cases as I did.