stata odbc sqlfile - sql

I am trying to load data from database (either MS Access or SQL server) using odbc sqlfile it seems that the code is running with any error but I am not getting data. I am using the following code odbc sqlfile("sqlcode.sql"),dsn("mysqlodbcdata"). Note that sqlcode.sql contains just sql statement with SELECT. The thing is that the same sql code is giving data with odbc load,exec(sqlstmt) dsn("mysqlodbcdata"). Can anyone suggest how can I use odbc sqlfile to import data? This would be a great help for me.
Thanks
Joy

sqlfile doesn't load any data. It just executes (and displays the results when the loud option is specified), without loading any data into Stata. That's somewhat counter-intuitive, but true. The reasons are somewhat opaquely explained in the pdf/dead tree manual entry for the odbc command.
Here's a more helpful answer. Suppose you have your SQL file named sqlcode.sql. You can open it in Stata (as long as it's not too long, where too long depends on your flavor of Stata). Basically, -file read- reads the SQL code line by line, storing the results in a local macro named exec. Then you pass that macro as an argument to the -odbc load- command:
Updated Code To Deal With Some Double Quotes Issues
Cut & paste the following code into a file called loadsql.ado, which you should put in directory where Stata can see it (like ~/ado/personal). You can find such directories with the -adopath- command.
program define loadsql
*! Load the output of an SQL file into Stata, version 1.3 (dvmaster#gmail.com)
version 14.1
syntax using/, DSN(string) [User(string) Password(string) CLEAR NOQuote LOWercase SQLshow ALLSTRing DATESTRing]
#delimit;
tempname mysqlfile exec line;
file open `mysqlfile' using `"`using'"', read text;
file read `mysqlfile' `line';
while r(eof)==0 {;
local `exec' `"``exec'' ``line''"';
file read `mysqlfile' `line';
};
file close `mysqlfile';
odbc load, exec(`"``exec''"') dsn(`"`dsn'"') user(`"`user'"') password(`"`password'"') `clear' `noquote' `lowercase' `sqlshow' `allstring' `datestring';
end;
/* All done! */
The syntax in Stata is
loadsql using "./sqlfile.sql", dsn("mysqlodbcdata")
You can also add all the other odbc load options, such as clear, as well. Obviously, you will need to change the file path and the odbc parameters to reflect your setup. This code should do the same thing as -odbc sqlfile("sqlfile.sql"), dsn("mysqlodbcdata")- plus actually load the data.
I also added the functionality to specify your DB credentials like this:
loadsql using "./sqlfile.sql", dsn("mysqlodbcdata") user("user_name") password("not12345")

For "--XYZ" style comments, do something like this (assuming you don't have "--" in your SQL code):
if strpos(`"``line''"', "--") > 0 {;
local `line' = substr(`"``line''"', 1, strpos(`"``line''"', "--")-1);
};
I had to post this as an answer otherwise the formatting would've been all messed up, but it's obviously referring to Dimitriy's code.
(You could also define a local macro holding the position of the "--" string to make your code a little cleaner.)

Related

Relacing a word in an db2 sql file causes DSNC105I : End of file reached while reading the command error

I have a dynamic sql file in which name of TBCREATOR changes as given in a parameter.
I use a simple python script to change the TBCREATOR=<variable here> and write the result to an output sql file.
calling this file using db2 -td# -vf <generated sql file>gives
DSNC105I : End of file reached while reading the command
Here is the file i need the TBCREATOR variable replaced:
CONNECT to 204.90.115.200:5040/DALLASC user *** using ****#
select REMARKS from sysibm.SYSCOLUMNS WHERE TBCREATOR='table' AND NAME='LCODE'
#
Here is the python script:
#!/usr/bin/python3
# #------replace table value with schema name
# print(list_of_lines)
fin = open("decrypt.sql", "rt")
#output file to write the result to
fout = open("decryptout.sql", "wt")
for line in fin:
fout.write(line.replace('table', 'ZXP214'))
fin.close()
fout.close()
After decryptout.sql is generated I call it using db2 -td# -vf decryptout.sql
and get the error given above.
Whats irritating is I have another sql file that contains exactly same data as decryptout.sql which runs smoothly with the db2 -td# -vf ... command. I tried to use the unix command cmp to compare the generated file and the one which I wrote, with the variable ZXP214 already replaced but there are no differences. What is causing this error?.
here is the file (that executes without error) I compare generated output with:
CONNECT to 204.90.115.200:5040/DALLASC user *** using ****#
select REMARKS from sysibm.SYSCOLUMNS WHERE TBCREATOR='ZXP214' AND NAME='LCODE'
#
I found that specifically on the https://ibmzxplore.influitive.com/ challenge, if you are using the java db2 command and working in the Zowe USS system (Unix System Services of zOS), there is a conflict of character sets. I believe the system will generally create files in EBCDIC format, whereas if you do
echo "CONNECT ..." > syscat.clp
the resulting file will be tagged as ISO8859-1 and will not be processed properly by db2. Instead, go to the USS interface and choose "create file", give it a folder and a name, and it will create the file untagged. You can use
ls -T
to see the tags. Then edit the file to give it the commands you need, and db2 will interoperate with it properly. Because you are creating the file with python, you may be running into similar issues. When you open the new file, use something like
open(input_file_name, mode=”w”, encoding=”cp1047”)
This makes sure the file is open as an EBCDIC file.
If you are using the Db2-LUW CLP (command line processor) that is written in c/c++ and runs on windows/linux/unix, then your syntax for CONNECT is not valid.
Unfortunately your question is ambigiously tagged so we cannot tell which Db2-server platform you actually use.
For Db2-LUW with the c/c++ written classic db2 command, the syntax for a type-1 CONNECT statement does not allow a connection-string (or partial connection string) as shown in your question. For Db2-LUW db2 clp, the target database must be externally defined (i.e not inside the script) , either via the legacy actions of both catalog tcpip node... combined with catalog database..., or must be defined in the db2dsdriver.cfg configuration file as plain XML.
If you want to use connection-strings then you can use the clpplus tool which is available for some Db2-LUW client packages, and is present on currently supported Db2-LUW servers. This lets you use Oracle style scripting with Db2. Refer to the online documentation for details.
If you not using the c/c++ classic db2 command, and you are instead using the emulated clp written in java only available with Z/OS-USS, then you must open a ticket with IBM support for that component, as that is not a matter for stackoverflow.

How to provide vsdbcmd deploy command line target dbschema sql command variables?

The Visual Studio (2010) gui provides options for specifying second command variable file for target. I however cant find this option for the command line implementation - vsdbcmd.exe.
Running vsdbcmd deploy for dbschema to dbschema with only source model command variables given results that objects that implement the variables are treated as having changes. Resulting in incorrect(improper) update script.
The command i use currently:
vsdbcmd.exe /a:deploy /dd:- /dsp:sql /model:Source.dbschema /targetmodelfile:Target.dbschema /p:SqlCommandVariablesFile=Database.sqlcmdvars /manifest:Database.deploymanifest /DeploymentScriptFile:UpdateScript.sql /p:TargetDatabase="DatabaseName"
What im looking for is the /p:TargetSqlCommandVariablesFile, if such thing exists ...
The result script is the same as running so GUI compare without specifying the sqlcmd vars for target
I found what looks like full documentation for VSDBCMD.EXE at this link.
I think you may be looking for something like:
/p:SqlCommandVariablesFile=Filepath
In the end i found no info on the possibility to do what I required - checked vsdbcmd libs with IL spy for hidden parameters - didn't find any.
Reached my goal by parsing the dbschema files for both target and current and parsing the cmd variable values directly into them - then doing the compare on modified dbschemas. This approach no longer allows to change sql cmd vars in resulting script (as the values are already baked into code), however this was deemed as acceptable loss.
Not the most beautiful solution but so far i have had no issues with it.

config_file works wrong for sqlite3 database

Here is what am I doing:
#configure_file(${CMAKE_CURRENT_SOURCE_DIR}/xdb.db3 ${complex_BINARY_DIR}/) <-- works wrong
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/armd.conf ${complex_BINARY_DIR}/)
both files moves there properly but when I'm trying to use moved xdb.db3 my program and sqlite editor says "xdb.db3 is not sqlite database or encrypted"
How must I move sqlite database and why I can't do it with configure_file?
Try adding the COPYONLY flag to configure_file.
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/xdb.db3 ${complex_BINARY_DIR} COPYONLY)

Reading a text file with SSIS with CRLF or LF

Running into an issue where I receive a text file that has LF's as the EOL. Sometimes they send the file with CRLF's as the EOL. Does anyone have any good ideas on how I can make SSIS use either one as the EOL?
It's a very easy convert operation with notepad++ to change it to what ever I need, however, it's manual and I want it to be automatic.
Thanks,
EDIT. I fixed it (but not perfect) by using Swiss File Knife before the dataflow.
If the line terminators are always one or the other, I'd suggest setting up 2 File Connection Managers, one with the "CRLF" row delimiter, and the other with the "LF" row delimiter.
Then, create a boolean package variable (something like #IsCrLf) and scope this to your package. Make the first step in your SSIS package a Script Task, in which you read in a file stream, and attempt to discover what the line terminator is (based on what you find in the stream). Set the value of your variable accordingly.
Then, after the Script Task in your Control Flow, create 2 separate Data Flows (one for each File Connection Manager) and use a Precedence Constraint set to "Expression and Constraint" on the connectors to specify which Data Flow to use, depending on the value of the #IsCrLf variable.
Example of the suggested Control Flow below.
how about a derived column with the REPLACE operation after your file source to change the CRLFs to LFs?
I second the OP's vote for Swiss File Knife.
To integrate that, I had to add an Execute Process Task:
However, I have a bunch of packages that run For-Each-File loops, so I needed some BIML - maybe this'll help the next soul.
<ExecuteProcess Name="(EXE) Convert crlf for <#= tableName #>"
Executable="<#= myExeFolder #>sfk.exe">
<Expressions>
<Expression PropertyName="Arguments">
"crlf-to-lf " + #[User::sFullFilePath]
</Expression>
</Expressions>
</ExecuteProcess>

How to Store BLOB data in Sqlite Using Tcl

I have a Tcl TK application that has a Sqlite back-end. I pretty much understand the syntax for inserting, manipulating, and reading string data; however, I do not understand how to store pictures or files into Sqlite with Tcl.
I do know I have to create a column that holds BLOB data in Sqlite. I just don't know what to do on the Tcl side of things. If anyone knows how to do this or has a good reference to suggest for me, I would really appreciate it.
Thank you,
Damion
In my code, I basically open the file as a binary, load its content into a Tcl variable, and stuff that into the SQLite db. So, something like this...
# load the file's contents
set fileID [open $file RDONLY]
fconfigure $fileID -translation binary
set content [read $fileID}
close $fileID
# store the data in a blob field of the db
$db eval {INSERT OR REPLACE INTO files (content) VALUES ($content)}
Obviously, you'll want to season to taste, and you're table will probably contain additional columns...
The incrblob command looks like what you want: http://sqlite.org/tclsqlite.html#incrblob
The "incrblob" method
This method opens a TCL channel that
can be used to read or write into a
preexisting BLOB in the database. The
syntax is like this:
dbcmd incrblob ?-readonly?? ?DB? TABLE COLUMN ROWID
The command returns a new TCL channel
for reading or writing to the BLOB.
The channel is opened using the
underlying sqlite3_blob_open()
C-langauge interface. Close the
channel using the close command of
TCL.