How to extract data with Actual Column Size rather then Fixed Column Size in SYBASE by ISQL query - isql

ISQL command executes the SQL file and generates a text file. The results data columns size is based on the fixed size of the column and not based on the actual size of the data.
e.g.
The Table "STUDENT" has columns
"FirstName" varchar(10)
"LastName" varchar(10)
ISQL Command :-
isql -UUserID -PPassword -SDatabase1 -DUserID -iName.sql -b -s -w2000 -oName.txt
When I execute the SELECT query(Name.sql) by the ISQL command it result in
Actual :-
FirstName |LastName
JOHN______|DOE_______
Note : "_" is blank spaces
Expected :-
FirstName|LastName
JOHN|DOE
I did google and I got few links but they were not helpful to me.
https://docs.faircom.com/doc/isql/32422.htm
http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.infocenter.dc30191.1550/html/utility/utility14.htm
Installed SYBASE version : 15.7.0

After researching I came to know that Sybase ISQL has this limitation and data result column is based on the fixed size of the column rather then the actual size.
There are other options available like having temporary table/views and we can get the desired data.
I ended up writing a utility that does the job for me.

Related

Write results of SQL query to multiple files based on field value

My team uses a query that generates a text file over 500MB in size.
The query is executed from a Korn Shell script on an AIX server connecting to DB2.
The results are ordered and grouped by a specific field.
My question: Is it possible, using SQL, to write all rows with this specific field value to its own text file?
For example: All rows with field VENDORID = 1 would go to 1.txt, VENDORID = 2 to 2.txt, etc.
The field in question currently has 1000+ different values, so I would expect the same amount of text files.
Here is an alternative approach that gets each file directly from the database.
You can use the DB2 export command to generate each file. Something like this should be able to create one file :
db2 export to 1.txt of DEL select * from table where vendorid = 1
I would use a shell script or something like Perl to automate the execution of such a command for each value.
Depending on how fancy you want to get, you could just hardcode the extent of vendorid, or you could first get the list of distinct vendorids from the table and use that.
This method might scale a bit better than extracting one huge text file first.

ignore bcp right truncation

I have a file with the stock information, such as ticker and stock price. The file was loaded to database table using freebcp. The stock price format in the file is like: 23.125. The stock price data type in database table is [decimal](28, 2). freebcp loaded the data to the table without any problem by ignoring the last digit: 23.12 was loaded to the table column of the record. We are now using Microsoft SQL Server's bcp utility (Version: 11.0 ) to load the data. However we now encounter an issue: bcp considers loading 23.125 to decimal(28.2) is an error (## Row 783, Column 23: String data, right truncation ##). It rejected the record.
I didn't want to modify the input file, because there are a lot of columns in the file need to be fixed by removing the last digit of columns.
Are there any ways to construct the BCP or the Microsoft SQL Server to ingore the right truncation error?
A common workaround back in the day, is to BCP into a secondary/temp table, then do SELECT (columnlist) INTO the base table, with the necessary conversion. Another option, is to Use the OPENROWSET Bulk Rowset Provider, then you can cast/convert as needed.
I encountered this error today and I fixed this by using the -m parameter in sql server version #15
bcp dbo.<table> in <csv file> -S <server> -d <db> -U <user> -P <psw> -m 999999 -q -c -t ,
Reference: https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15#m
Note:
The -m option also does not apply to converting the money or bigint data types.

DB2 SQL query returns some type of converted results when exporting to file

have shell script which queries a DB2 db and exports the output to a file. When I sun the SQL statement without exporting, I get the following:
su - myid -c 'db2 connect to mydb;db2 -v "select COL1"; db2 connect reset;'
Sample Output
COL 1
x'20A0E2450080000'
x'50D24520E100GDS00'
x'10H0EFJ10080000'
x'50A0GH0080000'
x'80RHE1008B0000'
x'70A50E1F4008000'
x'10F329EF09BB0'
But when I export my results using the exact same query, I get the following:
su - myid -c 'db2 connect to mydb;db2 -v "EXPORT TO '/tmp/query_results.out' OF DEL MODIFIED BY COLDEL: select COL1 from MYTABLE"; db2 connect reset;'
Sample Output
hôª"
"xàÓ °á
"èÅ °á
hôª"
"é# °á
hôª"
"é« °á
hôª"
"éÅ °á
hôª"
"""ÒYá  á
hôª"
"#sYá  á
hôª"
I'm assuming this is due to the single quote characters. Due to the fact that they are both preceded by another character, I have not been able to add '\' in front of them. I've also attempted to run the substr function within the query, but I still get the same result, only shorter. I'm sure there must be something I am overlooking, so after a several days of trying on my own (and failing), I'm turning to you guys. Any help would be greatly appreciated.
*Edit: Just wanted to add that my actual select statement includes more than one column which are displayed correctly. So out of several columns, only one is displaying bad data.
"I'm assuming this is due to the single quote characters" -- No. This particular column contains binary data, either BLOB or VARCHAR FOR BIT DATA. If it is BLOB, specify LOBS TO in the EXPORT command, this way BLOBs will be written to binary files. If it is VARCHAR FOR BIT DATA, you can either convert it to BLOB on export (export to ... lobs to ... select blob(your_column)...) or export it as hex(your_column), depending on what you're planning to do with the export later.
Another alternative for VARCHAR FOR BIT DATA would be to export your table using the IXF format instead of DEL, which will preserve binary strings.

Moving results of T-SQL query to a file without using BCP?

What I want to do is output some query results to a file. Basically, when I query the table I'm interested in, my results look like this:
HTML_ID HTML_CONTENT
1 <html>...
2 <html>...
3 <html>...
4 <html>...
5 <html>...
6 <html>...
7 <html>...
The field HTML_CONTENT is of type ntext and each record's value is around 500+ characters (that contains HTML content).
I can create a cursor to move each record's content to a temp table or whatever.
But my question is this: instead of temp table, how would I move this without using BCP?
BCP isn't an option as our sysadmin has blocked access to sys.xp_cmdshell.
Note: I want to store each record's HTML content to individual files
My version of sql is: Microsoft SQL Server 2008 (SP1) - 10.0.2531.0
You can make use of SSIS to read the table data and output the content of the table rows as files. Export column transformation available within Data Flow Task of SSIS packages might help you do that.
Here is an example, The Export Column Transformation
MSDN documentation about Export Column transformation.
This answer would have worked until you added the requirement for Individual Files.
You can run the SQL from command line and dump the output into a file. The following utilities can be used for this.
SQLCMD
OSQL
Here is an example with SQLCMD with an inline query
sqlcmd -S ServerName -E -Q "Select GetDate() CurrentDateAndTime" > output.txt
You can save the query to a file (QueryString.sql) and use -i instead
sqlcmd -S ServerName -E -i QueryString.sql> output.txt
Edit
Use SSIS
Create a package
Create a variable called RecordsToOutput of type Object at the package level
Use an EXECUTE SQL task and get the dataset back into RecordsToOutput
Use a For-Each loop to go through the RecordsToOutput dataset
In the loop, create a variable for each column in the dataset (give it the same name)
Add a Data Flow task
Use a OleDB source and use a SQL statement to create one row (with data you already have)
use a flat-file destination to write out the row.
Use expressions on the flat file connection to change the name of the destination file for each row in the loop.

Unable to update the table of SQL Server with BCP utility

We have a database table that we pre-populate with data as part of our deployment procedure. Since one of the columns is binary (it's a binary serialized object) we use BCP to copy the data into the table.
So far this has worked very well, however, today we tried this technique on a Windows Server 2008 machine for the first time and noticed that not all of the columns were being updated. Out of the 31 rows that are normally inserted as part of this operation, only 2 rows actually had their binary columns populated correctly. The other 29 rows simply had null values for their binary column. This is the first situation where we've seen an issue like this and this is the same .dat file that we use for all of our deployments.
Has anyone else ever encountered this issue before or have any insight as to what the issue could be?
Thanks in advance,
Jeremy
My guess is that you're using -c or -w to dump as text, and it's choking on a particular combination of characters it doesn't like and subbing in a NULL. This can also happen in Native mode if there's no format file. Try the following and see if it helps. (Obviously, you'll need to add the server and login switches yourself.)
bcp MyDatabase.dbo.MyTable format nul -f MyTable.fmt -n
bcp MyDatabase.dbo.MyTable out MyTable.dat -f MyTable.fmt -k -E -b 1000 -h "TABLOCK"
This'll dump the table data as straight binary with a format file, NULLs, and identity values to make absolutely sure everything lines up. In addition, it'll use batches of 1000 to optimize the data dump. Then, to insert it back:
bcp MySecondData.dbo.MyTable in MyTable.dat -f MyTable.fmt -n -b 1000
...which will use the format file, data file, and set batching to increase the speed a little. If you need more speed than that, you'll want to look at BULK INSERT, FirstRow/LastRow, and loading in parallel, but that's a bit beyond the scope of this question. :)