Currently having issues with running sql code through pypyodbc in python. The odbc is connected to netezza to run sql code through there.
db = pyodbc.connect(connection_string)
cursor = db.cursor()
sql = '''
Create external table TABLE1 (
col1,
col2,
col3
)
using (
dataobject('C:\\file.txt')
delimiter '|'
quotedvalue 'DOUBLE'
requiresquotes TRUE
nullvalue ''
escapechar '`'
y2base 2000
encoding 'internal'
remotesource 'ODBC'
logdir 'C:\\log'
);
Create table TABLE1_MAILBOX as (
select * from TABLE1
) distribute on random;
'''
cursor.execute(sql)
db.commit()
The first block of code that creates the external table works fine, it's when the second block of code that creates TABLE1_MAILBOX that things go wrong. It seems like it misses lines in TABLE1. For example, if the dataobject text file has 5000 lines, TABLE1 has 5000 lines, but TABLE1_MAILBOX has about 4750 lines.
However, if I run this code directly in netezza, it works just fine and each table has the correct amount of lines.
Not sure why it is doing through if it is run through pypyodbc in python. Could it be a glitch? Or am I doing something wrong?
Related
I have table which has 100 columns. I want to write select * from table and null column values should be replaced with blank. I don't want to include 100 columns in
my select statement and write:
select
isnull(col1,''),
isnull(col2,''),
...
isnull(col100,'')
from table
Check this out :
DECLARE #TableName VARCHAR(MAX)='ASC_LOT_TBL'
DECLARE #SchemaName VARCHAR(MAX)='dbo'
DECLARE #ColumnList VARCHAR(MAX)
SELECT #ColumnList= ISNULL(#ColumnList+',','')+
CASE WHEN DATA_TYPE LIKE '%char' THEN 'ISNULL('+COLUMN_NAME+','''') AS '+COLUMN_NAME
WHEN DATA_TYPE = 'int' THEN 'ISNULL('+COLUMN_NAME+',''0'') AS '+COLUMN_NAME
ELSE COLUMN_NAME END
FROM INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME=#TableName
AND TABLE_SCHEMA=#SchemaName
ORDER BY ORDINAL_POSITION
SELECT 'SELECT '+#ColumnList+' FROM ' +#SchemaName+'.'+#TableName
Well, you can go over all the suggestion above on SQL/script level, or change your design on your table to not allow NULL and with a default value to '' (In MSSQL, it will be ('')).
This way, when a new entry is inserted, that column will store the data/value '', not NULL.
And in this case, you can select * from table with "blank" in return instead of NULL.
You need to choose from script or table design, one way or another.
X/Y problem apart, to answer your specific question :
To write the query you can use a regular expression search & replace in your editor to avoid editing each line a hundred time.
Just select all the concerned lines with the columns names after the SELECT,
col1,
col2,
col3,
...
Open search and replace (Ctrl + H) for (\S+), and replace by ISNULL($1, ''),, with the regular expression mode activated in your favorite editor, and apply the replacement to the selection only.
Illustration in SSMS :
It works for instance in Visual Studio, in SQL Server Management Studio, in Notepad++, etc...
(provided there is no space in your column names, otherwise please leave a comment to specify how your column names are, and we can work out a quick solution)
The last line need to be edited manually, but you saved 99 editions anyway ;).
You should also be able to create a view, and then do SELECT on your view, this would reduce the need to write this only once (but not tested, and not sure if there could be performance issue in your usecase)
I am using SQL Server Management Studio 2017. I'm having a table in my local machine. Let's say its name is Email. Inside the Email table, there is a column name Language and its value can be 'en' or 'zh'. Beside this Language column, there are some more columns.
Now my colleague also has the same table but different machine and he only has the rows where Language is en. He doesn't have the rows where Language is zh yet.
How can I use this software to generate a SQL script to insert data to his table but with rows only where Language = 'zh' so that both table will be the same.
To be more clear , I need to send him the INSERT script with also contains the values of the data, so that he can use to INSERT to his database in his local machine.
Or is it any way else?
Is this what you want?
insert into colleague.dbo.email ( . . . )
select . . .
from my.dbo.email
where language = 'zh';
There are a couple of ways - the first is quick and requires no code, but you need to manually get rid of the rows you don't want yourself, the other is more flexible but you have to code it. There are also third party tools you can get to do this for you.
If you just want to do this once, you can do the following (this is from a different version of SSMS, but it should be similar)
right click on the database, select tasks, select generate scripts. Then pick the table you want, go next, then in the advanced settings find the "types of data to script" and change it to data. This should generate a script to load all the data from your table - you would need to edit it to just load the rows you want.
Another option is to write a script that basically makes the data you want yourself. You would need set it up to start with, and adjust it if you ever change the table format, but you can edit the selection script so that it only gets the data you want. I'm sure you could write a script that would create this automatically for you, but I've never needed one, so I haven't thought about it.
Something like this one
With baseRows as
(
--This bit gets the data you want
Select E.*
, ROW_NUMBER() over(ORDER BY ID) as RowNo
From dbo.Email E
Where E.Language = 'zh' -- whatever selection you need here
)
, selectRows as
(
--This bit creates the data select statments to set the data to import
Select
case when BR.RowNo = 1 then '' else 'Union All ' end
+ 'Select '
+ convert(varchar(10), BR.userID) + ' as userID, ' -- required integer example
+ case when BR.backupID is null then 'NULL' else CONVERT(varchar(10), BR.backupID) end + ' as backupID, ' -- nullable integer example
+ '''' + BR.Name + ''' as name, ' -- required nvarchar example
+ case when BR.groupname IS null then 'NULL' else '''' + BR.groupname + '''' end + ' as groupname, ' --nullable varchar example
+ CONVERT(varchar(2), BR.isActive) + ' as isActive, ' --bit example
as SQL
from baseRows BR
)
Select
--This creates the insert command row (top row) of the final query
'Insert into Email (
userID
, backupID
, name
, groupName
, IsActive
)' as SQL
UNION ALL
Select SQL from baseRows --and adds the data to the following rows
If you run this script, the output would be the script you are looking for to load the data into another machine.
RightClick on Database name(table in which Database)--->Tasks--->Generate Scripts--->Next--->select specific database objects--->select Table--->Save to file = ADVANCED--->click on Advanced--->types of data to script = "Data Only"--->Give the path(where to store theinsert commands)
I am using the copy command to dump the data of a table in PostgreSQL to a txt file. I run the following command in PSQL:
\copy (select * from TableName) to 'C:\Database\bb.txt' with delimiter E'\t' null as '';
now in the bb.txt file, I see some special characters which are not there in the table itself. The Database has been configured with UTF8 encoding.
For example: when I run the above mentioned copy query, if the special character shows up in the column with ID=5. If I run the same copy query with (select * from tablename where ID=5), the special char is not there:
\copy (select * from TableName where ID=5) to 'C:\Database\bb.txt' with delimiter E'\t' null as '';
This happens on a Windows machine. Can someone tell me where these special characters are coming from?
I have to read and insert into a table i already have in sql server 2005 i have to read a txt file that has more than 1million of records, it worked great using this method:
set #variable='
insert into table(row1)
select * from OPENROWSET(''MSDASQL'',''Driver={Microsoft Text Driver (*.txt; *.csv)};
DEFAULTDIR='+#path+';rowDTERMINATOR = \n;'',''SELECT * FROM '+#fileName+''')
'
EXEC(#variable)
the problem is that some records has a ',' in the middle of the name of the client for example:
david,steven abril pulecio
and this method insert that in the table like this:
|david|
so how can i use this method or maybe use bulk insertion and evite that it cut the records when they have a ',' ?
I am unloading the results of a query to a file. I need the result records to be displayed in a single line. For example,
unload to file.unl select col from table where col = "test";
The resulting file.unl would be like
test|
test|
....
test|
But what I would like to get is,
test|test|test|....|test|
Any guidance or ideas are appreciated.
Thanks
You are probably aware you can declare DELIMITER "|" in the SQL or via the DBDELIMITER environment variable. As far as I know, there is no way to change the default record terminator from NEWLINE to anything else.
So I suspect you will need to either write a custom function that iterates through the results and appends them to a variable which you then write out to a file, or write a separate piece of script that you call via RUN that pivots the data from rows to columns, eg:
UNLOAD TO file.unl SELECT col FROM table WHERE ...;
RUN "/usr/bin/perl -ni -e 'chomp; print' file.unl";
I've assumed you're running on Unix and have perl handy. If you don't, you might find the iterator suggestion more practical than mucking about with installing perl on Windows, particularly if you have limited control over machines this code gets run on.
Update
My 4GL is very rusty, but I was decribing something very basic, eg:
DEFINE command CHAR(10100)
DEFINE long_string CHAR(10000)
DECLARE curs1 CURSOR FOR
SELECT col FROM table WHERE ...
FOREACH curs1 INTO col1
LET long_string = long_string CLIPPED || col1 CLIPPED || '|'
END FOREACH
LET command = "echo '" || long_string CLIPPED || "' > file.unl"
RUN command
It's 15 years or more since I wrote any 4GL, so treat that as pseudo-code at most, please. There might be a better way to write directly to a file, I honestly can't remember.
CREATE Table #Table (
PKey INT Primary key,
Field VARCHAR(10)
)
INSERT INTO #Table
select 1, 'ABS1' UNION ALL
select 2, 'ABS2' UNION ALL
select 3, 'ABS3'
DECLARE #results VARCHAR(MAX) = ''
select
#results = COALESCE(
#results + '|' + Field,
#results
)
from #Table
SELECT #results