Format output in bteq in teradata - sql

I am using below bteq to fetch data .
.Set Separator '|'
.SET TITLEDASHES OFF;
.SET NULL AS ''
.EXPORT REPORT FILE = /app2/test.txt
sel
emp_id,
float_perc,
CAST( 0 AS DECIMAL(18.2) ) AS var
from emp
I am getting below output:
5|.99|.00
4|.78|.00
But we want output in below format:
5|0.99|0.00
4|0.78|0.00
Can anyone please help on this.
Can we replace |. with |0. in unix (Sun OS) with sed or tee?

Can you use a FORMAT?
sel
emp_id,
float_perc (FORMAT '9.99'),
CAST( 0 AS DECIMAL(18.2) FORMAT '9.99') AS var
from emp

Related

Export query result to an XML file - removing carriage return from XML result file?

I'm keen to export a SQL Server query result to an XML file.
I seem to get carriage returns in the resulting file.
I'm wondering what approach I should take to remove the carriage returns from the XML results file?
What I have tried is:
DOS command:
sqlcmd -S HSL-PC0242 -U sa -P PasswordX -i "D:\SQL\auditlog_query1.sql" -C -o "D:\SQL\auditlog_query1_out.xml"
D:\SQL\auditlog_query1.sql contains:
SELECT
A.*
FROM
H2PenguinDev.[dbo].[AuditLog] A
JOIN H2PenguinDev.dbo.ImportProviderProcesses IPP ON IPP.ImportType = 'Z'
AND A.OperatorID = IPP.OperatorID
AND A.AuditTypeID in ( '400','424','425' )
WHERE
A.[PostTime] >= IPP.StartTime
AND A.[PostTime] <= dateadd(second, 90, IPP.StartTime)
FOR XML PATH('Record'), ROOT('AuditLog')
This seems to work.
2Gb output limit tho .. which is fine for this case.
Can open resulting XML in excel ..
and/or use notepad XML plugin and pretty print option to view ..
Note the requirement for ## temp tables rather than single # temp table name.
SELECT A.MyXML
INTO ##AuditLogTempTable
FROM
(SELECT CONVERT(nvarchar(max),
(
SELECT
A.*
FROM
[dbo].[AuditLog] A
JOIN ImportProviderProcesses IPP ON IPP.ImportType = 'Z'
AND A.OperatorID = IPP.OperatorID
AND A.AuditTypeID in ( '400','424','425' )
WHERE
A.[PostTime] >= IPP.StartTime
AND A.[PostTime] <= dateadd(second, 90, IPP.StartTime)
FOR XML PATH('Record'), ROOT('AuditLog')
)
, 0
) AS MyXML
) A
EXEC xp_cmdshell 'bcp "SELECT MyXML FROM ##AuditLogTempTable" queryout "D:\bcptest1.xml" -T -c -t,'

How to pass arguments in IN clause of select statement as parameter having multiple values?

Iam writing a script in unix where where iam trying to implement the following
1) Connect to a database
2) run a select query and fetch the results in a file for validation
Now i have written the following
#!/bin/bash
file="./database.properties"
if [ -f "$file" ]
then
echo "$file found."
. $file
echo "User Id = " ${userName}
echo "user password = " ${password}
echo "schema = " ${schema}
sqlplus -S ${userName}/${password}#${schema}
set feedback off trimspool on
spool workflow_details.txt;
SELECT WORKFLOW_NAME, START_TIME, END_TIME, (END_TIME-START_TIME)*24*60 as TIME_TAKEN
FROM schema1.table1
WHERE WORKFLOW_NAME IN ('argument1,argument2,argument3,argument4')
AND WORKFLOW_RUN_ID IN (SELECT MAX(WORKFLOW_RUN_ID) FROM schema2.table3
WHERE WORKFLOW_NAME IN ('argument1'));
spool off;
exit;
else
echo "$file not found."
fi
The requirement is the value iam using in In clause i.e( argument1,argument2....etc.) is present in a file and the script should be modified such that the arguments will be fetched and placed in In clause through comma separation. The number of arguments is dynamic . How to modify the code.
In short I need to fetch the arguments for IN clause at run time from a file having the argument details . The file will look like having a single column consisting of arguments.
As mentioned in my comments you need to use Collection to fulfill your requirement. See below demo and explanation inline.
In PLSQL
-- Declare a Nested table of type Number. You can declare it of type of your argument1,argument2..
Create or replace type var is table of number;
/
DECLARE
v_var var := var ();
v_num number;
BEGIN
--Fetching rows to collection
SELECT * BULK COLLECT INTO
v_var
FROM (
SELECT 1 FROM dual
UNION ALL
SELECT 2 FROM dual
);
--Printing values of collection
FOR rec IN 1..v_var.count LOOP
dbms_output.put_line(v_var(rec) );
END LOOP;
--Using in Where clause.
Select count(1)
into v_num
from dual where 1 Member of v_var; --<-- this is how you pass the collection of number in `IN` clause.
dbms_output.put_line(v_num );
END;
In your case: UNIX script
#!/bin/bash
#read from file and prepare the "in clause" --<--Put a loop to read through the file
in_clause=argument1,argument2 #--Prepare your `in_clause`
file="./database.properties"
if [ -f "$file" ]
then
echo "$file found."
. $file
echo "User Id = " ${userName}
echo "user password = " ${password}
echo "schema = " ${schema}
sqlplus -S ${userName}/${password}#${schema}
set feedback off trimspool on
spool workflow_details.txt;
SELECT workflow_name,
start_time,
end_time,
( end_time - start_time ) * 24 * 60 AS time_taken
FROM schema1.table1
WHERE workflow_name IN ($in_clause ) #<--Use in clause
AND workflow_run_id IN (SELECT MAX(workflow_run_id) FROM schema2.table3 WHERE workflow_name IN ( 'argument1' )
);
spool off;
exit;
else
echo "$file not found."
fi
PS: Not tested

Postgres copy to TSV file with header

I have a function like so -
CREATE
OR REPLACE FUNCTION ind (bucket text) RETURNS table (
middle character varying (100),
last character varying (100)
) AS $body$ BEGIN return query
select
fname as first,
lname as last
from all_records
; END;
$body$ LANGUAGE PLPGSQL;
How do I output the results of select ind ('Mob') into a tsv file?
I want the output to look like this -
first last
MARY KATHERINE
You can use the COPY command
example:
COPY (select * from ind('Mob')) TO '/tmp/ind.tsv' CSV HEADER DELIMITER E'\t';
the file '/tmp/ind.tsv' will contain you data
Postgres doesn't allow copy with header for tsv for some reason.
If you're using a linux based system you can do it with a script like this:
#create file with tab delimited column list (use \t between each column name)
echo -e "user_id\temail" > user_output.tsv
#now you can append the results of your query to that file by copying to STDOUT
psql -h your_host_name -d your_database_name -c "\copy (SELECT user_id, email FROM my_user_table) to STDOUT;" >> user_output.tsv
Alternatively, if your script is long and you don't want to pass it in with -c command you can use the same approach from a .sql file, use "--quiet" to avoid notices being passed into your file
psql --quiet -h your_host_name -d your_database_name -f your_sql_file.sql >> user_output.tsv

Pass variables from shell script to SQL

I have an SQL statement file 1.sql:
set pages 0
set head off
set feed off
select $1 from
(
select $1 from user_tab_partitions
where table_name = 'test'
order by partition_position desc
)
where rownum = 1;
and I would like to execute the same SQL statement in a shell script, 1.sh:
#!/bin/ksh
username="test"
passwrd="testpass"
partition_name=$1
partition_position=$2
PARTNAME=`sqlplus -s $username/$passwrd << EOT
#1.sql $1
exit
EOT`
echo $PARTNAME
PARTPOS=`sqlplus -s $username/$passwrd << EOT
#1.sql $2
exit
EOT`
echo $PARTPOS
--
So, basically what I'm doing is executing the same SQL but with different inputs.
and I don't know how to pass these variables from SHELL script to the SQL script.
What should I change in my code???!!
Thanks for your time!
/Hesi
You need to change your SQL Script from $1 to &1. The $1 will on work if you imbed the SQL into the actual script here doc.
set pages 0
set head off
set feed off
select &1 from
(
select &1 from user_tab_partitions
where table_name = 'test'
order by partition_position desc
)
where rownum = 1;

Oracle PL-SQL : Import multiple delimited files into table

I have multiple files (f1.log, f2.log, f3.log etc)
Each file has the data in ; & = delimited format. (new lines are delimited by ; and fields are delimited by =) e.g.
data of f1:
1=a;2=b;3=c
data of f2:
1=p;2=q;3=r
I need to read all these files and import data into table in format:
filename number data
f1 1 a
f1 2 b
f1 3 c
f2 1 p
[...]
I am new to SQL. Can you please guide me, how can do it?
Use SQL*Loader to get the files into a table. Assuming you have a table created a bit like:
create table FLOG
(
FILENAME varchar2(1000)
,NUM varchar2(1000)
,DATA varchar2(1000)
);
Then you can use the following control file:
LOAD DATA
INFILE 'f1.log' "str ';'"
truncate INTO TABLE flog
fields terminated by '=' TRAILING NULLCOLS
(
filename constant 'f1'
,num char
,data char
)
However, you will need a different control file for each file. This can be done by making the control file dynamically using a shell script. A sample shell script can be:
cat >flog.ctl <<_EOF
LOAD DATA
INFILE '$1.log' "str ';'"
APPEND INTO TABLE flog
fields terminated by '=' TRAILING NULLCOLS
(
filename constant '$1'
,num char
,data char
)
_EOF
sqlldr <username>/<password>#<instance> control=flog.ctl data=$1.log
Saved as flog.sh it can then be run like:
./flog.sh f1
./flog.sh f2