Bash script to import the multiple CSV files into a mysql database using load data local infile command - sql

I have Multiple CSV files which are stored in one of the folder then I need to use these folder to fetch the csv files then load them into Database Table.
This script need to prepare in Bash with parameterized fields like InputFolderPath(loop Csv Files), DatabaseConnection, SchemaName, TableName then pass these fields using
Load Data Local Infile Command.

This worked for me,
for f in /var/www/path_to_your_folder/*.csv
do
mysql -e "use database_name" -e "
load data local infile '"$f"' into table your_table_name FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (column_name1, #date_time_variable1, column_name3)
SET some_column_name_which_contains_date = STR_TO_DATE(#date_time_variable1, '%d-%m-%Y');" -u your_mysql_username_here --p --local-infile=1
echo "Done: '"$f"' at $(date)"
done
This script will prompt password for mysql.
i am using this script on ec2 + ubuntu

Related

Is It possible to create BCP without mentioning input file name?

BULK INSERT NECCOI_DB.dbo.ALL_Stores
FROM 'C:\TestingforAutomation\AllStores.csv'
WITH
(
FIELDTERMINATOR=',',
ROWTERMINATOR='\n',
FIRSTROW=2
)
My bat file is like this:
bcp NECCOI_DB.dbo.ALL_Stores in C:\TestingforAutomation\AllStores.csv -Uvm -PMadhu#9515 -SBLLT-5CD124JQHQ -c -F2 -t ","
TIMEOUT /T 60
It is working fine without any issues ,but this script have to work without mentioning csv file name means with any csv file it should work...
so i have changed my script to this:
bcp NECCOI_DB.dbo.ALL_Stores in C:\TestingforAutomation\*.csv -Uvm -PMadhu#9515 -SBLLT-5CD124JQHQ -c -F2 -t ","
TIMEOUT /T 60
It's giving error like this:
SQLState = S1000, NativeError = 0 Error = [Microsoft][ODBC Driver 17
for SQL Server]Unable to open BCP host data-file
can anyone guide me to fix this ?thanks in advance
There is no way to do this within the BCP command.
To bulk copy many files, you'll need to write code to cycle through the list of files you want loaded and build a new bcp or bulk insert statement for each file to be loaded.
Neither the BCP utility nor the BULK INSERT t-sql command can accept wildcards.

Loading data to Netezza using nzload utility

I am trying to load data into Netezza database using "nzload" utility. The control file is as below and it works without any issues.
Is there a way to provide multiple data files as the input in a single control file?
DATAFILE C:\Karthick\data.txt
{
Database test1
TableName test
Delimiter '%'
maxErrors 20
Logfile C:\Karthick\importload.log
Badfile C:\Karthick\inventory.bad
}
$ cat my_control_file
datafile my_file1 {}
datafile my_file2 {}
datafile my_file3 {}
datafile my_file4 {}
# Below, I specify many of the options
# on the command line itself ... so I don't have
# to repeat them in the control file.
$ nzload -db system -t my_table -delim "|" -maxerrors 10 -cf my_control_file
Load session of table 'MY_TABLE' completed successfully
Load session of table 'MY_TABLE' completed successfully
Load session of table 'MY_TABLE' completed successfully
Load session of table 'MY_TABLE' completed successfully
Yes, you can specify multiple data files in single control file. Those data files can be loaded to same table or different tables. See an example at https://www.ibm.com/docs/en/psfa/7.2.1?topic=command-nzload-control-file
Following are two data files "/tmp/try1.dat" and "/tmp/try2.dat" to be loaded in a table "test" in "system" database:
[nz#nps ]$ cat /tmp/try1.dat
1
2
[nz#nps ]$ cat /tmp/try2.dat
3
4
Following control file defines two "DATAFILE" blocks one for each data file.
[nz#nps ]$ cat /tmp/try.cf
DATAFILE /tmp/try1.dat
{
Database system
TableName test
Delimiter '|'
Logfile /tmp/try1.log
Badfile /tmp/try1.bad
}
DATAFILE /tmp/try2.dat
{
Database system
TableName test
Delimiter '|'
Logfile /tmp/try2.log
Badfile /tmp/try2.bad
}
Load the data using "nzload -cf" option and verify that data is loaded.
[nz#nps ]$ nzload -cf /tmp/try.cf
Load session of table 'TEST' completed successfully
Load session of table 'TEST' completed successfully
[nz#nps ]$ nzsql -c "select * from test"
A1
----
2
3
4
1
(4 rows)

How can I run .sql file having multiple .sql file inside using Snowsql?

I want to figure out how to run multiple sql files on one go. Suppose I have this test.sql file which has file1.sql, file2.sql and file3.sql and so on. Along with some DML/DDL.
use database &{db};
use schema &{sc};
file1.sql;
file2.sql;
file3.sql;
create table snow_test1
(
name varchar
,add1 varchar
,id number
)
comment = 'this is snowsql testing table' ;
desc table snow_test1;
insert into snow_test1
values('prachi', 'testing', 1);
select * from snow_test1;
here what I run in power shell,
snowsql -c pp_conn -f ...\test.sql -D db=tbc -D sc=testing;
Is there any way to do this ? I know It is possible in Oracle but I want to do this using snowsql. Please guide me. Thanks in advance!
you can run multiple files in a single call:
snowsql -c pp_conn -f file1.sql -f file2.sql -f file3.sql -D db=tbc -D sc=testing;
You might need to put the addition DMLs in a file.
I have tried defining .sql file with !source inside my test.sql file and its working:
!source file1.sql;
!source file2.sql;
!source file3.sql;
....
Also, run the same command in power shell using one .sql file and it is working.

Export amazon mysql database to an excel sheet

I have an ec2-instance on which mysql database is there and now there are multiple tables have huge values which i want to export into an excel sheet into my local system or even some place at S3 will also work , how can i achieve this ?
Given that you installed your own MySQL instance on an EC2 node, you should have full access to MySQL's abilities. I don't see any reason why you can't just do a SELECT ... INTO OUTFILE here:
SELECT *
FROM yourTable
INTO OUTFILE 'output.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Once you have the CSV file, you may transfer it to a box running Excel, and use the Excel import wizard to bring in the data.
Edit:
Based on your comments below, it might be the case that you need to carefully select an output path and location to which MySQL and your user have permissions to write.
Another way to export CSV files from RDS Mysql and without getting Access denied for user '<databasename>'#'%' (using password: YES) is doing the following command:
mysql -u username -p --database=dbname --host=rdshostname --port=rdsport --batch -e "select * from yourtable" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > yourlocalfilename.csv
The secret is in this part:
--batch -e "select * from yourtable" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > yourlocalfilename.csv

Exporting SQL Table into a CSV file using Windows Batch Script

I am trying to create a windows batch file to export data from an SQL file to a CSV file.
I have an SQL file in %MYHOME%\database\NET-DB.sql which contains data that is:
NET-DB.sql
insert into net_network (id, a_id, alias, address, domain, mask) values('NET_10.10.1.0_10', 1, 'Local Network', '10.10.1.0', '', '255.255.252.0');
What I have tried so far in exporting the data from net_network table into a CSV file in my .bat file is with this command:
export.bat
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
COPY net_network TO '%MYHOME%\net\CSV-EXPORT_FILE.csv' DELIMITER ',' CSV HEADER;
pause
Since that does not work for me, what should be the correct approach for this implementation? Any help will be much appreciated.
Use SQLCMD
You need to modify the code to make it work in your environment, but here goes.
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
cd "C:\your path\to\sqlcmd"
sqlcmd -S YourDBServer -d DB_NAME -E -Q "select id, a_id, alias, address, domain, mask from net_network"
-o "CSV-EXPORT-FILE.csv" -s"," -w 255
Some explanations:
-S The database server to connect to.
-d Name of the database to connect to.
-Q Query to run, can also be insert, delete, update etc.
-o select the output file
-s"," separated by comma
-w column width, this has to be as big as your largest columns characters.