insert data into table that value Import from CSV - sql

I have a list of data in excel list , how can i import them and insert into current created table. May i know the sql query for import excel data and insert into table ??Any help would be appreciated.
My excel list .CSV
table describe
new table with 0 record (13 columns same as excel list)

DB2 has the IMPORT and LOAD for getting external data into tables. In your case I would recommend IMPORT.
This would be the general statement:
import from yourfile of del insert into mytable
When you go to the documentation there are so-called "file type modifiers" that can be used to tell DB2 about the format of the data, the codepage, separators between the columns and more.

Related

How to import daily csv data into table with generated columns postgres

I'm new to PostgreSQL and and looking for some guidance and best practice.
I have created a table by importing data from a csv file. I then altered the table by creating multiple generated columns like this:
ALTER TABLE master
ADD office VARCHAR(50)
GENERATED ALWAYS AS (CASE WHEN LEFT(location,4)='Chic' THEN 'CHI'
ELSE LEFT(location,strpos(location,'_')-1) END) STORED;
But when I try to import new data into the table I get the following error:
ERROR: column "office" is a generated column
DETAIL: Generated columns cannot be used in COPY.
My goal is to be able to import new data each day to the table and have the generated columns automatically populate in order to transform the data as I would like. How can I do so?
CREATE TEMP TABLE master (location VARCHAR);
ALTER TABLE master
ADD office VARCHAR
GENERATED ALWAYS AS (
CASE
WHEN LEFT(location, 4) = 'Chic' THEN 'CHI'
ELSE LEFT(location, strpos(location, '_') - 1)
END
) STORED;
--INSERT INTO master (location) VALUES ('Chicago');
--INSERT INTO master (location) VALUES ('New_York');
COPY master (location) FROM $$d:\cities.csv$$ CSV;
SELECT * FROM master;
Is this the structure and the behaviour you are expecting? If not, please provide more details regarding your table structure, your importable data and your importing commands.
Also, maybe when you try to import the csv file, the columns are not linked properly, or maybe the delimiter is not properly set. Try to specify each column in the exact order that appear in your csv file.
https://www.postgresql.org/docs/12/sql-copy.html
Note: d:\cities.csv contains:
Chicago
New_York
EDIT:
If columns positions are mixed up between table and csv, the following operation may come in handy:
1. create temporary table tmp (csv_column1 <data_type>, csv_column_2 <data_type>, ...); (including ALL csv columns)
2. copy tmp from '/path/to/file.csv';
3. insert into master (location, other_info, ...) select csv_column_3 as location, csv_column_7 as other_info, ... from tmp;
Importing data using an intermediate table may slow things down a little, but gives you a lot of flexibility.
I was getting the same error when importing to PG from a csv - I found that even though my column was generated, I still had to have it in the imported data, just left it empty. Worked fine when the column name was in there and mapped to my DB col name.

import two columns from a text file into a sql table

I have a table in SQL Server with some columns and a text file. I need to import data of two columns of text file into SQL table (two columns exist in SQL table for do it and no need two insert columns). How can I do it?
Use the SQL Server Importing Wizard and just ignore the columns in the mapping that are not required.
See link.
SQL Server Management Studio (SSMS) provides the Import Wizard task which you can use to copy data from one data source to another. You can choose from a variety of source and destination data source types, select tables to copy or specify your own query to extract data, and save your work as an SSIS package. In this section we will go through the Import Wizard and import data from an Excel spreadsheet into a table in a SQL Server database.
https://www.mssqltips.com/sqlservertutorial/203/simple-way-to-import-data-into-sql-server/
FOR CSV
// THIS IS THE DATA IN THE CSV FILE
Name,Class
Prabhat,4
Prabhat1,5
Prabhat2,6
// end OF CSV FILE
THE QUERY
CREATE TABLE CSVTest (Name varchar(100) , class varchar(10))
BULK
INSERT CSVTest
FROM 'C:\New folder (2)\testcsv.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO
--Drop the table to clean up database.
DROP TABLE CSVTest
GO

Sql bulk insert calculate values at insert

So data can be imported into SQL Server from .csv files.
Import CSV file into SQL Server
I'm trying to use this to import test data into a database. So I want the dates to be up-to-date. Currently we use .sql files with getdate() so after inserting the dates are all newly generated. But when inserting getdate() with bulk insert from a .csv file it will just say 'getdate()'. The dates are only an example, I need different rows to be calculated differently. So one date might get 5 added to it, another 10.
Although bulk insert does not let you specify function calls, you could work around the problem by changing your table definition: add default constraint to your date column, and do not insert anything into it through bulkinsert. This would ensure that SQL Server fills the column by calling getdate:
ALTER TABLE MyTable ADD CONSTRAINT
DF_MyTable_MyDateColumn_GetDate DEFAULT GETDATE() FOR MyDateColumn

Handling Empty Strings in SQL Bulk Insert

I am using SQL Bulk Insert to insert data into a temporary table.
My table has a column XYZ which is defined as varchar NOT NULL and I want if the XYZ column data in the delimited file is empty, it should be written to error file. Currently SQL BI treats it as a 0 length string and inserts into the table.
The delimited file looks like this:
Col1|XYZ|Col2
abc||abc
abc||abc
abc|abc|abc
I tried using CHECK_CONSTRAINTS in SQL BI query and created a Check constraint on XYZ column in table as XYZ <> '', but rather then writing particular row to error file, it causes the entire SQL BI to fail.
Please help.

sql dump of data based on selection criteria

When extracting data from a table (schema and data) I can do this by right clicking on the database and by going to tasks->Generate Scripts and it gives me all the data from the table including the create script, which is good.
This though gives me all the data from the table - can this be changed to give me only some of the data from the table? e.g only data on the table after a certain dtmTimeStamp?
Thanks,
I would recommend extracting your data into a separate table using a query and then using generate scripts on this table. Alternatively you can extract the data separately into a flatfile using the export data wizard (include your column headers and use comma seperators with double quote field delimiters).
To make a copy of your table:
SELECT Col1 ,Col2
INTO CloneTable
FROM MyTable
WHERE Col3 = #Condition
(Thanks to #MarkD for adding that)