Inserting more than 1000 rows from Excel into SQLServer - sql

I'm new to Sql but what is the best way to insert more than 1000 rows from an excel document into my database(Sql server 2008.)
For example I'm using the below query:
INSERT INTO mytable(companyid, category, sub, catalogueref)
VALUES
('10197', 'cat', 'sub', '123'),
('10197', 'cat2', 'sub2', '124')
This is working fine but there is a limit of inserting 1000 records and I have 19000 records and I don't really want to do 19 separate insert statements and another question, is that the company id is always the same is there a better way then writing it 19000 times?

Just edit the data in Excel or another program to create N amount of insert statements with a single insert for each statement, you'll have an unlimited number of inserts. For example...
INSERT INTO table1 VALUES (6696480,'McMurdo Station',-77.846,166.676,'Antarctica','McMurdo')
INSERT INTO table1 VALUES (3833367,'Ushuaia',-54.8,-68.3,'America','Argentina')
...19,000 later
INSERT INTO table1 VALUES (3838854,'Rio Grande',-53.78769,-67.70946,'America','Argentina')

Microsoft provides an import wizard with SQL Server. I've used it to migrate data from other databases and from spreadsheets. It is pretty robust and easy to use.

There are several options, the Import Wizard which Erik suggests, or SSIS is another good one.
Read here:
Import Excel spreadsheet columns into SQL Server database

Ok, it's a late answer but I ran into this very same problem and found a solution that worked twice as fast as #Nur.B's solution for a 7K-row insertion.
Note: whenever possible prefer to use TRANSACTIONS when dealing with large amounts of data.
INSERT INTO mytable(companyid, category, sub, catalogueref)
SELECT '10197', 'cat', 'sub', '123' UNION ALL
SELECT '10197', 'cat2', 'sub2', '124' UNION ALL
-- ... other N-thousand rows
SELECT '10197', 'catN-1', 'subN-1', '12312' UNION ALL
SELECT '10197', 'catN', 'subN', '12313'; -- don't add "UNION ALL" statement on the last line

You should be able to insert using multiple transactions -
BEGIN TRANSACTION
Insert into mytable(companyid,category,sub,catalogueref)
values
('10197','cat', sub','123'),
('10197','cat2', sub2','124')
...998 more rows...
COMMIT TRANSACTION
go
BEGIN TRANSACTION
Insert into mytable(companyid,category,sub,catalogueref)
values
('10197','cat', sub','123'),
('10197','cat2', sub2','124')
...998 more rows...
COMMIT TRANSACTION

INSERT INTO mytable(companyid, category, sub, catalogueref)
SELECT companyid, category, sub, catalogueref from (VALUES
('10197', 'cat', 'sub', '123'),
('10197', 'cat2', 'sub2', '124')
//more 1000 rows...
as sub (companyid, category, sub, catalogueref)

Related

Using IDs from table and insert into secondary table

Currently I am trying to write a sql statement that selects from one table then uses the primary key to insert into a secondary table. I am having a hard time figuring out how I would do it. This is the select and insert I have right now. The first select will return multiple results. I need to run this nightly so I need to make it dynamic.
SELECT ParentTypeId FROM Config_OrderParentQueueType
INSERT INTO [dbo].[Config_OrderParentQueueTypeNotes]
([ParentTypeId]
,[NoteDate]
,[NoteText]
,[NoteSubmittedById])
VALUES
(This is the ID I need to insert from the select
,GETDATE()
,'Default Note'
,6)
I have tried to mess with rowcount but the IDs are not always sequential. Appreciate any help in advance on how I would do this.
Use insert .. select:
INSERT INTO [dbo].[Config_OrderParentQueueTypeNotes]
([ParentTypeId]
,[NoteDate]
,[NoteText]
,[NoteSubmittedById])
SELECT ParentTypeId, getdate(), 'Default Note', 6
FROM Config_OrderParentQueueType

How can I INSERT data into two tables simultaneously with only one sql script db2?

How would I insert into multiple tables with one sql script in db2
For example, insert a row into T1 DOCK_DOOR and then insert into T2 DOCK_DOOR_LANE multiple times based on the dock_door_sysid from the first table.
My first approach was the following. I was attempting to use a with with three inserts. on the other hand, doing to inserts on the second table is not and option if this can be automated with one insert.
thanks for any feedback
sql example
WITH ins AS (
INSERT INTO DBF1.DOCK_DOOR (DOCK_DOOR_SYSID,DOOR_NUMBER,DOOR_NAME,DOCK_SYSID,DOOR_SEQ,ENCRYPTION_CODE,RFID_ENBLD_FLAG,LANES_COUNT,CMNT_TEXT,CREATE_TS,CREATE_USERID,UPDATE_TS,UPDATE_USERID,VER_NUMBER,ACTIVE_FLAG,STATUS_SYSID,DOOR_TYPE_SYSID)
VALUES (nextval for DBF1.DOCK_DOOR_SEQ,'026','DOOR025',61,25,NULL,'N','2',NULL,current timestamp,'SQL_INSERT',current timestamp,'SQL_INSERT',0,NULL,1723,1142)
RETURNING door_number,dock_door_sysid),
ins2 AS (
INSERT INTO SIT.DOCK_DOOR_lane (DOCK_DOOR_LANE_SYSID,DOOR_LANE_ID,DOCK_DOOR_SYSID,LANE_ID,CREATE_TS,CREATE_USERID,UPDATE_TS,UPDATE_USERID,VER_NUMBER)
VALUES (nextval for DBF1.DOCK_DOOR_LANE_seq,door_number||''||'A',dock_door_sysid,'A',current timestamp},'SQL_INSERT',current timestamp,'SQL_INSERT',0)
SELECT door_number,dock_door_sysid FROM DBF1.DOCK_DOOR
RETURNING door_number,dock_door_sysid)
INSERT INTO DBF1.DOCK_DOOR_lane (DOCK_DOOR_LANE_SYSID,DOOR_LANE_ID,DOCK_DOOR_SYSID,LANE_ID,CREATE_TS,CREATE_USERID,UPDATE_TS,UPDATE_USERID,VER_NUMBER)
VALUES (nextval for DBF1.DOCK_DOOR_LANE_seq,door_number||''||'B',dock_door_sysid,'B',current timestamp},'SQL_INSERT',current timestamp,'SQL_INSERT',0)
SELECT door_number,dock_door_sysid FROM DBF1.DOCK_DOOR;
Table 1 = dock_door
Table 2 = Dock_door_lane
You could do it with a trigger on the dock_door table.
However, if you're on a recent, version on IBM i. You might be able to make use of data change table reference
Your statement would look something like this
insert into dock_door_lane
select <....>
from final table (insert into dock_door <...>)
I'm not sure it will work, as this article indicates that at least at a couple of years ago DB2 for i didn't support the secondary insert required.
This old SO question also seems to confirm that at least at v7.1, the double insert isn't supported.
If I get a chance, I'll run a test on a 7.2 system Monday.

Difference between INSERT INTO and INSERT ALL INTO

While I was inserting some records in table i found that..
INSERT INTO T_CANDYBAR_DATA
SELECT CONSUMER_ID,CANDYBAR_NAME,SURVEY_YEAR,GENDER,1 AS STAT_TYPE,OVERALL_RATING
FROM CANDYBAR_CONSUMPTION_DATA
UNION
SELECT CONSUMER_ID,CANDYBAR_NAME,SURVEY_YEAR,GENDER,2 AS STAT_TYPE,NUMBER_BARS_CONSUMED
FROM CANDYBAR_CONSUMPTION_DATA;
79 rows inserted.
INSERT ALL
INTO t_candybar_data VALUES (consumer_id,candybar_name,survey_year,gender,1,overall_rating)
INTO t_candybar_data VALUES (consumer_id,candybar_name,survey_year,gender,2,number_bars_consumed)
SELECT * FROM candybar_consumption_data
86 rows inserted.
I have read somewhere that INSERT ALL INTO automatically unions then why those difference is showing.
The problem is your queries are different—your first is with UNION and your second is without—so they are naturally inserting different numbers of values. As far as what INSERT ALL is versus a straight INSERT:
INSERT can be used for inserting new records to a single table.
INSERT ALL can be used for inserting new records to multiple tables based on the query condition.
So your assumption as stated here:
I have read somewhere that INSERT ALL INTO automatically unions then
why those difference is showing.
Is incorrect. INSERT ALL doesn’t have anything to do with UNION in any way. But that said, you might be mixing up UNION ALL as explained here.
The SQL UNION ALL operator is used to combine the result sets of 2 or
more SELECT statements. It returns all rows from the query (even if
the row exists in more than one of the SELECT statements).
Each SELECT statement within the UNION ALL must have the same number
of fields in the result sets with similar data types.

Inserting multiple rows into Oracle

In the discussion about multiple row insert into the Oracle two approaches were demonstrated:
First:
insert into pager (PAG_ID,PAG_PARENT,PAG_NAME,PAG_ACTIVE)
select 8000,0,'Multi 8000',1 from dual
union all select 8001,0,'Multi 8001',1 from dual
Second:
INSERT ALL
INTO t (col1, col2, col3) VALUES ('val1_1', 'val1_2', 'val1_3')
INTO t (col1, col2, col3) VALUES ('val2_1', 'val2_2', 'val2_3')
INTO t (col1, col2, col3) VALUES ('val3_1', 'val3_2', 'val3_3')
.
.
.
SELECT 1 FROM DUAL;
Could anyone argue the preference of using one over another?
P.S. I didn't do any research myself (even explanation plan), so any information or opinion would be appreciated.
Thanks.
From performance's point of view, these queries are identical.
UNION ALL won't hurt performance, since Oracle estimates the UNION'ed query only when it needs it, it doesn't cache the results first.
SELECT syntax is more flexible in that sense that you can more easuly manupulate the SELECT query if you want to change something.
For instance, this query:
insert into pager (PAG_ID,PAG_PARENT,PAG_NAME,PAG_ACTIVE)
select 8000,0,'Multi 8000',1 from dual
union all select 8001,0,'Multi 8001',1 from dual
can be rewritten as
INSERT
INTO pager (PAG_ID,PAG_PARENT,PAG_NAME,PAG_ACTIVE)
SELECT 7999 + level, 0, 'Multi ' || 7999 + level, 1
FROM dual
CONNECT BY
level <= 2
By replacing 2 with appropriate number, you can get any number of rows you want.
In case of INSERT ALL, you would have to duplicate the destination table description, which is less readable if you need, say, 40 rows.
The INSERT ALL method has a problem with inserting bigger number of rows into a table.
I recently wanted to insert 1130 rows into a table with single SQL statement. When I tried to do this with INSERT ALL method I got the following error:
ORA-24335 - cannot support more than 1000 columns
When I used INSERT INTO .. UNION ALL .. approach everything went fine.
Btw. I didn't know about the UNION ALL method before I found this discussion :)
I would suspect solution 1 is a bit of a hack that works and is probably less efficient than the designed alternative of Insert ALL.
Insert all is really designed for you to insert many rows into more than 1 table as a result of a select, eg:
Insert ALL
into
t1 (c1, c2) values (q1, q2)
t2 (x1, x2) values (q1, q3)
select q1, q2, q3 from t3
If you want to load thousands of rows and they are not in the database already, I don't think this is the best way to do it - If your data is in a file, you want to look at External Tables or SQL Loader to efficiently insert the rows for you.
i tried some test and the faster solution should be
insert into pager (PAG_ID,PAG_PARENT,PAG_NAME,PAG_ACTIVE)
select 8000,0,'Multi 8000',1 from dual
union all select 8001,0,'Multi 8001',1 from dual
buffering between 300 <-> 400 rows (i tried with odbc, this value could depends about its configuration)
The statement utilizing the UNION ALL has theoretically a small performance disadvantage as it has to union the results of all statements before the insert can happen.
The INSERT ALL doesn't have this disadvantage as the final result can already be processed line-by-line.
But practically the optimizer inside Oracle should make the difference negligible and it is up to your preferences which way you choose.
In my own opinion the INSERT ALL is the better human-readable of the two while the UNION ALL variant is the one taking less space when such an insert is automatically generated.
If you have insert statements that are more than 1000 then put all the insert statements in a .sql file and open that in Toad or SQL Developer and then execute. All records will get inserted.
You should consider Array-Insert.
Easy SQL
need some client-side coding to setup the array-Parameters
This is the way to minimize the Network-Traffic if some hundred inserts needs to be done in a batch.

How might I insert an array of data into a database table?

I am developing an attendance management program, used to maintain the absence record of a student. Users of this software will need to enter various dates, updated once in a month: for instance, a list of dates on which a student was absent for that particular month would be entered, and my program must then store them into a database with each date added as a new row in the appropriate table.
I have the dates stored using arrays internally, how might I transfer these into the database? How should I proceed?
You have not mentioned the database system being used, so my reply is general in nature. The usual way to do this is to run multiple insert statements one after another:
INSERT INTO Table1 (FirstColumn, SecondColumn)
VALUES ('a', 'b');
INSERT INTO Table1 (FirstColumn, SecondColumn)
VALUES ('c', 'd');
INSERT INTO Table1 (FirstColumn, SecondColumn)
VALUES ('e', 'f');
GO
The trick way to do this is to use the UNION ALL statement:
INSERT INTO Table1 (FirstColumn, SecondColumn)
SELECT 'a', 'b'
UNION ALL
SELECT 'c', 'd'
UNION ALL
SELECT 'e', 'f'
GO
Versions of SQL Server prior to 2008 support only these methods. But SQL 2008 and MySQL 3.22 and above support the Row construction method as well:
INSERT INTO Table1 (FirstColumn, SecondColumn)
VALUES ('a', 'b'),
VALUES ('c', 'd'),
VALUES ('e', 'f')
GO
Now you can use any of the above methods to iterate through your array and add individual attendance rows to the database.
foreach($arrayName as $arrayValue) {
// run your query here!
}
for example:
$myArray = array('apple','orange','grape');
foreach($myArray as $arrayFruit) {
$query = "INSERT INTO `Fruits` (`FruitName`) VALUES ('" . $arrayFruit . "')";
mysql_query($query, $connection);
}
does that makes sense / fit what you were thinking?
Do you want to store the dates seperately so you can juggle with them, query them, etc.?
Or do you just want to store the array as is?
If you want to store the dates separately you may want to create a table with an FK to students, a column for date and a column for the nature of the date, like absence, late, ...
Then you would indeed store the single dates into that table. If you must, by iterating but if you can with one of Cerbrus' solutions!. It is not recommended to have db-queries within loops.
If you just need to store that array somewhere, you can serialize it and store the serialized string in a text or varchar column.
What language and what type of database are you using? Is this a web application? A desktop application? We can't help you without more information.
Depending on your situation, any of the above solutions could work. Or you could even load all of the attendance records at once as a CSV or XML document. Perhaps a little more research on your part will help you ask a more useful question?
Iterate over the array and execute insert SQL for each date.