How to add more rows to an existing DB Table - sql

I'm currently updating an existing DB table.
The Table has 14924 rows, I'm trying to insert new data which is requiring 15000 rows.
When running my Query, I'm getting this error message:
There are fewer columns in the INSERT statement than values specified
in the VALUES clause. The number of values in the VALUES clause must
match the number of columns specified in the INSERT statement.
Is there a way to add the additional 76 rows as needed?
I'm using MSSMS (Microsoft SQL Server Management Studio)
Query I'm running:
Insert INTO [survey].[dbo].[uid_table] (UID)
VALUES ('F32975648JX2','F32975681JX2',..+14998 more)
Should I clear the Column first by setting to NULL
What I'm trying to do is add all the VALUES to the UID column
My Columns are currently set as is:
UID | Email | Name | Title | Company | Address1 | Address2 | DateCreated |
All columns I have set to NULL except for UID, which already contains Values like above. Just need to replace the old values with the new ones. BUt getting error stated above

For inserting more than one value into a column you need to make the Insert statement in this format
Insert INTO [survey].[dbo].[uid_table] (UID)
VALUES ('F32975648JX2'),
('F32975681JX2'),
..+14998 more)
Also note that, The maximum number of rows that can be constructed by inserting rows directly in the VALUES list is 1000. So you have to break the INSERT statement into 1000 rows per INSERT
To insert more than 1000 rows, use one of the following methods
Create multiple INSERT statements
Use a derived table
Bulk import the data by using the bcp utility or the BULK INSERT
statement
Derived table approach
Insert INTO [survey].[dbo].[uid_table] (UID)
select 'F32975648JX2'
Union All
Select 'F32975681JX2',
Union All
..+14998 more)

your problem is in your INSERT statment
An example is
INSERT INTO table (col1, col2, col3,...)
VALUES(valCol1, valcol2, valcol3...)
Ensure that the number of columns (col1, col2, col3...) is the same number that VALUES (valCol1, valcol2, valcol3...) 3 columns and 3 values in this case

Related

Could insert statements match data from another table

I am trying to do an insert on a table from another table with matched values and IDs.
lets say on table1 there are values of names and IDS. Like John and 55.
I am trying to only insert the 55 into table2 since John is already on table2 but just missing his ID.
I know I can do update statement to update the value for John to 55, but my tables have over 3000 values, and it will be hard to do one at a time:
Anyway I can write a query to enter a value into the other table as long as the names match together?
what I tried so far:
insert into desired_table (id,version,source_id,description,r_id)
SELECT HI_SEQUENCE.nextval,'0', select min (id)
from table
where name in (select name from table2 where table2_name is not null),
table2_name,
table2.r_id from table2 where name is not null;
Issue with this statement is it inserts multiple values, but it only inserts it into where the min ID is.
Anyway I can adjust this and have it pull more than one ID?
Use Merge statement (https://learn.microsoft.com/en-us/sql/t-sql/statements/merge-transact-sql?view=sql-server-ver15)
Merge into Table1 d
Using Table 2 s
on d.name=s.name
when matching then update
age=s.age
when not matching then insert
(col1, col2)
values (s.col1, s.col2);
You might want a trigger to automate the above task.
Create Trigger sample after insert on
Table1 for each row
Begin
Update table2 set table2.age = :NEW.AGE where
table2.id=:NEW.Id
END;
Got this working by generating insert statements and running them with insert all

insert data and avoid duplication by checking a specific column

I have a local db that I'm trying to insert multiple rows of data, but I do not want duplicates. I do not have a second db that I'm trying to insert from. I have an sql file. The structure is this for the db I'm inserting into:
(db)artists
(table)names-> ID | ArtistName | ArtistURL | Modified
I am trying to do this insertion:
INSERT names (ArtistName, Modified)
VALUES (name1, date),
(name2, date2),
...
(name40, date40)
The question is, how can I insert data and avoid duplication by checking a specific column to this list of data that I want inserted using SQL?
Duplicate what? Duplicate name? Duplicate row? I'll assume no dup ArtistName.
Have UNIQUE(ArtistName) (or PRIMARY KEY) on the table.
Use INSERT IGNORE instead of IGNORE.
(No LEFT JOIN, etc)
I ended up following the advice of #Hart CO a little bit by inserting all my values into a completely new table. Then I used this SQL statement:
SELECT ArtistName
FROM testing_table
WHERE !EXISTS
(SELECT ArtistName FROM names WHERE
testing_table.ArtistName = testing_table.ArtistName)
This gave me all my artist names that were in my data and not in the name table.
I then exported to an sql file and adjusted the INSERT a little bit to insert into the names table with the corresponding data.
INSERT IGNORE INTO `names` (ArtistName) VALUES
*all my values from the exported data*
Where (ArtistName) could have any of the data returned. For example,
(ArtistName, ArtistUrl, Modified). As long as the values returned from the export has 3 values.
This is probably not the most efficient, but it worked for what I was trying to do.

Insert Multiple Rows into Table from a Table

I have a SQL Server 2008 database. The database has a stored procedure which receives two strings as parameters. One parameter is used to build a temp table which will usually only have 1 or 2 rows but theoretically could have more.
For each row in the temp table, I need to insert a row into a different table that consists of the other parameter and the contents of the temp table. Is there a way to do this without a cursor?
I've tried variations on the following:
Pseudo code:
procedure InsertLinks(#Key varchar(36), #LinkKey varchar(36)
tempLinks Table = getLinks(#LinkKey)
Insert into MyTable (Key, LinksTo) Values (#Key, Select LinksTo From tempLinks)
The VALUES clause is messed up - you have a single value comma a table. That's not valid.
The following should work just fine:
INSERT INTO MyTable (Key, LinksTo)
SELECT #Key, LinksTo
FROM tempLinks

Insert Command error

I wanna write insert command in sql 2005.
I have 10 Columns, some of them can be null.
I use this command:
Insert Into TableName Values(x,y)
since the others can be null, I don't bring them in command.
cause, number of null-able columns are different, I can't bring exact null values.
but I've got this error:Column name or number of supplied values does not match table definition.
what can I do?
1 - Accept some of the past answers to your questions.
2 - Supply which fields you are inserting. In a 5 column table, you can say
INSERT INTO Table (col2, col4)
VALUES (col2value, col4value)

large insert in two tables. First table will feed second table with its generated Id

One question about how to t-sql program the following query:
Table 1
I insert 400.000 mobilephonenumbers in a table with two columns. The number to insert and identity id.
Table 2
The second table is called SendList. It is a list with 3columns, a identity id, a List id, and a phonenumberid.
Table 3
Is called ListInfo and contains PK list id. and info about the list.
My question is how should I using T-sql:
Insert large list with phonenumbers to table 1, insert the generated id from the insert of phonenum. in table1, to table 2. AND in a optimized way. It cant take long time, that is my problem.
Greatly appreciated if someone could guide me on this one.
Thanks
Sebastian
What version of SQL Server are you using? If you are using 2008 you can use the OUTPUT clause to insert multiple records and output all the identity records to a table variable. Then you can use this to insert to the child tables.
DECLARE #MyTableVar table(MyID int);
INSERT MyTabLe (field1, field2)
OUTPUT INSERTED.MyID
INTO #MyTableVar
select Field1, Field2 from MyOtherTable where field3 = 'test'
--Display the result set of the table variable.
Insert MyChildTable (myID,field1, field2)
Select MyID, test, getdate() from #MyTableVar
I've not tried this directly with a bulk insert, but you could always bulkinsert to a staging table and then use the processs, described above. Inserting groups of records is much much faster than one at a time.