i have a sql script I am supposed to run that starts with:
BULK INSERT #BridgeVendors
FROM 'D:\projects\databases\Scripts\Release\6.7.1\BridgeVendors.csv'
WITH ( FIELDTERMINATOR=',', FIRSTROW = 2 )
the first time I ran it I had the path name going toincorrect path so it didn't execute properly , but now - I can't run it again because I get the error :
There is already an object named '#BridgeVendors' in the database.
How do I UNDO or DELETE this "Object" that was a BULK INSERT??
You just need to drop the table :)
drop table #BridgeVendors
Related
I'm trying to create a temporary table to save some codes, but when I try to insert a code it throws me the following error as if the table did not exist:
can't format message 13:796 -- message file C:\Windows\firebird.msg
not found. Dynamic SQL Error. SQL error code = -204. Table unknown.
TEMPCODES. At line 1, column 13.
These are the lines that I try to run:
create global temporary table TEMPCODES
(
codigo varchar(13)
)
on commit delete rows;
insert into TEMPCODES values('20-04422898-0');
Why can't it find the table if I'm creating it before?
In Firebird, you cannot use a database object in the same transaction that created it. You need to commit before you can use the table.
In other words, you should use:
create global temporary table TEMPCODES
(
codigo varchar(13)
)
on commit delete rows;
commit;
insert into TEMPCODES values('20-04422898-0');
Also, it is important to realise that global temporary tables (GTT) are intended as permanent objects. The idea is to create a GTT once, and then use it whenever you need it. The content of a GTT is only visible to the current transaction (on commit delete rows) or to the current connection (on commit preserve rows). Creating a GTT on the fly is not the normal usage pattern for GTTs.
I have a rather simple DB with a column called File, and I need to remove the first 7 characters of each row, and replace with a new string. I thought I had the code sorted, but I am getting error "SQLite3 Error 19 - UNIQUE constraint failed: MGOFile.File."
My table name is MGOFile, and the column is File. This is a simple select statement on the first few rows, the left column is the raw data, the right is what I need the resultant rows to look like...
I query my table using this:
'''sql
SELECT
File,
'T:\'|| substr(File, 8,2000) as File
FROM
MGOFile
WHERE
file like 'M:\_TV%';
'''
I then tried updating using this:
UPDATE MGOFile
SET File = 'T:\' || substr(File, 8, 2000)
WHERE File like 'M:\_TV%';
But here is where my error comes in, this fails with an error:
I am sure I am doing something simple wrong, but I have done plenty of Googling but all responses are over my head, this is the most advanced SQL I have tried to do!
Any ideas on how to can update these strings with some simple SQLite?
As checking for duplicates doesn't appear to detect the issues. Perhaps getting values at the time of the issue may assist. Do you have Triggers by any-chance? These will sometimes propagate an error which will be reported as being with the table that triggered the trigger.
As such perhaps consider adding a table to log such data along with a BEFORE UPDATE TRIGGER to actually log the information at run time. To stop the data being rolled back and thus undoing the logged information OR FAIL needs to be used.
Important as the updates will not be rolled back updates will have been applied. It is suggested that the above is used on a test database.
-- The code
DROP TABLE IF EXISTS lastupdated;
-- Create the logging table
CREATE TABLE IF NOT EXISTS lastupdated (counter, lastfile_before, lastfile_after, id_of_the_row);
-- Initialise it so it's plain to see if nothing has been done
INSERT INTO lastupdated VALUES(0,'nothing','nothing',0);
-- Add the Trigger to record the debugging information BEFORE the update
CREATE TRIGGER IF NOT EXISTS monitorupdateprogress
BEFORE UPDATE ON MGOFile
BEGIN
UPDATE lastupdated SET counter = counter +1, lastfile_before = old.File, lastfile_after = new.File, id_of_the_row = old.rowid;
END
;
UPDATE OR FAIL MGOFile -- OR FAIL will halt but NOT ROLLBACK
SET File = 'T:\' || substr(File, 8, 2000)
WHERE File like 'M:\_TV%';
SELECT * FROM lastupdated; -- will not run if there is a fail but should be run after the fail
This would, assuming the fail, record
the nth update in the counter column
the value in the File column before the change in the lastfile_before column.
the value that the File column would be updated to in the **lastfile_after* columns.
the last rowid (failing) of the row in the MGOFile table (this does assume that the MGOFile table is not a table defined using WITHOUT ROWID).
If the table was defined with the WITHOUT ROWID then you could change , id_of_the_row = 0;. The value will then be meaningless.
Testing/Results the version of the above that was used to test the above is :-
-- Solely for testing the code below
DROP TABLE IF EXISTS MGOFile;
CREATE TABLE IF NOT EXISTS MGOFile (File TEXT PRIMARY KEY);
-- Some testing data
INSERT INTO MGOFile VALUES
('M:\_TV/9-1-1.so2e09.web.x264-tbs[eztv].mkv'),
('M:\_TV/9-1-1.so2e09.web.x265-tbs[eztv].mkv'),
('M:\_TV/9-1-1.so2e09.web.x266-tbs[eztv].mkv'),
('M:\_TV/9-1-1.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv'),
('M:\_TV/9-1-1.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x277-tbs[eztv].mkv'),
('M:\_TV/9-1-1.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x278-tbs[eztv].mkv'),
('M:\_TV/9-1-1.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x279-tbs[eztv].mkv'),
('M:\_TV/9-1-1.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x280-tbs[eztv].mkv')
;
SELECT substr(File,170,8) FROM MGOFile GROUP BY Substr(File,8,170) HAVING count() > 1;
-- The code
DROP TABLE IF EXISTS lastupdated;
-- Create the logging table
CREATE TABLE IF NOT EXISTS lastupdated (counter, lastfile_before, lastfile_after, id_of_the_row);
-- Initialise it so it's plain to see if nothing has been done
INSERT INTO lastupdated VALUES(0,'nothing','nothing',0);
-- Add the Trigger to record the debugging information BEFORE the update
CREATE TRIGGER IF NOT EXISTS monitorupdateprogress
BEFORE UPDATE ON MGOFile
BEGIN
UPDATE lastupdated SET counter = counter +1, lastfile_before = old.File, lastfile_after = new.File, id_of_the_row = old.rowid;
END
;
SELECT * FROM MGOFile;
UPDATE OR FAIL MGOFile -- OR FAIL will halt but NOT ROLLBACK
SET File = 'T:\' || substr(File, 8, 170) -- <<<<<<<<<<<<<<<<<<<< truncate reduced to force UNIQUE constraint
WHERE File like 'M:\_TV%';
SELECT * FROM lastupdated; -- will not run if there is a fail
When the above is run then the message is :-
UPDATE OR FAIL MGOFile -- OR FAIL will halt but NOT ROLLBACK
SET File = 'T:\' || substr(File, 8, 170) -- <<<<<<<<<<<<<<<<<<<< truncate reduced to force UNIQUE constraint
WHERE File like 'M:\_TV%'
> UNIQUE constraint failed: MGOFile.File
> Time: 0.094s
Running SELECT * FROM lastupdated; returns :-
counter
6
lastfile_before =
M:_TV/9-1-1.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x278-tbs[eztv].mkv
lastfile_after
T:\9-1-1.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x266-tbs[eztv].mkv.so2e09.web.x27
id_of_the_row
6
In the above contrived example the issue can easily be determined (albeit that the duplicate search also found the same issue) as the error is on the 6th row and at the row that contains mkv.so2e09.web.x278-tbs[eztv] but was truncated by the update to .mkv.so2e09.web.x27 hence it is a duplicate of the 5th row which has .mkv.so2e09.web.x277-tbs[eztv] but was also truncated to .mkv.so2e09.web.x27.
P.S. Have you tried using just
UPDATE MGOFile
SET File = 'T:\' || substr(File, 8)
WHERE File like 'M:\_TV%';
i.e. removing the truncation.
The error seems quite clear to me. You are changing the file name to a name that is already in the table.
You can identify the duplicates by running:
SELECT f.*
FROM MGOFile f
WHERE EXISTS (SELECT 1
FROM MGOFile f2
WHERE f2.File = 'T:\'|| substr(File, 8,2000)
) AND
f.file LIKE 'M:\_TV%';
I don't know what you want to do about the duplicate.
I have a local database myDB and a server database serverDB that is linked to myDB as [serverDB].serverDB.dbo. I want to upload a 50,000-row table from a .csv file on my computer to the serverDB. I tried this:
bulk insert #temp from 'filename'
insert into [serverDB].serverDB.dbo.tablename select * from #temp
and it takes ages. I found out that the insert into creates connection for each row, so it looks like it's not an option in this case. Then I tried
bulk insert [serverDB].serverDB.dbo.tablename from 'filename'
and I get the error Invalid object name 'tablename' even thought this table exists in the [serverDB].serverDB database. Does anyone know how I can make SQL "see" the table [serverDB].serverDB.dbo.tablename?
I am using the following code to Bulk insert a CSV file:
BULK
INSERT CustomSelection
FROM 'c:\asd\a1.csv'
WITH
(
FIRSTROW =2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRE_TRIGGERS
)
GO
I have the FIRE_TRIGGERS property but the trigger is still not executing.
The trigger works for sure because if i manually insert into the table then it executes. Any help to solve that ?
During a bulk-import operation, your trigger will be fired only once because it's considerated as a single statement that affects multiple rows of data.
Your trigger should be able to handle a set of rows instead a single rows. Maybe this is the reason because your manual insert test is working fine and your bulk import is failing.
The C section of this MSDN article, show you how to create an insert trigger to Handle Multiple Rows of Data: http://msdn.microsoft.com/en-us/library/ms190752.aspx
Hope it helps.
Need to read CSV file information one by one. i.e. If the customer in the file is existing in Customer table insert into detail table otherwise insert into error table. So I can't use bulk insert method.
How to read one by one record from CSV file? How to give the path?
Bulk insert method is not going to work here.
One option is to use an INSTEAD OF INSERT trigger to selectively put the row in the correct table, and then use your normal BULK INSERT with the option FIRE_TRIGGERS.
Something close to;
CREATE TRIGGER bop ON MyTable INSTEAD OF INSERT AS
BEGIN
INSERT INTO MyTable
SELECT inserted.id,inserted.name,inserted.otherfield FROM inserted
WHERE inserted.id IN (SELECT id FROM customerTable);
INSERT INTO ErrorTable
SELECT inserted.id,inserted.name,inserted.otherfield FROM inserted
WHERE inserted.id NOT IN (SELECT id FROM customerTable);
END;
BULK INSERT MyTable FROM 'c:\temp\test.sql'
WITH (FIELDTERMINATOR=',', FIRE_TRIGGERS);
DROP TRIGGER bop;
If you're importing files regularly, you can create a table (ImportTable) with the same schema, set the trigger on that and do the imports to MyTable through bulk import to ImportTable. That way you can keep the trigger and as long as you're importing to ImportTable, you don't need to do any special setup/procedure for each import.
CREATE TABLE #ImportData
(
CVECount varchar(MAX),
ContentVulnCVE varchar(MAX),
ContentVulnCheckName varchar(MAX)
)
BULK INSERT #ImportData
FROM 'D:\test.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
select * from #ImportData
//Here you can write your script to user read data one by one
DROP TABLE #ImportData
Use bulk insert to load into a staging table and then process it line by line.