I am using SQL Server 2008 and I have a table that has two columns. One column is for a row of text and the second is for the name of the text file. I am using the command below. This works fine to insert the rows of text but is there any way to also insert the file name test.txt with all the rows?
BULK INSERT viewFile FROM 'C:\test.txt' WITH (ROWTERMINATOR ='\n')
Related
I have a CSV file which consists of one column. I want to insert (not import) all the elements in the columns into the database table. I know that if I wanted to insert few elements, then I can use the below statement to insert individually.
INSERT INTO table(column_name )
VALUES (element1);
But is there a method that I can insert all the elements at once?
You can just comma separate the values like below. I sometimes use Excel to do the formatting if you have a lot of values.
INSERT INTO table
(column_name)
VALUES
(element1), (element2)
I am using SQL Bulk Insert to insert data into a temporary table.
My table has a column XYZ which is defined as varchar NOT NULL and I want if the XYZ column data in the delimited file is empty, it should be written to error file. Currently SQL BI treats it as a 0 length string and inserts into the table.
The delimited file looks like this:
Col1|XYZ|Col2
abc||abc
abc||abc
abc|abc|abc
I tried using CHECK_CONSTRAINTS in SQL BI query and created a Check constraint on XYZ column in table as XYZ <> '', but rather then writing particular row to error file, it causes the entire SQL BI to fail.
Please help.
I have one table with 3 fields
id_Complex | fileLine | date
The field id_Complex, and, that id_complex is the same for the file, that id just chenge when another file is processed is a ID generate from my program, fileLine is just a line from file and date is the date of recording of the line.
Now, my program make a insert in the database for each line read from the file.
I whant to know, if is possible to make a bulk, and that bulk just insert the values in one specific column of table, and, I just send the id_complex to sql, so, the SQL will be make the insert with id_complex I sent for SQL, the lines of file and date.
How I can make that bulk ?
it's possible to make this, Bulk insert with one that has a value predefined
You should in your program proccess input file and generate temp file with correct complex_id and the make bulk insert for this temp file.
After insert just delete temp file.
If I understand what you are asking, you could create a temporary table TempTable and do a bulk insert into it. Then perform an UPDATE from TempTable joining to your permanent table by id_Complex. You can also set the date in this UPDATE statement. Finally, clear out the temporary table.
Alternatively, you could bulk import the file into a temporary table, delete the old permanent table, and rename the temporary table as the permanent table.
I want to bulk insert columns of a csv file to specific columns of a destination table.
Description - destination table has more columns than my csv file. So, I want the csv file columns to go to the right target columns using BULK INSERT.
Is this possible ? If yes, then how do I do it ?
I saw the tutorial and code at - http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
and http://www.codeproject.com/Articles/439843/Handling-BULK-Data-insert-from-CSV-to-SQL-Server
BULK INSERT dbo.TableForBulkData
FROM 'C:\BulkDataFile.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
They don't show you how you can control where data is inserted.
Yes, you can do this. The easiest way is to just create a View that Selects from the target table, listing the columns that you want the data to go to, in the order that they appear in the source file. Then BULK INSERT to your View instead of directly to the Table.
I have a table with 3 columns, 2 of which are auto-incrementing. I have a text file with various content that I would like to insert into the 3rd column. In that text, values are separated as such:
"value1", "value2", "value3", "etc",
How would I go about inserting all those values into a specific column, while being able to use the above format for my initial content(or something similar that I could do by a "replace all"), hopefully with a single command in phpmyadmin?
Let me know if my question is not clear!
Thanks in advance
Use a regular expression to convert each value into a full Insert query on a separate row:
INSERT INTO mytable (column3) VALUES ('value1')
Something like this should do it:
Match: "\([^"]*\)",
Replace with: INSERT INTO mytable (column3) VALUES ('\1') \n
First, I would if at all possible get that text file in the right format which would be 1 column with each value on a separate line. Other wise, you need to first write a function to split the data into sometype of one column temp table and then insert by joining to the table. If the text file is in a reasonable format instead of a comma delimited mess, then you can directly insert it in a bulk operation in most databases.