BULK INSERT with Excel from a VARBINARY(MAX) field - sql

BULK INSERT Communication.Message
FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\temp\Messages201101.XLS', [messages$])
How do I take the above and instead read Message.XLS from a Varbinary(max) field named FileBytes in a table named Attachments.FileContents? I already know how to stage it in the table by various methods-- I just do not know method to use a BULK INSERT from a VarBinary(max) field.

The Jet driver can't open a VARBINARY that contains the bytes of your file. Looking at this MSDN page, the documentation doesn't talk about opening/mounting anything except files. You would have to take the bytes out of FileBytes, write them to a file and then use that file in your OPENROWSET statement.

Related

Save output into Varbinary(max) format

Using this SQL statement, I pull locally stored Excel sheet into SQL Server.
SELECT *
FROM OPENROWSET( 'Microsoft.ACE.OLEDB.12.0','Excel 12.0;Database=C:\SampleTestFiles\outputFile.xlsx;HDR=YES', [Sheet1$])
And, I get the complete Excel sheet details with multiple rows in output console.
Now, I want to save the Excel file into a database column of datatype varbinary(max).
Please can anyone let me know how to do that?
Use application code like C# to read the file and then insert the binary data into your varbinary column in the database.

BULK INSERT from VARBINARY containing CSV file

I have a CSV file that needs to be BULK INSERTed in my database.
The actual scheme is:
Client generates file01.csv
Client moves file01.csv into the shared folder \\SERVERNAME\Sharing, that points to C:\Data in the Server
Client tells database the file is called file01.csv
Server BULK INSERTs C:\Data\file01.csv into the final table
Server removes the file01.csv from its queue
(It'll be deleted later)
The Windows shared folders are a bit buggy and unstable, so I want to make it a bit different:
Client generates file01.csv
Client inserts file01.csv in VARBINARY(MAX) column
Server simulates the CSV from the VARBINARY and BULK INSERTs it into the final table
(without generating any file in the server side)
The only way I found to make the second option happen is:
Server generates temp.csv from the VARBINARY data
Server BULK INSERTs temp.csv into the final table
(It'll be deleted later)
Is there a way to use a VARBINARY variable instead of a file in the BULK INSERT?
Or if it isn't possible, is there a better way to do this?
(Searched Google for a answer and found only how to read a VARBINARY value from a CSV file, so my question may be a duplicate)
One way you can do what you describe is to create an SSIS package (or console app for that matter) with a script task that reads the Varbinary column into a single-column DataTable, parses into a "final-table-formatted" data table, and then does the bulk insert. The whole process would be in-memory.
You could insert the varbinary(max) data into a FileTable. SQL Server would actually write the data to a local file, which you could then process with BULK INSERT.

Getting error while bulk insertion

I am getting error when I am trying to do bulk insert:
BULK INSERT #tbl_InterCompanyUploadDetail_Staging
FROM '\\SVRP03546008461D\QA\UploadTemplatewithvalidation.xlsx'
WITH (FIRSTROW = 6, FIELDTERMINATOR ='\t', ROWTERMINATOR ='\\n' )
Error that I am getting is :
Bulk load data conversion error (truncation) for row 6, column 2 (Oracle Company Code).
The column in Excel has data as 470 and in database column is varchar (10).
So what could be the reason for the error.
The Issue
BULK INSERT may not work with xlsx files, try converting the .xlsx file to .csv file to achieve this (using BULK INSERT)
1st Solution - using OPENROWSET
Try using OPENROWSET with Microsoft.ACE.OLEDB.12.0 provider:
Insert into <rawdatatable>
select * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0;Database=D:\SSIS\FileToLoad.xlsx;HDR=YES',
'SELECT * FROM [Sheet1$]')
OR
SELECT * INTO Data_dq
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0; Database=D:\Desktop\Data.xlsx', [Sheet1$]);
2nd Solution - using OPENDATASOURCE
SELECT * INTO Data_dq
FROM OPENDATASOURCE('Microsoft.ACE.OLEDB.12.0',
'Data Source=D:\Desktop\Data.xlsx;Extended Properties=Excel 12.0')...[Sheet1$];
References
Import/Export Excel (.Xlsx) or (.Xls) File into SQL Server
Import data from Excel to SQL Server or Azure SQL Database
How to Bulk Insert from XLSX file extension?
Replace "\n" with "0x0a" as ROWTERMINATOR and try again.
Or also
ROWTERMINATOR = '''+cast (0x0000 as char(1))+'''
Let me know if it works.
Check also this.
I doubt using XLSX file with BULK INSERT. If XLSX file supported, then FIELDTERMINATOR and ROWTERMINATOR are not needed.
XLSX is zip file so I guess (but not sure) XLSX not supported and you are getting truncation error because it is reading it as pure text file and BULK INSERT getting long text up to FIELDTERMINATOR.
To confirm, you try increasing length of column up to some thousand character and run BULK INSERT, if you get garbage character then it is reading it as pure text file. may be garbage character could be same as you open same xlsx file in notepad or notepad++.
You can't bulk load XLSX into SQL Server. You CAN convert the XLSX to a tab delimited text file and bulk load.
If this is a one-off operation I would recommend converting to text first (but beware how Excel exports certain types like dates and large numbers). Or you can use the Import/Export wizard (https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql)
If this is a process you need to repeat I would create an SSIS script.

Wildcard character in field delimiter - reading .csv file

A .csv file gets dumped nightly onto my FTP server by an external company. (Thus I have no control over its format, nor will they change it as it's used by several other companies.)
My mission is to create a job that runs to extract the info from the file (and then delete it) and insert the extracted data into a SQL table.
This is the format of the info contained within the .csv file:
[Message]Message 1 contains, a comma,[Cell]27747642512,[Time]3:06:10 PM,[Ref]144721721
[Message]Message 2 contains,, 2 commas,[Cell]27747642572,[Time]3:06:10 PM,[Ref]144721722
[Message],[Cell]27747642572,[Time]3:06:10 PM,[Ref]144721723
I have a SQL Server 2012 table with the following columns:
Message varchar(800)
Cell varchar(15)
Time varchar(10)
Ref varchar(50)
I would like to use something like the SQL bulk insert (see below) to read from the .csv file and insert into the SQL table above.
BULK INSERT sms_reply
FROM 'C:\feedback.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
The delimiter in the .csv file is not common. I was wondering if there was a way for me to use a wildcard character when specifying the delimiter. e.g. '[%]'?
This would then ignore whatever was between the square brackets and extract the info in between.
I cannot simply use the commas to delimit the fields as there could be commas in the fields themselves as illustrated in the .csv example above.
Or if anyone has any other suggestions, I'd really appreciate it.
TIA.
My approach would be:
Bulk load the csv file into a staging table. All columns accepting data would be varchar.
Do an sql update to get rid of the brackets and their contents.
Do whatever other processing is required.
Insert data from your staging table to your prodution table.
I've managed to overcome this by using a handy function I found here
which comes from an article located here.
This function returns each line in the .csv file and then I use string manipulation to extract the fields between the "[variable_name]" delimiters and insert directly into my SQL table.
Clean, simple and quick.
NOTE: You have to enable OLE Automation on your SQL server:
sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
sp_configure 'Ole Automation Procedures', 1;
GO
RECONFIGURE;
GO

Create initial data sqlscripts with binary data, sqlserver 2005

with a TFS 2008 Teambuildtype we create a ClickOnce Setup for a review version of our application.
Within this Teambuildtype we create an initial database with some data to start directly with tests.
Now i need to put some binary data in our sql script insert files (Wordfiles).
How can i create a initial script with the binary data ?
I can't put the binary string into a script or?
Thanks a lot
Edit:
Found a solution with OpenRowset
INSERT INTO TestTable (Doc) SELECT * FROM OPENROWSET(BULK N'C:\File.jpg', SINGLE_BLOB) as ImageToInsert
Looks like you already found a solution that works for you that just bumped this question but you can put binary data in a script.
Just use 0x.... to denote a binary data literal.
INSERT INTO TestTable(Doc)
VALUES (0x474946383961500...)