Is it possible to use sql server bulk insert without a file? - sql

Curious if this is possible: The app server and db server live in different places (obviously). The app server currently generates a file for use with sql server bulk insert.
This requires both the DB and the app server to be able to see the location, and it makes configuration more difficult in different environments.
What I'd like to know is: is it possible to bypass the file system in this case? Perhaps I can pass the data to sql server and have it generate the file?
I'm on sql server 2008, if that makes a difference.
thanks!

I don't think you can do that with SQL Server's bulkcp tool, but if your app is written using .NET, you can use the System.Data.SqlClient.SqlBulkCopy class to bulk insert rows from a data table (or any datasource you can access with a SqlDataReader).

From the documentation on bulk insert:
BULK INSERT
[ database_name. [ schema_name ] . | schema_name. ] [ table_name | view_name ]
FROM 'data_file'
The FROM 'data_file' is not optional and is specified as such:
'data_file'
Is the full path of the data file that contains data to import into the
specified table or view. BULK INSERT
can import data from a disk (including
network, floppy disk, hard disk, and
so on).
data_file must specify a valid path
from the server on which SQL Server is
running. If data_file is a remote
file, specify the Universal Naming
Convention (UNC) name. A UNC name has
the form
\Systemname\ShareName\Path\FileName.
For example,
\SystemX\DiskZ\Sales\update.txt.
Your application could do the insert directly using whatever method meets your performance needs.

Related

Using Microsoft Access program to upload images to be stored in SQL Server database

I have an Access back-end that is going to be converted to SQL Server. The front-end will stay the same using Access. The issue I am having is how SQL Server handles images differently than MS Access.
Currently, a user adds a picture to the record via the attachment data type which, to my understanding, isn't possible in SQL Server. I saw the image data type is deprecated which leaves varbinary(MAX) and/or filestream as the options.
I want to go with storing the images in the filesystem as the size is greater than 256KB, but I'm not finding any documentation about accomplishing that with an Access front-end.
Consider running an MS Access pass-through query to upload user's image. Specifically, pass the file name into an SQL query as shown in MSDN docs for large-value data types. For this, the user will need OPENROWSET privileges and the image file may need to be accessible on client machine or server.
INSERT myTable (myImageColumn, ...other columns...)
SELECT myPicData.*, ...other values...
FROM OPENROWSET
(BULK 'C:\Path\To\Image.jpg', SINGLE_BLOB) AS myPicData

BULK INSERT from a different server

I am using T-SQL with SQL Server 2012.
Here is my question: I have a text file on server 10.10.10.1. I want to create a temporary table on server 10.10.10.2 from the previous text file.
According to my research, BULK INSERT works on local C: directory.
Is there a way to do the following?
BULK INSERT dbo.#G2
FROM '10.10.10.1.C:\File\textfile.txt'
WITH
(
CODEPAGE = '1252',
FIELDTERMINATOR = ';',
CHECK_CONSTRAINTS
)
Thank you for your help.
You can read data from a file share (\\computer\share\folder\file) but the SQL Server process has to have access. As SQL Server generally runs as a local service account it can only access shares that allow anonymous access (so anyone can read the content of the share).
Better up upload to a folder on the server, but of course that means sharing a writable folder from the database server. While controllable (eg. dedicated partition, controlled ACL) it is still not ideal.

SQL Server - Copying data between tables where the Servers cannot be connected

We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.

copy blob data into on-premise sql table

My problem statement is that I have a csv blob and I need to import that blob into a sql table. Is there an utility to do that?
I was thinking of one approach, that first to copy blob to on-premise sql server using AzCopy utility and then import that file in sql table using bcp utility. Is this the right approach? and I am looking for 1-step solution to copy blob to sql table.
Regarding your question about the availability of a utility which will import data from blob storage to a SQL Server, AFAIK there's none. You would need to write one.
Your approach seems OK to me. Though you may want to write a batch file or something like that to automate the whole process. In this batch file, you would first download the file on your computer and the run the BCP utility to import the CSV in SQL Server. Other alternatives to writing batch file are:
Do this thing completely in PowerShell.
Write some C# code which makes use of storage client library to download the blob and once the blob is downloaded, start the BCP process in your code.
To pull a blob file into an Azure SQL Server, you can use this example syntax (this actually works, I use it):
BULK INSERT MyTable
FROM 'container/folder/folder/file'
WITH ( DATA_SOURCE = 'ds_blob',BATCHSIZE=10000,FIRSTROW=2);
MyTable has to have identical columns (or it can be a view against a table that yields identical columns)
In this example, ds_blob is an external data source which needs to be created beforehand (https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql)
The external data source needs to use a database contained credential, which uses an SAS key which you need to generate beforehand from blob storage https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql)
The only downside to this mehod is that you have to know the filename beforehand - there's no way to enumerate them from inside SQL Server.
I get around this by running powershell inside Azure Automation that enumerates blobds and writes them into a queue table beforehand

Import a CSV file with Access frontend into SQL Server

Background:
In my company we have many CSV files which have to be imported into an SQL Server. The CSV files contain multidimensional market simulations, which are stored in an EAV form (2 columns and 10^6 to 10^10 rows). Their size is variable, but it is not unusual that it is more than 500Mb.
Until now, theses files were imported by an database administrator via SSMS into SQL Server.
Every importation should get an ImportationID and a Timestamp. This is time consuming and error prone for the database administrator who does this manually.
Thus, an Access front end is created to allow every user to import easily the CSV file into the server, after making a selection with a Listbox.
Now, I am faced to the problem to import the CSV file through the Access interface.
Problem:
Here are the possible options which I have considered but which aren't possible :
Pass some T-SQL command to the SQL Server, as listed here (not allowed by Access)
Import the CSV line by line with a VBA loop (takes too long for 10^6 to 10^10 rows)
Import the CSV file in the Access database and then export the table to the SQL Server (2Gb size limitation of Access makes it impossible)
Is there any other option to perform this task, using Access ?
One possible solution is as follows. Your Access frontend has a form that accepts three values: file name/location; ImportationID; Timestamp. After the user enters this data, the 'Go' or 'Submit' button fires a stored procedure on the SQL Server database that accepts these 3 variables.
The stored procedure will issue a BulkInsert (or other of the commands you linked to) to get the CSV into the database, and then manipulate the data and transform it according to your business rules (sets ImportationID and Timestamp correctly, for example).
This is something that a database developer (or maybe a database admin) should be able to set up, and any validation or security constraints can be enforced on the database.