I need to create a procedure where I will save some data on my Files Server (as an .csv).
Something like this:
--Note: I need permissions to access "0.0.0.0" address
--Note: I need to create a new file, and not replace an existing one
Select Name, Email From users => save to '\\0.0.0.0\c\userData'
I tried this:
--Note: "1.1.1.1" is the server where my database is, not sure if its the parameter I should use on ServerName
Exec xp_cmdshell
'bcp " Name, Email From users" queryout "\\0.0.0.0\c\userData" -c -t , -S "1.1.1.1" -T'
Got the following errors:
SQLState = S1000, NativeError = 0; Error = [Microsoft][ODBC Driver 13 for SQL Server]Unable to open BCP host data-file
I also saw that you can use something like this:
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0','Text;Database=D:\;HDR=YES;FMT=Delimited','SELECT * FROM [FileName.csv]')
SELECT Field1, Field2, Field3 FROM DatabaseName
But it has the downside the file has to exist already.
Q: So how can I achieve this? Export data to a file to a different machine with folder specific permissions/credentials (and create a new file, not overwrite it).
I am trying to export my SQL Server query results into a folder in .txt format (this is for an automated job)
I know the equivalent in MySQL works with INTO OUTFILE. Does anyone know the best way to do this in SQL Server 2008 Management Studio?
SELECT DISTINCT RTRIM (s1.SGMNTID) AS 'AccCode',RTRIM (s1.DSCRIPTN) AS 'CodeDesc', CASE
WHEN s1.SGMTNUMB = '1' THEN '1'
WHEN s1.SGMTNUMB = '2' THEN '2'
WHEN s1.SGMTNUMB = '3' THEN '110'
WHEN s1.SGMTNUMB = '4' THEN '4'
WHEN s1.SGMTNUMB = '5' THEN '120'
END AS 'AccountType_id',
CASE WHEN s1.SGMTNUMB = '2'
THEN LEFT(s1.SGMNTID, 2)
ELSE 'DEFAULT'
END AS 'AccGroupName'
FROM GL40200 s1
UNION
SELECT REPLACE ([ACTNUMBR_1]+'-'+ [ACTNUMBR_2]+'-'+ [ACTNUMBR_3]+'-'+[ACTNUMBR_4]+'-'+ [ACTNUMBR_5],' ', '') AS 'AccCode',
'' AS 'CodeDesc',
'0' AS 'AccountType_id',
'Default' AS 'AccGroupName'
FROM GL00100 a
INTO OUTFILE 'C:\Users\srahmani\verian/myfilename.txt'
You do this in the SSMS app, not the SQL.
In the toolbar select:
Query --> Results To --> Results To File
Then Execute the SQL statements and it will prompt you to save to a text file with an .rpt extension. Open the results in a Text Editor.
Another way is from command line, using the osql:
OSQL -S SERVERNAME -E -i thequeryfile.sql -o youroutputfile.txt
This can be used from a BAT file and shceduled by a windows user to authenticated.
You can use bcp utility.
To copy the result set from a Transact-SQL statement to a data file,
use the queryout option. The following example copies the result of a query into the Contacts.txt data file. The example assumes that you are using Windows Authentication and have a trusted connection to the server instance on which you are running the bcp command. At the
Windows command prompt, enter:
bcp "<your query here>" queryout Contacts.txt -c -T
You can use BCP by directly calling as operating sytstem command in SQL Agent job.
You can use windows Powershell to execute a query and output it to a text file
Invoke-Sqlcmd -Query "Select * from database" -ServerInstance "Servername\SQL2008" -Database "DbName" > c:\Users\outputFileName.txt
The BCP Utility can also be used in the form of a .bat file, but be cautious of escape sequences (ie quotes "" must be used in conjunction with ) and the appropriate tags.
.bat Example:
C:
bcp "\"YOUR_SERVER\".dbo.Proc" queryout C:\FilePath.txt -T -c -q
-- Add PAUSE here if you'd like to see the completed batch
-q MUST be used in the presence of quotations within the query itself.
BCP can also run Stored Procedures if necessary. Again, be cautious: Temporary Tables must be created prior to execution or else you should consider using Table Variables.
This is quite simple to do and the answer is available in other queries. For those of you who are viewing this:
select entries from my_entries where id='42' INTO OUTFILE 'bishwas.txt';
Below is an example of the BCP Statement.
I'm not accustomed to using BCP so your help and candor is greatly appreciated
I am using it with a format file as well.
If I execute from CMD prompt it works fine but from SQL I get the error.
The BCP statement is all on one line and the SQL Server Agent is running as Local System.
The SQL server, and script are on the same system.
I ran exec master..xp_fixeddrives
C,45589
E,423686
I've tried output to C and E with the same result
EXEC xp_cmdshell 'bcp "Select FILENAME, POLICYNUMBER, INSURED_DRAWER_100, POLICY_INFORMATION, DOCUMENTTYPE, DOCUMENTDATE, POLICYYEAR FROM data.dbo.max" queryout "E:\Storage\Export\Data\max.idx" -fmax-c.fmt -SSERVERNAME -T
Here is the format file rmax-c.fmt
10.0
7
1 SQLCHAR 0 255 "$#Y#$" 1 FILENAME
2 SQLCHAR 0 40 "" 2 POLICYNUMBER
3 SQLCHAR 0 40 "" 3 INSURED_DRAWER_100
4 SQLCHAR 0 40 "" 4 POLICY_INFORMATION
5 SQLCHAR 0 40 "" 5 DOCUMENTTYPE
6 SQLCHAR 0 40 "" 6 DOCUMENTDATE
7 SQLCHAR 0 8 "\r\n" 7 POLICYYEAR
Due to formating in this post the last column of the format file is cut off but reads SQL_Latin1_General_CP1_CI_AS for each column other that documentdate.
Does the output path exist? BCP does not create the folder before trying to create the file.
Try this before your BCP call:
EXEC xp_cmdshell 'MKDIR "E:\Storage\Export\Data\"'
First, rule out an xp_cmdshell issue by doing a simple 'dir c:*.*';
Check out my blog on using BCP to export files.
I had problems on my system in which I could not find the path to BCP.EXE.
Either change the PATH variable of hard code it.
Example below works with Adventure Works.
-- BCP - Export query, pipe delimited format, trusted security, character format
DECLARE #bcp_cmd4 VARCHAR(1000);
DECLARE #exe_path4 VARCHAR(200) =
' cd C:\Program Files\Microsoft SQL Server\100\Tools\Binn\ & ';
SET #bcp_cmd4 = #exe_path4 +
' BCP.EXE "SELECT FirstName, LastName FROM AdventureWorks2008R2.Sales.vSalesPerson" queryout ' +
' "C:\TEST\PEOPLE.TXT" -T -c -q -t0x7c -r\n';
PRINT #bcp_cmd4;
EXEC master..xp_cmdshell #bcp_cmd4;
GO
Before changing the path to \110\ for SQL Server 2012 and the name of the database to [AdventureWorks2012], I received the following error.
After making the changes, the code works fine from SSMS. The service is running under NT AUTHORITY\Local Service. The SQL Server Agent is disabled. The output file was created.
Please check, the file might be opened in another application or program.
If it is the case, bcp.exe cannot overwrite the existing file contents.
In my case, I solved The problem in the following way:
my command was :
bcp "select Top 1000 * from abc.dbo.abcd" queryout FileNameWithDirectory -c -t "|" -r "0x0a" -S 192.111.1.111 -U xx -P xxxxx
My FileNameWithDirectory was too long. like "D:\project-abc\R&D\abc-608\FilesNeeded\FilesNeeded\DataFiles\abc.csv".
I change into a simpler directory like : "D:\abc.csv"
Problem solved.
So I guess the problem occurred due to file name exceeding. thus the file was not found.
If it works from the command line but not from the SQL Agent, I think it is an authentication issue.
The SQL Server Agent is running under a account. Make sure that the account has the ability to read the format file and generate the output file.
Also, make sure the account has the ability to execute the xp_cmdshell stored procedure.
Write back with your progress ...
I received this after I shared my output folder, even when there were no files open.
I created a new, unshared folder for output and all was fine.
(might help someone ;-))
In my case this fix was simply running in administrator mode.
This error can be due to insufficient write permissions to the target folder.
This is a common issue, since the user writing the query might have access to a folder, but the SQL Server Agent or logged-in server account which actually invokes bcp.exe may not.
Destination path has to already exist (except for file name).
Remove no_output from your command, if you use one offcourse
SET #sql = 'BCP ....'
EXEC master..xp_cmdshell #sql , no_output
EXEC master..xp_cmdshell #sql
In case anyone else runs into the same problem: I had ...lesPerson" queryout' rather than ...lesPerson" queryout '
If your code is writing the data file, and then reading it with BCP, make sure that you CLOSE THE DATA FILE before trying to read it!
Failure to do so gives: 'Unable to open host data-file'.
Python example:
# Management of temporary bulk insert file.
def openBulkInsertFile(self) :
self.bulkInsertFile = open('c:/tmp/bulkInsertContent.txt', 'w', newline='')
self.csvWriter = csv.writer(self.bulkInsertFile)
def closeBulkInsertFile(self) :
self.bulkInsertFile.close()
When using a Job in SQL the user that uses the SQL express server is the current user logged, you should give write permission to that user in the folder where the Batch writes the output.
This happens usually only with bcp, when using type commands the ownership goes to the computer(Administrator) and the command runs with out problem.
So if you have a long command in your job just look for the bcp parts.
I am attempting to create a batch file in windows that will take a user's input, and pass that along to a sql file containing the following query, so that I can set a siteid, like in the following sql query:
exec sp_addlinkedserver [sqlserver1]
select * from [sqlserver1].onesource.dbo.admsites where siteid = '123'
I want to then take the results of this query, particularly the admsiteid, and then use the results of the query, and insert that into the originatorid (using another .sql file:
Use Onesource
update OSCsettings set originatorid = 'whatever-the-admsiteid-is'
How would I go about passing along these variables?
sqlcmd with the -v command line
-v var = "value"
You can specify multiple variables in the list.
See:
http://msdn.microsoft.com/en-us/library/ms162773.aspx
and
http://msdn.microsoft.com/en-us/library/ms188714.aspx
We have a few huge databases (20GB+) which mostly contain static lookup data. Because our application executes joins against tables in these databases, they have to be part of each developers local SQL Server (i.e. they can't be hosted on a central, shared database server).
We plan on copying a canonical set of the actual SQL Server database files (*.mdf and *.ldf) and attach them to each developer's local database.
What's the best way to find out the local SQL Server instance's data directory so we can copy the files to the right place? This will be done via an automated process, so I have to be able to find and use it from a build script.
It depends on whether default path is set for data and log files or not.
If the path is set explicitly at Properties => Database Settings => Database default locations then SQL server stores it at Software\Microsoft\MSSQLServer\MSSQLServer in DefaultData and DefaultLog values.
However, if these parameters aren't set explicitly, SQL server uses Data and Log paths of master database.
Bellow is the script that covers both cases. This is simplified version of the query that SQL Management Studio runs.
Also, note that I use xp_instance_regread instead of xp_regread, so this script will work for any instance, default or named.
declare #DefaultData nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultData', #DefaultData output
declare #DefaultLog nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultLog', #DefaultLog output
declare #DefaultBackup nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'BackupDirectory', #DefaultBackup output
declare #MasterData nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer\Parameters', N'SqlArg0', #MasterData output
select #MasterData=substring(#MasterData, 3, 255)
select #MasterData=substring(#MasterData, 1, len(#MasterData) - charindex('\', reverse(#MasterData)))
declare #MasterLog nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer\Parameters', N'SqlArg2', #MasterLog output
select #MasterLog=substring(#MasterLog, 3, 255)
select #MasterLog=substring(#MasterLog, 1, len(#MasterLog) - charindex('\', reverse(#MasterLog)))
select
isnull(#DefaultData, #MasterData) DefaultData,
isnull(#DefaultLog, #MasterLog) DefaultLog,
isnull(#DefaultBackup, #MasterLog) DefaultBackup
You can achieve the same result by using SMO. Bellow is C# sample, but you can use any other .NET language or PowerShell.
using (var connection = new SqlConnection("Data Source=.;Integrated Security=SSPI"))
{
var serverConnection = new ServerConnection(connection);
var server = new Server(serverConnection);
var defaultDataPath = string.IsNullOrEmpty(server.Settings.DefaultFile) ? server.MasterDBPath : server.Settings.DefaultFile;
var defaultLogPath = string.IsNullOrEmpty(server.Settings.DefaultLog) ? server.MasterDBLogPath : server.Settings.DefaultLog;
}
It is so much simpler in SQL Server 2012 and above, assuming you have default paths set (which is probably always a right thing to do):
select
InstanceDefaultDataPath = serverproperty('InstanceDefaultDataPath'),
InstanceDefaultLogPath = serverproperty('InstanceDefaultLogPath')
Even though this is a very old thread, I feel like I need to contribute a simple solution.
Any time that you know where in Management Studio a parameter is located that you want to access for any sort of automated script, the easiest way is to run a quick profiler trace on a standalone test system and capture what Management Studio is doing on the backend.
In this instance, assuming you are interested in finding the default data and log locations you can do the following:
SELECT
SERVERPROPERTY('instancedefaultdatapath') AS [DefaultFile],
SERVERPROPERTY('instancedefaultlogpath') AS [DefaultLog]
I stumbled across this solution in the documentation for the Create Database statement in the help for SQL Server:
SELECT SUBSTRING(physical_name, 1, CHARINDEX(N'master.mdf', LOWER(physical_name)) - 1)
FROM master.sys.master_files
WHERE database_id = 1 AND file_id = 1
For the current database you can just use:
select physical_name fromsys.database_files;
to specify another database e.g. 'Model', use sys.master_files
select physical_name from sys.master_files where database_id = DB_ID(N'Model');
As of Sql Server 2012, you can use the following query:
SELECT SERVERPROPERTY('INSTANCEDEFAULTDATAPATH') as [Default_data_path], SERVERPROPERTY('INSTANCEDEFAULTLOGPATH') as [Default_log_path];
(This was taken from a comment at http://technet.microsoft.com/en-us/library/ms174396.aspx, and tested.)
Various components of SQL Server (Data, Logs, SSAS, SSIS, etc) have a default directory. The setting for this can be found in the registry. Read more here:
http://technet.microsoft.com/en-us/library/ms143547%28SQL.90%29.aspx
So if you created a database using just CREATE DATABASE MyDatabaseName it would be created at the path specified in one of the settings above.
Now, if the admin / installer changed the default path, then the default path for the instance is stored in the registry at
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\[INSTANCENAME]\Setup
If you know the name of the instance then you can query the registry. This example is SQL 2008 specific - let me know if you need the SQL2005 path as well.
DECLARE #regvalue varchar(100)
EXEC master.dbo.xp_regread #rootkey='HKEY_LOCAL_MACHINE',
#key='SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.MSSQLServer\Setup',
#value_name='SQLDataRoot',
#value=#regvalue OUTPUT,
#output = 'no_output'
SELECT #regvalue as DataAndLogFilePath
Each database can be created overriding the server setting in a it's own location when you issue the CREATE DATABASE DBName statement with the appropriate parameters. You can find that out by executing sp_helpdb
exec sp_helpdb 'DBName'
Keeping it simple:
use master
select DB.name, F.physical_name from sys.databases DB join sys.master_files F on DB.database_id=F.database_id
this will return all databases with associated files
From the GUI: open your server properties, go to Database Settings, and see Database default locations.
Note that you can drop your database files wherever you like, though it seems cleaner to keep them in the default directory.
Small nitpick: there is no data folder, only a default data folder.
Anyway, to find it, assuming you want to install for the first default instance:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\Setup\SQLDataRoot
If there's a named instance, MSSQL.1 becomes something like MSSQL10.INSTANCENAME.
You can find default Data and Log locations for the current SQL Server instance by using the following T-SQL:
DECLARE #defaultDataLocation nvarchar(4000)
DECLARE #defaultLogLocation nvarchar(4000)
EXEC master.dbo.xp_instance_regread
N'HKEY_LOCAL_MACHINE',
N'Software\Microsoft\MSSQLServer\MSSQLServer',
N'DefaultData',
#defaultDataLocation OUTPUT
EXEC master.dbo.xp_instance_regread
N'HKEY_LOCAL_MACHINE',
N'Software\Microsoft\MSSQLServer\MSSQLServer',
N'DefaultLog',
#defaultLogLocation OUTPUT
SELECT #defaultDataLocation AS 'Default Data Location',
#defaultLogLocation AS 'Default Log Location'
Expanding on "splattered bits" answer, here is a complete script that does it:
#ECHO off
SETLOCAL ENABLEDELAYEDEXPANSION
SET _baseDirQuery=SELECT SUBSTRING(physical_name, 1, CHARINDEX(N'master.mdf', LOWER(physical_name)) - 1) ^
FROM master.sys.master_files WHERE database_id = 1 AND file_id = 1;
ECHO.
SQLCMD.EXE -b -E -S localhost -d master -Q "%_baseDirQuery%" -W >data_dir.tmp
IF ERRORLEVEL 1 ECHO Error with automatically determining SQL data directory by querying your server&ECHO using Windows authentication.
CALL :getBaseDir data_dir.tmp _baseDir
IF "%_baseDir:~-1%"=="\" SET "_baseDir=%_baseDir:~0,-1%"
DEL /Q data_dir.tmp
echo DataDir: %_baseDir%
GOTO :END
::---------------------------------------------
:: Functions
::---------------------------------------------
:simplePrompt 1-question 2-Return-var 3-default-Val
SET input=%~3
IF "%~3" NEQ "" (
:askAgain
SET /p "input=%~1 [%~3]:"
IF "!input!" EQU "" (
GOTO :askAgain
)
) else (
SET /p "input=%~1 [null]: "
)
SET "%~2=%input%"
EXIT /B 0
:getBaseDir fileName var
FOR /F "tokens=*" %%i IN (%~1) DO (
SET "_line=%%i"
IF "!_line:~0,2!" == "c:" (
SET "_baseDir=!_line!"
EXIT /B 0
)
)
EXIT /B 1
:END
PAUSE
i would have done a backup restore simply becuase its easier and support versioning. Reference data especially needs to be versioned in order to know when it started taking effect. A dettach attach wont give you that ability. Also with backups you can continue to provide updated copies without having to shut down the database.
Alex's answer is the right one, but for posterity here's another option: create a new empty database. If you use CREATE DATABASE without specifying a target dir you get... the default data / log directories. Easy.
Personally however I'd probably either:
RESTORE the database to the developer's PC, rather than copy/attach (backups can be compressed, exposed on a UNC), or
Use a linked server to avoid doing this in the first place (depends how much data goes over the join)
ps: 20gb is not huge, even in 2015. But it's all relative.
SELECT DISTINCT dbo.GetDirectoryPath(filename) AS InstanceDataPaths
FROM sys.sysaltfiles WHERE filename like '%.mdf' and filename not like '%\MSSQL\Binn\%'
SELECT DISTINCT dbo.GetDirectoryPath(filename) AS InstanceLogPaths
FROM sys.sysaltfiles WHERE filename like '%.ldf' and filename not like '%\MSSQL\Binn\%'
You can download detail SQL script from how to find the data directory for a SQL Server instance
You will get default location if user database by this query:
declare #DataFileName nVarchar(500)
declare #LogFileName nVarchar(500)
set #DataFileName = (select top 1 RTRIM(LTRIM(name)) FROM master.sys.master_files where database_id >4 AND file_id = 1)+'.mdf'
set #LogFileName = (select top 1 RTRIM(LTRIM(name)) FROM master.sys.master_files where database_id >4 AND file_id = 2)+'.ldf'
select
( SELECT top 1 SUBSTRING(physical_name, 1, CHARINDEX(#DataFileName, LOWER(physical_name)) - 1)
FROM master.sys.master_files
WHERE database_id >4 AND file_id = 1) as 'Data File'
,
(SELECT top 1 SUBSTRING(physical_name, 1, CHARINDEX(#LogFileName, LOWER(physical_name)) - 1)
FROM master.sys.master_files
WHERE database_id >4 AND file_id = 2) as 'Log File'