bcp Utility write to remote server? - sql

According to some recommendations i use bcp Utility to write SQL server Table to .cvs file so i could later transfer the data to informix table with the same structure .
My SQLServer Stored Procedure :
ALTER PROCEDURE [dbo].[TestCVS]
AS
BEGIN
declare #sql nvarchar(4000)
select #sql = 'bcp "select * from ML..gmp4vacationbalance" queryout c:\ss\Tom.cvs -c -t, -T -S' + ##servername
exec master..xp_cmdshell #sql
END
I have four questions concerning this procedure :
1- How to allow this procedure to write to remote server instead of local server ##servername,because it's not secure to allow specific server to access my sql server ?
2-How to allow to filter the query under specific condition :
say i want to write query like this :
select * from ML..gmp4vacationbalance where balance_date = #date AND emp_num = #empNum
3-when i execute the procedure i get data like this:
Why the third column appear corrupted like this , it's varchar desc written in arabic ?
4-When i want to delimit by pipe | instead of comma , like this
select #sql = 'bcp "select * from ML..gmp4vacationbalance" queryout c:\ss\Tom.cvs -c -t| -T -S' + ##servername
I get the following error :

Question 1: Writing from a remote server
I assume you meant to say "from" not "to". Specify the server name in form ServerName\InstanceName instead of using ##servername, you will need to have permissions to access the other server (since you are using a trusted connection -T)
Question 2: How to add parameters to the BCP statement
BCP is a command line utility not part of Transact-SQL. You can't add parameters. You can format the command line executed. You'll have to make your parameters #date and #empNum strings concatenate them with the rest of the SQL string for the command line
Question 3: Wrong characters in output
Instead of -c use -w to output Unicode characters
Question 4: Pipe not working
A common problem with BCP, simply quote the pipe like this -t"|" to make that the record separator

Related

Using xp_cmpshell with variable in SQL Server

I want to use xp_cmdshell to ping servers. I created a procedure but I have a little problem, I need to select the server IP from table that is already created.
I created a cursor to get the server IP from the table but I don't know how to use the #ip varchar variable with ping command.
This syntax didn't work:
execute xp_cmdshell 'ping #ip'
You cannot reference parameters directly within xp_cmdshell, so you have to concatenate the value when creating the command. I recommend reading: https://msdn.microsoft.com/en-us/library/ms175046.aspx
In your example, you would do something like:
DECLARE #cmd nvarchar(4000);
SET #cmd = 'ping ' + #ip;
EXEC xp_cmdshell #cmd;

sql server: how to execute a .sql from another query?

I have a .sql script with a lot of action queries that work on some staging tables. This script needs to be run twice with some other commands in-between i.e.:
Load the staging table from source A
Use do_stuff.sql to process it
Move the results somewhere.
Repeat Steps 1-3 for source B.
The brute force approach would be to just copy & paste dostuff.sql as needed. While this would technically work, is there a better way?
I'm hoping there's a command like RunThisSQL 'C:\do_stuff.sql' that I haven't discovered yet.
Update
Well, it's been about 5 years and I just re-discovered this old question. I did this recently and made a cursor to loop thru a master table. For each record in that master table, the script runs through an inner script using variables set by the master table.
https://www.mssqltips.com/sqlservertip/1599/sql-server-cursor-example/
If you use visual studio you can create "Sql Server Database" project. Withing the project you can create script that let you execute your *.sql files in a manner
/*
Post-Deployment Script Template
--------------------------------------------------------------------------------------
This file contains SQL statements that will be appended to the build script.
Use SQLCMD syntax to include a file in the post-deployment script.
Example: :r .\myfile.sql
Use SQLCMD syntax to reference a variable in the post-deployment script.
Example: :setvar TableName MyTable
SELECT * FROM [$(TableName)]
--------------------------------------------------------------------------------------
*/
see also. http://candordeveloper.com/2013/01/08/creating-a-sql-server-database-project-in-visual-studio-2012/
Try using xp_cmdshell.
EXEC xp_cmdshell 'sqlcmd -S ' + #ServerName + ' -d ' + #DBName + ' -i ' +#FileName
xp_cmdshell and concatenation do not play together nicely, often resulting in an "Incorrect syntax near '+'" error. So further to Jeotics solution above you will need to make a variable of the entire string you pass to xp_cmdshell (including quotes around anything that may contain a space (eg filepath\filename). This is mentioned in the Microsoft documentation for xp_cmdshell here. Other issues you will have to contend with are the default set up for SQL Server which has xp_cmdshell disabled as outlined here and granting permission to non-system administrators to use xp_cmdshell outlined here. The documentation generally advises against giving xp_cmdshell rights to too many people owing to it being a vehicle for those with malintent but if, like me, you have minimal and trustworthy database users then it seems like a reasonable solution. One last issue that requires correct configuration is the SQL Server Agent as outlined here. Documentation outlines that SQL Agent is responsible for background scheduling (such as back ups) and performance of command line statements, etc..
DECLARE
#Server nvarchar (50)
,#Database nvarchar(50)
,#File nvarchar(100)
,#cmd nvarchar(300);
SET #Server = server_name;
SET #Database = database_name;
SET #File = 'C:\your file path with spaces';
SET #cmd = 'sqlcmd -S ' + #Server + ' -d ' + #Database + ' i "' + #File + '"';
EXEC xp_cmdshell #cmd;
There are some security issues with enabling xp_cmdshell in SQL Server. You can create a CLR Stored procedure, which executes the passed file content. This CLR stored procedure is especially for this purpose, not like xp_cmdshell, which can do anything over the command prompt.
issues with enabling xp_cmdshell
Create CLR stored procedure

xp_cmdshell Native Error 208, BCP in SQL Server 2008 R2

I've been trying to work on taking the result of a large and multiply-joined SELECT statement, and email the query result as a CVS file.
I have the query correct and the emailing down, but I'm having trouble automating the export of the result as a CVS file. From what I've been reading, the best bet for auto-exporting query results is a tool called "BCP".
I attempted to use BCP like this in Management Studio:
USE FootPrint;
DECLARE #sql VARCHAR(2048);
DECLARE #dir VARCHAR(50);
SET #dir = 'C:\Users\bailey\Desktop';
SET #sql = 'bcp "SELECT TOP 10 * FROM datex_footprint.Shipments" queryout "' + #dir + '" -c -t, -T';
EXEC master..xp_cmdshell #sql;
FootPrint is the name of a specific database, and datex_footprint a schema.
(This is not the real query, just a test one).
When I run this, the error I get is:
"SQLState=S0002, NativeError = 208"
"Error = [Microsoft][SQL Server Native Client 10.0][SQL Server] Invalid object name 'datex_footprint.Shipments'."
I am 100% positive that datex_footprint.Shipments is the correct schema\table access for the data I'm trying to test on.
Does anyone see what I'm missing or doing wrong in trying to export this result to a CSV file? Specifically, though, I'm trying to automate this process. I know how to export results into a CSV file, but I want to do it in T-SQL so I can automate the generation of the file by time of day.
Any help would be appreciated!
[SOLVED]
I figured out what I was doing wrong. I was not identifying the view in complete form. I was using "schema.Table/View", instead of "database.schema.table/view".
Also, I added a "-S" + ##SERVERNAME flag -- this tells the BCP utility to use the server SQL Server is currently connected to for the query.
The correct code to generate a CSV file of a SELECT-query's results in T-SQL, SQL Server 2008 is:
DECLARE #sql VARCHAR(8000);
SELECT #sql = 'bcp "SELECT * FROM FootPrint.datex_footprint.Shipments" queryout "C:\Users\bailey\Desktop\FlatTables\YamotoShipping.csv" -c -t, -T -S' + ##SERVERNAME;
exec master..xp_cmdshell #sql;
So once I added "FootPrint." to identify the database, it worked.
NOTE: I'm running SQL Server 2008 R2.
I have got same error but I have resolved it in a different way.
I have added the default database to my ID then BCP started looking the tables in my default database and processed the files.
After searching and reading the documentation of bcp, what I found was that when we have copy the data we should be using GLobal temp table i.e. ## instead of #.. because in tempdb it will collide and wont allow to copy the data to the destination file.
Example:
DECLARE #OutputFilePath nvarchar(max); SET #OutputFilePath = 'C:\OutputData'
DECLARE #ExportSQL nvarchar(max); SET #ExportSQL = N'EXEC xp_cmdshell ''bcp
"SELECT * FROM LW_DFS_DIT.dbo.##Mytemptable " queryout "' + #OutputFilePath +
'\OutputData.txt" -T -c -t '''
EXEC(#ExportSQL)
Hope this would help

xp_cmdshell Query Length Too Large

All, I need to write a data set from a large SQL table to a .txt file. To do this I have chosen to use xp_cmdshell. The query I have been using to create the Data.txt file is
declare #sql varchar(8000)
select #sql = 'bcp "SELECT /*Lots of field names here*/ ' +
'FROM [SomeDatabase]..TableName WHERE /*Some Long Where Clause*/" ' +
'queryout "M:\\SomeDir\\SomeOtherDirectory\\Data.txt" -c -t -T -S' + ##servername
exec master..xp_cmdshell #sql
the problem I am having is that the SELECT query I am using exceeds the 1024 character limit imposed by the command line. To get around this I have decide to try and use sqlcmd to attempt to execute the SQL Query I need from a file, elliminating the error with the query length. I have tried the following query
DECLARE #DatabaseName VARCHAR(255)
DECLARE #cmd VARCHAR(8000)
SET #DatabaseName = 'SomeDatabase'
SET #CMD = 'SQLCMD -E -S (localhost) -d ' + #DBName +
'i "M:\\SomeDir\\SomeOtherDirectory\\tmpTestQuery.sql"'
EXEC master..xp_cmdshell #CMD
where 'tmpTestQuery.sql' holds the long query I want to execute, but I get the following errors
HResult 0x2AF9, Level 16, State 1
TCP Provider: No such host is known.
NULL
Sqlcmd: Error: Microsoft SQL Server Native Client 10.0 : A network-related or instance-
specific error has occurred while establishing a connection to SQL Server.
Server is not found or not accessible. Check if instance name is correct and
if SQL Server is configured to allow remote connections.
For more information see SQL Server Books Online..
Sqlcmd: Error: Microsoft SQL Server Native Client 10.0 : Login timeout expired.
NULL
I have remote connections enabled.
I would like to know what I am doing wrong, and if there is another way around the problem I am having with the query length when using xp_cmdshell?
Thanks for your time.
Note. This query will eventually be called from C#, so the plan was to write the very long query to a temporary .txt file, execute it using the method outlined and delete when finished.
One way to get around the BCP limitation is to wrap the complex query in a view or stored procedure, then have the BCP command query that object.
Your SQLCMD may not work because of the brackets around localhost. Try:
...
SET #CMD = 'SQLCMD -E -S localhost -d ' + #DBName +
...
You can insert desired data into global temp table (##temp_table) and then use it as source:
declare #sql varchar(8000)
select #sql = 'bcp "SELECT * FROM ##temp_table" ' +
'queryout "M:\\SomeDir\\SomeOtherDirectory\\Data.txt" -c -t -T -S' + ##servername
exec master..xp_cmdshell #sql

Write rows to CSV from SQL Server Trigger

We need to write a trigger so that when rows inserted into a sql server table meeting certain conditions, they are then written to a windows OS flat file in CSV format.
Are there any commands short of running a xp_cmdshell that would allow us to do this?
Any reason you wouldn't instead do a scheduled process with SSIS?
Depending on your transaction rate, I'd be hesistant to put this in a trigger since it means evaluations and possible export on every record entered. If you get a high frequency, you could shoot yourself in the foot...
Even if the transaction rate is fairly low, you could potentially still have problems like blocking/locking if the physical writes take long. You introduce several possible classes of errors (file i/o errors, write-fail means trigger fails means insert fails, etc.).
With a scheduled process you only take a hit on an infrequent basis and you don't potentially lock your table while the trigger is doing something external.
Use Bulk Copy Program (BCP) to create CSV files. For example:
BCP master..sysobjects out c:\sysobjects.txt -c -t, -T –S<servername>
The basic format for the BCP command for creating a CSV file is as follows:
BCP out
The switches used here are:
-c Output in ASCII with the default field terminator (tab) and row terminator (crlf)"
-t override the field terminator with ","
-T use a trusted connection. -U & –P may be used for username/password
-S connect to this server to execute the command
Here's another example:
declare #sql varchar(8000)
select #sql = 'bcp master..sysobjects out
c:\bcp\sysobjects.txt -c -t, -T -S'+ ##servername
exec master..xp_cmdshell #sql
Here's a description of how to use BCP: http://msdn.microsoft.com/en-us/library/ms162802.aspx