Problem with BCP writing to .txt file from SQL - sql

Im using Sql2008 trying to run this BCP command but it never creates the file.
-- Export query
DECLARE #qry2 VARCHAR(1000)
SET #qry2 = 'SELECT * FROM #SkippedProductsTable'
-- Folder we will be putting the file in
DECLARE #incomingfolder VARCHAR(1000)
SET #incomingfolder = 'c:\Logs'
DECLARE #bcpCommand VARCHAR(2000)
SET #bcpCommand = 'bcp "'+#qry2+'" queryout "'+#incomingfolder+'\SkippedProducts-'+CAST(#StoreMatchCode AS VARCHAR)+'-'+'.txt" -c -T'
PRINT #bcpCommand
EXEC MASTER..xp_cmdshell #bcpCommand, no_output
The created command looks like:
bcp "SELECT * FROM #SkippedProductsTable" queryout "c:\Logs\SkippedProducts-1330-.txt" -c -T
Can anyone suggest what could be going wrong? I've never used BCP before and not really sure where to start looking.
As a start I know that the folder deffinately exists at that location

I think the problem is the SELECT.
You are SELECTing from a table variable that is not declared in the query, so there's nothing for BCP to do.
Table variables only persist for the context they are called in, so even if you have one in a query, and you have dynamic sql or a subproc within that first query, they won't be able to see the table variable.
See this for more info.

Related

bcp outfile not found

I'm trying to run the script below, but it returns null. When I run the DOS command, it generates the file normally.
DECLARE #str VARCHAR(1000)
SET #str = 'bcp "Select * FROM WDG.dbo.Facilidade" queryout "w:\xyzTable.txt" -S "WDG-NOTE24\MSSQLWDG" -T -c -t ; '
EXEC xp_cmdshell #str
GO
I need to return a separate txt file for ';' with query data
Tanks
Given your error in your comments i will follow this link and as i described earlier its an access problem to your folder.
Remember it should be given to your Sql service account user and not your self
BCP unable to open BCP host access
I managed to make it work, I found the command below that tests if it has access to the directory.
For some reason did not accept the old path, so I created another one on another disk and it worked.
EXEC master..xp_cmdshell 'DIR C:\sql'
Very thank's for help.
I have a problem, at the end of my import file, it comes with the text '--END--', and when the bulk insert is going to render, it displays an unexpected end message.
I can put some parameter so that when it finds the text it finishes the import.
declare #sql varchar(max)
set #sql = 'BULK INSERT Temp_Facilite FROM ''' + ##FullPath + '''WITH (FIRSTROW = 2,CODEPAGE = ''RAW'',FIELDTERMINATOR = '';'',ROWTERMINATOR = ''0x0A'',MAXERRORS = 3, KEEPNULLS );'
exec (#sql)

Passing parameters to BCP command in SSIS execute SQL task

First of all, can i pass parameters for Path in BCP command?
I wrote this query in Exec SQL task
EXEC xp_cmdshell 'bcp "SELECT * FROM TLC.dbo.World_Hosts" queryout `"C:\Users\akshay.kapila\Desktop\TLC\Foreachlearn\Dest\?.txt" -T -c -t '`
I have given ? in path. I want specific countries in place of that. They are held in variable "Country".I am using Foreach loop which creates rather it should create a file ex Aus.txt,In.txt everytime loop runs for that specific value.
Can i use this way. If not, how can i pass variable value to Path in BCP command?
You can use variable as the SQLSourceType in your Execute SQL Task.
Create a variable to hold your bcp command, it may look like:
"EXEC xp_cmdshell 'bcp \"SELECT ''abc'' as output\" queryout \"" + #[User::strFileName] + "\" -T -c -t '"
Here #[User::strFileName] is the dynamic file you want to generate.
Then in the Execute SQL Task, change SQLSourceType to variable, and select the variable you just generated.

SQL Server Calling a stored procedure from another stored procedure at the command line

I have been playing around with database backup automation scripts and in particular the one at this link:
http://support.microsoft.com/kb/2019698
I got everything working fine and even added automated compression using 7zip, logging, and with the help of vbscript an email scheduled notification. However, even without all that, you can see this is a bit heavy. Its now easily reaching 400 lines of code.
I am really not comfortable having all my stuff in one block like this and I want to separate it out. So I can have say a compression file called BackupCompress.sql, and an log file called BackupLogReport.sql all of which would be called from inside the main Backup.sql script.
The Backup.sql script is in turn run from a Backup.bat file which is set to run in the scheduler.
All of this works like a charm. But I am at a loss as to how to call BackupCompress.sql from within BackupLogReport.sql and pass in parameters and get a return value.
In the Backup.bat file I use this command to spin everything up and pass parameters to it:
SQLCMD -S %SQLDATABASE% -d master -i %BACKUP_FOLDER%\Backup.sql -v Pram1="%Pram1%"
In the Backup.sql file I get those parameters simply by:
DECLARE #Param1 NVARCHAR(256) = '$(Param)'
from then on as my script runs it uses whatever I want to pass in.
I tried using standard sql stored procedure logic to call another procedure like this:
EXEC BackupCompress.sql
#AnotherParam = #Param1
I also tried:
EXECUTE sp_executesql BackupCompress.sql #Param1
Finally I tried:
SET #cmd = 'SQLCMD -S ' + ##ServerName + ' -d master -i $(BACKUP_FOLDER)\BackupCompress.sql -v Param1 = ' + #Param1
EXEC xp_cmdshell #cmd, no_output
but it doesn't work and my files which were being compressed simply don't get compressed. I get no error message. everything else continues to work fine.
EDIT: I was getting an error message on the last one but I fixed it - however, I still don't get my little zip file. I even put print's into the file to see if it was actually be executed but it does not seem to be.
EDIT2: Another option I have tried, almost works, but cant figure out how to pass parameters from within the sql file to the other file... As a result it generates an error saying it cant find the file as it's treating the path as a literal string instead of the variable value I want to pass.
:!!SQLCMD -S ##ServerName -d master -i #CFG_BACKUP_PATH\BackupCompress.sql -v Param1 = #Param1
xp_cmdshell can return values. These values can be captured into a table variable that you could use to "see" the results, and perhaps determine where the problem lies:
DECLARE #cmd VARCHAR(255);
DECLARE #Param1 NVARCHAR(256) = '$(Param)';
DECLARE #Results TABLE
(
ResultsText NVARCHAR(MAX)
);
SET #cmd = 'SQLCMD -S ' + ##ServerName + '-d master -i $(BACKUP_FOLDER)\$(BackupCompress.sql) -v Param1 = ' + #Param1;
SET #cmd = 'DIR \';
INSERT INTO #Results (ResultsText)
EXEC xp_cmdshell #cmd;
SELECT *
FROM #Results;
You need to ensure xp_cmdshell is enabled for the instance, by executing:
EXEC sp_configure 'xp_cmdshell',1;

How to pass parameters to SQL script via Powershell

I have a Power-shell script that calls a SQL script. This is currently working, but inside my sql script I have some hard coded parameters that I would like to pass to the SQL script via the powershell.
So this is the snip-it from the Power-shell script
function ExecSqlScript([string] $scriptName)
{
$scriptFile = $script:currentDir + $scriptName
$sqlLog = $script:logFileDir + $scriptName + "_{0:yyyyMMdd_HHmmss}.log" -f (Get-Date)
$result = sqlcmd -S uk-ldn-dt270 -U sa -P passwordhere3! -i $scriptFile -b | Tee-Object - filepath $sqlLog
if ($result -like "*Msg *, Level *, State *" -Or $result -like "*Sqlcmd: Error:*")
{
throw "SQL script " + $scriptFile + " failed: " + $result
}
}
try
{
ExecSqlScript "restoreDatabase.sql"
}
catch
{
//Some Error handling here
}
And this is from the SQL
USE MASTER
GO
DECLARE #dbName varchar(255)
SET #dbName = 'HardCodedDatabaseName'
So I want to pass the value for dbName, any ideas?
You could take advantage of sqlcmd's scripting variables. Those can be used in script file and are marked with $(). Like so,
-- Sql script file
use $(db);
select someting from somewhere;
When calling sqlcmd, use the -v parameter to assign variables. Like so,
sqlcmd -S server\instance -E -v db ="MyDatabase" -i s.sql
Edit
Mind the Sql syntax when setting variables. Consider the following script:
DECLARE #dbName varchar(255)
SET #dbName = $(db)
select 'val' = #dbName
As passed to the Sql Server, it looks like so (Profiler helps here):
use master;
DECLARE #dbName varchar(255)
SET #dbName = foo
select 'val' = #dbName
This is, obviously invalid a syntax, as SET #dbName = foo won't make much sense. The value ought to be within single quotes like so,
sqlcmd -S server\instance -E -v db ="'foo'" -i s.sql
Just in case someone else needs to do this... here is a working example.
Power Shell Script:
sqlcmd -S uk-ldn-dt270 -U sa -P 1NetNasdf£! -v db = "'DatabaseNameHere'" -i $scriptFile -b | Tee-Object -filepath $sqlLog
Note the -v switch to assign the variables
And here is the MS SQL:
USE MASTER
GO
if db_id($(db)) is null
BEGIN
EXEC('
RESTORE DATABASE ' + $(db) + '
FROM DISK = ''D:\DB Backup\EmptyLiveV5.bak''
WITH MOVE ''LiveV5_Data'' TO ''C:\Program Files (x86)\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\LiveV5_' + $(db) + '.MDF'',
MOVE ''LiveV5_Log'' To ''C:\Program Files (x86)\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\LiveV5_' + $(db) + '_log.LDF'', REPLACE,
STATS =10')
END
Note: You do not have to assign the scripting varible to a normal sql varible like this.
SET #dbName = $(db)
you can just use it in your sql code. - Happy coding.
Here's a full example using a different PowerShell approach. Here I'm using a specific script to reset local databases for a clean development environment.
## Reset the Database
$resetScript= "C:\ResetSite\resetDatabases.sql"
Write-Host "Resetting the DB - Running $($resetScript)"
$connectionString = "Server = localhost; Database = 'master'; UID = myusername; PWD = mypassword"
# Create variables & params
$sqlCmdVariables = #(
"Database=$($siteConfig.db_name)",
"UserName=$($siteConfig.db_username)",
"UserPassword=$($siteConfig.db_user_password)"
)
$sqlCmdParameters = #{
InputFile = $resetScript
QueryTimeout = 1800
ConnectionString = $connectionString
Variable = $sqlCmdVariables
}
# Invoke
Invoke-SqlCmd #sqlCmdParameters
The .sql file then uses the parameters passed in, the same way #nmbell mentions.
-- Declare the vars
DECLARE #Database nvarchar(100), #UserName nvarchar(100), #UserPassword nvarchar(100)
-- Set the vars
SET #Database = '$(Database)' -- database name
SET #UserName = '$(UserName)' -- SQL login and database username
SET #UserPassword = '$(UserPassword)' -- login password
... more stuff here.. use the vars like normal
This is partly derived from this blog post but modified slightly to use a file rather than an inline query.
Adjusting vonPryz's answer to use:
SET #dbName = '$(db)'
means you can pass in the parameter from the command line in a more natural form as
sqlcmd -S server\instance -E -v db ="foo" -i s.sql
The SqlCmd variable still substitutes correctly.
I know this is an old answer but I do have a better way that is much easier if you only need a small amount of data from Powershell (or even a large amount as long as all you want is text), and you work mainly in SQL for your scripting like I do:
1: Start the PowerShell from SQL using xp_cmdshell, and insert the results to a one-column table which allows NULLs e.g:
DECLARE #Results (Line varchar(1000) NULL)
INSERT #Results
EXEC master.dbo.xp_cmdshell '"powershell.exe C:\PowershellScripts\MyScript.ps1 MyParams"'
2: During your PowerShell script, for anything you want to pass back to SQL, simply use "Write-Output", e.g:
Write-Output $returned_data
You can do this as many times as you want. If you have 10,000 values to pass back to SQL, then you could use write-output 10,000 times.
So in the above example once the "MyScript.ps1" PowerShell script finishes running, all of the output will be in the #Results table variable, ready to be used, queried, imported into individual variables, whatever you want really.

My bcp hangs after creating an empty file

I am trying to drop a file into a directory on the local machine (same machine running SQL Instance). The content of the table I am trying to drop out is in xml format.
ie. table=xmlOutFiles, fieldName = xmlContent; fieldName contains essentially varchar(max) data that is to become the xml file we need.
When the bcp command is executed it seems to create the file in the #dest location, size = 0 bytes and then the process running from within SMSS just sits there waiting for something!
I cannot do anything with that empty file, like delete it, unless I use task manager to kill the process "bcp.exe".
I have tried multiple combinations of the bcp flags, etc.
Running the bcp command from a system prompt, replacing the "#vars" seems to work but I really need it to be part of my SQL Trigger script and function.
Assistance appreciated!!!!
Select #dest = (Select filename from xmlOutfiles)
Select #cmd = 'bcp "Select xmlContent from ProcureToPay.dbo.XmlOutFiles" queryout '+#dest+' -x -w -T -S' + ##servername
Exec xp_cmdshell #cmd
I have tried executing with -T and -Uusername -Ppassword parameters, etc.
This command works from the DOS prompt:
bcp "Select xmlContent from Procure.To.Pay.dbo.XmlOutFiles" queryout c:\temp\test.xml -x -w -T S<myservernameHere>