Ant <Sql> and <exec> sqlcmd - Different Outputs - sql

I have one .sql file that is execute using ant, when I execute it with the tag I recived a different output as when i used calling "sqlcmd".
sql tag output:
[sql] Executing resource: C:\SqlTesting\TestScriptDependencies\Executor.sql
[sql] Failed to execute: Use Library Exec CallSelectSP
[sql] com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name'Libraty.dbo.libraryDocumentType'.
[sql] 0 of 1 SQL statements executed successfully
exec tag output:
[exec] First SP
[exec] Msg 208, Level 16, State 1, Server MyPC-PC, Procedure getFirstDocumentType, Line 3
[exec] Invalid object name 'Libraty.dbo.libraryDocumentType'.
[exec] Second SP
[exec] Msg 208, Level 16, State 1, Server MyPC-PC, Procedure badSP, Line 3
[exec] Invalid object name 'Libraty.dbo.libraryDocumentType'.
And this is the .sql file.
Print 'First SP'
Exec getFirstDocumentType
Print 'Second SP'
Exec badSP
Go
I wonder if it is a way of the SQL tag reproduce the same output as the EXEC tag.
Thanks.

Looks like the first one is submitting the whole script as a single batch via jdbc. Whereas the second appears to be sending each sql statement via sqlcmd - hence the print statements succeed (and result in synchronized output - which is not always guaranteed with print - raiserror(str, 10, 1) with nowait; is the only guarantee of timely messaging) and both sp calls are attempted, each producing their own (sql) error.

Related

Trying to import Datapump dump into remote Oracle Server via SQL

I am trying to import a datapump dump file into a remote server via SQLPlus.
So, local machine -> remote oracle Server
Dump file is on local machine on a SMB share.
I have written a little script for that operation, however it fails with errors.
The Datapump Directory exists (SMB share on local machine, DIRECTORY is pointing to said share)
My script:
DECLARE
ind NUMBER; -- Loop index
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
logfileName VARCHAR2(30);-- Name of the Logfile
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
h1 := DBMS_DATAPUMP.OPEN('IMPORT','FULL',NULL,'IMPORT_XYDESCRIPTOR');
DBMS_DATAPUMP.ADD_FILE(h1,'dump.dmp','DPIMP_REMOTE',NULL,DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE,1);
DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','APPEND');
DBMS_DATAPUMP.START_JOB(h1);
[... more stuff...]
However I get the following error, when I execute that script as SYSDBA:
DECLARE
*
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4056
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4307
ORA-06512: at line 15
line 15 is this line:
DBMS_DATAPUMP.ADD_FILE(h1,'dump.dmp','DPIMP_REMOTE',NULL,DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE,1);
I don't really get why I get an invalid argument value error, the command is exactly as the Oracle wiki dictates.
Can someone help me on what I am doing wrong?
Maybe the thing I try isn't even possible?

Error passing in SSIS package variables from job command line in SSMS

I'm trying to have a job pass in a variable for multiple emails through the command line in SQL Server ManageMent Studio however, it is throwing an error for a specific SET call. FYI, User::gEmailTo is a string in the package.
In my job creation script, I have the following -
--Declaration
:setvar EmailTo "josh#test.com; mike#test.com; earl#test.com; mark#test.com; bill#test.com"
--Syntax throwing the error:
'/SET "\package.Variables[User::gEmailTo].Value";"$(EmailTo)" '
The output in the command line for the job in SSMS shows:
/SET "\"\package.Variables[User::gEmailTo].Value\"";"\"josh#test.com; mike#test.com; earl#test.com; mark#test.com; bill#test.com\""
The error
Argument ""\package.Variables[User::gEmailTo].Value;josh#test.com; mike#test.com; earl#test.com; mark#test.com; bill#test.com"" for option "set" is not valid. The command line parameters are invalid. The step failed.

SQL Server database project pre- and post-deployment script

I've added an extra column to a table which I want to initialize using a query in the post deployment script. Unfortunately I can't seem to write a query which can be run every time so I'm looking for a way to check in the pre-deployment script if the column is available and pass an argument or variable to the post-deployment script which will then run the initialization query once.
Attempt 1: I tried setting a sqlcmd var in the pre-deployment script but the following syntax isn't allowed:
IF COL_LENGTH('dbo.Table','NewColumn') IS NULL
:setvar PerformInitQuery 1
Attempt 2: I've also tried using a normal variable in the pre-deployment script:
DECLARE #PerformInitQuery BIT = 0
IF COL_LENGTH('dbo.Table','NewColumn') IS NULL
SET #PerformInitQuery = 1
And accessing it in the post-deployment script:
IF #PerformInitQuery = 1
BEGIN
:r ".\DeploymentScripts\PerformInitQuery.sql"
END
This last attempt seemed to work when publishing the project from Visual Studio but not on our build server; which uses SqlPackage.exe to publish the generated .dacpac file to the database.
Error SQL72014: .Net SqlClient Data Provider:
Msg 137, Level 15, State 2, Line 12
Must declare the scalar variable "#PerformInitQuery"
You could try using a temp table to hold values you wish to pass from pre to post scripts;
/*
Pre-Deployment Script Template
--------------------------------------------------------------------------------------
This file contains SQL statements that will be executed before the build script.
Use SQLCMD syntax to include a file in the pre-deployment script.
Example: :r .\myfile.sql
Use SQLCMD syntax to reference a variable in the pre-deployment script.
Example: :setvar TableName MyTable
SELECT * FROM [$(TableName)]
--------------------------------------------------------------------------------------
*/
select 'hello world' as [Col] into #temptable
picked up in post deployment script;
/*
Post-Deployment Script Template
--------------------------------------------------------------------------------------
This file contains SQL statements that will be appended to the build script.
Use SQLCMD syntax to include a file in the post-deployment script.
Example: :r .\myfile.sql
Use SQLCMD syntax to reference a variable in the post-deployment script.
Example: :setvar TableName MyTable
SELECT * FROM [$(TableName)]
--------------------------------------------------------------------------------------
*/
declare #var nvarchar(200)
select #var = [Col] from #temptable
print #var
hello world
Update complete.

What's the wrong with this code?

I'm new in SSMS, and I'm trying to execute this code:
use master
create database SQL20145Db2
on primary
( name = Sql2015Data2, filename='C:\Program Files (x86)\Microsoft SQL Server\MSSQL.1\MSSQL\Data\Sql2015Data2',
size=4MB, MaxSize=15, FileGrowth= 20%
)
log on
(Name=Sql2015Log2, filename='C:\Program Files (x86)\Microsoft SQL Server\MSSQL.1\MSSQL\Data\Sql2015Log2',
size= 1MB,MaxSize=5Mb,filegrowth=1MB
)
But the messages pane displays this error:
Msg 5123, Level 16, State 1, Line 2
CREATE FILE encountered operating system error 5(failed to retrieve text for this error. Reason: 15105) while attempting to open or create the physical file 'C:\Program Files (x86)\Microsoft SQL Server\MSSQL.1\MSSQL\Data\Sql2015Data2'.
Msg 1802, Level 16, State 4, Line 2
CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
This error happen because the registry value DefaultData and DefaultLog (which correspond to default data directory) are either empty or does not exists.
See documentation for more information.
Most of the time, these registry values does not exists because they actually need to be accessed as Admin. So, to fix this issue, simply run whatever application you are using to execute the sql as an administrator.

Calling sql file from shell script with parameters

I have developed following code to call a sql file from shell script testshell.sh
#!/usr/bin/env ksh
feed=`sqlplus -s uname/pwd <<-EOF
#test.sql 'Test_VAl'
/
exit;
EOF`
echo $feed;
My sql file is test.sql which contains following:
Declare
attributeName varchar2(255):=&1;
BEGIN
DBMS_OUTPUT.put_line (attributeName);
END;
I am getting following error while execution.
old 3: attributeName varchar2(255):=&1;
new 3: attributeName varchar2(255):=Test_VAl;
attributeName varchar2(255):=Test_VAl; test.sql testshell.sh
ERROR at line 3:
ORA-06550: line 3, column 31: PLS-00201: identifier 'TEST_VAL' must be declared
ORA-06550: line 3, column 16: PL/SQL: Item ignored
ORA-06550: line 5, column 25: PLS-00320: the declaration of the type of this expression is incomplete or malformed
ORA-06550: line 5, column 3: PL/SQL: Statement ignored
Please tell me how to fix this issue.
If your substitute variable is a string then you need to quote it when it's used, not when it's passed in. At the moment it doesn't have quotes so it's treated as an object identifier, and there is no matching object or variable, hence the error.
So your SQL script would be:
set verify off
DECLARE
attributeName varchar2(255):='&1';
BEGIN
DBMS_OUTPUT.put_line (attributeName);
END;
/
Of course you don't need to define a local variable but I assume you're experimenting with simple cases for now.
The set verify off stops the old and new messages being displayed. Thos are useful for debugging but otherwise are usually just noise.
Then you can call it with:
feed=`sqlplus -s uname/pwd <<-EOF
#test.sql Test_VAl
exit;
EOF`
Or if you include the exit in the script you can do:
feed=`sqlplus -s uname/pwd #test.sql Test_VAl`