Passing variables between batch file and .sql file for mssql server - sql

I am attempting to create a batch file in windows that will take a user's input, and pass that along to a sql file containing the following query, so that I can set a siteid, like in the following sql query:
exec sp_addlinkedserver [sqlserver1]
select * from [sqlserver1].onesource.dbo.admsites where siteid = '123'
I want to then take the results of this query, particularly the admsiteid, and then use the results of the query, and insert that into the originatorid (using another .sql file:
Use Onesource
update OSCsettings set originatorid = 'whatever-the-admsiteid-is'
How would I go about passing along these variables?

sqlcmd with the -v command line
-v var = "value"
You can specify multiple variables in the list.
See:
http://msdn.microsoft.com/en-us/library/ms162773.aspx
and
http://msdn.microsoft.com/en-us/library/ms188714.aspx

Related

Pass byte[] as parameter to sql insert script

I am trying to upload the binary[] of a Zip folder to my database. I used Get-Content -Encoding Byte -ReadCount 0 to read the data into a variable. I want to use this variable in an INSERT statement. Unfortunately, sqlcmd doesn't like the size of the variable, and gives me this error:
Program 'SQLCMD.EXE' failed to run: The filename or extension is too longAt line:1 char:1.
I have tried using the -Q option to run the query, and also -i to run a sql file.
DECLARE #data varbinary(MAX)
SET #data = '$(data_stuff)'
INSERT INTO MyTable
(v1,v2,v3,v4,v5)
VALUES
(v1,v2,v3,v4,#data)
sqlcmd -S servername -E -i .\file.sql -v data = "$binarydata"
Is there a workaround for doing this?
In a SQL query/batch/.sql file, binary/varbinary/image literal data values must be in hexadecimal format with a 0x prefix:
INSERT INTO tableName ( binaryColum ) VALUES ( 0x1234567890ABCDEF )
I don't know what the maximum length of a binary literal is, but I suspect things might stop working, or be very slow, if you exceed more than a few hundred kilobytes.
I recommend using ADO.NET directly via PowerShell, which will also let you use binary parameter values (SqlParameter): How do you run a SQL Server query from PowerShell?

Generated KML file from SQL query save to local drive

My SQL Query generates a XML output:
select 'TEST.kml' as name,
(select 'TEST' as name, (
select (
select top 10 issue as name,
null as description,
null as 'Point/coordinates',
(
select
null as altitudeMode,
Coordinates as 'coordinates'
for xml path('Polygon'), type)
from Mapping for xml path('Placemark'), type))
for xml path ('Line') , type)
for xml path ('Doc'), root('kml'))
I want to save the output of the query as .XML file on to local drive.Please advise.
Not the most elegant way but it is possible to use bulk copy program and xp_cmdshell to do this. Few things first, xp_cmdshell is blocked by default by SQL Server as part of the security configuration so you will need to enable that first and BCP requires you to have access to the directory that you want to create the file.
To enable xp_cmdshell you'll need run sp_configure and RECONFIGURE, use this:
EXEC sp_configure'xp_cmdshell', 1
RECONFIGURE
GO
EXEC sp_configure 'show advanced options', 1
RECONFIGURE
GO
Then you can run the following:
EXEC xp_cmdshell 'bcp "SELECT * FROM [Database].dbo.[Table] FOR XML AUTO,
ELEMENTS" queryout "C:\test.xml" -c -T'
Just add your query into it and make sure you add [] around your table names.
The Microsoft Documents for xp_cmdshell are here and bcp can be found here
Using bcp is definite choice especially when working with large data sets. Alternatively, you can try using SQL Management Studio - Export Data.
Open the interface - Right Click on database name, then Tasks, then Export Data
The menu is opened. Click Next
Then choose SQL Server Native Client, sql server, database name and authentication method:
Then where to save the data:
Then how we are getting the data (in your case SQL query):
Past the query:
Then we have some settings, click finish.
To save the results of a remote query to a local file, you could use a Powershell script like this example:
$connection = New-Object System.Data.SqlClient.SqlConnection("Data Source=YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI")
$command = New-Object System.Data.SqlClient.SqlCommand(#("
select 'TEST.kml' as name,
(select 'TEST' as name, (
select (
select top 10 issue as name,
null as description,
null as 'Point/coordinates',
(
select
null as altitudeMode,
Coordinates as 'coordinates'
for xml path('Polygon'), type)
from Mapping for xml path('Placemark'), type))
for xml path ('Line') , type)
for xml path ('Doc'), root('kml');"), $connection);
$connection.Open();
$command.ExecuteScalar() | Out-File -FilePath "C:\KmlFiles\YourFile.kml";
$connection.Close();
The script can be executed from a command prompt by saving the script to a file with a ".ps1" extension and using a command like:
powershell -ExecutionPolicy RemoteSigned -File "C:\PowershellScripts\ExampleExport.ps1"
This command can be scheduled using a Windows Task Scheduler task to automate the export. Alternatively, schedule using a SQL Server agent job with a Powershell or CmdExec step.

How to pass a query stored in a variable to a sql file. Shell script

I am creating a shell script where I have saved entries from a text file into an array. Those values are properly stored and show the correct contents. One of those entries contains a simple query and I want to pass it to a sql file. With that sql query I want to save the results into a text file.
Here is the part of the code that calls the sql file to run the sql script
PURGE_SITES=purge_site.txt
logmsg "USERID - $PURGES_SITE" n
QUERY=${Unix_Array[4]}
echo $QUERY
sqlplus -s $USER/$PASS <<EndSQL
#purges_sites.sql $PURGE_SITES '$QUERY'
EXIT SQL.SQLCODE
EndSQL
for now query stored in ${Unix_Array[4]} is "select -1 from dual"
Here is the file contents of the .sql file
set echo off ver off feed off pages 0
accept fname prompt 'Loading Sites...'
spool &1;
&2
/
spool off
It gives me error and reads &2 as "&2" instead of the query saved in the variable. However when i edit the .sql file and add something beforehand, it will display the correct data from the variable.
Here is the output
select -1 from dual
File Name===> results.txt
select -1 from dual
Loading Sites...SP2-0042: unknown command "&2" - rest of line ignored.
SP2-0103: Nothing in SQL buffer to run.
Here is the output if I add something before &2.
select -1 from dual
File Name===> results.txt
select -1 from dual
Loading Sites...select * from table_table select -1 from dual
*
ERROR at line 1:
ORA-00933: SQL command not properly ended
I typed in select * from table_table before &2.
So its actually retrieving the value from the variable but something needs to come beforehand in order to pass correctly.
Is there a system execute command in oracle that will execute a query? &2 just by itself is not allowed.
Wont this help you?
PURGE_SITES=purge_site.txt
logmsg "USERID - $PURGES_SITE" n
QUERY=${Unix_Array[4]}
echo $QUERY
# FRAME YOUR QUERY, PROMPTING USER IN SHELL ITSELF AND SEND TO SQLPLUS DIRECTLY
# BEWARE SQL INJECTION POSSIBLE
# YOU CAN REDIRECT THE SQLPLUS OUTPUT TO A FILE LIKE THIS, NO SPOOL NEEDED
sqlplus -s $USER/$PASS <<EndSQL >> $OUTPUT_FILE
set echo off ver off feed off pages 0
$QUERY
/
EXIT SQL.SQLCODE
EndSQL

SQL Server : export query as a .txt file

I am trying to export my SQL Server query results into a folder in .txt format (this is for an automated job)
I know the equivalent in MySQL works with INTO OUTFILE. Does anyone know the best way to do this in SQL Server 2008 Management Studio?
SELECT DISTINCT RTRIM (s1.SGMNTID) AS 'AccCode',RTRIM (s1.DSCRIPTN) AS 'CodeDesc', CASE
WHEN s1.SGMTNUMB = '1' THEN '1'
WHEN s1.SGMTNUMB = '2' THEN '2'
WHEN s1.SGMTNUMB = '3' THEN '110'
WHEN s1.SGMTNUMB = '4' THEN '4'
WHEN s1.SGMTNUMB = '5' THEN '120'
END AS 'AccountType_id',
CASE WHEN s1.SGMTNUMB = '2'
THEN LEFT(s1.SGMNTID, 2)
ELSE 'DEFAULT'
END AS 'AccGroupName'
FROM GL40200 s1
UNION
SELECT REPLACE ([ACTNUMBR_1]+'-'+ [ACTNUMBR_2]+'-'+ [ACTNUMBR_3]+'-'+[ACTNUMBR_4]+'-'+ [ACTNUMBR_5],' ', '') AS 'AccCode',
'' AS 'CodeDesc',
'0' AS 'AccountType_id',
'Default' AS 'AccGroupName'
FROM GL00100 a
INTO OUTFILE 'C:\Users\srahmani\verian/myfilename.txt'
You do this in the SSMS app, not the SQL.
In the toolbar select:
Query --> Results To --> Results To File
Then Execute the SQL statements and it will prompt you to save to a text file with an .rpt extension. Open the results in a Text Editor.
Another way is from command line, using the osql:
OSQL -S SERVERNAME -E -i thequeryfile.sql -o youroutputfile.txt
This can be used from a BAT file and shceduled by a windows user to authenticated.
You can use bcp utility.
To copy the result set from a Transact-SQL statement to a data file,
use the queryout option. The following example copies the result of a query into the Contacts.txt data file. The example assumes that you are using Windows Authentication and have a trusted connection to the server instance on which you are running the bcp command. At the
Windows command prompt, enter:
bcp "<your query here>" queryout Contacts.txt -c -T
You can use BCP by directly calling as operating sytstem command in SQL Agent job.
You can use windows Powershell to execute a query and output it to a text file
Invoke-Sqlcmd -Query "Select * from database" -ServerInstance "Servername\SQL2008" -Database "DbName" > c:\Users\outputFileName.txt
The BCP Utility can also be used in the form of a .bat file, but be cautious of escape sequences (ie quotes "" must be used in conjunction with ) and the appropriate tags.
.bat Example:
C:
bcp "\"YOUR_SERVER\".dbo.Proc" queryout C:\FilePath.txt -T -c -q
-- Add PAUSE here if you'd like to see the completed batch
-q MUST be used in the presence of quotations within the query itself.
BCP can also run Stored Procedures if necessary. Again, be cautious: Temporary Tables must be created prior to execution or else you should consider using Table Variables.
This is quite simple to do and the answer is available in other queries. For those of you who are viewing this:
select entries from my_entries where id='42' INTO OUTFILE 'bishwas.txt';

Generate a Properties File using Shell Script and Results from a SQL Query

I am trying to create a properties file like this...
firstname=Jon
lastname=Snow
occupation=Nights_Watch
family=Stark
...from a query like this...
SELECT
a.fname as firstname,
a.lname as lastname,
b.occ as occupation...
FROM
names a,
occupation b,
family c...
WHERE...
How can I do this? As I am aware of only using spool to a CSV file which won't work here?
These property files will be picked up by shell scripts to run automated tasks. I am using Oracle DB
Perhaps something like this?
psql -c 'select id, name from test where id = 1' -x -t -A -F = dbname -U dbuser
Output would be like:
id=1
name=test1
(For the full list of options: man psql.)
Since you mentionned spool I will assume you are running on Oracle. This should produce a result in the desired format, that you can spool straight away.
SELECT
'firstname=' || firstname || CHR(10) ||
'lastname=' || lastname || CHR(10) -- and so on for all fields
FROM your_tables;
The same approach should be possible with all database engines, if you know the correct incantation for a litteral new line and the syntax for string concatenation.
It is possible to to this from your command line SQL client but as STTLCU notes it might be better to get the query to output in something "standard" (like CSV) and then transform the results with a shell script. Otherwise, because a lot of the features you would use are not part of any SQL standard, they would depend on the database server and client application. Think of this step as sort of the obverse of ETL where you clean up the data you "unload" so that it is useful for some other application.
For sure there's ways to build this into your query application: e.g. if you use something like perl DBI::Shell as your client (which allows you to connect to many different servers using the DBI module) you can jazz up your output in various ways. But here you'd probably be best off if could send the query output to a text file and run it through awk.
Having said that ... here's how the Postgresql client could do what you want. Notice how the commands to set up the formatting are not SQL but specific to the client.
~/% psql -h 192.168.2.69 -d cropdusting -u stubblejumper
psql (9.2.4, server 8.4.14)
WARNING: psql version 9.2, server version 8.4.
Some psql features might not work.
You are now connected to database "cropdusting" as user "stubblejumper".
cropdusting=# \pset border 0 \pset format unaligned \pset t \pset fieldsep =
Border style is 0.
Output format is unaligned.
Showing only tuples.
Field separator is "=".
cropdusting=# select year,wmean_yld from bckwht where year=1997 AND freq > 13 ;
1997=19.9761904762
1997=14.5533333333
1997=17.9942857143
cropdusting=#
With the psql client the \pset command sets options affecting the output of query results tables. You can probably figure out which option is doing what. If you want to do this using your SQL client tell us which one it is or read through the manual page for tips on how to format the output of your queries.
My answer is very similar to the two already posted for this question, but I try to explain the options, and try to provide a precise answer.
When using Postgres, you can use psql command-line utility to get the intended output
psql -F = -A -x -X <other options> -c 'select a.fname as firstname, a.lname as lastname from names as a ... ;'
The options are:
-F : Use '=' sign as the field separator, instead of the default pipe '|'
-A : Do not align the output; so there is no space between the column header, separator and the column value.
-x : Use expanded output, so column headers are on left (instead of top) and row values are on right.
-X : Do not read $HOME/.psqlrc, as it may contain commands/options that can affect your output.
-c : The SQL command to execute
<other options> : Any other options, such as connection details, database name, etc.
You have to choose if you want to maintain such a file from shell or from PL/SQL. Both solutions are possible and both are correct.
Because Oracle has to read and write from the file I would do it from database side.
You can write data to file using UTL_FILE package.
DECLARE
fileHandler UTL_FILE.FILE_TYPE;
BEGIN
fileHandler := UTL_FILE.FOPEN('test_dir', 'test_file.txt', 'W');
UTL_FILE.PUTF(fileHandler, 'firstname=Jon\n');
UTL_FILE.PUTF(fileHandler, 'lastname=Snow\n');
UTL_FILE.PUTF(fileHandler, 'occupation=Nights_Watch\n');
UTL_FILE.PUTF(fileHandler, 'family=Stark\n');
UTL_FILE.FCLOSE(fileHandler);
EXCEPTION
WHEN utl_file.invalid_path THEN
raise_application_error(-20000, 'ERROR: Invalid PATH FOR file.');
END;
Example's source: http://psoug.org/snippet/Oracle-PL-SQL-UTL_FILE-file-write-to-file-example_538.htm
At the same time you read from the file using Oracle external table.
CREATE TABLE parameters_table
(
parameters_coupled VARCHAR2(4000)
)
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY test_dir
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE
FIELDS
(
parameters_coupled VARCHAR2(4000)
)
)
LOCATION ('test_file.txt')
);
At this point you can write data to your table which has one column with coupled parameter and value, i.e.: 'firstname=Jon'
You can read it by Oracle
You can read it by any shell script because it is a plain text.
Then it is just a matter of a query, i.e.:
SELECT MAX(CASE WHEN INSTR(parameters_coupled, 'firstname=') = 1 THEN REPLACE(parameters_coupled, 'firstname=') ELSE NULL END) AS firstname
, MAX(CASE WHEN INSTR(parameters_coupled, 'lastname=') = 1 THEN REPLACE(parameters_coupled, 'lastname=') ELSE NULL END) AS lastname
, MAX(CASE WHEN INSTR(parameters_coupled, 'occupation=') = 1 THEN REPLACE(parameters_coupled, 'occupation=') ELSE NULL END) AS occupation
FROM parameters_table;