Why does database query using sqlcmd produce just empty file while same query works in Management Studio? - sql

I have made recently a huge MSSQL script which was working until the server environment changed and some queries are not anymore allowed. So I had to make a .bat file which executes this query from command line.
I get no error or something else. I just get a file with no entry. But I receive a lot of entries if I use the code of the query in the Management Studio.
Does somebody see where is the mistake in my command line?
I inserted some new lines for reading the code better. Everything except command PAUSE is on one line in the batch file.
EDIT: I figured out that the problem is in the last WHERE clause in the LIKE operater. If I take out the LIKE operater it works. It has nothing todo with the % caracter. it is effective the LIKE operater. Does anybody know how to fix that?
sqlcmd -S connection\string -U user -P password -d dbName -s";" -Q "SET NOCOUNT ON;
SELECT [per_nummer] as EmployeeID,
[per_id] as System_nr,
[per_pid] as PID,
[per_anrede] as Gender,
[per_vname] as Vorname,
[per_name] as Name,
[per_telEx] as Telefon,
[per_email] as Email,
[per_instradierungHauptort] as Instradierung,
[per_gebnr] as Gebaeudenummer,
(SELECT mobileTelephoneNumber FROM [dbName].[dbo].[import_zuko_GLDAP_DUMP] WHERE UserID = per_pid ) as Mobile,
[per_business_area] as Business_Area,
(SELECT csgdivision FROM [dbName].[dbo].[import_zuko_GLDAP_DUMP] WHERE UserID = per_pid ) as Division,
NULL as Bereich,
(SELECT csgCompany FROM [dbName].[dbo].[import_zuko_GLDAP_DUMP] WHERE UserID = per_pid ) as Firma,
[per_sprache] as Korespondenzsprache,
(SELECT roomnumber FROM [dbName].[dbo].[import_zuko_GLDAP_DUMP] WHERE UserID = per_pid ) as Bueronummer,
[per_floor] as Etage,
(SELECT convert(varchar, convert(date,[per_eintrittsdatum]), 104)) as Eintritt_per,
(SELECT convert(varchar, convert(date,[per_austrittsdatum]), 104)) as Austritt_per,
CONVERT(VARCHAR(10), GETDATE(), 104) AS letzte_mutation,
per_lm as Linemanager,
(SELECT TOP 1 per_pid FROM [dbName].[dbo].[person] WHERE per_nummer = master_table.per_lm) AS lm_pid,
(SELECT convert(varchar, convert(date,[per_lm_von]), 104)) as lm_von,
(SELECT convert(varchar, convert(date,[per_lm_bis]), 104)) as lm_bis,
NULL FROM [dbName].[dbo].[person] as master_table
WHERE
[per_nummer] IS NOT NULL AND per_pidStatus = 'A' AND
([per_pid] LIKE('A%')OR [per_pid] LIKE('F%')OR [per_pid] LIKE('W%'))"
-w 1000 -W -o "\\servername\G$\path\to\file.csv"
PAUSE

Different default options
Do you have read carefully sqlcmd Utility documentation by Microsoft?
There is at top the important information:
Because different default options may apply, you might see different behavior when you execute the same query in SQL Server Management Studio in SQLCMD Mode and in the sqlcmd utility.
Therefore the empty file could be caused by different default options.
Space between option and value of option
Further the documentation contains at top:
Currently, sqlcmd does not require a space between the command line option and the value. However, in a future release, a space may be required between the command line option and the value.
Double quotes inside an argument string are often problematic, see answer on Why double quotes should be always only at beginning and end of an argument string?
Therefore I suggest to follow the advice of Microsoft and change -s";" to -s ";" with a space charcter after -s as done for all other options, too.
Escaping percentage sign
Do you have tried to escape each % in the query string with an additional % and using therefore 3 times %% in the query string?
Command line interpreter cmd.exe could interpret the string between two % as a reference to an environment variable and as there is no such environment variable, removes everything between two % from the command line.
Update: It turned out that indeed the not escaped percentage signs resulted in a wrong query string and therefore in an empty results file.
Command with path and extension
General hint:
It is always advisable to specify in batch files commands like sqlcmd with full name which means with path and file extension as this makes the execution of the application independent on the values of the environment variables PATH and PATHEXT.

Related

Pass byte[] as parameter to sql insert script

I am trying to upload the binary[] of a Zip folder to my database. I used Get-Content -Encoding Byte -ReadCount 0 to read the data into a variable. I want to use this variable in an INSERT statement. Unfortunately, sqlcmd doesn't like the size of the variable, and gives me this error:
Program 'SQLCMD.EXE' failed to run: The filename or extension is too longAt line:1 char:1.
I have tried using the -Q option to run the query, and also -i to run a sql file.
DECLARE #data varbinary(MAX)
SET #data = '$(data_stuff)'
INSERT INTO MyTable
(v1,v2,v3,v4,v5)
VALUES
(v1,v2,v3,v4,#data)
sqlcmd -S servername -E -i .\file.sql -v data = "$binarydata"
Is there a workaround for doing this?
In a SQL query/batch/.sql file, binary/varbinary/image literal data values must be in hexadecimal format with a 0x prefix:
INSERT INTO tableName ( binaryColum ) VALUES ( 0x1234567890ABCDEF )
I don't know what the maximum length of a binary literal is, but I suspect things might stop working, or be very slow, if you exceed more than a few hundred kilobytes.
I recommend using ADO.NET directly via PowerShell, which will also let you use binary parameter values (SqlParameter): How do you run a SQL Server query from PowerShell?

SQL Server : export query as a .txt file

I am trying to export my SQL Server query results into a folder in .txt format (this is for an automated job)
I know the equivalent in MySQL works with INTO OUTFILE. Does anyone know the best way to do this in SQL Server 2008 Management Studio?
SELECT DISTINCT RTRIM (s1.SGMNTID) AS 'AccCode',RTRIM (s1.DSCRIPTN) AS 'CodeDesc', CASE
WHEN s1.SGMTNUMB = '1' THEN '1'
WHEN s1.SGMTNUMB = '2' THEN '2'
WHEN s1.SGMTNUMB = '3' THEN '110'
WHEN s1.SGMTNUMB = '4' THEN '4'
WHEN s1.SGMTNUMB = '5' THEN '120'
END AS 'AccountType_id',
CASE WHEN s1.SGMTNUMB = '2'
THEN LEFT(s1.SGMNTID, 2)
ELSE 'DEFAULT'
END AS 'AccGroupName'
FROM GL40200 s1
UNION
SELECT REPLACE ([ACTNUMBR_1]+'-'+ [ACTNUMBR_2]+'-'+ [ACTNUMBR_3]+'-'+[ACTNUMBR_4]+'-'+ [ACTNUMBR_5],' ', '') AS 'AccCode',
'' AS 'CodeDesc',
'0' AS 'AccountType_id',
'Default' AS 'AccGroupName'
FROM GL00100 a
INTO OUTFILE 'C:\Users\srahmani\verian/myfilename.txt'
You do this in the SSMS app, not the SQL.
In the toolbar select:
Query --> Results To --> Results To File
Then Execute the SQL statements and it will prompt you to save to a text file with an .rpt extension. Open the results in a Text Editor.
Another way is from command line, using the osql:
OSQL -S SERVERNAME -E -i thequeryfile.sql -o youroutputfile.txt
This can be used from a BAT file and shceduled by a windows user to authenticated.
You can use bcp utility.
To copy the result set from a Transact-SQL statement to a data file,
use the queryout option. The following example copies the result of a query into the Contacts.txt data file. The example assumes that you are using Windows Authentication and have a trusted connection to the server instance on which you are running the bcp command. At the
Windows command prompt, enter:
bcp "<your query here>" queryout Contacts.txt -c -T
You can use BCP by directly calling as operating sytstem command in SQL Agent job.
You can use windows Powershell to execute a query and output it to a text file
Invoke-Sqlcmd -Query "Select * from database" -ServerInstance "Servername\SQL2008" -Database "DbName" > c:\Users\outputFileName.txt
The BCP Utility can also be used in the form of a .bat file, but be cautious of escape sequences (ie quotes "" must be used in conjunction with ) and the appropriate tags.
.bat Example:
C:
bcp "\"YOUR_SERVER\".dbo.Proc" queryout C:\FilePath.txt -T -c -q
-- Add PAUSE here if you'd like to see the completed batch
-q MUST be used in the presence of quotations within the query itself.
BCP can also run Stored Procedures if necessary. Again, be cautious: Temporary Tables must be created prior to execution or else you should consider using Table Variables.
This is quite simple to do and the answer is available in other queries. For those of you who are viewing this:
select entries from my_entries where id='42' INTO OUTFILE 'bishwas.txt';

Generate a Properties File using Shell Script and Results from a SQL Query

I am trying to create a properties file like this...
firstname=Jon
lastname=Snow
occupation=Nights_Watch
family=Stark
...from a query like this...
SELECT
a.fname as firstname,
a.lname as lastname,
b.occ as occupation...
FROM
names a,
occupation b,
family c...
WHERE...
How can I do this? As I am aware of only using spool to a CSV file which won't work here?
These property files will be picked up by shell scripts to run automated tasks. I am using Oracle DB
Perhaps something like this?
psql -c 'select id, name from test where id = 1' -x -t -A -F = dbname -U dbuser
Output would be like:
id=1
name=test1
(For the full list of options: man psql.)
Since you mentionned spool I will assume you are running on Oracle. This should produce a result in the desired format, that you can spool straight away.
SELECT
'firstname=' || firstname || CHR(10) ||
'lastname=' || lastname || CHR(10) -- and so on for all fields
FROM your_tables;
The same approach should be possible with all database engines, if you know the correct incantation for a litteral new line and the syntax for string concatenation.
It is possible to to this from your command line SQL client but as STTLCU notes it might be better to get the query to output in something "standard" (like CSV) and then transform the results with a shell script. Otherwise, because a lot of the features you would use are not part of any SQL standard, they would depend on the database server and client application. Think of this step as sort of the obverse of ETL where you clean up the data you "unload" so that it is useful for some other application.
For sure there's ways to build this into your query application: e.g. if you use something like perl DBI::Shell as your client (which allows you to connect to many different servers using the DBI module) you can jazz up your output in various ways. But here you'd probably be best off if could send the query output to a text file and run it through awk.
Having said that ... here's how the Postgresql client could do what you want. Notice how the commands to set up the formatting are not SQL but specific to the client.
~/% psql -h 192.168.2.69 -d cropdusting -u stubblejumper
psql (9.2.4, server 8.4.14)
WARNING: psql version 9.2, server version 8.4.
Some psql features might not work.
You are now connected to database "cropdusting" as user "stubblejumper".
cropdusting=# \pset border 0 \pset format unaligned \pset t \pset fieldsep =
Border style is 0.
Output format is unaligned.
Showing only tuples.
Field separator is "=".
cropdusting=# select year,wmean_yld from bckwht where year=1997 AND freq > 13 ;
1997=19.9761904762
1997=14.5533333333
1997=17.9942857143
cropdusting=#
With the psql client the \pset command sets options affecting the output of query results tables. You can probably figure out which option is doing what. If you want to do this using your SQL client tell us which one it is or read through the manual page for tips on how to format the output of your queries.
My answer is very similar to the two already posted for this question, but I try to explain the options, and try to provide a precise answer.
When using Postgres, you can use psql command-line utility to get the intended output
psql -F = -A -x -X <other options> -c 'select a.fname as firstname, a.lname as lastname from names as a ... ;'
The options are:
-F : Use '=' sign as the field separator, instead of the default pipe '|'
-A : Do not align the output; so there is no space between the column header, separator and the column value.
-x : Use expanded output, so column headers are on left (instead of top) and row values are on right.
-X : Do not read $HOME/.psqlrc, as it may contain commands/options that can affect your output.
-c : The SQL command to execute
<other options> : Any other options, such as connection details, database name, etc.
You have to choose if you want to maintain such a file from shell or from PL/SQL. Both solutions are possible and both are correct.
Because Oracle has to read and write from the file I would do it from database side.
You can write data to file using UTL_FILE package.
DECLARE
fileHandler UTL_FILE.FILE_TYPE;
BEGIN
fileHandler := UTL_FILE.FOPEN('test_dir', 'test_file.txt', 'W');
UTL_FILE.PUTF(fileHandler, 'firstname=Jon\n');
UTL_FILE.PUTF(fileHandler, 'lastname=Snow\n');
UTL_FILE.PUTF(fileHandler, 'occupation=Nights_Watch\n');
UTL_FILE.PUTF(fileHandler, 'family=Stark\n');
UTL_FILE.FCLOSE(fileHandler);
EXCEPTION
WHEN utl_file.invalid_path THEN
raise_application_error(-20000, 'ERROR: Invalid PATH FOR file.');
END;
Example's source: http://psoug.org/snippet/Oracle-PL-SQL-UTL_FILE-file-write-to-file-example_538.htm
At the same time you read from the file using Oracle external table.
CREATE TABLE parameters_table
(
parameters_coupled VARCHAR2(4000)
)
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY test_dir
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE
FIELDS
(
parameters_coupled VARCHAR2(4000)
)
)
LOCATION ('test_file.txt')
);
At this point you can write data to your table which has one column with coupled parameter and value, i.e.: 'firstname=Jon'
You can read it by Oracle
You can read it by any shell script because it is a plain text.
Then it is just a matter of a query, i.e.:
SELECT MAX(CASE WHEN INSTR(parameters_coupled, 'firstname=') = 1 THEN REPLACE(parameters_coupled, 'firstname=') ELSE NULL END) AS firstname
, MAX(CASE WHEN INSTR(parameters_coupled, 'lastname=') = 1 THEN REPLACE(parameters_coupled, 'lastname=') ELSE NULL END) AS lastname
, MAX(CASE WHEN INSTR(parameters_coupled, 'occupation=') = 1 THEN REPLACE(parameters_coupled, 'occupation=') ELSE NULL END) AS occupation
FROM parameters_table;

SQLCMD use LIKE '%#%'

I'm trying to run a query using SQLCMD.EXE and have trouble with the LIKE portion.
WHERE email LIKE '%%#%%'
I think it is an error with cmd prompt rather then SQLCMD.EXE since I get the error:
Syntax error "#%'"
I am running this via Notepad++ (NppExec) pointing to the bat file like so:
H:\scripts\SQL.bat "$(CURRENT_WORD)"
This causes the query to be wrapped in double quotes before being used by the SQLCMD.EXE call. The SQLCMD.EXE call then runs in the bat file like so:
SQLCMD.EXE -U user -P %pass% -S %server% -Q %sql% -d %table%
It works perfect on any query I use aside from this LIKE '%%#%%' part.
UPDATE
I've done a few more tests and think I have narrowed it down to being a problem with the % and the #.
So queries like these work fine:
SELECT name FROM table WHERE name LIKE 'test'
SELECT name FROM table WHERE name LIKE 'test%'
SELECT name FROM table WHERE name LIKE '%%test'
But these will cause errors:
SELECT name FROM table WHERE name LIKE '%test'
SELECT name FROM table WHERE name LIKE '%test%'
This is fine since I am ok with doubling the % in my queries, but I've tried %%#% and %%#%% and they throw errors. Syntax error "#'"" or Syntax error "#%'"", respectively.
Also, the reason for the variables is that I included some logic so it can detect table names and run for different servers and databases.
Here is the bat file
set sql=%1
iff %#index[%sql%,sur_] GT -1 THEN
SET SERVER=server1
SET table=tablename
SET pass=password
else
SET SERVER=server2
SET table=tablename
SET pass=password
endiff
SQLCMD.EXE -U usr -P %pass% -S %server% -Q %sql% -d %table%
The reason for the weird syntax is due to the command being run through TCC/LE (see here)
I'm not quite sure what your reasoning is for doubling up the %s, but it looks like your intent is to find values in the email column that contain #. If so, you can try rewriting the clause as such:
WHERE CHARINDEX('#', email) > 0
If it's the # symbol that is tripping things up, use CHAR(64) instead.
WHERE CHARINDEX(CHAR(64), email) > 0
When run query with sqlcmd, i found that % symbol will be removed. Let's say your query is :
SELECT name FROM table WHERE name LIKE 'test%'
The sqlcmd will read your query as
SELECT name FROM table WHERE name LIKE 'test'
So sqlcmd will not filter your result. Please use %% for query
SELECT name FROM table WHERE name LIKE 'test%%'
and you will get result
SELECT name FROM table WHERE name LIKE 'test%'
I have tested this on SQLServer 2005 & 2008

sqlcmd - How to get around column length limit without empty spaces?

I'm trying to use sqlcmd on a windows machine running SQL Server 2005 to write a query to a csv file. The command line options we typically use are:
-l 60 -t 300 -r 1 -b -W -h -1
However, the columns are getting truncated at 256 bytes. In an attempt to circumvent this, I tried using this command line option in place of -W:
-y 8000
This captures the entire fields, but the problem with this method is that the file balloons up from just over 1mb to about 200mb due to all the extra space (I realize 8000 is probably overkill, but it will probably have to be at least 4000 and I'm currently only working with a small subset of data). The -W option typically eliminates all this extra space, but when I try to use them together it tells me they're mutually exclusive.
Is there a way to get sqlcmd around this limit, or does anyone know if another program (such as bcp or osql) would make this easier?
Edit:
Here are the code snippets we're using to get the field that's being truncated (similar code is used for a bunch of fields):
SELECT ALIASES.AliasList as complianceAliases,
...
LEFT OUTER JOIN (Select M1.ID, M1.LIST_ID,stuff((SELECT '{|}' + isnull(Content2,'')+' '+isnull(Content3,'')+' '+isnull(Content4,'')+' '+isnull(Content5,'')+' '+isnull(Content6,'')+' '+isnull(Content7,'')
FROM fs_HOST3_TEST_web.ISI_APP_COMP_MULTI M2 with (nolock)
WHERE M1.LIST_ID = M2.LIST_ID and M1.ID = M2.ID and M1.TYPE = M2.TYPE
FOR XML PATH('')
),1,1,'') as AliasList
FROM fs_HOST3_TEST_web.ISI_APP_COMP_MULTI M1 with (nolock)
WHERE M1.LIST_ID = 2001 AND M1.TYPE = 'Aliases'
GROUP BY m1.list_id,m1.ID,m1.Type) as ALIASES
ON ALIASES.LIST_ID = PAIR.COMP_LIST_ID AND ALIASES.ID = PAIR.COMP_ID
I ended up solving this by using the "-y0" argument. It still left a bunch of whitespace but it looks like it only went to the end of the longest piece of data in each field.
I then ran the output through a program that removed repeating spaces and that solved all of the problems.