We were experiencing problems with Powershell and SQLCMD, when there was sapces in the -v parameter variable powershell wouldn't run the command.
e.g.
sqlcmd ... -v VAR="Some space"
Has anyone experienced this before or know how to fix the problem?
Thanks,
B
The syntax above works for the PS commandline but fails within a script.
We struggled a long time with how to make this work. One of our very clever QA guys finally came up with the following:
$variableWithSpaces="one two three"
$mySqlCmd = "sqlcmd -E -S $dbServer -i $script -v var=```"$variableWithSpaces```" "
Invoke-Expression $mySqlCmd
Plug ugly but it works.
Powershell will actually pass the parameter to the program as "VAR=Some space". Maybe sqlcmd stumbles over this. By using
VAR=`"Some space`"
instead it will get passed as VAR="Some space". Maybe that resolves the problem.
Related
I have a file that contains several SQL queries.
Can I somehow run them via isql (I'm doing the calls from Bash script, so no access to Perl DBI or JDBC)
I tried piping them into isql command via echo /my/file | isql -my-other-parameters but that didn't work.
Yes.
If you're running ISQL in interactive mode, you can load an entire contents of the file using :r my-filename command from > prompt.
From Bash script, it's also possible to do - but you need to carefully make sure that
The SQL file you are piping in has a go statement at the end. That is a VERY common cause of issues like the one you mentioned.
That statement has a newline after it.
From a script, you can do it 2 ways: Pass on STDIN via a pipe/redirect; OR, pass in the file name via isql's -i parameter
In my case it was isq -n for a piped query to work.
isql -U $DB_USR -P $DB_PWD -S $DB_PATH -D $DB_NAME -w 500 < $FILE
I am learning the shell language. I have creating a shell script whose function is to login into the DB and run a .sql file. Following are the contents of the script -
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
echo "Running SQL Dump - auto_qa_db_sync"
\\i auto_qa_db_sync.sql
After running the above script, I get the following error
./autoqa_script.sh: 39: ./autoqa_script.sh: /i: not found
Following one article, I tried reversing the slash but it didn't worked.
I don't understand why this is happening. Because when I try manually running the sql file, it works properly. Can anyone help?
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production and run script"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT -f auto_qa_db_sync.sql
The lines you put in a shell script are (moreless, let's say so for now) equivalent to what you would put right to the Bash prompt (the one ending with '$' or '#' if you're a root). When you execute a script (a list of commands), one command will be run after the previous terminates.
What you wanted to do is to run the client and issue a "\i ./autoqa_script.sh" comand in it.
What you did was to run the client, and after the client terminated, issue that command in Bash.
You should read about Bash pipelines - these are the way to run programs and input text inside them. Following your original idea to solving the problem, you'd write something like:
echo '\i auto_qa_db_sync.sql' | $DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
Hope that helps to understand.
While trying to write a script, I found an interesting issue with cat today. If I do the following at the command line, everything works properly:
var=$(ssh user#server "cat /directory/myfile.sh")
echo $var > ~/newfile.sh
This works and I have a script file with all the proper formatting and can run it. However, if I do the EXACT same thing in a script:
#!/bin/sh
var=$(ssh user#server "cat /directory/myfile.sh")
echo $var > ~/newfile.sh
The file is mangled with carriage returns and weird formatting.
Does anyone know why this is happening? My goal is to ultimately cat a script from a server and run it locally on my machine.
EDIT
I now know that this is happening because of my invoking #!/bin/sh in my shell script. The command line works because I'm using zsh and it is preserving the formatting.
Is there a way to cat back the results regardless of the shell?
As you seem to have figured out, word splitting is off by default on zsh, but on in sh, bash, etc. You can prevent word splitting in all shells by quoting the variable:
echo "$var" > ~/newfile.sh
Note that echo appends a newline to its output by default, which you can suppress (on most echo implementations and builtins) with -n.
Currently working with kde3.5
Here is what I would eventually like to do to help my workflow:
Have a script that:
Opens multiple konsole shells
Renames each shell
This is what I have so far:
#!/bin/tcsh -fv
set KPID =ps -ef | grep konsole | grep -v grep | awk '{print $2}'| tr "\n" " "
dcop konsole-$KPID konsole newSession
The dcop command works just fine in command line (substituting variable for actual pid) but when I run it through the script, it gives 'object not accessible' error. No other errors present.
I've made sure permissions are ok (777) and even added sudo with it, but no luck.
As per second part again I have it working on command line:
dcop $KONSOLE_DCOP_SESSION renameSession "name"
This however only works for the active (working) shell and am not sure how to get it to do it for the others. I have not put this part in script yet as I am still working on the first part. Any suggestions would be great.
Thanks.
If it's a script, it doesn't need to be tcsh. see http://www.grymoire.com/Unix/CshTop10.txt
But if you want to pass $KPID into your script, use $1 in your script argument #1), and call it with
script $KPID
I have foo.sql as:
print 'foo=$(foo)'
Then I have in foo.cmd the following shell script:
sqlcmd -i foo.sql -v foo="c:\path"
Running foo.cmd prints:
foo=\path
how do I escape the "c:"? Is dos-shell eating it, or is it sqlcmd?
cmd's argument delimiters include the equal sign. I've seen in other cases (such as bjam.exe) that the entire parameter sequence has to be quoted to work properly.
Try this:
sqlcmd -i foo.sql -v "foo=c:\path"
If it still strips the "c:" portion, I'd focus on sqlcmd. I don't personally have it installed to test with. This is based solely on experience with similar situations.
OK, my mistake. the above does work.
What i did wrong was doing: sqlcmd -i foo.sql -v foo='c:\path'
(single quote, since I tried to pass them as ' ' sql string) that won't work. it will chop the c:
Using another shell causes this.
I just had this when running sqlcmd via powershell. Switched to using cmd.exe and it worked fine
double quotes to escape the ":" and single quotes so that sql treated the variable value as a string. e.g.
sqlcmd -S . -d myDb -i .\test.sql -v pathToFile = "'D:\Temp\temp\My.csv'"
Escape the backslash,
sqlcmd -i foo.sql -v foo="c:\\path"
It's actually your shell eating the \