Failure to connect to derby database using ij from .bat file - sql

I have a java application from which I need to delete and insert data into a local JDBC derby database. I'm trying to execute a SQL script that does this using the ij utility. I've written a batch file to handle this.
C:
C:\Progra~1\Sun\JavaDB\bin\ij.bat
connect 'jdbc:derby:D:\Documents and Settings\user\My Documents\mydatabase';
run "D:\Documents and Settings\user\sqlscript.sql";
disconnect;
exit;
When I run the batch file, the command prompt will execute up to line 2. The ij utility will load in the command prompt, but then the rest of the commands won't be run. I've tested each line by hand and it works fine (as does the SQL script). Is there anything I need to add to the batch file to make the last 4 lines execute? Thanks.

Put these commands in a file called 'commands.txt'
connect 'jdbc:derby:D:\Documents and Settings\user\My Documents\mydatabase';
run "D:\Documents and Settings\user\sqlscript.sql";
disconnect;
exit;
then run ij as follows from your batch file:
C:\Progra~1\Sun\JavaDB\bin\ij.bat commands.txt
You might have the add the exact path to to commands.txt if it is not in your current folder.

Related

Executing script even if they are failures in yml file

Hi All I am tying to setup a RestAPI pipeline in aws codebuild. I have custom Newman docker. I have a build command that will failure but I want to execute the rest of the commands as well. but shell stops executions other commands when the Newman command fails. how to execute other commands in yml file.
One simple way would be:
You can create a custom shell script (mycommand.sh) with your command that can cause error inside a try catch statement (so that it will not result in an error)
In your Code build's yml file under commands section, just execute the ./mycommand.sh
Source:
https://docs.aws.amazon.com/codebuild/latest/userguide/build-env-ref-cmd.html

Bat file to run a sql query on a schedule through Task Scheduler

I am trying to run a .sql script on a schedule. I have created a batch file to run the script. The script runs fine in sql server management studio and also when I run the batch file content through cmd.
Contents of the batch file:
sqlcmd -S omfmesql -U OMESRV -P orat -i "\\pvsrv-
fsr14\data\Projects\Stat_Table_Creation_unique.sql"
The sql script is supposed to update a stat table. When I run it though cmd and refresh the stat table, the numbers are updated. But when I run this batch file through Task Scheduler, the only action that seems to be performed is running C:\Windows\SYSTEM32\cmd.exe
The task is stated to be completed successfully but the sql query is just not run.
I am not too experienced with Task Scheduler. Any help here would be very much appreciated. Thanks!
Note: I am not intending to use SQL Server Agent
If you have not done so, you need to set the location in Task Scheduler (TS). In at least some versions of TS, this can only be done when you create a basic task, not from the more general "Create Task..." option. Ensure that all the paths in the batch file are absolute or are based in this location.

Execute External commands Fitnesse/DBFit from within a Test page

I would like to execute external commands in my test from Fitnesse/DBFit from within a Test page.
For example, as part of a Unit Test:
run a remote ssh command towards host ABC and copy a file with win cp command
run a DB2 load command to import the data in the database from the file
run a stored procedure
verify results
May anyone advice regarding step 1) ?
There's been a few implementations of a CommandLineFixture - google commandlinefixture and you'll find them.

Can you write sql commands using a .bat file?

I'm designing a process to create a list of fuzzy duplicates for my colleagues. I have automated most of the process and have used a .bat file to open sqlite. However, I can now find no other way to read the code other than to manually type:
.read file_name.sql
Into command prompt. Is there a way I could type open and read the file from notepad with the commands prewritten, like a .bat file. For example:
cd sqlite -- enter directory with sqlite3 inside of it. DOS command
sqlite3 --to open the sqlite3 application DOS command
.read file_name.sql -- SQLite command
Thanks in advance, sorry if the question is trivial.
You can give a command as parameter to the sqlite3 tool:
sqlite3 mydatabasefile ".read file_name.sql"
You can find information on what you're looking to do with the SQLCMD tool. It allows SQL to be run from the command line.

Teradata client on Unix Solaris

I deploy some .bteq and .sql scripts on a TERADATA database. For doing this, I use a client on my desktop called BTEQWin version 13.10.0.03.
I get the .bteq/.sql from a version control like pvcs/svn etc and all I do once the files are in my workspace folder (from Version control tool), to just drag and drop the files from Windows browser to BTEQWin client (which I connect to a database prior to drag/drop for running those scripts).
Now, I have to automate this whole process in UNIX.
I have written a SHELL KSH/BASH script which is getting all the .bteq/.sql from a TAG/LABEL in the version control tool to a given UNIX folder. Now, all I need to do is the pass these files one by one (i'll take care of the order) to Teradata client.
My ?
- what client do I need to tell Unix admin team to install on Unix server - so that I can run something like below:
someTeraDataCommand -u username -p password -h hostname -d database -f filenametoexectue | tee output_filename.log
Where, someTeraDataCommand is the client / executable - which will let me run Teradata scripts (like I was doing using BTEQWin on my desktop - GUI session). Other parameters can be username, password, which database to connect on what server and which file to run or make that file passed to the command using "<" operator at command line.
Any idea?
- What client ?
Assuming the complete Teradata Tools and Utilities package is installed on your UNIX server (which will have the connectivity tools to talk to Teradata), you should have access to bteq from the command line. Something like this:
bteq < script_file > output_file
Your script file should contain a .LOGON statement to establish the connection:
.LOGON yourTDPID/your_account,your_pw
You might also need to use other commands to set your default database or non-default session values.
Another option would be to combine the SQL and call to BTEQ in a Korn shell script:
#!/usr/bin/ksh
##############
SHELL_NAME=`basename $0`
PRG_NAME=`basename $(SHELL_NAME} .ksh`
LOG_FILE=${PRG_NAME}.log
OUT_FILE=${PRG_NAME}.out
#
bteq <<EOBTQ > ${LOG_FILE} 2>$1
.LOGON {TDPID}/{USERID},{PWD};
--.RUN file=${LOGON}
/* Add your SQL/BTEQ commands here */
.QUIT 0;
EOBTQ
Edit
The double hyphen indicates a single line comment. Typically in a UNIX script you do not leave your password in plain text of a KSH script. The .RUN command would reference a text file in a barely sufficient secure location containing the .LOGON {TPDID}/{USERID},{PWD}; command.
The .RUN command in BTEQ allows you to reference another text file containing a series of valid BTEQ commands that you want to run in the current BTEQ script.
Easiest way is to setup the Solaris TTU, is to request root sudo, and run an interactive installation into defaults as a root. That would cure all client issues.