SQL Loader Unix - Giving error SQL Loader-500 Unable to open file (ABC_CTL.dat) - sql

We are executing SQL loader command from shell script on RH Linux OS. The command passes both control file and data file as command line parameters. The syntax used is given below (masked sensitive data):
sqlldr userid=$connstring control=/local/abc-1.2.3/instances/www.abc.com/apps/int/script/bin/ABC_CONTROL.ctl data=$f log=/local/abc-1.2.3/instances/www.abc.com/apps/int/script/logs/ABC.log bad=/local/abc-1.2.3/instances/www.abc.com/apps/int/script/logs/ABC.bad
The data file name is passed as dynamic variable in a FOR loop to process multiple files. The data file extension is *.app and path is /local/abc-1.2.3/instances/www.abc.com/apps/int/script/input.
We have verified that $f variable is able to correctly point to data file. Also verified file permissions. We tried changing the directory paths as well.
Still script fails with below error: **SQL Loader-500 Unable to open file (ABC_CTL.dat)**, SQL Loader-553 File not found, SQL Loader-509 System error: No such file or directory
The same script runs with exact same syntax on another server. Please suggest any solutions.

Related

CommandLine Execution from SQL Server not finding CMD variable

I am trying to get a SQL Server Agent job to execute in command line the following code. It sets the variable to the server name of the SQL Server instance so that it can reach out to the UNC path and run the executable. This singular line works find when I run it on the server in command line through RDP, but fails when run in SQL Server as type Operating system (CmdExec). The line is:
FOR /F "usebackq" %i IN (`hostname`) DO SET _exeCall=\\%i\test\test1.exe && %_exeCall%
The results from the log are:
C:\Windows\system32>SET _exeCall=\\testbox\test\test1.exe && _exeCall
'%_exeCall%' is not recognized as an internal or external command,
operable program or batch file.
The machine is correctly titled testbox, the path and executable file are correct, but it does not appear to understand that it should be executing the variable _exeCall. This works properly when executed on the server in commandline, but not from SQL Server's commandline tool.

'gunzip' is not recognized as an internal or external command, operable program or batch file. System command 'gunzip' failed

I am trying to analyse my raw GNSS data on the GNSS Analyser app from here https://github.com/google/gps-measurement-tools. The installation guide includes the following step:
4.2 gunzip installation
The automatic ftp code inside GnssAnalysis will download ephemeris zip files, and attempt to
unzip them using gunzip.
Download gzip.exe from here http://ftp.gnu.org/gnu/gzip/gzip-1.9.zip
Extract the files from the zip file, rename gzip.exe to gunzip.exe
Move gunzip.exe to somewhere in your Windows path (type path in the Windows
Command Prompt to see what your path is, typically you will find a directory
C:\Windows\system32 and you can put gunzip.exe there.)
However, upon downloading gunzip, I cant find a gzip.exe file, and hence tried renaming the gzip.c and gzip.h file instead. It did not work and I got this error when attempting to process my own raw data.
I have just tried and got success to import DB from a backup file:
gzip -d < C:\Users\my-user\Downloads\my-db-backup.sql.gz | mysql -u root -p MY_DB_NAME

How to delete remote file using Kettle Pentaho

I have a directory in remote Linux machine where files are being archived and kept for a certain period of time. I want to delete a file from remote (Linux) machine using kettle transformation based on some condition.
If file does not exists then job should not throw any error but if file exists at remote location, then job should delete file or raise an error in case some other reason, i.e., permission issue.
Here, the file name will be retrieved as a variable from previous steps of transformation and directory path of archived files will be fixed one.
How can I achieve this in Pentaho Kettle transformation?
Make use of "Run SSH commands" utility to pass commands to your remote server.
Assuming you do a rm -f /path/file it won't error for a non-existent file.
You can capture the output and perform an error handling as well (Filter rows and trigger the course of action).
Or you can mount remote directory to machine where kettle is, and try to delete file as regular.
Using ssh, i think, non trivial. It needs a lots of experiments to find out error types, to find way to distinguish errors. It might be and error with ssh connection or error to delete file.

invoke-sqlcmd failing after Automated SQL install with powershell

I'm using Powershell 4 to install SQL 2014. everything goes ok except at the very end where I have a function that will run a script from a .sql file using invoke-sqlcmd. I get the following error:
"The term 'invoke-sqlcmd' is not recognized as the name of a cmdlet, function, script file..."
If I try and import the sqlps module I get:
The specified module 'sqlps' was not loaded because no valid module file was found in any module directory.
But here's the kicker. If I open a separate PowerShell terminal, IT WORKS THERE. :/ and continues to fail in the initial terminal.
I'm trying to understand why this is so any assistance would be greatly appreciated. I'd like to avoid writing in a reboot once script.
Thanks,
Dan
Existing Powershell session isn't aware about Sql's modules that were just installed. Take a look at environment variable $env:PSModulePath. Compare new shell's variable to existing and you should see a missing path like ...\Microsoft SQL Server\110\Tools\PowerShell\Modules\.
For a work-around, modify the path to include module dir. Like so,
$env:PSModulePath += ";c:\some\path\to\sql"

Import .prpt file in Pentaho Server using Command Line

I want to upload .prpt (Pentaho Report File) in Pentaho BI Server. I am using the following command:
./import-export.sh --import --url=https://server/pentaho/ --username=user --password=pass --source=file-system --type=files --charset=UTF-8 --path=/public--file-path=/home/kishan/folder/Clients/abc/Daily_Reports/Prpt/xyz.prpt --logfile=/home/user/upload.log --permission=true --overwrite=true --retainOwnership=true
So, I want to pick up the file located at the file-path value above and upload it to the BI server in the public folder. However, I am getting the following error:
CommandLineProcessor.ERROR_0001 - Missing Arguments: file-path
Why is it saying this even though I have this argument in the command above.
I got it working. Here's the command that worked for me:
./import-export.sh --import --url=http://localhost:8080/pentaho --username=admin --password=password --charset=UTF-8 --path=/public --file-path=/home/some_directoryN/Daily_Reports/Prpt/xyz.prpt --logfile=/home/kishan/upload.log --permission=true" + "--overwrite=true --retainOwnership=true
So, for this to work you need the file on the server file system (file-path argument) to which you want to upload the file. path argument is the path on BI server where you want to upload the report