I have an application where I manually go in and back up certain files(1), and manually copy those files via WinSCP(2).
I would like to be able to use two scripts/ cron jobs to automate these tasks.
Here are the steps/commands I use to back up the files:
sudo as admin
cd to directory
export PATH=$PATH:/<filePath>/
./export-<appData.sh Backups/ /<filePath>/
Once that is complete, I will login to WINSCP and look in the directory where the data is backed up. I copy them over to the Backups shared directory manually.
Any help with creating a script/cron job to automate this process would be greatly appreciated.
Thank You!
You can create a shared drive in windows that can be accessible through sambha in linux ; then make a script that can copy from your linux server to windows server.
I hope this will be useful.
Related
Does anyone know if its possible to have Rundeck (or another open source scheduler) kickoff a job based on a file being detected on it's filesystem?
For ProActive, an open source scheduler, we developed a small script that would allow you to do that and check the changes in a folder for a selected period.
Directory Monitoring Script
Let me know if you have any issue.
am using Pentaho community edition 5.4.0 ,I explain My requirement very Simply,
1) I have my jobs and transformation in my local windows machine and i like to execute those in my client machine ,So that i installed same Pentaho community version 5.4.0 on his machine. For Remote Execution i heard about Carte.bat service,I searched the installation procedure and configuration settings for remote execution,but i didn't get a clear idea about that,Please help me a clear step by step procedure for how to run remotely in my client machine .
2) Is there possible for Schedule those jobs and transformation in Pentaho Community edition 5..4.0 ? Is it possible please explain the same.
Thanks and Regards
Dhamodharan.
Install jenkins
https://wiki.jenkins-ci.org/display/JENKINS/Installing+Jenkins
At least read what variables are available in Jenkins. It is pretty handy to know them.
Download PDI KETTLE from http://pentaho.com unzip in any suitable directory.
Configure executables and PDI variables as in here
How to configure Database connection for production environment in Pentaho data integration Kettle transformation
Start jenkins and login into admin panel. Create an new job,
in paragraph Build add Execute shell inside input text area add lines:
cd $WORKSPACE
kitchen.sh -file=main.kjb
Done.
There are a lot of jenkins plugins.
You can add post-build actions:
notice by email
archive publish result
.... so on
Worth to use Jenkins if it is used for some other functionalities, means it is already exists in infrastructure, otherwise carte will be enought.
Variable configured in .bashrc and .bash_profile (User should be same as used for Jenkins)
#.bashrc
export KETTLE_HOME=/opt/R1/data-integration
export KETTLE_JNDI_ROOT=$KETTLE_HOME/simple-jndi
export PATH=$PATH:$KETTLE_HOME
To force evaluate .bashrc on ssh login add to .bash_profile
#.bash_profile
if [ -f .bashrc ]; then
. ~/.bashrc
fi
Then
source .bashrc
After restart Jenkins (not from admin panel)
Hello guys I'm using kettle spoon 5.0.1 and I can't find the Schedule Perspective, or any other except Data Integration Perspective, can you help me, please?
If you need scheduling in CE, you have to do it yourself. Fortunately this isn't terribly hard. Just use an external scheduler to launch transforms and jobs via Pan and Kitchen.
For example, I use pgagent since I have PostgreSQL. So I set up my schedules and launch my job as follows:
set KETTLE_PASSWORD=Encrypted (Some pwd)
"C:\Program Files\pentaho\Daily_Jobs.bat" > "C:\Program Files\pentaho\Daily_Jobs.log"
And then Daily_Jobs.bat is this:
cd /d "C:\Program Files\pentaho\"
Kitchen.bat /rep:"ETL_PROD" /job:"Daily Jobs" /dir:"/Finished" /user:Brian /level:Basic
This runs the job "Daily Jobs" in the "/Finished" directory of my PDI repository as use "Brian" with basic logging and using the password stored in KETTLE_PASSWORD. These are Windows batch files, but .sh scripts in Linux work just as well. No they don't have to be in two separate batch files (I forget why I did that originally).
While I use pgagent, any scheduler that can launch batch files/shell scripts should work.
I am new to WinSCP and am attempting to create a script file that will eventually be used with SSIS to download files from an SFTP site. A lot of the literature WinSCP includes explains the file downloading or uploading portions. For the time being, I just want to create a script to test the connection first and will build from there.
So far I saved the connection in WinSCP and have the following. The below code does not seem to function at all and I am not sure where else to go as I am still reading about the scripting for WinSCP. Is there a way or can someone point me in a direction to see if I am in fact connecting via through the script?
option batch on
option confirm off
open username#address
exit
Not sure what SSIS is (sorry) but I can tell you how I'd set it up from a windows batch file if that helps:
If you are open to using a different software, consider using cygwin. It mimics a linux shell so linux users on windows have a lot of linux utilities handy. That being said, there are some commands which can run on windows straight from command prompt (and thus batchable). What you'd need to do:
1) install cygwin
2) Create a "passwordless" login (using ssh-rsa authentication). To do this start your cygwin terminal and use the commands "ssh-keygen" and "ssh-copy-id" (more on that later)
3) Now you can run "sftp" from the DOS command prompt (does not require cygwin terminal) and sftp to your account. No password required because of step 2).
A few follow up info:
What can run from dos command prompt and what must be run from cygwin terminal?
If you go to the "bin" directory of cygwin (for me it's in c:\cygwin\bin) you can see all the cygwin utilities. Anything with "exe" extension can be run from dos command prompt. If no "exe" extension, must start cygwin terminal first
How to set up ssh-rsa authentication?
You can pretty much google "ssh login without password" and pull up a lot of results. This is common for setting up login from one linux system to another. You would be using the same steps using cygwin on windows. My instructions are here:
http://geekswing.com/geek/unix/how-to-ssh-login-without-a-password-using-ssh-keygen-quick-tutorial/
Storing session settings in WinSCP GUI and trying to access them from WinSCP script running in SSIS is generally a bad idea. I believe there's no example or guide on WinSCP site that would suggest doing that.
WinSCP stores its configuration in registry in HKEY_CURRENT_USER hive. The SSIS typically runs under a dedicated system account, that have its own HKEY_CURRENT_USER hive, and won't see the GUI configuration.
For details see WinSCP FAQ about your problem:
https://winscp.net/eng/docs/faq_scheduler
The best you can do is isolate your your script from configuration by using the session URL with the open command, instead of the stored site name.
See also https://winscp.net/eng/docs/scripting#configuration
Your actual problem can be completely different though. But that's hard to guess as you have not shared any details, such as error message, log file, etc.
I am using the 7zip standalone .exe to unzip a file. I am using the Execute Process task for this. I have tested this over and over again on multiple machines and I know it works (at least in debug mode/visual studio). I have uploaded this package the server. I have created a job that calls said package from the Package Store. The package is not able to find the .exe no matter where I put it.
My first thought was to put the .exe on the C:\ drive, which failed. I have also failed in my attempts to place the .exe on a network location that the account the package is running under has full control over.
Basically, has anybody else had issues getting the Execute Process Task to find an executable when the package is uploaded to the server?
The error message is
Can't find 7za.exe in directory C:\7zip
I'll risk a downvote for being wrong, but I believe you have a permission issue.
You say it runs fine on other servers from BIDS, try it without BIDS. Call it from a command-line on a box that it works on.
dtexec.exe /file C:\HereComesTheUnzipper.dtsx
If that works, then repeat the step on the troublesome server. RDC into the box and try again
dtexec.exe /ser localhost /sq HereComesTheUnzipper
If that still works, then you are looking at an issue with the job. What account is the SQL Agent service running as? Is the SSIS job step running as a particular set of credentials? If so, is it a SQL Server login (which wouldn't map to anything on the physical box)? Regardless of what your answer is, the resolution will be to ensure the account has access to
7z.exe
whatever scratch area 7zip may use while unpacking files (I assume %temp%)
the output folder (C:\bin\7z.exe -e e:\data\MyThing.7z)