WE have a call recordings everyday that are posted in FTP. As a backup i wanted to copy those recordings from FTP in to local drive on daily basis.Is it possible to copy entire folder as a whole and paste the folder in my local and rename it ?
thanks in advance
BIDS in 2008 has an FTP Task, but I never use it. There are also several third party extensions for SFTP and the like, but I don't use those.
[Edit: Since the above links are to CodePlex, which is shutting down, I'll provide more detail. The first link is to a project titled "SSIS SFTP Task Control Flow Component" and the second is to a project titled "SSIS Extensions - SFTP Task, PGP Task, Zip Task". As of June 2017 the projects do not appear to have an obvious rehosting.]
I tend to use WinSCP with a script file. It's certainly the least Microsoft-y approach, but, IMX, it's more foolproof. This has the advantage of working for FTP, FTPS, and SFTP. It has the disadvantage that you're storing the password in plain text. You can store it in the registry with WinSCP's session manager, but I believe that's a per-user registry which makes for lots of possible fun when you have the SQL Agent using an SSIS Proxy account.
Create your script file with the commands you want to use. Be sure to specify option batch abort and option confirm off. You also might want to specify option failonnomatch, but I haven't tested that one yet myself since I'm still on a slightly older version that doesn't support that option.
Mine tend to look like this:
# Set batch settings
option batch abort
option confirm off
# Connect
open sftp://user:password#sftp.server.com -hostkey="ssh-rsa 2048 00:11:22:33:44:55:66:77:88:99:aa:bb:cc:dd:ee:ff"
# Change remote and local directories
lcd "M:\SSIS\eSchoolPlus to Clever\Output"
# Transfer files
put -nopreservetime -nopermissions -transfer=ascii file1.csv
put -nopreservetime -nopermissions -transfer=ascii file2.csv
put -nopreservetime -nopermissions -transfer=ascii file3.csv
put -nopreservetime -nopermissions -transfer=ascii file4.csv
put -nopreservetime -nopermissions -transfer=ascii file5.csv
close
exit
The options on put are due to how the remote server works for the script I happened to grab. get works about the same.
Then you use an Execute Process Task. The executable is C:\Program Files (x86)\WinSCP\WinSCP.exe and the Arguments I use are:
-console -script="X:\Path\To\Script\WinSCPScript.txt" -xmllog="X:\Path\To\Script\WinSCP-!S-!Y-!M-!D-!T.log" -xmlgroups`
That creates a daily log of just the WinSCP information. The XML logs are more readable and useful than the older log format, IMO, which contains a lot of mostly debugging information.
You can also go the route of using the WinSCP .NET library with an Execute Script Task, but that's obviously a lot more effort. They do have a HOWTO for that, however.
Yes - you would use a File System Task and set the Operation to Copy Directory (before Microsoft changed it with Win 95, folders were called directories). You can change the name in the destination connection where you choose the destination folder. You can use an expression to add a date if you want.
https://msdn.microsoft.com/en-us/ms140185.aspx
Here's some good step-by-step instructions if you need them since Microsoft gives vague descriptions with no examples:
http://oops-solution.blogspot.com/2011/11/file-operation-copy-move-rename-and.htmlhttp://oops-solution.blogspot.com/2011/11/file-operation-copy-move-rename-and.html
Related
How can I make sure that a file uploaded through SFTP (in a Linux base system) stays locked during the transfer so an automated system will not read it?
Is there an option on the client side? Or server side?
SFTP protocol supports locking since version 5. See the specification.
You didn't specify, what SFTP server are you using. So I'm assuming the most widespread one, the OpenSSH. The OpenSSH supports SFTP version 3 only, so it does not support locking.
Anyway, even if your server supported file locking, most SFTP clients/libraries won't support SFTP version 5. Or even if they do, they won't support the locking feature. Note that the lock is explicit, the client has to request it.
There are some common workarounds for the problem:
As suggested by #user1717259, you can have the client upload a "done" file, once an upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after an upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for example of implementing this approach.
Also, some SFTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive (courtesy of #fakedad).
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, when you download an incomplete file.
A typical way of solving this problem is to upload your real file, and then to upload an empty 'done.txt' file.
The automated system should wait for the appearance of the 'done' file before trying to read the real file.
A simple file locking mechanism for SFTP is to first upload a file to a directory (folder) where the read process isn't looking. You can "make" an alternate folder using the sftp> mkdir command. Upload the file to the alternate directory, instead of the ultimate destination directory. Once the SFTP> put command completes, then do a move like this:
SFTP> move alternate_path/filename destination_path/filename. Since the SFTP "move" is just switching the file pointers, it is atomic, so it is an effective lock.
How can I make sure that a file uploaded through SFTP (in a Linux base system) stays locked during the transfer so an automated system will not read it?
Is there an option on the client side? Or server side?
SFTP protocol supports locking since version 5. See the specification.
You didn't specify, what SFTP server are you using. So I'm assuming the most widespread one, the OpenSSH. The OpenSSH supports SFTP version 3 only, so it does not support locking.
Anyway, even if your server supported file locking, most SFTP clients/libraries won't support SFTP version 5. Or even if they do, they won't support the locking feature. Note that the lock is explicit, the client has to request it.
There are some common workarounds for the problem:
As suggested by #user1717259, you can have the client upload a "done" file, once an upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after an upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for example of implementing this approach.
Also, some SFTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive (courtesy of #fakedad).
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, when you download an incomplete file.
A typical way of solving this problem is to upload your real file, and then to upload an empty 'done.txt' file.
The automated system should wait for the appearance of the 'done' file before trying to read the real file.
A simple file locking mechanism for SFTP is to first upload a file to a directory (folder) where the read process isn't looking. You can "make" an alternate folder using the sftp> mkdir command. Upload the file to the alternate directory, instead of the ultimate destination directory. Once the SFTP> put command completes, then do a move like this:
SFTP> move alternate_path/filename destination_path/filename. Since the SFTP "move" is just switching the file pointers, it is atomic, so it is an effective lock.
So, as part of my daily jobs, I have to transfer a one file from our customers server to our internal server and any responses back.
Each customer effectively has one file up and one file down each day.
I have an SFTP server here that I can use and is already used manually for a few sites.
I'm looking to automate as many sites as possible using batch files on a scheduled task.
Initially, I'm looking at automating the internal side of the process.
We simple have a requests folder that needs to import from the SFTP (then delete the original on the SFTP) and a response folder which needs to copy to a 'sent' folder and then export to the SFTP (also, deleteing the original)
On the SFTP server I have a "to site" and "from site" folder. Each file is site specific followed by a variable. So SiteNameImport.<variable> and SiteNameExport.<variable>
EDIT:
I'm asking this as I'm a novice at scripting and basically have no idea what to do.
I've tried reading the automation guide on WinSCP website but a lot of it means nothing to me.
Filezilla doesn't support automation, You're better off with WinSCP. They have some scripting examples here as well as any other information you'll need to build the scripts functionality. You'll just need to add the specifics (Like deleting sent files and so on) CuteFTP is also another solution you can script with but I believe you have to pay for a licence. I suggest VBscript, Examples can be found Here for vbscript.
My .NET app updates a certain .xls file located in a shared folder inside the domain. Using VB.NET, how can I check if that file is being used by another user and unshare/close it, so my code can update the file properly?
NOTE:.Exe file(my app) will be executed in the server where shared folder is.
Windows and NTFS is designed to prevent this scenario. Unless you:
Get the Excel application which is holding the .xls file open to close it
Shut down the server's network connection assuming the Excel file is opened by client PCs - a highly undesirable step to take
Discover the PC on which the Excel.exe instance is running and use the Windows Taskkill command-line command to kill all Excel.exe instances (since you won't necessarily know which one has it open) on the machine - another undesirable step to take
Have a look at this answer, but this uses third-party utilities, and doesn't explain how these utilities do what they do though you may be able to Process.Start them.
you just can't get Windows/NTFS to release the file lock.
A possible alternative that I'm not sure will work with an xls (as opposed to an xls_) is to make the file shared, though this has some limitations.
I would like to make a complete backup of my whole joomla 1.5 based site from time to time. How would this ideally be done? Are there any common pitfalls? Not that I only have ftp access to the hosting server. Is there a step by step tutorial somewhere? I am using latest Joomgallery and Kunena 1.0.9 (Legacy mode).
Maybe there is a good way to automate this?
There's two parts of the backup you have to worry about, the database and the files.
The first part is the database. It can be backed up using something like phpMyAdmin. If you don't have this available on your server already, it's not too hard to upload and get it going yourself. From there, you can just Export the entire database to a gzip file.
The second part is the code and uploaded files. The code base shouldn't change too often, so you could probably just make one backup of this. There's a number of ways. The simplest is to just download the entire folder via FTP, though if you're Linux, I'm sure someone will know a single command line to get all the changed files (rsync?).
The database is the main thing you have to worry about though: everything else should be able to be rebuilt just by reinstalling.
I think this: http://www.joomlapack.net/ is what you need. I use it myself and it works like a charm. Both for backups and for moving my Joomla installations from developer sites and to the real site.
get an FTP synchronisation tool and keep an up-to-date copy of your site locally. Then you could run the batch script
mysqldump -hhost -uuser -p%1 schema > C:\backup.sql
to create a backup of your mysql tables at various points in time.
edit
you would have to have MySQL Server installed on your local machine and path to its bin directory in you PATH, in order to run the mysqldump command without much hassle. -p%1 would take the command-line provided password, as you wouldn't want to store passwords in your batch script.
If you only have FTP access you are in a bit of a problem, as beside all files you'll also have to backup the database. Without accessing the database, a full-backup won't do you any good.
Whatever backup strategy you choose - be sure it can handle UTF-8 correctly. Joomla 1.5 stores all content with UTF-8, even when the database charset is set on 'iso-5589-1' - so when the backup solution is detecting the database charset, some characters like € or é will result in "strange" ¬ / é - not really what you'll want.
I absolutely endorse using Joomlapack - it works great. The optional remote tools allow you to initiate the backup from a Windows desktop machine - it performs the backup and downloads it. The remote has a scheduler, and you can also set it off to backup and download a list of sites.
Joomlapack also provides a file "kickstart.php" which you copy to your empty server account along with the backup, which automates the restore procedure. You do have to create an empty database with PHPMyAdmin or similar, and you are given the opportunity to supply the database parameters (host, database, username, password) during the process.
One pitfall I did run into with this though is that some common components can have absolute URLs in their configuration - e.g. SOBI2, Virtuemart. It's then just a matter of finding the appropriate configuration file, editing it and re-uploading it.
Another problem was one archive file (either ZIP or their JPA format) got a filename with a "?" character in it (from a Linux server) and this caused a bit of a problem trying to install it locally on a Windows WAMP stack - the extract process on the ZIP file failed, and it stopped the process completing cleanly.
I suggest using automatic backup service by http://www.everlive.net
Update:
Ok, here is some more information. EverLive.net is a website where you can create a free account. Enter your website details and you are ready to take your backups withe just one click. Restore is also possible in the same way.
Further you can use automatic backup option to take automatic backups at defined intervals. Other than that, you can use the website health check service to inform you if your website is not available.