This question already has an answer here:
Download file with today's date in its name from remote server with WinSCP [closed]
(1 answer)
Closed last year.
So everyday I download multiple .zip files from an SFTP server. Also everyday our client uploads new .zip files to this SFTP server, but is not willing to delete the old files.
So I download the same files of the last few days + the files which got uploaded today.
I tried a lot but didn't have any success.
This is my short script right now (which downloads way to many files and eats my storage space up):
open sftp://user:password#sftp-server.com/ -hostkey=*
synchronize local D:/Test\Download /sftp-server/PDF-files/
I couldn't find an option to download files per date, so maybe you can help me further.
Also important, the .zip files are named:
"name_clientname_YYYYMMDD_NumberOfUploads.zip"
I tried to add
*%TIMESTAMP#yyyymmdd%*.zip
at the end of the path of the files, but that didn't work out.
Don't use synchronise if you are deleting the old files from your local copy. Select files based on timestamp instead:
From the winscp site: How do I transfer new/modified files only?
The appropriate get syntax (close to what you tried) seems to be something like:
open sftp://user:password#sftp-server.com/ -hostkey=*
get -filemask="*.zip>today" /remote-folder/* D:\local-folder\
where the filemask constraint is as specified in: https://winscp.net/eng/docs/file_mask#size_time
Related
I can store files to the specified storage location via the GUI. I can see the files are in the storage location.
When I try to download them using the GUI, I get this every time.
{"error":{"code":3,"message":"Unauthorized request","class":"Directus\\Exception\\UnauthorizedException","file":"\/var\/www\/directus\/src\/helpers\/app.php","line":287}}
When I try the links from the File library, I get the same error.
I found some old topics concerning a "_" project. I do not see any "_" entries in my project.php configuration.
Everyone has read permissions for the storage directory.
The rest of the system appears to run without error.
check the folder in the server and what is been set, the default should be, like
So then if you want to access the 300x300 the URL should be like:
domain.com/public/uploads/Directus/generated/w300,h300,fcrop,q80/file-name.jpg
My goal is to locally save a BigQuery table to be able to perform some analyses. To save it locally, i tried to export it to Google Cloud Storage as a csv file. Alas the dataset is too big to move it as one file, thus it is splitted into many different files, looking like this:
exampledata.csv000000000000
exampledata.csv000000000001
...
Is there a way to put them back together again in the Google Cloud Storage? Maybe even change the format to csv?
My approach was to download it and try to change it manually. Clicking on it does not work, as it will save it as a BIN.file and is also very time consuming. Furthermore I do not know how to assemble them back together.
I also tried to get it via the gsutil command, and I was able to save them on my machine, but as zipped files. When unzipping with WinRar, it gives me exampledata.out files, which I do not know what to do with. Additionally I am clueless how to put them back together in one file..
How can I get the table to my computer, as one file, and as a csv?
The computer I am working with runs on Ubuntu, but I need to have the data on a Google Virtual Machine, using Windows Server 2012.
try using the following to merge all files into one from the windows command prompt
copy *.cs* merged.csv
Suggest you to save the file as .gzip file, then you can download it from Google Cloud easily as BIN file. If you get these splited files in bigquery as following:
Export Table -> csv format, compression as GZIP, URI: file_name*
Then you can combine them back by doing steps as below:
In windows:
add .zip at the end all these files.
use 7-zip to unzip the first .zip file, with name "...000000000000", then it will automatically detect all the rest .zip files. This is just like the normal way to unzip a splitted .zip file.
In Ubuntu:
I failed to unzip the file following the method I can find in internet. Will update the answer if I figure it out.
I'm trying to create an updater for my app in VB.NET, No, I do not want to use clickonce, it sucks because I have to deal with managing self signed certs etc.
I know the code to check for new update files:
http://pastebin.com/ZjYBWABu
I also know the code for specifying where those files download to, the issue is I dont want to just download 1 .exe...I want to download all the latest build files which I would have uploaded to my server, which i would have taken from my Bin\release folder of my project.
Then when the updater downloads the files to a directory, it would go to the directory of the application, and somehow overwrite/replace all the files that have changed...maybe by using a hash or something?
I do not know how to proceed with this. What I do know is this.
The updater and the main app would have to be separate so that the updater could do the replacing while the app is closed so it doesn't get file in use errors. After the updater app has finished it would then start up the main app from the new exe.
Would appreciate help here thank you guys.
I am currently working on a project for which I have to implement a similar approach for updates. The project is lengthy, it would take some time to finish. But this is how I have planned to apply the updates:
There will be two main parts of the application Launcher (main application program) and Updater (To download files from server and replace them with the new ones and then launch the new file)
The application will have the option to manually check for update and also to check for update on startup.
If an update is available, it asks the user to apply the update now or later.
If the user selects to apply the update now then Updater application is executed in a separate process and then Launcher application is closed from within the code in Launcher. I have following approaches in my mind to launch another program from within first one and then exit:
Execute the Updater directly from within the Launcher using Process.Start
If that causes problem then as second approach launch command prompt from Process.Start, execute another program (Updater) from command prompt, close the command prompt and then exit the Launcher.
The Updater application then downloads all the relevant files from the server and upon completion old application files are replaced with the new ones.
Update availability information from server will include the new Version_No of application. For the purpose of providing all files for update, I will compress (zip) all of them in a single file named as Application.Version_No (as given by the server).
Upon download completion decompress (unzip) them to a folder named as the same Application.Version_No.
After decompressing all the files in this (Application.Version_No) folder will be copied to the Bin folder of application.
The new application Launcher file is executed in a separate process and Updater application is closed from within the code in Updater.
I have NOT yet tried this scenario as currently my focus is on completing the main application, but surely this must work.
UPDATE:
Another approach to check for updates is to use a bootstrap like application startup. It will be the main entry point of the program. Upon execution it will check for the updates and if there is none the Launcher is executed otherwise it will download the files, replace the old ones and then execute the new / updated Launcher.
For copying / overriding the files
One approach is to include only those files in the compressed (zip) file which are required to be replaced with the old ones and then after the download completes, either directly decompress them to the Bin folder or decompress them to a designated folder and then copy all of them to the Bin folder.
As another approach which seems somewhat lengthy, an additional helper file (XML, text or any other format) could be prepared for the download.
This helper file contains information of updated files like version number of each file, location where these are to be copied etc.
The files may be downloaded to a specific folder named as the new application version.
After downloading all the required files to a specific folder process each file mentioned in the helper file. Compare version of every old file with the new downloaded file. If it is latest then replace it in the folder mentioned in the helper file.
As another step in between all the downloads may be verified prior to copying and replacing.
Built an updater that ships with a daemon. Main project here:
https://github.com/UVLabs/dotNetUpdatify
There should be a way to eliminate the use of the daemon, if i figure it out i will update.
I think we have a problem in our FTP scripts that pull files from a remote server to a local machine. I couldn't find an answer in their knowledge base, nor scripting documentation.
We are doing an MGET *.* and then a MDELETE *.* immediately after it. I think what is happening is that, while we are copying files from the server, additional files are copied into the same directory and then the delete command deletes everything from the server. So we end up deleting file we never copied down.
Is there a straight-forward way to delete only the files that were copied, or is it going to be some sort of hack job where we generate a dynamic delete script based on what we actually copied down?
Answers that are product specific would be much appreciated!
Here were the options that I came up with and what I ended up doing.
Rename the extension on the server, copy the renamed files, and then delete the renamed files. This could not work because there is no FTP rename command that works with wildcards (Windows rename command will by the way).
Move the files to a subdirectory on the server, copy the files from that location, and then delete from the remote location. This could not work because there is no FTP command to move the files on the remote server.
Copy the files down in one script and SHELL a batch file on the local side that dynamically builds a script to connect to the server and delete the files that were copied down. This is the solution I ended up using to solve this problem.
An MSBuild project copies its output to a directory on a server. Each day, only a few files change and most have an older creation date.
I can FTP this to a remote server with MSBuild tasks. But how can I do this FTP and only copy the few files that have changed?
To do this you'll need something that will manage the sync for you - that is that will keep track of what file is where and update accordingly.
We have used FTPSync to do the file sync bit very tidly for a number of sites.
From MSBuild you can call an external program - so putting the two together will probably work providing your are consistently synching from the same location (otherwise its going to be more interesting!)