I am using SPSS Statistics 24 and am looking for a solution how to upload a file to ftp server. I have been searching online but can't find any examples.
Many thanks,
Art
Art,
There is nothing in SPSS Statistics to upload/download files to/from an FTP server. Are you sure you aren't thinking of saving the file (*.sav) locally and then using FTP client software of some kind to upload your file? If you have previously mounted the FTP resource to your filesystem (e.g. Mapped a drive, using Google Drive, iCloud Drive, Dropbox, OneDrive, etc.) then the normal GET and SAVE operations should apply. But no, SPSS Statistics itself is not an FTP client.
I hope this helps,
-ddwyer
Related
My work uses WinSCP for SFTP transfers. We have some data coming in this way each week and I would like to get it into an S3 bucket. We want to automate this transferring by use of a cron job or some other way like this.
I know there are AWS tools, but they cost money and money can't be spent. We also do not have an ETL tool like Alteryx, otherwise I would use it. Nothing on the internet gives a lot of detail about transfering files from SFTP server to another server. Mostly reading how to transfer from server to local.
Below is the code I have found.
Can this WinSCP commands be used to transfer to S3 bucket somehow at the 'put:'? (I cannot use the generator like other posts have said because I do not have access to our AWS or any buckets, yet.) This is all about proving a concept.
# Connect to SFTP server using a password
open sftp://user:password#example.com/ -hostkey="ssh-rsa 2048 xxxxxxxxxxx...="
# Upload file (THIS IS WHERE I WOULD WANT S3 PATH SYNTAX)
put d:\examplefile.txt /home/user/
# Exit WinSCP
Exit
Once I have this command we can then create a Windows schedule task, from what I read. This would automate where the file is and we can then do more with the file where SFTP servers limit us.
If I understand the question correctly, you are asking how to transfer files directly from SFTP server to S3 using a script running on yet another machine.
It's not possible (unless AWS has a feature for that, but then it won't be free). You have to download the files from SFTP server and then upload them to S3.
With WinSCP scripting, you can do it with a script like:
open sftp://username:password#sftp.example.com/
get /sftp/path/*
exit
open s3://accesskey:secretkey#s3.amazonaws.com/
put * /bucket/
exit
You can mount your bucket on your server with "s3fs" "fuse" like an normal harddisc.
I'm trying to set up a connection a Google Drive folder and S3 bucket, but I'm not sure where to start.
I've already created a sort of "Frankenstein process", but it's easy to use only by me and sharing it to my co-workers it's a pain.
I have a script that generates a plain text file and saves it into a drive folder. And to upload, I've installed Drive file stream to save it in my mac, then all I did was create a script using Python3, with the boto3 library, to upload the text file into different s3 buckets depending on the file name.
I was thinking that I can create a lambda to process the file into the s3 buckets but I cannot resolve how to create the connection between drive and s3. I would appreciate if someone could give me a piece of advise on how to start with this.
Thanks
if you just simply want to connect google drive and aws s3 there is one service name zapier which provide different type of integration without line of code
https://zapier.com/apps/amazon-s3/integrations/google-drive
For more details you can check this link out
Dropbox again banned my public folder because we exceed the daily limit. That is very stressing for us. So I'm looking for other options to share our media files with our users.
Our site is hosted in a Digital Ocean droplet: 2 GB Memory / 40 GB Disk / SFO1 - Ubuntu LEMP on 14.04
Our media files are in a folder in our Dropbox Pro account.
There is some way to cut/copy the files from our Dropbox account and paste it to our Digital Ocean account?
Thanks in advance!!
Not sure where exactly you want to migrate, but these link should be useful for you (or someone else in the future):
Dropbox client for Linux - this is a tutorial on how to use Dropbox client with DigitalOcean and sync files between server and Dropbox.
Mount DigitalOcean Spaces instance - this tutorial will allow you to mount your DO Spaces storage to your DO Droplet using s3fs.
Configure backups to DigitalOcean Spaces - this one describes how to configure s3cmd to exchange your files between server and Spaces.
You could use info above to e.g. download your entire Dropbox data using Dropbox client, then create Spaces instance. Next you would mount Spaces with s3fs and just move data "inside" your Droplet onto the newly mounted fs. Or use another server for download and upload to Spaces with s3cmd (if network speed and HDD space is your constraint on primary server). Of course, it might be just enough to download data with Dropbox client and keep it on your server without external Spaces, if your HDD/SSD is big enough.
This link might be useful as well when writing your own scripts, these ones help to automate backup to Dropbox using "on the fly" DO instances. I haven't tried this though.
Let's assume I have a dropbox pro account which gives 1TB of storage & the storage is fully occupied with data.If my local machine storage is less than 1 TB, can anyone please explain me about the behavior of "my local dropbox folder"?
I know that all data cannot be downloaded to my desktop(local dropbox folder)due to lack of storage.I have following questions
1.What will happen if I access a file which is not there in local dropbox folder. Will it be downloaded?
Which files are stores locally out of all the files in cloud storage.
Does dropbox consider about file access patterns?
Thank you in advance
Can't access a file which is not there in local dropbox folder.
No, Dropbox has an option called "selective sync" - user determines which folders should be in local out of all the folders. If your local storage is not sufficient it gives a message stating that.
I am looking to have users upload files to an FTP server using a "pre-configured" FTP client. By that I mean; the FTP client's connection settings have already been set in the client they have downloaded. Ideally users should be able just to drag and drop the files into a simple window and the file uploads.
I have found two applications which allow me to do this;
"FTP Maker" (softhing.com/ftpmaker.html)
This allows you to configure the FTP connection details and add a logo. You then hit a button which generates the "uploader application". This can then be distributed to users where they have to configure nothing. While this works, it doesn't have as many features as...
"FTP Uploader Creator" (devzerog.com)
Same as above, except the application can zip before uploading, can resume uploads and can also send an email after upload has completed. These are handy features I wish FTP Maker had. The issue with this application is it's developer seems to have gone out of business and only the thirty-day trial is available...
Another application is "FTPcreator" (ftpcreator.com). Unfortunately this is a little outside my price range.
I am also aware of options such as dropbox, ftpbox etc.
Do you know of any super-lite FTP clients which I could pre-configure before sending out to users? Ideally it should have the features of "FTP Uploader Creator" above. I believe this sort of tool might be used in IT to allow users to send files directly to an FTP server.
I know I could do this through a webpage. However, the files will be particularly large and well over the limits of my hosting apache upload file size limit.
I have spent literally hours looking for alternative! Any help would be greatly appreciated!