Dropbox again banned my public folder because we exceed the daily limit. That is very stressing for us. So I'm looking for other options to share our media files with our users.
Our site is hosted in a Digital Ocean droplet: 2 GB Memory / 40 GB Disk / SFO1 - Ubuntu LEMP on 14.04
Our media files are in a folder in our Dropbox Pro account.
There is some way to cut/copy the files from our Dropbox account and paste it to our Digital Ocean account?
Thanks in advance!!
Not sure where exactly you want to migrate, but these link should be useful for you (or someone else in the future):
Dropbox client for Linux - this is a tutorial on how to use Dropbox client with DigitalOcean and sync files between server and Dropbox.
Mount DigitalOcean Spaces instance - this tutorial will allow you to mount your DO Spaces storage to your DO Droplet using s3fs.
Configure backups to DigitalOcean Spaces - this one describes how to configure s3cmd to exchange your files between server and Spaces.
You could use info above to e.g. download your entire Dropbox data using Dropbox client, then create Spaces instance. Next you would mount Spaces with s3fs and just move data "inside" your Droplet onto the newly mounted fs. Or use another server for download and upload to Spaces with s3cmd (if network speed and HDD space is your constraint on primary server). Of course, it might be just enough to download data with Dropbox client and keep it on your server without external Spaces, if your HDD/SSD is big enough.
This link might be useful as well when writing your own scripts, these ones help to automate backup to Dropbox using "on the fly" DO instances. I haven't tried this though.
Related
We're currently moving our setup from a few VPS to Digital Ocean. Our setup includes a staging site, which has a replica of our live db, however all E-Mails are caught using Mail Catcher and it has it's own storage location for assets, as we don't want to delete all production assets, when we delete a user on the staging system.
Therefore the question is: Is there a way to replicate all the assets from one DO space to another. Basically the idea is that if someone uploads an image on the prod site, it will be uploaded to the prod media bucket and should be replicated to the stage media bucket. Same when someone deletes a file in the prod media bucket (though I could live without this).
The only option to achieve my goals so far is to use Rclone on a VPS using a cron job, as described by DO here, however I hope there is a more "elegant" solution
I don't think there is a way to do it natively like S3 so rclone or something custom is your best bet - you can always OSS it to help the community
I'd like to use restic to do cloud backups.
I have a lot of GBs on pCloud, and I want to backup there. I work on a macOS laptop (with GUI, that is). I also have pCloud client installed on this laptop.
Question: restics's docs say about connection to pCloud using rclone. However, I have already pCloud client and pCloud Drive folder is mounted.
Does rclone provide any advantages over direct use of pCloud Drive folder for backups?
Thanks!
Given the popularity of hosting static sites from AWS S3 buckets it would be great to be able to do that from Cloud9 too.
Is there any way I can set up an FTP-based workspace that uses an S3 bucket as the source?
Transmit and other FTP apps have the ability to work directly with an S3 bucket. I did try setting up an FTP workspace in Cloud9 using the following:
Host: s3.amazonaws.com
Username: My-Access-Key
Password: My-Secret-Key
I know it was a long-shot and I have since read confirmation that Amazon doesn't allow simple FTP access to buckets like that.
Any ideas if this is possible?
FTP workspaces on Cloud9 are actually being phased out, so I'd recommend using the mounting feature described in this blog post to mount an FTP source: https://c9.io/site/blog/2014/12/ftp-sftp-mounting-beta
Unfortunately, S3 doesn't support the FTP protocol, so this would have to be a new feature. Luckily we're opening up our SDK to be able to implement features like this. If you're interested in contributing please email us via https://support.c9.io
Codeanywhere (https://codeanywhere.com) does this now. However, you'll have to shell out $7 to $10/m for that capability.
But then again, like Cloud9 (which I'm a big fan of), you get a bunch of features on the Codeanywhere IDE.
I was disappointed when Cloud9 discontinued its efforts on S/FTP. Codeanywhere seems to be taking on the cloud/storage issue head on by handling cloud access to S3, FTP, SFTP, Google Drive and others.
I am about to install Maverick and before I do that I am going to reformat my macbook air. I use dropbox and have about 15gb of (small) files on it (mainly documents/ebooks).
My question is: Is it possible to backup my Dropbox folder now, reformat my SSD and and install dropbox again. After wish I replace the dropbox folder with my backup without getting Dropbox confused (It might think it are new files? So dropbox could upload them or/and download the same files again).
Does anyone got any experience with this?
It's fine to do this - I have done it myself, but not on OSX.
The Dropbox client will index the files that it finds on your computer and compare them to the ones which are already in your account (on the server). I believe that it uses some kind of hash function to do this - the client creates a small hash value for each file and then this value is compared to the value on the server. If the value is the same then the client assumes that the file is the same and it does not need to be re-uploaded. However, if you have thousands of files, this can take some time.
Source: https://www.dropbox.com/help/1941/en - "The application will index the files and see that they are the same files in your account."
If you want to do it, when you install Dropbox again, you should sign-in to your account, let it create the Dropbox folder and then click "Pause Syncing" so that it doesn't start downloading everything. Then you should copy the backed-up Dropbox files into the new Dropbox folder and resume syncing.
I'm developing a website in EC2 and I have a lampp server in the original /opt/lampp folder. The thing is that I store all my images related to the website including users' profile images there(/opt/lampp/htdocs). I doubt this is the most efficient way? I have links to the images in my MySQL server.
I actually have no idea of what Amazon EBS and Amazon S3 is, how can I utulize it?
EBS is like an external usb hard drive. You can easily access content from filesystem (/mnt/).
S3 is more like an API based cloud storage. You'll have much more work to integrate it into your system.
You have a pretty good summary here :
http://www.differencebetween.net/technology/internet/difference-between-amazon-s3-and-amazon-ebs/
Google has a lot of infos about this.