Where to begin with managing web servers / business document file management - backup

I've inherited a couple of web servers - one linux, one windows - with a few sites on them - nothing too essential and I'd like to test out setting up back-ups for the servers to both a local machine and a cloud server, and then also use the cloud server to access business documents and the local machine as a back-up for these business documents.
I'd like to be able to access all data wherever I am via an internet connection. I can imagine it running as follows,
My PC <--> Cloud server - access by desktop VPN or Web UI
My PC <--> Web Servers - via RDP, FTP, Web UI (control panels) or SSH
My PC <--> Local Back-up - via RDP, FTP, SSH or if I'm in the office, Local Network
Web servers --> Local Back-up - nightly via FTP or SSH
Cloud Server --> Local Back-up - nightly via FTP or SSH
Does that make sense? If so, what would everyone recommend for a cloud server and also how best to set up the back-up server?
I have a couple of spare PC's that could serve as local back-up machines - would that work? I'm thinking they'd have to be online 24/7.
Any help or advice given or pointed to would be really appreciated. Trying to understand this stuff to improve my skill set.
Thanks for reading!

Personally I think you should explore using AWS's S3. The better (S)FTP clients can all handle S3 (Cyberduck, Transmit, etc.), the API is friendly if you want to write a script, there is a great CLI suite that you could use in a cron job, and there are quite a few custom solutions to assist with the workflow you describe. s3tools being one of the better known ones. The web UI is fairly decent as well.
Automating the entire lifecycle like you described would be a fairly simple process. Here's one process for windows, another general tutorial, another windows, and a quick review of some other S3 tools.
I personally use a similar workflow with S3/Glacier that's full automated, versions backups, and migrates them to Glacier after a certain timeframe for long-term archival.

Related

How to setup Netdata to monitor my website performance?

So I have installed netdata on my machine using this tutorial https://www.how2shout.com/how-to/how-to-install-netdata-on-windows-10-wsl.html
I started it in my browser via the provided command 127.0.0.1:19999 and it only monitors and sends performance of my local machine (the laptop I'm using)
I own a website so I tried to enter my website IP + 19999 at the end but of course that did not work.
I'd like to set it up so I can measure live performance from my website.
Any idea how I can do that?
Your website runs on a server that your hosting provider owns. To use Netdata, Netdata would have to already be installed on your provider's hosting infrastructure, or you would require sufficient (effectively administrator) access to your hosting server (or servers) to install it yourself, which many hosting services would be unlikely to provide. If you are using a hosting provider that manages your website hosting you likely don't need Netdata to monitor your website performance - monitoring then is considered to be part of what you pay for.
On the other hand, if you are managing your own cloud infrastructure, it should be easy (and a good idea) to install Netdata to monitor any website servers that run on top of it.

How to rapidly publish web role cloud service, uploading only binaries, avoiding wholly restarting the VM?

Possible ways to accomplish it:
Creating dedicated WCF service for this purpose (currently my favorite option)
Using the REST API?
Azure PowerShell?
Explanation:
Publishing a web-role cloud-service takes about 10 minutes. It's much too long during development - I try to do as much as I can offline, unit-test-ish and modular, but it's just impossible to completely avoid development cycles altogether with the VM.
Apparently, the long time is mostly a result of the machine being wholly restarted, so I'm trying to find an automatic solution, like uploading and installing the binaries.
What is the best way to accomplish it?
What do you think? would it cut at least 50% of the publishing time?
Do you expect any critical problems?
The solutions proposed below are definitely against best practices and should NEVER-EVER be used in production environment.
If your objective is to quickly test your changes in your development environment, there are two ways you can go about it.
Enable RDP and copy your modified binaries or other files directly in the appropriate folders on the VM. You could enable Remote Desktop on your web role and copy the files manually in appropriate folders.
Use Web Deploy: This will only work for web roles in your project but you could enable Web Deploy on your Web Roles and use that to make faster deployment. Please see this link for more details on how to use this feature: https://msdn.microsoft.com/en-us/library/azure/ff683672.aspx.

Amazon S3 WebDAV access

I would like to access my Amazon S3 buckets without third-party software, but simply through the WebDAV functionality available in most operating systems. Is there a way to do that ? It is important to me that no third-party software is required.
There's a number of ways to do this. I'm not sure about your situation, so here they are:
Option 1: Easiest: You can use a 3rd party "cloud gateway" provider, like http://storagemadeeasy.com/CloudDav/
Option 2: Set up your own "cloud gateway" server
Set up a dedicated server or virtual server to act as a gateway. Using Amazon's own EC2 would be a good choice.
Set up software that mounts S3 as a drive. Two I know of on Windows: (1) CloudBerry Drive http://www.cloudberrylab.com/ and (2) WebDrive (http://webdrive.com). For Linux, I have never done it, but you can try: https://github.com/s3fs-fuse/s3fs-fuse
Set up a webdav server like CrushFTP. (It comes to mind because it's stable and cheap and works on any OS.) Another option is IIS but I personally find it's harder to set up securely for webdav.
Set up a user in your WebDav server (ie CrushFTP or IIS) with access to the mapped S3 drive.
Possible snag: Assuming you're using Windows, to start your services automatically and have this work, you may need to set up both services to use the same Windows user account (Services->(Your Service)->[right-click]Properties->Log On tab). This is because the S3 mapping software might not map the S3 drive for all Windows users. Alternatively, you can use FireDaemon if you get stuck on this step to start the programs as a service all under the same username.
Other notes: I have experience using WebDrive under pretty heavy loads, and it seems to work well. Under tons of pounding (I'm talking thousands of files per hour being added to a 5 TB WebDrive) it started to crash Windows. But I'm not sure if you are going that far with it. Also, if you're using EC2, you may not have that issue since it was likely caused by a huge transfer queue in memory and EC2 will have faster transit to S3 and keep the queue smaller.
I finally gave up on this idea and today I use Rclone (https://rclone.org) to synchronize my files between AWS S3 and different computers. Rclone has the ability to mount remote storage on a local computer, but I don't use this feature. I simply use the copy and sync commands.
S3 does not support webdav, so you're out of luck!
Also, S3 does not support hierarchial name spaces, so you cant directly map a filesystem onto it
There is an example java project here for putting a webdav server over Amazon S3 - https://github.com/miltonio/milton-aws

Web UI to manage computer machines in the network

I'm looking for a platform with Web UI access that allows me to do the following:
Maintain a list of computers and add / remove based on their IP address.
Provide the SSH information for each computer machine.
Monitor if the machines are up ( ping ? )
Restart the machines with a web UI using the ssh information on the backend of the application.
I'm close to start making such an app myself since I can't seem to find anything close to that in the internet. Any clues if such an application exists ?
You might want to take a look at MeshCentral: https://meshcentral.com/ - you can add systems that you are managing and do some remote operations.
http://info.meshcentral.com/: Meshcentral is open source and is both a peer-to-peer technology with a wide array of uses and web service that is targeted for remote monitoring and management of computers and devices. Users can manage all their devices from a single web site, no matter the location of the computers or if they are behind routers or proxies.
If you are looking for source code you could take a look at the "Open Manageabilty Developer's Toolkit" http://opentools.homeip.net/open-manageability. This tool was built for managing systems with Intel Active Management Technology, but it does a lot of what you are looking for. You can download the source and see if you can use any of it if you decide to write your own UI.

pushing code to windows servers in a scriptable way

I'm looking for a good way to push code quickly and securely to my company's Windows web servers for release deployments.
I have a *nix background and in the past have always used rsync in conjunction with ssh for such tasks because it is quick, secure, and scriptable.
Right now our deployment process is very manual and requires logging into each server over remote desktop and using TortoiseHg to pull code from our main repo into the server (obviously this requires the webserver to have credentials into the central Hg repo). Needless to say, this process is very human, and accordingly error prone, not to mention tedious and slow. We also have several servers that we use internally for dev staging, QA team, etc.
What I would like to know is
1) Is there a straightforward way to do this either with rsync & ssh (and cygwin or powershell).
2) What is the most accepted way to script pushing code to Windows boxes??
Thanks,
Jamie
Check out Jon Tørresdal's blog series on No-Click Web Deployment part 1 and part 2.