git / github and web server deployment configuration - apache

I'm running an Apache web server and was wondering what's the best way to deploy changes (from github) to the web server?
/var/www/ right now is only writable by root.
Should I have my git project directly in /var/www/? (so it creates /var/www/.git/?)
However, when I need to run commands (i.e. sudo git push) wouldn't work (since my ssh keys are not under sudo).
Would I be better off making /var/www/ writable by myself (and not just root)? Or should I add ssh keys to the root user? Or should I do something else entirely?
Thanks.

I use rsync to sync the contents of my local machine with the server, and if you're just deploying to one server, then it's pretty simple (and Capistrano is overkill.). I put the following aliases in ~/.bash_profile:
alias eget='rsync -avie ssh matt#example.com:sites/example.com/www/ ~/Projects/example/example.com/www/ --exclude .DS_Store --exclude ".git*" --delete-after'
alias edep='rsync -avuie ssh ~/Projects/example/example.com/www/ matt#example.com:sites/example.com/www/ --exclude .DS_Store --exclude ".git*" --delay-updates --delete-after'
Then, from the git repo on my local machine. I do:
git commit -am 'commit some changes'
git pull --rebase # pull any new changes from remote (--rebase prevents an unnecessary merge commit.)
eget -n # confirm which files I've changed
If it looks fishy, I could do eget without the -n and then just do a git diff -w. Then, I could do git checkout -- path/to/file for the files I want to keep my changes for. Then, I commit the changes that were on the server that I didn't get yet. This would only happen if the files on the server are changing in a different way than from deployments. Otherwise, you know that your local version is always more up to date than the files on the server and so don't have to worry about overwriting things on the server that you don't yet have on your local. Continue...
edep -n # just see what files will be deployed/updated/etc.
edep # looks good. Deploy for real.
Done!
Check out the rsync(1) Mac OS X Manual Page for more info.
Another option is to use the Git post-receive hook. But, you'll have to install Git on the server to do that. Also, I recommend putting the .git directory outside of your public www directory for security & cleanliness reasons. You can do this with the Git core.worktree configuration option. For example, from ~/git/example.com.git, do git init --bare; git config core.worktree ~/sites/example.com/. That makes ~/git/example.com.git like the .git dir for ~/sites/example.com/.

Create a central repository, use the branching of git to create different branches for different purposes, and never serve all of your repository publicly, nor should you ever serve your .git directory publicly (since that's the same as serving everything you ever did with the code or put in the repository to the public). Off the top of my head, here are the steps I recommend, from my own experience:
Create a central/hub repository for the code. (optional, but recommended. Even better is using github.com for your central repository). You can then check out local copies for local deployments, e.g. when you want to recreate the site on your laptop. Not necessary, but very convenient and makes sure your site is portable. You can have a staging repository and staging branch for development purposes. You can also have a repo and branch for production purposes.
Create a explicitly public directory in the repository that is not the root of the repository: E.g. Create a /www/ or /served/ or /public/ directory within the repository. This is stuff that will be publicly available and indexable by search engines, so be careful what goes in there. Assume that anything that goes in there is public knowledge, cached for eternity, and will be the target of security vulnerability attacks (because that could easily be the truth).
Create the dev repository: git clone the central repository on the server (e.g. cd /home/tchal then git clone git#github.com:tchalvak/ninjawars.git ) , though ideally in a folder that has shared permissions for your developer group.
Create a symbolic link for you development site: cd /var/www/ , ln -s /url/to/shared/repository/public/ nickNameForDevSiteHere , creating a symbolic link to only the served/public files of the site, creating a simple development level site. (optional, but recommended). In that manner, the dev site can easily be accessible via some ip and a nickname, e.g. http://10.0.1.123/publicdevelopmentsitenickname without the need of a real domain name.
Specify the live & deployed code commit. You may well want to create a live-branch for whatever code is currently "live", just be aware that this branch will probably have to be forcibly overwritten periodically, e.g. git branch live-branch git push -f origin live-branch. Consider it a snapshot of your code, and not a branch that will stay stable.
When you're sure that the dev site has been tested well enough, either deploy the live-branch code manually, via a custom deploy script, or use a distinct repository with the live-branch checked out in it, serving only the explicitly public content, similar to the dev site.
create a virtualhost in apache for the domain name. For example, you could use something like:
<VirtualHost *>
ServerName greatdomain.com
ServerAlias www.greatdomain.com
DocumentRoot /srv/greatdomain/www/
</VirtualHost>
That's a huge topic, so if you're not clear on all of the details, I recommend getting into further research of setting up a virtualhost in apache.
Point your DNS for the domain name at the ip of the server.
In summary, you can pretty easily use git to deploy all of your code using specific-to-each-deployment-type branches. Won't help with syncing, for example, databases between deployments, but that is a step that you could figure out after you have things running, as a second tier of deploying a site, and do it manually in the meantime.

Related

I want to turn my localhost server into a real website

I created an application that runs on a localhost server using expressjs. And I also bought a domain.
I'm wondering if there is a way to take that localhost server and turn it into a real shared server
I tried once to use a hosting service like hostgator but I still don't know how I can turn the express app into a real website.
I have no experience with any web development services so please don't tell me to use ....... whatever because I will have no idea what that is.
For one thing it is not clear how your website actually works: if it is only express does it generate HTML or is it purely JSON passed to browser clients via get requests (to each their own).
There are so many options as to how you might do this: one of the best options is to first make sure your server runs on Docker. Find a tutorial on YouTube/google/Stack Overflow/Blogs on how to run your Express server with docker. If you do that you can deploy it to a Container manager like Google/Amazon/Digital Ocean. If this seems hard to you there are other options.
Presumably you run your server with something like npm start. This guide can show you how to do essentially that but on a cloud computer.
Before you begin make sure that you're locally working server is checked in to a cloud Git provider like Github, GitLab, Bitbucket, etc.
Since Amazon AWS, and Google Cloud have free tier or options for hosting for free for a certain amount of time (AWS 1 year) or for a certain amount of money (Google Cloud). These two seem like viable place to start.
If you find the option that you'd like you'll need to:
create an account
Create a server (choose a cheap one especially initially like mice/small/cheap etc).
Find a tutorial on how to "SSH" into that server (which basically means remotely control the terminal on that server). Google actually makes this fairly easy there's a big button that says SSH into this server.
Once you've logged into that Computer you'll be able to run the same commands you probably normally do on your home computer:
The computer you'll be getting is likely to be a virtual Linux Computer probably something like Linux Ubuntu. Find a tutorial on how to get git and node installed there (but it's something like sudo apt-get update && sudo apt-get install git node).
Once you have git and node try mkdir www and cd into that: mkdir www && cd project (This isn't critical but conventional.)
Copy the link that allows you to "Clone your repo using HTTPS" (there's a link at the top right of your GitHub (or others) repo that allows you to do that. You'll need to enter your password
Now all the files that you had on your computer are on this new computer.
Next you'll have to probably npm i to install your dependent NPM packages. (This assumes you properly used .gitignore to prevent GitHub from being filled with extra copies of your npm packages.)
Now you should be able to run your code as usual: npm run start
If all those steps work you'll want to get something that will run these "forever" like https://www.npmjs.com/package/forever npm i -g forever or even better: https://www.npmjs.com/package/pm2 will allow you to continuously run your express server.
Finally, you'll need to configure this server on AWS/Google/whatever service you're using to push traffic coming in on port 80 and 443 to port 3000 and open traffic to all. And depending on the service you chose that's different so find a tutorial for doing just that part.
This will only allow people across the internet to see your service on an AWS URL or a google URL. But it's a good chance to make sure everything works perfectly. Once you're happy with everything associate your purchased domain with that special AWS/Google domain. You can do that on the AWS side, or the GoDaddy/NameCheap/where-ever you bought your domain side.
For the docker option you can download aws-cli tools and upload your built docker container to AWS and have it available. Find a tutorial to do that.
Essentially your question is very broad so I sometimes brushed over some details, but this is essentially what you have to do.

SVN+Apache: Where to store repositories

I have been keeping all my Subversion repositories on my local computer for a while, but now I decided to move them to my web hosting server. It's an apache server and the hosting company has set up svn. My question is, where should I store my svn repositories. I originally stored them in the public_html/ directory, but (I'm certainly no security expert) I think only publicly available web content should be stored there. On the other hand, if I try storing the repositories in ~/var/svn/ then my subversion client (Eclipse) says "no element found". How do other people store their repositories with regards to Apache? Thank you.
You can select any physical location for your repositories collection, because logical path is defined inside Apache (Location container + SVNPath|SVNParentPath) later. You have only
select big partition (repository may require a lot of space)
don't forget chown|chmod repo dirs, in order to give Apache process the ability to handle files of repository
/var (/var/repos/) is good candidate for repo-root
Typically, you would store data in an appropriate subdirectory of /var (not ~/var), that is if you have access to it. E.g. /var/svn
We haven't adopted SVN for our web source yet, but have been using CVS. Our solution was to simply prevent Apache from serving the CVS store using IndexIgnore
IndexIgnore .??* *~ *# HEADER* README* RCS CVS *,v *,t
You may be able to do something similar, using something based upon .svn.
It will be difficult if you are using a shared web hosting, because you need to have an access to root, to create svn-group and creating the structure of the repository. You also need to install some module on Apache like dav_svn. You will need to create a VirtualHost (sites-available) in order to serve your repos over a specific DNS.
There is a lot of tutorial on the inter-webs -> http://www.debuntu.org/how-to-subversion-svn-with-apache2-and-dav/ (for ubuntu)
Me, I rent a Virtual Private Server to host my SVN.

Rails and Git push, Git pull: logs return same commit, changes aren't made

I've got a local branch (master), a GitHub repo (origin), and another remote repo (server). I've set up my remote so locally I can type git push server master and push the changes to server/master.
When I type git log -1 locally and on the server they return the same commit, but none of the changes I made locally are visible on the server.
I deployed my app with Capistrano so redeploying it makes the changes visible immediately, but I don't want to have to redeploy every time I make a change.
Any idea what's going on here? I'm rather new to Git. Hopefully it's something easy to fix.
It sounds to me that you maybe looking at a different directory than your web server is pointed to.
When I setup capistrano:
cap deploy:setup
#then
cap deploy
it creates a directory structure similar to:
/releases (each deploy gets its own randome number directory)
/current (a sym-link to the latest release)
/shared
None of these folder are tied to git. Which makes me think your webserver may not be pointed to the same directory that you're using with git.
--
You may find cap deploy is preferable as you'll be able to see the output if there are any issues.
I'm not a huge expert but the above is how I've setup rails with Capistrano.
Normally origin is the name used for the GitHub remote and not master. You can check what remotes you have by doing git remote show. If you want even more detail on the remote use git remote show origin (or whatever your remote is called, if it is not origin). This will give you a list. I suspect that what you have is actually two local branches (master and server). Try doing a git push server origin. This will take your server branch and put it on GitHub.
Alternatively
If you are trying to combine the changes in your server branch with your master branch then use checkout master and then git merge server. This will merge your changes from the server branch into your master branch and you can then upload to GitHub via git push master origin.

mounting a git http repository using davfs

I'm trying to convert some projects at work from subversion to git. The projects are websites and our current subversion setup uses davfs to mount the repository and point apache's document root there. This way apache in dev runs the code currently checked-into the svn repository.
mount:
mount.davfs http://code.repository/svn/site.com /mnt/davfs/site.com
httpd.conf:
ServerName site.com
DocumentRoot /mnt/davfs/site.com
I'm looking for a way to mimic this setup with git. But, from what I understand, mounting a git repository (yes, our git repo is accessed over http) this way will result in the git repository internals showing up as the docroot and not the code itself.
example:
ls /mnt/davfs/gitrepository
Parent Directory
HEAD
branches/
config
description
hooks/
info/
objects/
refs/
Does anyone know if there is a way to achieve the desired effect?
Thanks!
If you want to be able to browse the code, you should be using something like gitweb. If you want to push/pull from the repo, then the internals should be showing up as the docroot.
In a bare repository (the kind that you would use for such a central repo, since you generally don't want to push to non-bare repos), there is no actual checkout of the code files on disk, the only things in that bare repo are the "git internals".
If you want to get a copy of the code on the server out of the repository, you probably want to use git archive - possibly in a post-receive hook if you want it to run every time new code is pushed to the repository. See the following man pages for details:
http://www.kernel.org/pub/software/scm/git/docs/git-archive.html
http://git-scm.com/docs/githooks
Well the git repo is another beast. If you want to browse source code, you need something like gitweb. Mind you, if you just pointed to a svn repo, you'd be looking at internals too; The /mnt/davfs/site.com is probably hosted with apache mod-dav-svn which does something similar as gitweb would do.
You'll want to look at gitweb or competition. Gitweb is IMHO the simplest to setup
https://git.wiki.kernel.org/index.php/Gitweb
For sharing your repository (to make it clonable, e.g.) just serve the tree as static HTTP pages (as docroot directly) because davfs will not (reliably) make it possible for others to push to your repo anyway.
Pushing would be done using the Smart HTTP server, git-daemon or over ssh

Structuring a central Git remote to serve production Apache sites

I've been referring to http://toroid.org/ams/git-website-howto as a starting point for a production web server running on a managed VPS. The VPS runs cPanel and WHM, and it will eventually host multiple client websites, each with its own cPanel account (thus, each with its own Linux user and home directory from which sites are served). Each client's site is a separate Git repository.
Currently, I'm pushing each repository via SSH to a bare repo in the client's home folder, e.g. /home/username/git/repository.git/. As per the tutorial above, each repo has been configured to checkout to another directory via a post-receive hook. In this case, each repo checks out to its own /home/username/public_html (the default DocumentRoot for new cPanel accounts), where the files are then served by Apache. While this works, it requires me to set up (in my local development environment) my remotes like this:
url = ssh://username#example.com/home/username/git/repository.git/
It also requires me to enter the user's password every time I push to it, which is less than ideal.
In an attempt to centralize all of my repositories in one folder, I also tried pushing to /root/git/repository.git as root and then checking out to the appropriate home directory from there. However, this causes all of the checked-out files to be owned by root, which prevents Apache from serving the site, with errors like
[error] [client xx.xx.xx.xx] SoftException in Application.cpp:357: UID of script "/home/username/public_html/index.php" is smaller than min_uid
(which is a file ownership/permissions issue, as far as I can tell)
I can solve that problem with chown and chgrp commands in each repo's post-receive hook--however, that also raises the "not quite right" flag in my head. I've also considered gitosis (to centralize all my repos in /home/git/), but I assume that I'd run into the same file ownership problem, since the checked-out files would then be owned by the git user.
Am I just approaching this entire thing the wrong way? I feel like I'm completely missing a third, more elegant solution to the overall problem. Or should I just stick to one of the methods I described above?
It also requires me to enter the user's password every time I push to it, which is less than ideal
It shouldn't be necessary if you publish your public ssh key to the destintion account ".ssh/authorized_keys" file.
See also locking down ssh authorized keys for instance.
But also the official reference Pro Git Book "Setting Up the Server".