mounting a git http repository using davfs - apache

I'm trying to convert some projects at work from subversion to git. The projects are websites and our current subversion setup uses davfs to mount the repository and point apache's document root there. This way apache in dev runs the code currently checked-into the svn repository.
mount:
mount.davfs http://code.repository/svn/site.com /mnt/davfs/site.com
httpd.conf:
ServerName site.com
DocumentRoot /mnt/davfs/site.com
I'm looking for a way to mimic this setup with git. But, from what I understand, mounting a git repository (yes, our git repo is accessed over http) this way will result in the git repository internals showing up as the docroot and not the code itself.
example:
ls /mnt/davfs/gitrepository
Parent Directory
HEAD
branches/
config
description
hooks/
info/
objects/
refs/
Does anyone know if there is a way to achieve the desired effect?
Thanks!

If you want to be able to browse the code, you should be using something like gitweb. If you want to push/pull from the repo, then the internals should be showing up as the docroot.
In a bare repository (the kind that you would use for such a central repo, since you generally don't want to push to non-bare repos), there is no actual checkout of the code files on disk, the only things in that bare repo are the "git internals".
If you want to get a copy of the code on the server out of the repository, you probably want to use git archive - possibly in a post-receive hook if you want it to run every time new code is pushed to the repository. See the following man pages for details:
http://www.kernel.org/pub/software/scm/git/docs/git-archive.html
http://git-scm.com/docs/githooks

Well the git repo is another beast. If you want to browse source code, you need something like gitweb. Mind you, if you just pointed to a svn repo, you'd be looking at internals too; The /mnt/davfs/site.com is probably hosted with apache mod-dav-svn which does something similar as gitweb would do.
You'll want to look at gitweb or competition. Gitweb is IMHO the simplest to setup
https://git.wiki.kernel.org/index.php/Gitweb
For sharing your repository (to make it clonable, e.g.) just serve the tree as static HTTP pages (as docroot directly) because davfs will not (reliably) make it possible for others to push to your repo anyway.
Pushing would be done using the Smart HTTP server, git-daemon or over ssh

Related

can't open fossil repo over web

I've been strugling for a couple of days with this problem, but can't seem to fix it, I think I'm almost there.... but... not quite :(
This is where I am at.
I'm on a headless debian server, running virtualmin / webmin for creating my domains / users etc. I don't know if this will mess things up, but I'm happy to modify the config files manually (via webmin or via ssh/vim).
I am attempting to run fossil as a cgi service over apache.
its an internal site, named as homeserver.net I can reach the default pages just fine, and add in and create links etc as I want to.
Please note that the solution to my problem is at the end of the question.
so the files are located on disk at, which tallys up with my apache document root
/home/homeserver/www
I would like to run fossil to have both the internal site, and later on and dev work that I practice on in separate files. So I created a new directory for these repositories.
/home/homeserver/repos/web/site.fossil
/home/homeserver/repos/dev/ [no repository yet!]
reading the instructions on the fossil page I have inserted a short cgi file called 'fos_repo.cgi' that reads as.
#!/usr/bin/fossil
directory: /home/homeserver/repos
notfound: http://www.homeserver.net/site404.htm
when I open the link to
www.homeserver.net/cgi-bin/fos_repo.cgi
I get redirected to the 404 page that I have written. So the script is clearly being read and working.
From reading the fossil pages I understand that I should be able to use the following link to open/access the repo.
www.homeserver.net/cgi-bin/repos/web/site
I'm not sure why this isn't working...
so far I have tried the following.
I opened the repository from the cli, and had the server run in the background
fossil server site.fossil &
I though maybee the file should have been inside the main repo directory, not inside a sub directory, so I moved it... it now lives in
/home/homeserver/repos/site.fossil
I tried creating an alias to the file in apache
Alias /home/homeserver/repos/web/site.fossil /home/homeserver/www/repos
When I browse to
www.homeserver.net/repos/site
I get nothing, but going to
www.homeserver.net/repos/site.fossil
will attempt to downloaded the file (which is a binary)
so I think I'm getting somewhere, but I'm not sure what I'm missing.
I've used fossil before, but I ran it as a local server, and started it up as and when I needed it.
I'm running it like this so as I can eventually push the site out to a live VPS (maybe even finish up hosting the fossil site on the VPS also).
ps I really liked fossil when I used it before, and loved the whole integrated wiki and bug tracker, and the fact I could simply copy the file to my external drive to do a backup. Personally don't really want to change to something else, but if I have to....
thanks in advance.
David
Edit: trying other options.
So I thought I would try the single repository method shown on the fossil page, so adjusted my cgi script accordingly.
Now when I navitage to : www.homeserver.net/cgi-bin/fos_repo.cgi I get the following message returned
SQLITE_CANTOPEN: cannot open file at line 30276 of [f5b5a13f73]
SQLITE_CANTOPEN: os_unix.c:30276: (21) open(/home/homeserver/repos)
however if I ssh to the server an start it manually with
fossil server site.fossil
I can get to the server with www.homeserver.net:8081
So I either have a problem with my SQLite usage in apache or something else wrong. Plesse help
Solution
So for reasons of simplicity I've decided that using a single cgi file for each repo is what I am going to go with.
My initial directory structure was as follows:
/home/homeserver/www
/home/homeserver/www/repos
/home/homeserver/www/repos/web # for web site development
/home/homeserver/www/repos/dev # for other development
I think part of my problem was that I was hoping that having the directory: pont to my repos/ location fossil would find the site.fossil file (located in repos/web) and the dev.fossil file (located in repos/deb).
Obviously this didn't work.
The reason I wanted it too look like this was for separation of the information on my system.
For some reason I had decided that pointing fossil as repos/ would give me a nice fossil style front page and links to my repositories automatically. However After having used the directory: version and getting the following error message
Unable to find or open the project repository
I realised that I was still going to need to write my front page to the repositories, and that my expectation was a little too much.
So I've decided to run with a single cgi file pointing to each repo that i need to make.
Instead of
www.homeserver.net/cgi-bin/repos/web/site
try
www.homeserver.net/cgi-bin/repos.cgi/index
Reading your ( very long ) question again, I suggest trying
www.homeserver.net/cgi-bin/fos_repos.cgi/index

Issue Listing multiple SVN repositories from client

I've already setup Apache to manage svn requests.
Basically the structure of the svn related directory is this:
/Repository
-----OneRepo
-----TheOtherRepo
Repository is a "normal" directory, while OneRepo and TheOtherRepo are svn repositories.
I've used SVNParentPath and SVNListParentPath directives and if I go to localhost/Repository/ (with my browser) I can see all my repositories.
Now, if I try to access a single repository (for example: OneRepo) from a client (in my case Cornerstone but Versions is the same) everything works fine.
The problem is that I would like to access the repository listing from the client so that I have a big "folder" with all my projects in it. Does it make sense?
So, instead of writing http://192.168.x.x/Repository/OneRepo in my client (and it works) I would like to write http://192.168.x.x/Repository/ and view a listing of project and so checkout whatever project I would like to.
Is that possible?
Thanks
This works only in a http browser. So your standard SVN Client (commandline , TortoiseSVN, etc.) can not list your repositories

git / github and web server deployment configuration

I'm running an Apache web server and was wondering what's the best way to deploy changes (from github) to the web server?
/var/www/ right now is only writable by root.
Should I have my git project directly in /var/www/? (so it creates /var/www/.git/?)
However, when I need to run commands (i.e. sudo git push) wouldn't work (since my ssh keys are not under sudo).
Would I be better off making /var/www/ writable by myself (and not just root)? Or should I add ssh keys to the root user? Or should I do something else entirely?
Thanks.
I use rsync to sync the contents of my local machine with the server, and if you're just deploying to one server, then it's pretty simple (and Capistrano is overkill.). I put the following aliases in ~/.bash_profile:
alias eget='rsync -avie ssh matt#example.com:sites/example.com/www/ ~/Projects/example/example.com/www/ --exclude .DS_Store --exclude ".git*" --delete-after'
alias edep='rsync -avuie ssh ~/Projects/example/example.com/www/ matt#example.com:sites/example.com/www/ --exclude .DS_Store --exclude ".git*" --delay-updates --delete-after'
Then, from the git repo on my local machine. I do:
git commit -am 'commit some changes'
git pull --rebase # pull any new changes from remote (--rebase prevents an unnecessary merge commit.)
eget -n # confirm which files I've changed
If it looks fishy, I could do eget without the -n and then just do a git diff -w. Then, I could do git checkout -- path/to/file for the files I want to keep my changes for. Then, I commit the changes that were on the server that I didn't get yet. This would only happen if the files on the server are changing in a different way than from deployments. Otherwise, you know that your local version is always more up to date than the files on the server and so don't have to worry about overwriting things on the server that you don't yet have on your local. Continue...
edep -n # just see what files will be deployed/updated/etc.
edep # looks good. Deploy for real.
Done!
Check out the rsync(1) Mac OS X Manual Page for more info.
Another option is to use the Git post-receive hook. But, you'll have to install Git on the server to do that. Also, I recommend putting the .git directory outside of your public www directory for security & cleanliness reasons. You can do this with the Git core.worktree configuration option. For example, from ~/git/example.com.git, do git init --bare; git config core.worktree ~/sites/example.com/. That makes ~/git/example.com.git like the .git dir for ~/sites/example.com/.
Create a central repository, use the branching of git to create different branches for different purposes, and never serve all of your repository publicly, nor should you ever serve your .git directory publicly (since that's the same as serving everything you ever did with the code or put in the repository to the public). Off the top of my head, here are the steps I recommend, from my own experience:
Create a central/hub repository for the code. (optional, but recommended. Even better is using github.com for your central repository). You can then check out local copies for local deployments, e.g. when you want to recreate the site on your laptop. Not necessary, but very convenient and makes sure your site is portable. You can have a staging repository and staging branch for development purposes. You can also have a repo and branch for production purposes.
Create a explicitly public directory in the repository that is not the root of the repository: E.g. Create a /www/ or /served/ or /public/ directory within the repository. This is stuff that will be publicly available and indexable by search engines, so be careful what goes in there. Assume that anything that goes in there is public knowledge, cached for eternity, and will be the target of security vulnerability attacks (because that could easily be the truth).
Create the dev repository: git clone the central repository on the server (e.g. cd /home/tchal then git clone git#github.com:tchalvak/ninjawars.git ) , though ideally in a folder that has shared permissions for your developer group.
Create a symbolic link for you development site: cd /var/www/ , ln -s /url/to/shared/repository/public/ nickNameForDevSiteHere , creating a symbolic link to only the served/public files of the site, creating a simple development level site. (optional, but recommended). In that manner, the dev site can easily be accessible via some ip and a nickname, e.g. http://10.0.1.123/publicdevelopmentsitenickname without the need of a real domain name.
Specify the live & deployed code commit. You may well want to create a live-branch for whatever code is currently "live", just be aware that this branch will probably have to be forcibly overwritten periodically, e.g. git branch live-branch git push -f origin live-branch. Consider it a snapshot of your code, and not a branch that will stay stable.
When you're sure that the dev site has been tested well enough, either deploy the live-branch code manually, via a custom deploy script, or use a distinct repository with the live-branch checked out in it, serving only the explicitly public content, similar to the dev site.
create a virtualhost in apache for the domain name. For example, you could use something like:
<VirtualHost *>
ServerName greatdomain.com
ServerAlias www.greatdomain.com
DocumentRoot /srv/greatdomain/www/
</VirtualHost>
That's a huge topic, so if you're not clear on all of the details, I recommend getting into further research of setting up a virtualhost in apache.
Point your DNS for the domain name at the ip of the server.
In summary, you can pretty easily use git to deploy all of your code using specific-to-each-deployment-type branches. Won't help with syncing, for example, databases between deployments, but that is a step that you could figure out after you have things running, as a second tier of deploying a site, and do it manually in the meantime.

Structuring a central Git remote to serve production Apache sites

I've been referring to http://toroid.org/ams/git-website-howto as a starting point for a production web server running on a managed VPS. The VPS runs cPanel and WHM, and it will eventually host multiple client websites, each with its own cPanel account (thus, each with its own Linux user and home directory from which sites are served). Each client's site is a separate Git repository.
Currently, I'm pushing each repository via SSH to a bare repo in the client's home folder, e.g. /home/username/git/repository.git/. As per the tutorial above, each repo has been configured to checkout to another directory via a post-receive hook. In this case, each repo checks out to its own /home/username/public_html (the default DocumentRoot for new cPanel accounts), where the files are then served by Apache. While this works, it requires me to set up (in my local development environment) my remotes like this:
url = ssh://username#example.com/home/username/git/repository.git/
It also requires me to enter the user's password every time I push to it, which is less than ideal.
In an attempt to centralize all of my repositories in one folder, I also tried pushing to /root/git/repository.git as root and then checking out to the appropriate home directory from there. However, this causes all of the checked-out files to be owned by root, which prevents Apache from serving the site, with errors like
[error] [client xx.xx.xx.xx] SoftException in Application.cpp:357: UID of script "/home/username/public_html/index.php" is smaller than min_uid
(which is a file ownership/permissions issue, as far as I can tell)
I can solve that problem with chown and chgrp commands in each repo's post-receive hook--however, that also raises the "not quite right" flag in my head. I've also considered gitosis (to centralize all my repos in /home/git/), but I assume that I'd run into the same file ownership problem, since the checked-out files would then be owned by the git user.
Am I just approaching this entire thing the wrong way? I feel like I'm completely missing a third, more elegant solution to the overall problem. Or should I just stick to one of the methods I described above?
It also requires me to enter the user's password every time I push to it, which is less than ideal
It shouldn't be necessary if you publish your public ssh key to the destintion account ".ssh/authorized_keys" file.
See also locking down ssh authorized keys for instance.
But also the official reference Pro Git Book "Setting Up the Server".

Using Bazaar to handle Website Versioning

I imagine this is a pretty basic question but I haven't been able to find an answer anywhere.
I develop websites. In the past I've handled all the live files manually and it stinks, of course. I've been hoping Bazaar could add some power and organization to the way we work.
Right now, I work with a local server on my laptop and want to gracefully push data onto the live server. Currently, I'm doing the following:
Local machine:
bzr push sftp://user#server/path/to/project/BZR/live
On server:
rm -r /path/to/project/live
bzr branch /path/to/project/BZR/live
Is there anyway to get the Local files live from the push?
Otherwise, is a branch to the live path correct?
Is there anyway to get Bazaar to just update changed files in the live path so that I don't have to delete /live each time?
Right now I have to manually edit .htaccess with each upload. If I didn't have to delete /live, I imagine I could tell bzr to ignore it and all would take care of itself.
Thanks for your help!
-Nicky
Check bzr-upload plugin, and also push-and-update plugin.