Is there away to backup all installed applications/RPMs/packages/ (even repositories) as is (with same exact versions/patches) on 1 script that can re-install them on a fresh bare bone server of the same specs
Note: I can't do any image or CloneZilla tricks
Note: there are some 3rd party software that is not offered by any repos ... solution should contain a backup of these packages (preferably all)
thanks!
As noted in a comment, you can backup the RPM database, but that is only one part of replicating your configuration to another server:
RPM's database records almost all of the information regarding the packages you have installed. Using the database, you could in principle do something like a script that used cpio or pax to append all of the files known to the RPM database to a suitably large archive area. rpm -qa gives a list of packages, and rpm -qlpackage gives a list of files for the given package.
However, RPM's database does not necessarily record files created by package %pre and %post scripts.
Likewise, it does not record working data (such as a MySQL database) that may be in /var/lib.
To handle those last two cases, you are going to have to do some analysis of your system to ensure that you do not leave something behind. rpm -qfpathname can tell you who owns a given file- or directory. What you have to do is to check for cases where RPM does not know who owns it.
Related
I am trying to use bzr for installing OpenERP. The problem is that I have a very slow internet connection.
When I try "sudo bzr branch lp:openobject-addons/7.0 addons" it takes too much time and sometimes the connection is broken. My questions are:
How can I resume the process on connection broken since every time I repeat the command I get an error "folder already exists..."
Is there any way I can restore a local backup of the files and folder structure and then just compare those files/folders with the files on the server and just upgrade the changed files/folders via bzr? This could be a solution for my slow internet connection.
If I sucessfully download all the files from a branch, which command should I use later to verify if there is any change on the files on the server and if so, how can I update this changes?
Thank you very much
Best regards
Paulo
What takes a lot of time and bandwidth is not transferring the OpenERP addons files themselves, but the repository containing the whole versioning history. It has grown quite big over the years, due to the number of commits as well as the daily translation updates exported by Launchpad.
Answering your points one by one:
If you don't actually need the revision history, you can grab a "lightweight checkout" of the addons instead of a full checkout, by using this command:
bzr checkout --lightweight lp:openobject-addons/7.0 addons
It will be much faster but will only get the files, not the history. You'll still be able to use bzr pull to grab the latest changes from upstream. See also the doc about bzr checkout.
Now if you still want a full checkout you can use the trick of grabbing only a few hundred revisions at a time (there are about 9000 in addons 7.0 right now), so you can resume at any time even after a timeout:
$ bzr branch lp:openobject-addons/7.0 addons -r 100 # grab first 100 revs
$ cd addons
$ bzr pull -r 1000
$ bzr pull -r 2000
$ bzr pull -r 3000
$ ...
There's no easy way to completely bootstrap a full addons checkout unless you manage to perform a full checkout on another machine or internet connection, in which case you should be able to simply transfer the directory (most importantly the .bzr it contains) on any other machine.
In order to see the difference between a local branch/checkout and another repository you can use bzr missing, for example bzr missing lp:openobject-addons/7.0. You can then grab the latest changes from that repository (provided it is compatible with yours) using bzr pull.
Now you should really have a look at the bzr documentation in order to get more information about the typical use cases. The documentation also contains a "bzr cheat sheet" that may help you.
Unfortunately I don't think you could resume a bzr branching.
OpenERP's official website does provide source code nightly builds,
but they use a different structure. I'd recommend you ask a friend
who has a faster Internet connection to bzr branch the source code
repositories and transfer them to you.
You could do bzr pull to get the latest changes and merge them
How to save/restore Sublime Text 2 configs/plugins to migrate to another computer? states that, to backup a Sublime Text 2 installation, a user should preserve the ~/Packages/User directory (from the user's local data folder on whatever OS they're using).
However, http://andrew.hedges.name/blog/2012/01/19/sublime-text-2-more-sublime-with-a-drop-of-dropbox and most other walkthroughs for using Dropbox to sync Sublime's settings specify three directories: ~/Packages, ~/Installed Packages and ~/Pristine Packages.
What is the functional difference between backing up just ~/Packages/User, and the other 3 directories?
I think that Packages/User is the one in which you are supposed to put settings (according to Sublime's official and unofficial documentation). However, some people put them in the other folders from time to time.
The Dropbox advice may be a hedge against poor practice.
From here:
Installed Packages is:
You will find this directory in the data directory. It contains a copy
of every sublime-package installed. Used to restore Packages.
These are the packages installed as sublime-packages. I don't think package control uses this, but if you install something as a sublime-package maybe you want to keep it?
Pristine Packages is:
You will find this directoy in the data directory. It contains a copy
of every shipped and core package. Used to restore Packages.
So essentially a list of .sublime-package files used to restore if you break something.
Packages is:
The packages used by Sublime Text, either installed as part of sublime, or the plugins.
User is:
The user directory is your personal directory, containing configurations, additional snippets, etc.
Below are my personal views on what to save, so feel free to ignore it if you would like.
I would have to agree with the post saying just save the User directory, as Package Control will grab all of the plugins in the list if they aren't already installed. I didn't see this mentioned in that post, but you can also add repositories (by specifying a URL) to Package Control, which allows you to install Packages outside of those submitted to Package Control, but still hosted somewhere. One of the arguments I can see to saving the Packages directory completely is if you are using plugins that aren't hosted anywhere (though these could probably be moved to the Packages directory without any problems).
The Installed Packages and Pristine Packages are used to restore packages, so I wouldn't think these would be needed, but I'm sure there is some use case where it is.
Anyways, realize I got off topic a bit at the end there, but hope everything before that helps clarify.
I would like to use a rpm to build subpackages for different environments (live,testing,developer) but for the same files, so having a package called name-config-live, one called name-config-testing and one called name-config-developer and in them to have the same paths but each with the configs corresponding to the environment it's named after.
as an example
let's say on all environments I have a file called /etc/name.conf and on testing I want it to contain "1", on development "2" and on live "3". Is it possible to do this in the same spec since the subpackage generation only happens last not in the order I enter it. ( and hopefully not with %post -n )
I tried using BuildRoot but it seems that's a global attribute
I don't think there's a native way; I would do a %post like you had noted.
However, I would do this (similar to something I do with an internal-only package I develop for work):
Three separate files /etc/name.conf-developer, /etc/name.conf-live, etc.
Have all three packages provide a virtual package, e.g. name-config
Have main package require name-config
This will make rpm, yum, or whatever require at least one be installed in the same transaction
Have all three packages conflict with each other
Have each config package's %post (and possibly %verify) symlink /etc/name.conf to the proper config
This also helps show the user what is happening
Cons:
It's a little hackish
rpm --whatprovides /etc/name.conf will say it is not owned by any package
I use the deb file for production and the source for development.Is this the correct way to do things?
I think that the deb might have certain optimizations(pyo or pyc) for production environment.
But since I have to move my custom modules, one at a time to the production,I find it increasingly difficult.
The actual addons path is here
(1) /usr/share/pyshared/openerp/addons
But the init.d points to
(2) /usr/lib/pymodules/python2.7/openerp/addons
In some modules the __init__.py is in 1 (eg: web_rpc)
and for some its in 2 (eg: hr)
What the actual difference btw
http://nightly.openerp.com/6.1/nightly/src/
and
http://nightly.openerp.com/6.1/nightly/deb/
I haven't tried the deb files, because we use the Ubuntu all-in-one script from openerpappliance.com. It downloads the source from Launchpad and then runs the deployment scripts for you. It will also do updates after you've installed.
We're very happy with the 5.0 version, but we haven't tried the 6.1 version, yet.
you can do with 6.1 is you can give multiple addons path to the your config file in comma separated , or else you can create link in addons folder for your customized folder while you can keep cumized module where you want, just put link(shortcut) to the addon sfoderl of your. this will give you flexibility.
Thank YOu
When I am in my dept's server, I cannot use commands such as "apt-get install nethack". I have to build the nethack from Binary files to get it working, at least so I have been told. I cannot understand the reason. Why do I need to build things from binaries? Why is the use of the commands, such as "apt-get", forbidden? Why do I not need Root access to build from binaries?
apt-get is a system-level command that installs packages for all users.
If you download and compile, you are only creating local "copies" of the binaries, not system-wide. If you tried to complete the install process with make install this would most likely fail because you do not have sufficient privileges to install the program for all users' access (same reason you can't run apt-get install)
When you compile a program from source, you can give it the '--prefix=~/'. This causes it to install relative to your own home directory (so binary programs typically end up in '~/bin', man pages in '~/man' etc). This poses no problems because you already have permission to write here.
Apt-get on the other hand installs the packages in the global filesystem ('/bin/', '/usr/bin/', etc), which can impact other users and so, quite rightly, require administrative access.
If you want to install some program you can use the command
apt-get source app-name
This will work even if you are not root since it only fetch the source code to the app-name and put it in the current directory, which is easier than having to track down the source and there is a better chance to get it work, since you download the version that should work on your system.
Alternatively you should bug your sysadmin to install the programs you need, since it is his job (and if you need them, chances are that the rest of your team does too).
Because apt-get will install a program system wide.
The locations to which apt-get writes installed files (/bin, /usr/bin, ...) are restricted to root access. I imagine that when you build from source you're not executing the install step of the bulid. You're going to need to set a prefix for the installation such that the packages end up somewhere you can write. This thread talks a bit about setting prefixes for apt-get and you'll probably want to set your prefix to something like
~/software/
and then add the resulting bin directories to your PATH.