I would like to know from where does a package gets installed in Puppet when we write a manifest in Puppet for a package resource with below attribute.
ensure => installed,
I'm aware that there are providers in puppet which send the request to software installation tools on following path(/usr/lib/ruby/vendor_ruby/puppet/provider/package ) like yum, pip, gem, apt, etc which are responsible to get this done. However, i would like to know from which repository are these packages installed. Eg. like we configure a local yum repository when we want to install packages on the server via yum command.
Is it over the internet or the puppet nodes should be configured & connected to a local repository ? OR By default Puppet comes with pre-configured repositories for these software installation tools.
I would Appreciate a good explanation in this regards. Thanks in Advance.
If the puppet nodes should be connected to the internet or not is a question that you need to answer yourself in terms of how you would like your architecture to look like. Puppet only runs package install commands that should also work if you run it yourself in the shell, such as - yum install x , it doesn't matter if its over the internet or not.
It would work either way, and it depends on your repository configuration.
For example, yum repository configuration is under- /etc/yum.repos.d/*
You can see in the files there where it would go look for the packages you are trying to install
Related
I'm using zef to install the Twitter module with zef install Twitter
I'm getting the following error:
Enabled fetching backends [git path curl wget] don't understand git://github.com/raku-community-modules/Twitter You may need to configure one of the following backends, or install its underlying software - [pswebrequest]
I've googled 'pswebrequest' and I'm not getting anything of note.
I've tried installing HTTP::Tinyish, one of the dependencies, to check that my zef installation is working properly, and that was fine.
Any pointers on how to fix this?
The source-url should end with .git if its a git repository, as the backends use extensions to figure out how/what is needed to fetch/extract various types of packaged distributions.
You could open a pull request to add the .git to the source-url, or you could work around it manually via:
zef install https://github.com/raku-community-modules/Twitter.git
I have a Cloud9 space and I have a small app running. Currently I am trying to start the Rethink service and its not working. I installed rethink using
npm install rethink -g --save
and I then type
rethink
to try and start the server, and it tells me
bash: rethink: command not found
what am I doing wrong? It seems pretty straight forward yet its not.
Not everything can be installed via npm, as its name suggests (Node Package Manager), it's mainly for Node.js packages and programs. RethinkDB is primarily written in C++, and its tooling is written in Python.
The rethink NPM package is an ODM (Object Document Mapper, similar to an ORM) one of the few Node.js packages which allow your Node.js program to interface with the database. Refer to it here: https://www.npmjs.com/package/rethink
To install RethinkDB on Cloud9, follow the instructions for installing RethinkDB on Ubuntu here: https://www.rethinkdb.com/docs/install/ubuntu/. You only need to follow the section that says "With binaries".
I don't know how it got this bad. I'm a web developer, and I use Ubuntu, and here are just some of the package managers I'm using.
apt-get for system-wide packages
npm for node packages
pip for python packages
pip3 for python 3 packages
cabal for haskell packages
composer for php packages
bower for front-end packages
gem for ruby packages
git for other things
When I start a new project on a new VM, I have to install seemingly a dozen package managers from a dozen different places, and use them all to create a development environment. This is just getting out of control.
I've discovered that I can basically avoid installing and using pip/pip3 just by installing python packages from apt, like sudo apt-get install python3-some-library. This saves from having to use one package manager. That's awesome. But then I'm stuck with the Ubuntu versions of those packages, which are often really old.
What I'm wondering is, is there a meta-package manager that can help me to replace a few of these parts, so my dev environment is not so tricky to replicate?
I had a thought to make a package manager to rule them all for that very reason. Never finished it though, too much effort required to stay compatible. For each package manager you have a huge community supporting it's upkeep.
Best advice I have is to try to reduce your toolchain for each type of project. Ideally you shouldn't need to work in every language you know for each project you work on. How many projects are you using that use both python 2 and python 3 simultaneously?
Keep using apt for your system packages and install git with it. From there try to stick to one language per project. AFAIK all of the package managers you listed support installing packages from git. The languages you mentioned all have comparable sets of tooling, so use the toolchain available for the target language.
I worked with a team that was using composer, npm, bower, bundler, maven, and a tar.gz file for frontend SPAs because those are the tools they knew. On top of all of that, they were using vagrant simply as a deployer. We considered our toolchain and described our need and realized that it could be expressed in a single language once we adopted appropriate tooling for the task at hand.
I have recently started using Atom editor. Its pretty great so far. I am planning to install it on several other machines.
How can I replicate the config and list of packages installed on my current machine to other machines. Is there a config that I can use to export and import them on other machines.
Use Git to version control your config file (~/.atom/config.cson), and any other config files (dotfiles) you may have.
You can then host your Git repository for free on somewhere like GitHub, and retrieve it on other computers simply by running git clone https://github.com/{username}/{repo}.
You can then keep it up to date using git push (to upload changes) and git pull (to download changes).
To track installed packages as well, you will need to run:
apm list --installed --bare > ~/.atom/package.list
And add that file to Git also. To restore, use:
apm install --packages-file ~/.atom/package.list
You can use the apm command to save/restore installed packages.
To export packages (only packages name):
apm list --installed --bare > ~/Gdrive/backup.txt
To import packages:
apm install --packages-file ~/Gdrive/backup.txt
On Linux apm is available if you install Atom from .deb file.
On OSX: open atom -> install shell command
Windows: apm in C:\Users\YOUR_NAME\AppData\Local\atom\bin
atom-package-sync is a package that I created a couple weeks ago. It works a little bit like the synchronization of Google Chrome, you just login and it syncs your packages and settings automatically across all your Atom instances.
I plan to release the source code for the server side in the coming weeks and add an export feature for alternative backups.
This question was already (if I understood you correctly) in how to sync Packages and settings for multiple computers in Github Atom Editor.
You might find the answer in a blog post I wrote. I hope it helps How to synchronize Atom between computers.
On OSX/macOS:
Open Terminal on the computer which has the settings you want to preserve / sync to others.
Move your ~/.atom folder to Dropbox or other synced service (~ represents your /users/<your_username> folder), like so:
mv ~/.atom ~/Dropbox/atom
Open terminal, and make a symlink that connects the place Atom expects its config to be (~/.atom), to your synced folder, like so:
ln -s ~/Dropbox/atom ~/.atom
On other computers you want to use these settings, open Terminal and run:
rm -rf ~/.atom && ln -s ~/Dropbox/atom ~/.atom
(This deletes the .atom folder and adds the symlink in one line.)
With this method, your settings are automatically in sync on each computer, no need to manually update anything.
The only potential bug I have noticed can occur if your settings specify a font which another computer does not have. Installing the font on that computer fixes. All packages, themes & settings installed by Atom are automatically there.
This same method can be used for many apps (WebStorm, Sublime Text, iTunes are a few examples).
The atom package manager supports starring packages, either online (through atom.io/packages and atom.io/themes) or on the commandline using
apm star <packagename>
or
apm star --installed
to star all of your installed packages.
Starred packages can then be easily installed using:
apm stars --install
Note that starring packages requires logging in to atom.io using your github account.
Install a package called sync-settings using atom package installer
Use Github Personal Access Token
And create Gist Secret for ~\username\.atom\config.cson file
On your primary Atom computer, navigate to packages > Synchronize Settings > backup
On target machines install sync-settings, then use Restore function from Synchronize settings.
Some of the packages that you had to run PIP, you would need to run pip on target machines as well, otherwise, you good to go.
I have a fresh installation of RHEL6 in front of me. For some reason there are no repo files in /etc/yum.repos.d/
What are the default repos to use so I can install packages?
To download software, updates, and security errata, your system should be registered to RHN.
For further information, have a look at these pages:
How do I access RHN Classic to download software, updates, and security errata?
Red Hat Network (RHN) Frequently Asked Questions
What is Yum and how do I use it?