Do you run yeoman/gruntjs inside your vagrant (vm) - virtual-machine

So I want to start using yeoman (Gruntjs/requirejs/bower), but I was wondering if this could be done from inside your vm or would it be better for my workflow to have it installed on my host machine (OSX)? As far as I know you need to have a couple dependancies like node.js.
Is this a subjective thing or is there a guideline?

As #matt-cooper said, it's a subjective thing.
Personally, I run it on my host because that's where git and my IDE live and I consider Yeoman etc to be development tools that belong outside the backend code, whereas inside my VM I expect it to reflect my deployment server which doesn't need to meet the same requirements as Yeoman.

This is purely a subjective thing... you can do either.
If you are only ever going to use one VM then you could install grunt etc on the VM or the host and use it, it would mean that you would have to ssh into the VM each time you wanted to run grunt commands though.
If however you are going to have more than one VM setup then you might be better to have grunt etc. installed on your host machine rather than having to maintain multiple versions.

Related

How to create and share a VM environment for development

I am working on a college project along with a group of people. Our goal is to add features to an already existing application that runs on the web. Currently, I'm in the process of getting the source code to run on my machine. This consists of cloning a bunch of repos, installing MySQL and some (very old and outdated :-| ) versions of Python, and running some scripts. The process sounds straightforward but it isn't; there are a lot of dependancies that need to be met for the code to run, which means that I need to spend a lot of time looking at error logs trying to figure out what package is missing and needs to be installed or downgraded. But that's not the point of this question.
I'd like to make it easier for people to pick up the project in the future and work on it without having to spend hours just to get the code to compile. I'd like to get the project set up on a Linux VM (something I know how to do using VirtualBox) and then somehow share (?) that VM so that other people can simply set it up and be able to immediately have the code compiling (something that I don't know how to do, or if it is even possible).
Additionally, I'd like to be able to do all the coding on the host OS if possible, and only do the compiling/running on the VM (something I also don't know how to do). I would like some help/pointers with all the "I don't know" 's, as I don't know much about VM's other than how to set one up using VirtualBox.
You can use Vagrant to automate the provisioning of the VM, and setup all your tools and dependencies using Docker.
There are many good tutorials and sample vagrantfiles online to get you started. There is a learning curve involved, but well worth the effort. Many companies use Vagrant to quickly provision dev environments.
Vagrant can automatically download a specific distro/version of a VM from the web if one is not already locally installed. It can also provision a Docker container, in which you can install any required dependencies, tools, etc. You can store the vagrantfile, dockerfile, scripts, etc. in GitHub for easy access by your colleagues. All they would have to do is install Vagrant and run vagrant up from the command line.
If you want to write code on the host machine and compile/test it on the VM, you will need to setup a shared folder in the VM using Guest Additions (see here). Be VERY careful with line endings if you are working in Windows and running in Linux. You can setup the shared folder with Vagrant as well (see here).

How do you use a repl with Vagrant?

I'm looking at simplifying the initial developer setup at my company by using Vagrant. On the surface, it seems pretty nice: I write a Vagrantfile once and commit it, and then new developers just install VirtualBox and Vagrant, git clone our project's source code, and type vagrant up, and they have a running web app, with all the dependencies handled automatically.
The one piece that I'm not sure about is the repl. It's common to run the command to start a shell with the web server's environment, for experimentation or testing or debugging or whatever. (I mean something like rails console. I'm sure every web framework has something similar.)
How do Vagrant users typically do this? Do you just keep a vagrant ssh window open, and run your repl in there? It seems awkward to have to use (potentially) a different window (and operating system) for just this one thing. But in order to run it natively, I'd need to install the whole development environment natively, which defeats the purpose of Vagrant in the first place.
Am I overthinking this? Is there some other practice that people typically use for this?
I think you are overthinking this a bit -- most modern deveopment requires an open command prompt or three, having it be SSH'd into a different box isn't really much different than running it locally in many cases.
Another angle for some things -- like code and scaffolding generation -- is to run those on the local box. Since there is that shared folder it will land on the server and you don't need to switch environments.

What use cases of Docker on real projects

I have read what the Docker is but having hard time finding of what are the real scenarios of using Docker?
It would be great to see here your usages.
I'm replicating production environment with it, on commit on project with jenkins after building binaries i deploy there, launch the required daemons and run integration tests, all in a very short time (a few seconds over the time that takes the integration tests). Having no need to boot, and little overhead on memory/cpu/disk is great for that kind of things.
I could extend that use for development (just adding a volume where the code resides to my git repository, at least for scripting languages) to have the production environment with the code im actually editing, at a fraction of what virtualbox would require.
Also needed to test how to integrate some 3rd party code into a production system that modified DB. Cloned the DB in a container, installed the production system in another, launched both and iterated the integration until i did it well, going back to zero to try again in seconds, and faster, cheaper and more scriptable than doing it with VMs+snapshots.
Also run several desktop browser instances on containers, with their own plugins, cookies, data storage and so on separated. The docker repository example for desktop integration is a good start for it, but planning to test subuser to extend this kind of usage.
I've used Docker to implement a virtualized build server which any user could ask to run a build off their personal git branch in our canonical environment.
Each SSH connection made to the server was connected to a new container, ensuring that all builds were isolated from each other (a major pain point in the past), ensuring that the container's state couldn't be corrupted (since changes were all isolated to that single instance), and ensuring that even developers on platforms such as Windows where Docker (and other tools in our canonical build environment) couldn't be run locally would be able to run builds.
We use it for the following uses:
We have a Jenkins Container which we can use to bring up our Jenkins server. We mount the workspace using volumes so we can migrate the server easily just by copying the files and launching the container somewhere else.
We use a Jetty container to easily deploy our war files in our production and development environment.
We use a whole host of other monitoring tools such as Uptime which we have containers for so that we can bring them up and down on various hosts with a single command.
I use docker to build and test our software on several different Linux distributions (RHEL 4/5/6/7, Ubuntu 12.04, 14.04).
Docker makes it easy and fast to create minimalistic and consistent build environments.
Docker gives you the benefits that other virtualization solutions give you to a fraction of the recourse needed.

How to remotely develop software?

Suppose I have a server that runs on Linux on which I would like to develop software (mainly OCaml, C/C++ and Java).
Is there a way to "remote develop" these things? I mean an IDE that allows me to modify files remotely (they are then uploaded when modified and saved) and to compile through SSH (basically invoking make or omake).
I was looking for something that makes this process transparent to the developer, without caring of doing things by hand. I'm used to use Eclipse so I wonder if a plugin to achieve this exists or if are there other choices?
Mind that it may happen that the local machine it not able to build software I intend to (for example for OCaml) so it should rely just on remote connection.
Thanks in advance
You can use X11 forwarding. Even if you are connecting from a Windows machine.
If you are on Linux, connecting with ssh -Y might work right out of the box for you:
ssh -Y user#your_server
eclipse &
Well the simplest idea I can think of, though it is rather brute force would be to just open up a file share to the server and then edit the file directly through Eclipse.
If that doesn't work for Java at least you could make use of Maven to do some of those tasks. I am less certain about invoking Make though.
I think your answer is IDE-centric.
KDE's ioslaves support access over both SFTP and SSH (using fish, which uses a Perl script uploaded to the remote machine). I believe Gnome also has a virtual file system (gvfs) which supports remote filesystem access.
My recommendation, therefore, is to choose an IDE which supports a virtual filesystem that can operate over SSH/SFTP and allows you to specify the build command. You would then only need to specify the build command which would get its output from the remote make command (for example, vim has a makeprg option which can be set to any arbitrary command).
Depending on how 'remote' this is; why not ssh in and run the IDE remotely over X?
Using a build tool (Hudson for example) you could put a build agent on your remote server, check your changes into your repository as normal, and have it do a build when you check in changes (it will either do a repository hook or poll for changes, probably). Your build process will be the same, it will simply be automated. :-)
emacs has tramp, which lets you both open and save remote files, and open a shell on a remote system. Working with tramp is almost exactly like working with local files, except for the filename. To open 'foo.c' on the machine 'bork' as user 'joe' I open it with the standard emacs commands, giving it the pathname /joe#bork:foo.c
I use vim for remote development. (Well, I use vim also non-remote.)
If building is the problem, have you thought about simply using an automated build system where you commit to svn and the system then automatically builds the software? I've heard many good things about these sorts of systems, although I haven't quite tried any out myself.
As for remote development, a SVnDAV solution might be reasonable. It basically commits your every save and is completely transparent to the text editor you're using. However a probably much nicer solution would simply be to use a networked drive/directory and edit files remotely. On all unix-based systems this should work completely transparently to both the developer and the text editor.
Your choice of IDE will have the most impact on the answer to "can I?". If your IDE of choice is CLI based than you can always just SSH in, fire up screen (so that your CLI session is persistent across SSH sessions), and have at it!
Use vim or emacs since they will offer you speed. I know there is a learning curve associated with these editors; but once you get comfortable in any of them; you will be able to work on them as good as with Eclipse or any other IDE.
If you already have a linux server then I would suggest setting up a simple VPN server. I have done this in the past and it works pretty well. This way you can connect and modify/build your files with any "local" OS. I did this cause I use mac, pc and linux through various parts of the day and in multiple locations, so the VPN allowed me to edit files remotely w/out having to allow file sharing over the internet.
There are plenty of tutorials about how to achieve this even if you are newer to linux. I use ubuntu server on my linux box and here are the tutorial I have used.
http://www.ubuntugeek.com/howto-pptp-vpn-server-with-ubuntu-10-04-lucid-lynx.html
Netbeans 7.3 has a new feature which addresses your problem (and mine). Here's the tutorial.
https://netbeans.org/kb/docs/cnd/remotedev-tutorial.html
note: I realize it has been 3 years since this question was asked so the answer may be irrelevant to #Jack now.
One IDE that supports exactly your language set is Nuclide. It adds some packages to Atom and is used internally in Facebook exactly as you have described - full-fledged remote development in C++, Java, and Ocaml.
If a friendly file editor is enough for you then I'd recommend to use Jupyter.
Super fast installation
Built in server/file editor that starts with one command

How to automate development environment setup? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Every time a new developer joins the team or the computer a developer is using changes, the developer needs to do lots of work to setup the local development environment to make the current project work. As a SCRUM team we are trying to automate everything including deployment and tests so what I am asking is: is there a tool or a practice to make local development environment setup automated?
For example to setup my environment, first I had to install eclipse, then SVN, Apache, Tomcat, MySQL, PHP. After that I populated the DB and I had to do minor changes in the various configuration files etc... Is there a way to reduce this labor to one-click?
There are several options, and sometimes a combination of these is useful:
automated installation
disk imaging
virtualization
source code control
Details on the various options:
Automated Installation Tools for automating installation and configuration of a workstation's various services, tools and config files:
Puppet has a learning curve but is powerful. You define classes of machines (development box, web server, etc.) and it then does what is necessary to install, configure, and keep the box in the proper state. You asked for one-click, but Puppet by default is zero-click, as it checks your machine periodically to make sure it is still configured as desired. It will detect when a file or mode has been changed, and fix the problem. I currently use this to maintain a handful of RedHat Linux boxes, though it's capable of handling thousands. (Does not support Windows as of 2009-05-08).
Cfengine is another one. I've seen this used successfully at a shop with 70 engineers using RedHat Linux. Its limitations were part of the reason for Puppet.
SmartFrog is another tool for configuring hosts. It does support Windows.
Shell scripts. RightScale has examples of how to configure an Amazon EC2 image using shell scripts.
Install packages. On a Unix box it's possible to do this entirely with packages, and on Windows msi may be an option. For example, RubyWorks provides you with a full Ruby on Rails stack, all by installing one package that in turn installs other packages via dependencies.
Disk Images Then of course there are also disk imaging tools for storing an image of a configured host such that it can be restored to another host. As with virtualization, this is especially nice for test boxes, since it's easy to restore things to a clean slate. Keeping things continuously up-to-date is still an issue--is it worth making new images just to propagate a configuration file change?
Virtualization is another option, for example making copies of a Xen, VirtualPC, or VMWare image to create new hosts. This is especially useful with test boxes, as no matter what mess a test creates, you can easily restore to a clean, known state. As with disk imaging tools, keeping hosts up-to-date requires more manual steps and vigilance than if an automated install/config tool is used.
Source Code Control Once you've got the necessary tools installed/configured, then doing builds should be a matter of checking out what's needed from a source code repository and building it.
Currently I use a combination of the above to automate the process as follows:
Start with a barebones OS install on a VMWare guest
Run a shell script to install Puppet and retrieve its configs from source code control
Puppet to install tools/components/configs
Check out files from source code control to build and deploy our web application
I stumbled across this question and was very suprised that no one has mentioned Vagrant yet.
As Pete TerMaat and others have mentioned, virtualization is a great way to manage and automate development environments. Vagrant basically takes the pain away from setting up these virtual boxes.
Within minutes you can have a completely fresh copy of your favourite Linux distro up and running, and provisioned exactly the same way your production server is.
No more fighting with OSX or Windows to get PHP, MySQL, etc. installed. All software lives and runs inside the virtual machine. You can even SSH in with vagrant ssh. If you make a mistake or break something, just vagrant destroy it, and vagrant up to start over fresh.
Vagrant automatically creates a synced folder to your local file system, meaning you don't need to develop within the virtual machine (ie. using Vim). Use whatever your editor of choice is.
I now create a new "Vagrant box" for almost every project I do. All my settings are saved into the project repository, so it's easy to bring on another team member. They simply have to pull the repo, and run vagrant up, and they are literally ready to go.
This also makes it much easier to handle projects that have different software requirements. Maybe you have some projects that rely on PHP 5.3, but some newer ones that run PHP 5.4. Just install the version you want for that project.
Check it out!
One important point is to set up your projects in source control such that you can immediately build, deploy and run after checkout.
That means you should also checkin helper infrastructure, such as Makefiles, ant buildfiles etc., and settings for the tools, such as IDE project files.
That should take care of the setup hassle for individual projects.
For the basic machine setup, you could use a standard image. Another option is to use your platform's tools to automate installation. Under Linux, you could create a meta-package that depends on all the packages you need. Under Windows, a similar thing should be possible using MSI or the like.
Edit:
Ideally, instead of checking in helper infrastructure, you check in the information that allows the build to generate the helper infrastructure. This is the approach taken by e.g. the GNU build system (autotools etc.), or by Maven. This is even more elegant, because you can (theoretically) generate infrastructure for any (supported) build environment, thus you are not bound to e.g. one specific IDE, and settings in the helper infrastructure (paths etc.) don't need to duplicate the main project settings.
However, this also a more complex approach, so if you can't get it to work, I believe checking in stuff like IDE files directly is acceptable.
I like to use Virtual PC or VMware to virtualize the development environment. This provides a standard "dev environment" that could be shared among developers. You don't have to worry about software that the user could add to their system that may conflict with your development environment. It also provides me a way to work to two projects where the development environments can't both be on one system (using two different versions of a core technology).
Use puppet to configure both your development and production environment. Using a top-notch automation system is the only way to scale your ops.
There's always the option of using virtual machines (see e.g. VMWare Player). Create one environment and copy it over for each new employee with minimal configuration needed.
At a prior place we had everything (and I mean EVERYTHING) in SCM (clearcase then SVN). When a new developer can in they installed ClearCase|SVN and sucked down the repository. This also handles the case when you need to update a particular lib/tool as you can just have the dev teams update their environment.
We used two repo's for this so code and tools/config lived in separate places.
I highly recommend Blueprint from DevStructure. It's open-source and your use case is actually the exact reason we originally wrote the software. Our goals have somewhat changed, but it still is the perfect tool for what you are describing. In short, you can create reusable server configs - dead simple configuration management. I hope this helps!
https://github.com/devstructure/blueprint (Blueprint # Github)
I've been thinking about this myself. There are some other technologies that you could throw into the mix. Here's what I'm currently setting up:
PXE based pre-seeded installation images (Debian Squeeze). You can start up a bare-metal machine (or new virtual appliance) and select the image from the PXE boot menu. This has the major advantage of being able to install your environment on physical machines (in addition to virtual appliances).
Someone already mentioned Puppet. I use CFEngine but it's a similar deal. Essentially your configuration is documented and centralized in policy files which are continually enforced by an agent on the client.
if you don't want a rigid environment (i.e. developers may choose a combination of tool-sets) you can roll your own deb packages so new devs can type sudo apt-get install acmecorp-eclipse-env or sudo apt-get install acmecorp-intellij-env, for example.
Slightly off-topic, but if you run a Debian based environment (i.e. Ubuntu), consider installing apt-cacher (package proxy). In addition to saving bandwidth, it will make your installations much faster (since packages are cached on your local network).
If you're using OSX and working with Rails. I'd suggest either:
https://github.com/platform45/let-there-be-light
https://github.com/thoughtbot/laptop
If you use machines in a standard configuration, you can image the disk with a fresh perfectly configured install -- that's a very popular approach in many corporations (and not just for developers, either). If you need separately configured OS's, you can tar-bz2 all the added and changed files once a configured OS is turned into your desired setup, and just untar it as root to make your desired environment from scratch.
if you're using a linux flavor, you've probably got a package management system: thinks .rpm for fedora/redhat, or .deb for ubuntu/debian. many of the things you describe already have packages available: svn, eclipse, etc. you could roll your own packages for company specific software, create a repository (perhaps only available on the local network) and then your setup could be reduced to a single bash script which would add the company repo to /etc/apt/sources.list (debian/ubuntu) and then call a command like,
/home/newhire$ apt-get update && apt-get install some complete package list
you could use buildbot to then automate regular builds for company packages that change often.
Try out DevScript at http://nsnihalsahu.github.io/devscript .
Its one command like ,
devscript lamp or devscript laravel or devscript django . In around a few minutes ,depending on the speed of your internet co