Using a Mini/Parallels virtual environment for development? - development-environment

My main Windows development PC, running Vista Business, is getting a little long in the tooth and I'm considering replacement. Its a dual screen setup with 24" and 19" monitors. My laptop is a reasonably new Macbook and I've been using Parallels to mount a Virtual XP system on it for some time and have been pretty impressed.
The new Mac Minis appear to have dual screen support, and I'm wondering about replacing the PC box with a fully spec'ed (4Gb Ram, 2.3Ghz processor) Mac mini and mounting Parallels VM environments for development. On the face of it this has a lot of attractions - clean development environments etc. but I'd appreciate any advice from anyone who has taken a similar approach - is it feasible and robust enough for general day to day development work?
My specific development requirements are that all my clients are predominantly PC based and over two thirds of my development work is now web based anyway. However I do have a couple of legacy Delphi 6 systems (and I'm considering a Delphi 2009 upgrade) and one .Net 1.1 Windows Mobile application. I also have a considerable number of Access/SQL Server applications I look after, all those these are generally coded directly on clients machines and the need to replicate locally is rare, it would however be useful. There's also the possibility of Win32 based development in the future, and I tend to do a lot with 3D and OpenGL technologies. For non-coding applications I also run and use Maya and Photoshop on the PC fairly regularly.

I bought an Aluminum MacBook a couple months ago, got additional ram in order to have 4 gigs and I've successfully virtualized Windows XP with 3D acceleration using VMWare Fusion. I'm currently developing web applications using PHP and Javascript, some other projects with Java and playing around a little bit with XNA 3.1, all on my MacBook .. so, If this works for me it could work for you too!

You're probably not going to want to do much graphics work through Parallels' emulated video adapter.

Related

Multi-platform development from one computer

I am planning to build a new development computer for both Windows & Linux platforms. On Windows, my development would be primarily in .NET/C#/IIS/MSSQL Server. On Linux—preferably Ubuntu—my development would be in Ruby and Python.
I am thinking of buying a laptop with Windows 7 pre-installed with 4GB RAM, Intel Core 2 Duo, and 320 GB HD; running 2 VMs for both Windows and Linux development with the host OS as my work station. Of course, I would be running DBs and web servers on the respective platforms.
Is this a typical setup? My only concern is running two VMs side by side. Not sure if this configuration would be optimal. Alternative would be to do my Windows development on the host Windows 7 OS. What are your thoughts?
I really like using VMs for development, because it makes it really easy to maintain different configurations, make backups, test comms between machines, experiment, and so forth.
Linux VMs work pretty well. Windows in a VM on Windows, however, can be a resource hog. You probably want more than 4 GB on the laptop.
If your not going to be switching between the two platform frequently, I would recommend repartitioning your hard drive after you get your machine, and installing Windows in one partition and Linux in the other. Doing things that way is usually simpler, in that you don't need the over head of the VMs.
Sounds like you will get 15 minutes out of this laptop's battery, maybe 20.
Speaking from experience, you will prefer a desktop plus a "more mobile" laptop. You can do this without spending more than you had budgeted (remember you can skip the monitor on the deskop), but that will probably get you slightly less specs in exchange for the flexibility and a laptop you really can take with you. But I recommend spending slightly more than you would on the single laptop, and remember you do get two machines out of it.
You can network between them (e.g. use remote desktop programs from the laptop to connect to VMs on the desktop).
In my particular case, about 6 years ago I needed a new machine that could be used for on-location photography and had the power, screen-size, disk space, etc. to run Photoshop and other tools (e.g. batch processing ~900 largish images was one use case). I got a beefy laptop, which worked great for that, but the battery quickly died and never had much lifetime in the first place. The system has always been more of a "slightly-easier-to-move desktop" than a laptop, and it sounds like you'd rather have a real laptop.

Using laptop as a second programming monitor

The joys of multimonitor programming are countless, I think there are about 5 blog posts on Coding Horror on the topic alone!
I often code in Windows on my main machine, and have my Mac laptop set up to the side. I use the Mac both to compile Mac builds but also as my "reference web browser". There's no KVM or anything.
However a casual conversation at a conference led me to the question, could I use two independent machines to share windows? Literally move some windows from one machine to another, so I could use one PC's display as "overflow" from the other.
Some googling suddenly shows that this is possible in some situations for sure:
Synergy and Maxivista
My question is whether any programmers have tried such a setup. We have unique needs especially with multiple text windows and editors, and this kind of tool may be a huge win or a huge hassle.
This solution feels like a combination of easy KVM switching AND multiple monitors.. it sounds like a programming dream! So advice or especially reports of actual experience in a programming environment would be greatly useful before I invest in the rather complex setup.
Followup:
Sounds like I'm asking for something that doesn't exist! It's kind of combination of a software KVM and VNC. But the VNC would need to break out the app windows and allow individual manipulation (like that maxivista commercial tool, which is Vista only).
Thanks for all the feedback. Looks like there's demand for a cool app if anyone has the drive to be first in this new nich!
Synergy doesn't allow you to move windows between machines (that would require a silly amount of work behind the scenes), but it does allow you to share a keyboard and mouse between two machines so they "appear" to be all one machine, but actually run separately.
I personally use Input Director, as I found it more stable than Synergy. I have my laptop with an external monitor to the right, and my desktop to the left as an Input Director slave. My desktop runs a different O/S and is basically my guinea pig box for testing stuff and for anything I need to keep running when I leave the office. Cut + paste is pretty seamless, so I can quite happily fire up an RDP session to a server on my desktop, and cut+paste SQL scripts from that to my laptop.
It's a very useful thing to have if you have a few physical boxes and monitors kicking around :)
I've actually managed to use spare notebook as a second monitor to Desktop PC. This allows to move windows to second PC, but not vise-versa.
Solution would work basically with any OS.
The only requirement is a spare VGA (or DVI-I/DVI-A) port on server PC.
Make a dummy VGA plug http://www.overclock.net/t/384733/the-30-second-dummy-plug
This will also work for DVI-I/DVI-A port + DVI-VGA adapter
Detect virtual monitor with your OS. Monitor will be detected as very generic monitor, so you can set up any resolution. Set it to slave PC resolution.
Use any remote control software to connect from slave to server PC. Set it to display only "virtual" monitor.
That's all. Your slave PC is a second monitor for server PC.
I've used this on Windows 7 + TeamViewer. I've additionally set up Mouse Without Borders (Microsoft Synergy analog) to be able to use slave PC with same mouse&keyboard, though this is not required if you intend to transform it to monitor-only.
Xdmx - Distributed Multihead X Project (linux only)
Provides native X display on external machines, no VNC cons.
The following is not exactly what you want, but pretty close:
You can start a VNC server on the Windows machine, which will let you "export" its graphical screen.
Then, unplug the monitor from the Windows machine and use it as external laptop monitor instead, with your Mac laptop.
There, on your Mac, you just connect to the VNC session using Chicken of the VNC, which will give you the graphical screen content of the Windows machine as a Mac window (interactively, so you can actually control the windows machine as if you were working on it directly). You can put that on the external monitor, and you can also put other windows there, so you really have a shared environment.
I believe this solution also lets you copy and paste content from the Windows screen to Mac windows and vice versa.
I use MaxiVista on WinXP while programming. It works fantastically and lets me add a third screen to my multi-monitor configuration.
There is hope, here for windows users: http://virtualmonitor.github.io/ Looks like a work-in-progress and only supports windows 2000 - windows 7, but he's looking for help with windows 7 - 8.
Unfortunately, synergy doesn't allow moving windows across screens currently. It only forwards mouse&keyboard events from one set of physical devices to different computers.
Yes, and I love it. It allows you get past 2 screens on a laptop, and really I find 3 a great amount.
If your main machine is a Mac you want ScreenRecycler. You can then use monitors on other Mac, Windows, and Linux machines (anything with a VNC client). You will want something better than the Mac's crappy windows management though. I suggest Many Tricks' Moom and Witch.
On Windows, as #LachlanG said, MaxiVista works great. And it supports adding monitors from Windows, Mac, and Linux machines.
I am reusing my old laptop as a second monitor to see the live preview while coding. I am using SpaceDesk, which is free.
I use barrier and open source fork of synergy. Its a little hard to use but works really well. (To find it just search google for 'barrier github').

What are the key use cases for use of virtualization in software development?

What are the key use cases for the use of virtualization -- that is, running one or more "virtual PCs" using software such as VMWare and Microsoft Virtual PC -- for software development?
Also -- are there other instances/uses of virtualization that aren't covered by my definition above (use of a tool like MS Virtual PC or VMWare), and that are useful to developers?
My impetus for asking is this StackOverflow comment by Metro Smurf asserting "You'll wonder how you ever developed without it!", regarding use of virtualization.
(Please include just one use case per response. Thanks!)
Application testing in multiple environments is one obvious use of virtualization that I'm aware of. Testing your application on other operating systems (without requiring additional physical computers to do so), as well as testing that involves software that generally only allows you to install a single version on a given machine (such as the Internet Explorer browser; running both IE6 and IE7 on the same machine is not an officially supported configuration), are good candidates for virtual machine usage.
If your build-server is running in a VM, you can make a snapshots of it for every software release in order to be 100% sure that you can recreate the build environment (in case you want to make patches to old releases, for example).
If you set up snapshots of your development environment (and back them up) it can be very easy to resume productivity if your computer breaks down. When your machine breaks down right before your release - and you can resume immediately with all your tools installed and configured, it can be a lifesaver.
The simplest case which applies to my current situation is that we have a complex client-server environment and with virtualization every developer can very quickly get a baseline set of operating systems to deploy their local build to and verify end to end functionality.
Locally you have your dev box, and N client boxes which get re-initialized as fresh OSes each time you want to try a build. Essentially it's the test environment equivalent of a 'make clean' where even the client workstation gets replaced with a new OS.
Quickly distributing environments between team members is a very nice use case to for virtualization especially if you have a lot of various components, tools, etc.. This can save you a ton of time with new hires, contractors, or other individuals who need an environment quickly.
Many presenters use a VM for presentations - it allows them to revert immediately to reset the presentation for the next day, transfer all presentation materials quickly between computers, and not have to show your attendees your messy My Documents folder.
Using virtualization for sales activities is also a great use case. You can take a snapshot at a particular time that you can save as your demo baseline. Then once you run through the demonstration and change the data, etc. you can restore back to your previous baseline for future demonstrations. You can also capture multiple baselines and pick and choose which baseline best fits the upcoming demo.
Test environments. If you have more than one setup that a system needs to be targeted for (e.g. Windows & Linux, XP & Vista) then a machine with lots of RAM and VMWare (or on of the others) is a good way to manage the environments.
Another is developing on one system and targeting another. For example, at one point I did some J2EE work on a workstation running Linux where the client was I.E. 5.5. A VM with Windows 2000 and IE 5.5 would let me test the application.
Reasons I use virtual machines for development.
Isolate different development environments.
Testing environments.
Easy recovery due to computer hardware failure/upgrade.
Ability to "roll-back" changes to your development environment if something corrupts it.
Currently, I am using VirtualBox for my VM setup. I used to use VirtualPC, but I REALLY hated not having any type of "snapshot" feature (like VMware and VirtualBox have).
We develop software for use in our SaaS application, our production environment has a large number of servers and their software environment needs to be absolutely predictable; we can't have ANYTHING installed extra, or absent from our development machines.
Moreover, our application requires a number of different server types in order to function properly (at least 7 last time I counted); mostly they can't be installed on the same (virtual) machine - at least, not without violating the "same software as production" requirement.
In order to have a consistent environment, it's necessary to use VMs. I don't know how anyone ever manages without them.
Snapshots and rollbacks are nice too, but I use them only occasionally (really useful during installation / upgrade tests).
Suppose you're developing a new version of your software, and checking that the upgrade from the previous version works correctly... how long does it take to do a test cycle without being able to rollback the box? Do you have to reinstall the OS then the old version? Can you guarantee that the uninstall really uninstalls everything?
Being able to test/retest your deployment process is a huge savings.
Developing Add-Ins for different versions of Microsoft Office (using Visual Studio Tools for Office).
My main work machine has Office 2007. When I work with Add-Ins for Office 2003 I use a virtual machine with Visual Studio and Office 2003.
I'm suprised that nobody has mentioned the VMware record/replay feature (awesome video demo) which is great for debugging.
I have a headless server running ESXi which runs various machines for building installers (so I don't have to give up processing power on my desktop), automated testing (server is faster than any desktop) and various test environments (about 20 different configurations) so that the support team can easily jump onto a configuration that closely matches a customers system.
When you have one really beefy server running VMs that can be shared between support, test and dev teams, you introduce huge cost savings. In all, we're running ~25 VMs on ESXi (dual-quad core Xeon 2.5G + 8Gb RAM) shared between 5-10 people, some of the developers use Virtual PC and then I use VMware Workstation on my desktop. All of the Mac users here use VMware Fusion as well
I am surprised that no one has mentioned the benefit of increased security by isolating, for example, the database server and web server in different VM's.
Some server applications can use VMs too. When one vm is not used much, the server can locate the resources to other vms.
Some sort of test environment: if you are debugging malware (either writing or developing a pill against it) it is not clever to use the real OS. The only possible disadvantage is that the viruses can detect that they are being run in the virtualization. :( One of the possibilities to do it is because the VM engines can emulate a finite set of hardware.

Pros and Cons of Developing on a VM on a PC

I recently build myself a semi beef up PC (Q9450, 8GB DDR2 1066, 1TB HDD, Dual 8600GT, Vista Ultimate and Dual 22' Monitors) and I'm evaluating whether i should develop on a VPC/VMWare session on top of Vista or not?
One benefit I can see is that I can run the same VM on my Vista laptop so my development environment is the same on any of my machines. I also plan on purchasing a MBP before the end of the year as well.
Found a couple of articles online that semi-help Here
Any other thoughts would be really appreciated?
For webdevelopment I like to have the serverpart separeted out into a VM. My current setup is a Macbook Pro with several Debian VM's inside. I like the isolation aspect of it. I can try new software on the servers and have the ability to revert them back if something is messed up.
I do the programming via network-share (samba) in Textmate on the host system.
Another advantage of a VM is having a clean installed base. I use my desktop and laptop for lots of things aside from development. You never know when a piece of software you install is going to conflict, or if the little tweaks and what not you play around with are going to trash your OS. Reinstalling/configuring all your tools so they are exactly the way you want them can take quite some time. If you have a backup of your Development VM Image you can mess up your PC as much as you want but still be able to code without downtime. It also allows you to run Win/Visual Studio/Etc on a box that you would otherwise prefer Linux or MacOS on.
You can also make multiple copies of the same Image and use each one for a separate project.
Being able to transition between a laptop/desktop/server/remote connection, and always be in the same environment is also very helpful.
One problem I found (at least when using VMWare Server) is that no matter how fast your machine is, the screen refresh rate is still around ~30hz. That makes for a slightly unpleasant experience after using it for a while.
Where I'm working at now I use a VM for all of my development because I don't have admin rights to my base copy of XP.
Pros:
I like using a VM's because it give you some flexibility - you can switch between machines - have programs running on both and have a cool environment to work on.
Cons:
You have to boot up multiple operating systems. This takes time, memory and resources.
Clipboard operations on VM's can be interesting at times. Sometimes copying to clipboard does not work or gets mixed up between VM's. (Using VMWare).
File operations can be interesting when you plug in USB drives and other external devices. VM's sometimes do not see the devices, sometimes it does.
If your VM image become corrupt - you can easily loose everything in it.... unless it is backed up.....
It's great for presenting development talks, you can revert to a snapshot and give the talk from the exact same starting point each time.
Bulk-up your RAM on your future MacBookPro if VMWare will be used. I haven't (yet) and the performance with several other (mac-side) apps open really starts to feel sluggish.
All the best.
I was doing some work with Visual Studio recently with a Windows XP vm on Linux and somehow the guys who made the vm (vmware) made the windows machine actually run faster. We did some time tests to make sure and it wasn't major, but a few things (autocomplete for example) really did pop up faster.
If you are on Windows, Virtual PC is pretty decent for development work. VMWare Virtual Server is not really designed for use as a desktop and you will get very tired of it with any prolonged use. Sun's VirtualBox is another option competing with Virtual PC. VMWare has a workstation product but it is not free.
Typically, I do development on the real desktop (non-virtual) and then deploy or test to virtual machines which I can snapshot and roll back easily.
For a long time, we were developing on very early versions of Visual Studio 2005 and the associated .Net bits that went along with it. To protect our real machines from the various problems associated with pre-release software, we did all of our development work inside virtual machines. It worked amazingly well. I've been considering moving back to that model as it makes upgrading the physical hardware a snap (not to mention making it easier to deal with hardware failures by just replacing the entire machine): you just copy the VM image over.
On my current machine (A Core2Duo with 4GB of RAM), the performance drop when running one VM is almost not noticeable. Running two VMs, however, is painful.
I also can't figure out how to get VMWare Server to work across two monitors well.
I wouldnt want to develop in a VM so much as test things in a VM. For instance, it might be nice to set up a couple VM's to emulate an n-tier architecture, or a client-server setup or finally simply to test code on multiple OSs
It depends what you are developing and in what language.
VM's tend to take a fairly hard hit on disk access, so compiling may slow down significantly, especially for large C/C++ projects. Not sure if this would be such an issue with .NET/Java.
If you are doing anything that is graphics intensive (3D, video, etc) then I would steer clear of a VM too.
I don't know if it is so useful as a development platform unless you are doing something that ties into software you don't want to have installed on your regular working machine or that needs to work around a certain event that you need to be able to reset on a regular basis. It can also be handy when you are working with code that risks crashing your computer as it will at least only crash your VM.
It is brilliant for testing different configurations and setups- working with installers and so on, that is where virtualisation really shines as far as I am concerned, being able to roll things back whenever you need to and run through stuff repeatedy is amazingly useful for identifying problems before your end users run into them.
While doing development at home, I have to VPN into my company to be able to use the collaborative tools that are on the intranet. I also have a desktop + laptop that are hooked together through Synergy.
The problem that I have is that our VPN software wants things to be so secure that it will force all network routing through the VPN gateway -- even if I'm using additional NICs to network my desktop and laptop through a separate private network. The end result is that I can't use Synergy between my desktop and laptop and VPN into my company at the same time.
The solution suggested to me by a co-worker was to setup a VM instance on my desktop and use that for all my VPN needs. Works like a charm!
Speaking from personal experience developing java in an Ubuntu VM on Windows 7, I've found this to be quite productive. Mainly because my local IT support on the ground supports Windows 7, so I can do things like access all the local file shares and printers in Windows, and then config my Ubuntu VM to my heart's content.
Huge productivity benefits around remote access and desktop sharing. Windows allowed me to very quickly and easily use tools like logmein.com and join.me to access my machine from home and to desktop share the VM with other people in the company (both work seamlessly with the VM in a nearly full screen window). Neither of these services are supported on Linux, and I wouldn't want to deal with all the associated VNC/X setup and network config on Ubuntu.
My machine is fairly beefy. Quad core, with 16Gb RAM - 8Gb for the VM. Java dev in the VM is pretty quick.

How practical is Virtual PC on a personal development machine?

Is virtual PC practical on a home personal development computer. I do some custom .net programming at home and I was wondering if in terms of performance and overall use, Virtual PC is useful. Do the applications inside Virtual PC session run slower. It will help me with my personal dev machine. Would you recommend any other products?
In my estimation virtual machines are one of the best tools that a developer can have. I have my base dev machine and on it I run VPC for different platforms to test installations and application functionality. For web development I keep VPC;s running each of the major browsers that I support, so I continually test my websites on various browsers. I even still maintain an old VB6 app and I have replicated my old VB6 build environment to a VPC image. Make sure you have lots of RAM. My machine runs with 4GB and that works well for most everything I need. I also have Sourcegear Vault set up for source code management. I have the clients loaded on the various VPC's that I use for development and they all check data in and out from my central SQL Server box. It works great.
It really depends on what your home computer is like. I've used VPC to test different versions of Visual Studio (e.g. to make sure that a solution is VS2005 compatible, and to check out VS2010).
I wouldn't want to use it all the time, but then I am working on a laptop. Given a really meaty multicore home desktop (preferrably with hardware support, of course, and lots of memory) it could be reasonably practical for day-to-day use.
VMWare Player is free and some people find it faster - I haven't used it enough to compare the two properly myself. If you're spend a lot of time "in" the VM, it would probably be worth giving both a proper test-drive.
VPC is a very good choice. I use it to test deployments and for presentation purposes.
If you have a PC with a new Intel chip and at least 2 gigs of RAM it actually works just as fast as a regular PC would :).
I recommend 4 gigs of ram though, they're cheap as hell these days and it really matters.
I've had some success with this; I had to develop some older .NET 1.1 software on Vista, which wasn't supported. I had to run XP in a virtual PC container in order to get the project done.
The biggest issue was available RAM; I'd recommend maxing out your home PC to use as much as it can- this will likely be less than 4GB unless you're running a 64Bit OS. I found that getting an extra gig of ram made life much better. Ram is cheap right now, so I'd start there if it didn't work well enough for me at first.
Yes applications will run slower but the hit isn't as big as you might expect. It is pretty reasonable to do development on a virtual machine. Obviously the performance is relative to how fast your computer is, a mulitcore machine will do nicely.
If you develop driver or core routines, where every mistake can and usually will result in a crash. A VM is the best you can use.
I tried Virtual PC and VMWare. They are both pretty good for such stuff.
Virtual PC should be fast enough, unless your driver or code is really time sensitive. A cross-platform, free alternative to Virtual PC is Virtual box.
If you've got a VirtualPC license already, by all means use it. If not, you might have a look at Sun's VirtualBox. It's Free/Libre and cross-platform. I use it to run windows and linux on mac os x and linux and have been quite happy with it.
You can run your dev tooling natively on your pick of O/S. and use VM's to test on other environments. Get lots of memory if you're going to do this, say 2GB or more - if you haven't already.
AMD chips have some facilities (nested page tables etc.) that improve VM performance. 2nd gen Opterons and some Athlon 64 chips will support this for reasonable money. You can even get brand-name hardware like an HP XW4550 with this sort of chip for fairly reasonable money. I'm not sure to what extent Intel has caught up with this yet.
Assuming your host machine has enough raw power then a virtual machine works fine. I have a 2.5GB ram, 2Ghz duel core work laptop and don't want to install vs2008 for personal development so have a virtual machine for that. I've given it 1 GB of dedicated memory at the moment and it runs great, no problems. If needed I'll up the ram allocation but for now I'm happy.
Hope this helps :-)
I use VirtualBox for all development and find the performance much better than VPC. My machine is about a 2 year old dual core with 4gb ram and performance is not noticeably slower than running natively. The virtual machines are Vista and the host OS is Windows 2008. I would definitely recommend using virtual machines as creating a fresh new machine for a new project is very easy.
I have a toshiba notebook with 2Gig of Ram. I am wondering if its worth to install Virtual box and use it to browse web, do quicken, some small dev work etc.? How would I install Windows OS on virtualbox virtual session? Are there good tutorials out there? Would 2gig of ram be enough to run virtual sessions on notebook computer with following configuration:
2 gig of ram
Intel Pentium 4 cpu
60 gig hdd