Why did my Linqpad just deactivate itself? - linqpad

I'm using Linqpad v4.45.05 on Windows Server 2012 virtual machine. I'm on a isolated network so I had to perform an "offline activation" (which was a complete nightmare). Today, I rebooted my VM and now my Linqpad is no longer activate. Do I have any way to recover my activation? Why did this happen and how do I prevent it from happening again?

Here are some general things to check:
LINQPad activations on virtual machines are tied both to the machine and the underlying hardware, so if the latter changes, the activation may become invalid.
When activating, choose the option to activate all users (rather than the current user) unless you plan to always use the same login.
Activations on Azure VMs are not fully supported on versions prior to v4.47.05, which is still in beta at time of writing.

Related

Virtual PC 2007 as programming environment

I'd like to create a VM in Virtual PC 2007 for use as a development environment/sandbox for an existing ASP.NET application in Visual Studio 2005/SQL Server 2005 (and VSS for source control).
I'm thinking that I need to create a 'base' copy of the environment (with the os, Visual Studio, and Sql Server), and then copy that to a 'work' version that I do actual development in. I would be sharing this VM with one or two other developers who would be working on different parts of the app.
Is this a good idea? What is the best way to get my app/databases in and out of the VM and the changes I make into VSS? Is it just a copy from the host location to the VM share and back again? How do I keep everything synchronized?
Thanks!
I would seriously suggest you the following things:
Use a "server" solution, rather than a desktop solution. That's far more reasonable if you want to share the VM environment with other developers.
Use VMware's products rather than Microsoft's.
From these two points it follows that you should use VMware ESX Server and related products. If you don't want to / can't invest money in it there's a free version of this product: http://www.vmware.com/go/getesxi/, but I never used it.
Whether you choose to use the enterprise version of ESX server or the free version, I suggest you put your IT organization's IT department on it.
It's not a bad idea, if you think there's a need for it.
I do something similar when I need to develop a Windows App because it's just nice to have a clean environment. That way I don't accidentally add a reference to something that's not necessarily included in the .NET Framework. It forces me to install any 3rd party components as I'm developing and documenting. This way, I can anticipate prerequisites, and ensure that I have them documented before I load software to a user's PC and wonder why it doesn't work.
Just make sure the PC it's hosted on can handle the additional load. My main Dev PC is a dual core processor with 4GB RAM. I devote 2GB to any virtual PC I plan on using as a development environment so that I don't hit too much of a performance snag.
As for keeping everything synchronized, you will want to use some sort of source control (as you should even in a normal environment). (I like SVN with Tortoise SVN as my client of choice, but there are plenty of alternatives.) Just treat the virtual PCS as if they were normal PCs. Make sure they can access the network, so you can all access your source code repository.
You can use the snapshot feature (or whatever it is called) - that chagnes to the "system" are saved to a delta file so that you can easily revert to an earlier state of the virtual pc. It has some performance penalty. This way you don't have to keep base and work copies.
I use Virtual PCs for all of my Windows development. The company I work for has legacy products in FoxPro and current products in .NET so I have 2 environments set up:
1 - Windows XP with Foxpro and VSS - I can access VSS directly from this image and the code never enters other machines in my network (I work remotely)
2 - Windows 7 with VS2008 and all the associated bits and pieces needed to develop our .NET software (including TFS). This is the machine I use every day - I have a meaty desktop PC so I I am able to give the VPC 4GB RAM and runs as fast as a 'normal' PC.
I have my VPCs running in VirtualBox and it is equally as good as the other offerings. A previous answer mentioned VMWare ESX which is an excellent product for large scale deployment but if you want a server solution then VMWare Server is free and is a nice virtualisation platform.
If you are looking at ways to experiment with changes and still want to use VPC then undo disks are excellent - you fire up the machine, hack away to your hearts content and when you shut down you can choose to save or discard the entire session.
For me Virtual PCs are an excellent way to quickly set-up / tear down development environments and I would struggle to return to using a single machine for all my work.

Online product demo environment for Windows applications

I'm looking for a way to allow potential customers to try my application before they buy it.
The product is a windows forms application that requires an SQL Server database to operate.
Although I have a functional demo that the customer can install on their network, I want to make it easier for them by have them "play" with it at my environment.
I remember Microsoft had (has?) something similar. I was testing Visual Studio a few years ago in a virtual environment where I was connecting to a server at Microsoft.
They setup the environment this way so when a user logs off after using it rollback his actions. Or to explain it better: when a user logins it starts with a new, clean environment.
So any projects I've created testing Visual Studio were lost after I logged off.
Any suggestions?
Thanks.
Some solutions that come to mind:
Provide remote access
You could provide access to a running instance of your application via some sort of remote connection protocol, e.g. via RDP or via VNC.
For example, there is a Java VNC client which can run as a Java applet; you could put that on a webpage and have it connect to a VNC session you host on your servers.
Or use Windows Terminal Server, and allow connection via RDP.
Both solutions of course have the drawback that people need to open the appropriate ports, if they are behind a firewall. There might be ways around that, however (e.g. you can run VNC over HTTP).
VM image
A completely different solution: Provide a ready-to-run VM image (for VMWare, VirtualBox or similar) of your application, including server and everything. You would need a demo version of your app though, plus getting redistribution rights for all the proprietary components (Windows OS, SQL server) might get hairy.
Offer videos
Often people do not really need to actually use the app; they are mainly interested to see how it works. So maybe it is enough to host videos of the app in operation. That allows you to put in some advertising for your features, and lets you show the users what they might miss when testing on their own.

Terminal Services Servers Information

I was wondering if anyone had some good general information on windows terminal services and how it works.
I'm wondering:
If a DLL is loaded into memory is it available for all users or reloaded for each user. (or does it depend on something else)
A specification for an example server and how many concurrent users it supports with general use....
Any issues with them, I used one a while ago and we had sporadic freezes for a few seconds every so often. This could not be tracked down to a network issue someone suggested something to do with remote printers being attached to the system. It really annoyed users seemed to happen often when going to the start menu.
Is 2008 a big upgrade in terms of performance to 2003
Terminal Services is simply like a server with multiple "remote desktop".
DLL is not shared between sessions, just like ordinal process.
You need a special license if you want to use standard Windows server
I suggest removing all the printers when you want to use it (you can also disable them in client side), but that's not a big issue
2008 is far better for performance and security, but you'd also need more recent RDP clients.

What are the key use cases for use of virtualization in software development?

What are the key use cases for the use of virtualization -- that is, running one or more "virtual PCs" using software such as VMWare and Microsoft Virtual PC -- for software development?
Also -- are there other instances/uses of virtualization that aren't covered by my definition above (use of a tool like MS Virtual PC or VMWare), and that are useful to developers?
My impetus for asking is this StackOverflow comment by Metro Smurf asserting "You'll wonder how you ever developed without it!", regarding use of virtualization.
(Please include just one use case per response. Thanks!)
Application testing in multiple environments is one obvious use of virtualization that I'm aware of. Testing your application on other operating systems (without requiring additional physical computers to do so), as well as testing that involves software that generally only allows you to install a single version on a given machine (such as the Internet Explorer browser; running both IE6 and IE7 on the same machine is not an officially supported configuration), are good candidates for virtual machine usage.
If your build-server is running in a VM, you can make a snapshots of it for every software release in order to be 100% sure that you can recreate the build environment (in case you want to make patches to old releases, for example).
If you set up snapshots of your development environment (and back them up) it can be very easy to resume productivity if your computer breaks down. When your machine breaks down right before your release - and you can resume immediately with all your tools installed and configured, it can be a lifesaver.
The simplest case which applies to my current situation is that we have a complex client-server environment and with virtualization every developer can very quickly get a baseline set of operating systems to deploy their local build to and verify end to end functionality.
Locally you have your dev box, and N client boxes which get re-initialized as fresh OSes each time you want to try a build. Essentially it's the test environment equivalent of a 'make clean' where even the client workstation gets replaced with a new OS.
Quickly distributing environments between team members is a very nice use case to for virtualization especially if you have a lot of various components, tools, etc.. This can save you a ton of time with new hires, contractors, or other individuals who need an environment quickly.
Many presenters use a VM for presentations - it allows them to revert immediately to reset the presentation for the next day, transfer all presentation materials quickly between computers, and not have to show your attendees your messy My Documents folder.
Using virtualization for sales activities is also a great use case. You can take a snapshot at a particular time that you can save as your demo baseline. Then once you run through the demonstration and change the data, etc. you can restore back to your previous baseline for future demonstrations. You can also capture multiple baselines and pick and choose which baseline best fits the upcoming demo.
Test environments. If you have more than one setup that a system needs to be targeted for (e.g. Windows & Linux, XP & Vista) then a machine with lots of RAM and VMWare (or on of the others) is a good way to manage the environments.
Another is developing on one system and targeting another. For example, at one point I did some J2EE work on a workstation running Linux where the client was I.E. 5.5. A VM with Windows 2000 and IE 5.5 would let me test the application.
Reasons I use virtual machines for development.
Isolate different development environments.
Testing environments.
Easy recovery due to computer hardware failure/upgrade.
Ability to "roll-back" changes to your development environment if something corrupts it.
Currently, I am using VirtualBox for my VM setup. I used to use VirtualPC, but I REALLY hated not having any type of "snapshot" feature (like VMware and VirtualBox have).
We develop software for use in our SaaS application, our production environment has a large number of servers and their software environment needs to be absolutely predictable; we can't have ANYTHING installed extra, or absent from our development machines.
Moreover, our application requires a number of different server types in order to function properly (at least 7 last time I counted); mostly they can't be installed on the same (virtual) machine - at least, not without violating the "same software as production" requirement.
In order to have a consistent environment, it's necessary to use VMs. I don't know how anyone ever manages without them.
Snapshots and rollbacks are nice too, but I use them only occasionally (really useful during installation / upgrade tests).
Suppose you're developing a new version of your software, and checking that the upgrade from the previous version works correctly... how long does it take to do a test cycle without being able to rollback the box? Do you have to reinstall the OS then the old version? Can you guarantee that the uninstall really uninstalls everything?
Being able to test/retest your deployment process is a huge savings.
Developing Add-Ins for different versions of Microsoft Office (using Visual Studio Tools for Office).
My main work machine has Office 2007. When I work with Add-Ins for Office 2003 I use a virtual machine with Visual Studio and Office 2003.
I'm suprised that nobody has mentioned the VMware record/replay feature (awesome video demo) which is great for debugging.
I have a headless server running ESXi which runs various machines for building installers (so I don't have to give up processing power on my desktop), automated testing (server is faster than any desktop) and various test environments (about 20 different configurations) so that the support team can easily jump onto a configuration that closely matches a customers system.
When you have one really beefy server running VMs that can be shared between support, test and dev teams, you introduce huge cost savings. In all, we're running ~25 VMs on ESXi (dual-quad core Xeon 2.5G + 8Gb RAM) shared between 5-10 people, some of the developers use Virtual PC and then I use VMware Workstation on my desktop. All of the Mac users here use VMware Fusion as well
I am surprised that no one has mentioned the benefit of increased security by isolating, for example, the database server and web server in different VM's.
Some server applications can use VMs too. When one vm is not used much, the server can locate the resources to other vms.
Some sort of test environment: if you are debugging malware (either writing or developing a pill against it) it is not clever to use the real OS. The only possible disadvantage is that the viruses can detect that they are being run in the virtualization. :( One of the possibilities to do it is because the VM engines can emulate a finite set of hardware.

Setting up a development environment INSIDE a virtual machine

Heres the problem. I use around three different machines for development. My partner is using two. We have to go through the same freaking set up procedure on all five machines to get to work.
Working with a php project here, so:
Install and configure, PDT, a php debugger, and some version of XAMPP.
Then possible install an svn client, and any other tools.
Again, to each of the five machines.
What if, instead, we did all of this once, in a virtual machine that is set up with the same stack, same versions, as the production server. Then each of us could grab a copy of the VM image, run that image on each of the five machines and do all of our development in that VM. Put Eclipse, apache, mysql, the works, all in that vm.
The only negative of this approach, and please correct me on the only part, is performance. Is it really that big of an issue though? The slowest machine out of the five is a Samsung NC10 powered by an Intel Atom 1.6 ghz processor.
Do you think this is possible and practically usable? Or am I crazy?
I use a VM for development (running on my laptop) and have never had performance problems. Another approach that you could take would be to image the drive in the state that you want. Use Acronis or Ghost to re-image each machine when you need to. Only takes about 5-10 minutes to restore an image on any modern PC.
I use a VM for all my "work" as it keeps it away from my "play". This set up allows me to use the office VPN without exposing my whole machine to the office environment (which I trust about as much as the internets. ;-) Also I don't have to worry about messing up my development environment by trying games or other software. My work VM is currently running inside VirtualBox but I have used VMWare in the past. I have only noticed performance issues when using graphic intensive programs like Webex or the Terminal Server Client.
It can certainly be done. What turns me off is the size of the VM image, which would normally be several GBs. Having it on a network share means it can take longer to transfer then your current setup process takes. I guess an external hard drive would be the easiest way to move it around.
Performance wouldn't be an issue with any web development.
I have to ask why your current machines need to be "re-imaged" each time you sit down for work?
If you're using Windows you'll probably want to use SYSPREP on the master image so that the 'mini-setup' runs when you boot up the virtual machines for the first time.
Otherwise in terms of Windows' point of view, the machines have the exact same SID, hostname and other things - running multiple machines with the same SID on the same network can cause tons of headaches. Even more if you want them to communicate with each other.
I've run websphere for zSeries on a vmware virtual machine with no problem and websphere is more resource intensive then any PHP stack. I find that having a multi core machine or at least hyper threading makes it run a lot faster.
With vmware, disk operations are slower. For PHP development I doubt it would be a problem, but you'd definitely notice it if you are compiling a large C++ project. There is also Sun's VirtualBox which is free, and the latest version is rather nice (but I haven't looked at how slow disk operations are yet).
I am using that idea in practice. Virtual machines are generally great for development.
To run on multiple operating systems and multiple separate development environments.
Preserver older development environments for later support.
Can be easily backed up, when hard drive crashes no need to start from beginning.
Can be copied from developer to another, so everyone don't have to do tedious installations and configurations.
Down sides are:
Virtual machines are slower, you need more powerful computers than you would need otherwise. I would recommend having at least 4 G of ram, but preferably more like 16, fast multi core processors and fast hard drives.
Copying Windows OS virtual machines, each used copy of virtual machine should have it's own product key. When you make a copy, it needs to be registered with new product key.
Did you think about a software configuration manager like ansible, chef or puppet? With such software automation of such tasks is very easy! It can even create fresh vm and then configure it.