so, I'm setting up a virtualized environment running on Hyper-V and I'm looking to see if there's a video conferencing system that can be setup on a virtual server, prererably for Windows Server 2008 R2. I'm not sure if his is possible, but is there something that can be done through IIS Media Services?
You could install the GNU Gatekeeper on the server so H.323 video-conferencing terminals can connect.
But beware that virtual machines usually don't provide the same even latency that real servers do, so you should avoid proxying RTP through virtual machines with whatever system you end up using.
You can try TrueConf Server as far as I remember it works on Hyper V. Be sure that your server is not overloaded otherwise any video conferencing software server will not work properly.
Related
Ok, I know what is basically a Hyper-V is.
Simple, a virtual machine. Well, good for testing application and development usage.
Ok, so far so good for the understanding. and here the main question:
Why do you need to install servers in a Hyper-V on a real server?
Isn't that running a server os on the real machine is somehow better performance than running it in a virtual environment?
for example, database server. Install it in a virtual machine? why not on the real machine?
One example of its use would be to create the perfect developer environment if you want to run many different versions of SQL Server on the same physical box.
SQL Server 2005 isn't compatible with Windows 10 so a virtual server running Windows 2003 is better to house it. Windows 2008 for SQL Server 2008 and so on.
This also gives you the flexibility to allocate resources to different VMs and prioritise RAM to the instance that your currently developing against. Giving you server level options with client tools running on the host OS as intended.
Check out this blog post on setting up such a dev environment.
http://www.purplefrogsystems.com/paul/2016/05/using-hyper-v-and-powershell-to-create-the-perfect-developer-workstation/
I am new to Hyper-V and Server Core but I am stumped as to how to install a guest OS from an ISO using only PowerShell.
I have downloaded the Hyper-V Server ISO and installed it on my server. It only installs Server Core and does not give me the option for a full GUI option. I configured its network settings, etc and all looks ok. So Server Core installed properly and Hyper-V feature is enabled. I can use PowerShell to create a VM with VHDX and link my Guest OS ISO to it. When I start the VM there is no console UI to install the OS.
How are you supposed to install a guest OS with no console interface to setup the OS?
Note, there is no option under this configuration to enable the OS GUI as some posts have suggested.
First, please don't confuse "Server Core" with "Hyper-V Server". "Server Core" is an installation mode of Windows. Among other things, it can be converted to GUI mode, which is why people keep telling you to just turn the GUI on. Hyper-V Server looks like Server Core but it is not Server Core.
For your actual problem, you're not going to find a simple out-of-the-box solution. You could work up a complete unattended installation process. You could set up a Windows Deployment Services server and have it install via PXE boot. I think some of the third-party Hyper-V management solutions allow you to connect to the console of a VM from within the local Hyper-V Server.
Hyper-V Server was designed with headless operation in mind. It was expected that you would use it to configure and perform maintenance on the management operating system and, if desired, the virtual machines as containers. The guest operating systems themselves were not really meant to be managed from within Hyper-V Server. What it's expected that you'll do is use a full GUI, whether another copy of Windows Server or a Windows desktop operating system running Remote Server Administration Tools to remotely connect to Hyper-V Server and manage its VMs.
Hello all I have a backed up about 30 servers using disk2vhd and now I have built my first of many hyper-v severs I did not realize this is all command line I did download CoreConfigurator and that does have some functionality I have been looking for. My question is how do I get the VHD files to run a Vitual Machines? its all command line I tried via vbs to mount the VHD's and I have not been able to any help on this would be great!
Thanks!
If you are using servercore, You maybe can do everything from the command line but I always prefer to have one computer running a Non server core version of windows 2008 to be the management server. You will load up Hyper-V manager on the non server core box and manage your Hyper-V server.
To have no "management" servers or desktops on your network will be a big pain IMO for management.
Using Hyper-V Manager you can quickly load the VHD's as VM's.
So load up Hyper-V Manager on a desktop PC on your localnet, and use its connect option to connect to your servercore. (Make sure your firewall settings are ok on servercore using coreconfig)
I'd like to import a real Windows Server 2008 server as a Hyper-V Virtual Server on another Windows Server 2008 instance.
Anyone have any idea how to do this?
I'm looking at the System Center Virtual Machine Manager 2008 but it doesn't seem to import Windows Server 2008 - nor is it free.
Is there some other workaround (i.e. import the image into VMWare first, then convert to Hyper-V)?
Please help.
Regards,
Randall
while testing disaster recovery, iwas pleasantly surprised
(and impressed) that the builtin windows server backup
restored to hyper-v without a hitch.
this was on production hardware, with hw raid 5 and such -
so i expect it would work with slightly less exotic stuff as well.
I know from personal experience that using VMware's converter works to take an image of the system. You can then use Hyper V to import the VMware image you created.
When I was testing the the beta version of Hyper-V this was the only reliable method I found to import a physical system into a Hyper-V environment.
It's seems crazy to doubled convert something, but it worked!
Currently running Server 2003 but am looking at reinstalling in the near future due to a change of direction with the domains. Should I take this opportunity to install Windows Server 2008 instead?
I would love to play with new technology and the server is only for a small home business so downtime/performance issues aren't really a concern.
I am no expert on Windows server revisions, but the only new feature of Server 2008 I can think of is Hyper-V. But I would try Server 2008 just for Hyper-V, as this VM hypervisor is supposedly much faster than VMware and Virtual PC, and is compatible with Virtual PC virtual disks.
One rule that has served me very well over the years is: Do not upgrade infrastructure components just for the sake of upgrading. If it works well, leave it be. You mentioned that some downtime isn't a big deal, but if the server is actually used then there is a chance it can become a big deal unexpectedly. Why not simply get (or build) a new machine and play with the new operating system there? That way you get the best of both worlds.
There is no Exchange Server 2008. Exchange has always been tightly integrated with IIS which tends to bind it to a specific version of Windows. However, Exchange Server 2007 SP1 can be installed on Windows Server 2008.
Exchange Server 2003, however, cannot run on Windows Server 2008 and I do not believe there are any plans to do so in a future service pack.
Note that Exchange Server 2007 requires x64 architecture, running the 64-bit OS, on a production system. The days of booting /3GB are past - it simply does not provide enough virtual address space for current large databases. Exchange's long-running virtual memory fragmentation problem has not been fixed, it has just been given more virtual address space to work in.