So we had a new server installed at work the other day, and this server is a Quad core Xeon (don't know the exact proc model) but we have Windows Server Standard running in a Hyper V virtualized environment.
Now, my impression of virtualization is that it can and often times is slow (dependent on the hardware and software being used). But, I'm a bit puzzled as to why (other than for cost purposes when expansion is needed) a virtualized environment would be "faster" or better than a normal server. I don't think that's the case, is it (or can it be)?
I would think that two separate servers running Windows Server 2008 Standard would be faster than one server running them virutalized/?? Right? AT this point in time, we only have one virtualized server, but in the future (5 years or so) we may want to add another server.
What's the big deal with virtualization any efficiency? I can see the cost benefits in regards to expansion, and that I understand, but I just don't quite get where people are saying it's "better" from a performance standpoint. I mean, we actually had to get 2 (if I am correct and I could be wrong) licenses for Server 2008 STandard: one for the primary OS, and then another license for the virtualized environment.