Cisco Study Finds 40 Percent of IT Workers are Clueless About Virtualization

12

Comments

+ Add a Comment
avatar

stinger2048

I would wonder if Cisco really spoke to the senior leadership that was in charge of networking or if they spoke with the senior leadership for Wintel and such. Generally in the Fortune 100 company I work for Cisco would never reach out and speak to someone in the Wintel, Mainframe or iSeries arena. To me, most people who understand virtualization understands networking since they need to know about the Nexus virtualized switch but it makes sense the Cisco management does not really understand virtualization.

avatar

Azmadeus

I have worked in virtualization for the past 8 years working specifically with VMware in a large Enterprise environment. I will be the first to admit that when you add in hardware costs, licensing, and support, that the savings you get from Virtualization aren't very much and can in some cases even cost more. However, what makes Virtualization VERY cool is the turn around time of standing up new workstations/servers and the recovery of said machines. Throw in High Availability and Distributed Resource Scheduling and that is where Virtualization shines. Now that everyone (from an Enterprise perspective) is moving to a "Cloud" infrastructure model, it makes better use of resources within a Virtual environment. With the many ways to access these environments, people aren't constrained to sitting at a desktop, in a cube, at an office somewhere. I am a HUGE fan of Virtualization as a whole, but it isn't for everyone/every company.

avatar

Hey.That_Dude

Not that surprising. Most colleges hardly teach real world anything anymore. They just shovel Math and 10 year old dribble down our throats so that we might have enough experience to not cause a meltdown.
Example: My university used the same chip to teach microcontrollers for 14 years. They just got ARM M3's within the last semester or two. They're sad little things but at least they have more than two registers.
NOW with this new found controller I've seen many projects in later classes (after they've supposedly been taught and learned quite a bit) where the CPU component is overlooked because they simply use the M3. Not a bad idea; however, no one checks the datasheets. Seen way too many projects that say "we'll use Double floats". Not only does that proc not support float... it's a 32b max... These are sad days ahead of us because our schools are so ingrained in theory that they forget to mention the things to look for in REAL LIFE.
I'd love to give more examples but I'd fill up pages. Long story short: We paid too much for college, especially when we learn more in field while getting paid...

avatar

wumpus

There is a lot to be said about learning on a chip where you can actually tell what is going on. I've heard claims of "using X means indeterminate execution time" on desktop CPUs. Learning on a 14 year old microcontroller, you would know what the requirements are to determine execution time (and can count them up to the clock cycle, then check the datasheet to determine which nanosecond the pins change). You wouldn't make the assumption that you would ever know what a desktop CPU is doing at any one cycle.

Some of this is due to the nature of the business. When I studied EE, one of the most important ideas in computer architecture was microcoding [I'm seriously dating myself here]. Designers would try microcoding once (the previous system was hardcoded) and refuse to use any other system. By the time I graduated, the Verilog/VHDL revolution was in full swing and the only thing anybody ever used microcode was obscure x86 compatibility. There was also that prof who insisted that nobody would ever use more than 2 register arguments (but we did learn the basics of RISC design at one point).

On the flip side, I heard equal horror stories of computer science students learning nothing but JAVA coding. Learn the principles and you can learn the next technology. If the students didn't learn virtualization, I can't blame the school (no matter how important it was to the IBM360 line). If the student can't *understand* virtualization, the school should have its accreditation yanked.

Finally, on the whole I have to agree with you. Here's how it works in EE.
What you learn: Lots of analysis. Very little design. What little design you do is all about making something and barely touches specs, reliability, documentation, etc.
What you need: Lots of design. In practice, the fact that they typically can make clever designs is more a problem for engineers (in the "you can't debug your most clever designs" sense). What you really need are the ability to find the *real* specs (a superset of the given specs), meet all the specs religiously, follow the documentation* specs religiously and make certain all documentation are correct (since manufacturing builds to the documentation, the docs *are* your product. It doesn't matter how well your prototype worked if you made a typo in the wrong place and manufacturing couldn't build the first run).

Of course, that's STEM. God help the humanities students.

* documentation here means parts lists, part specification, engineering drawings (schematics, gerber plots, mechanical drawings, solidworks whatever), and similar. Engineers can typically get away with their casual attitude toward user manuals and anything that isn't needed to build the device.

avatar

JeyNyce

I knew about Virtualization for years, but try explaining it to your boss and how it can save you money, good luck with that. I have a VM setup at home running my network.

avatar

jgottberg

This is shocking, if not disturbing.

avatar

Bullwinkle J Moose

heh heh

he signed his name Moron

Oh wait..

what?

avatar

Bullwinkle J Moose

Thats nothing...

Over 90% of YOU think that NSA Spyware Platforms 7 & 8 are more secure than Illegal copies of XP because XP lets you block all the back doors to Microsoft and everyone else while 7 & 8 leave all the doors open
(For security reasons of course)

Now who's the moron?

MORON

avatar

lordfirefox

Proof and Cite sources, or just STFU. Either way.

avatar

wumpus

He's claiming it is possible to secure XP. Think about it. I'm an old hand at bashing MS "security" and can't imagine to stooping to that level of trolling.

avatar

Bullwinkle J Moose

I am the source of my research so I know the data is accurate

Go away Troll

avatar

Innomasta

someone wheel this derp away