Recently I wrote a guest editorial for Virtual Strategy Magazine, although I have to admit I wasn’t made aware of my goofy picture – look away I’m hideous – until the article was published. You can find the full contents at Virtual Strategy Magazine
Consolidation is the major benefit or “killer app” for server/data center virtualization. Standardization is the major benefit or “killer app” for client-side virtualization.
As I was pondering the challenges of current systems management processes, researching the latest and greatest from the client-side virtualization vendors, and talking to a lot of large organizations I was trying to find that one thing that explained the operational benefits of client-side virtualization. There are more than one, but it really does come down to standardization, allow me to explain… Continue reading
To address the increasing cost and complexity of managing dynamic IT environments organizations are trying to understand how to adopt virtualization technologies. The value proposition and “killer app” are quite clear in the data center, however less attention has been given to the opportunities for endpoint virtualization. Even though there are multiple methods to address client-side virtualization; hosted virtual desktops (HVD), bare-metal hypervisors, local and streaming virtual workspaces and a range of options that layer on top of and between them all, such as application virtualization, portable personalities, and virtual composite desktops, there is still a tremendous amount of confusion and even more misconceptions about the benefits of client-side virtualization than with server virtualization. The major architectural flaw in almost all of these solutions is they remain very back end and infrastructural heavy, which reduces the benefit of cost-reduction and lower complexity.
Unlike server virtualization, which drove adoption from the bottom up, that is from the hypervisor and then through the other stacks, adoption of endpoint virtualization technologies is moving top down, that is starting with single applications within an existing OS. Application virtualization adoption will accelerate over the next 12-18 months with Gartner life cycle management analyst suggesting that it will be included in the majority of PC life cycle RFP’s in 2010 and beyond. Workspace/Desktop virtualization will follow over the next 24-36 months, as will the endpoint virtualization infrastructures. The adoption of both workspace/desktop and endpoint virtualization infrastructure will align with organizations desktop refresh cycles. Considering the average is between 3-5 years and considering that many are looking at desktop refresh to support Vista, although it probably only has about a 10% market adoption, and Windows 7, it is conceivable that we will begin seeing accelerated adoption of desktop and infrastructure virtualization over the next 24-36 months as organizations rethink their current systems management processes and technologies.
Let’s look at the 4 client/desktop virtualization models I believe will become the most prevalent over the next 3-5 years… Continue reading
I had an interesting conversation with a peer recently that started with a statement he made that “innovation was all but dead in security”. The implication was that we had done all we could do and that there was very little more that would be accomplished. Of course I felt this was an overly simplistic and narrow view, not to mention that it completely ignores the rather dramatic impact changes in computing infrastructures will have over the next 5-10 years and beyond.
How have enterprise architectures evolved over the past 10 years and how will it continue to evolve? Simply put we are pushing more of our computing assets and the infrastructure that supports them out into the Internet / cloud. It began with mobile computing devices, remote offices, and telecommuters and is now moving into aspects of the traditional internal infrastructure, such as storage, application / service delivery, and data management. This has forced IT to, in some cases, radically redefine the technologies and processes they implement to even provide the basics of availability, maintenance and security. How does an IT organization maintain the health and availability of the evolving enterprise while securing the environment? How do they ensure visibility into and control over an increasingly complex and opaque infrastructure? Continue reading
Thanks to VMware you can barely turn around today without someone using the V-word and with every aspect of the English language, and some from ancient Sumeria, now beginning with V it will only get worse. There is no question that virtualization holds a lot of promise for the enterprise, from decreased cost to increased efficiency, but between the ideal and the reality is a chasm of broken promises, mismatched expectations and shady vendors waiting to gobble up your dollars and leave a trail of misery and despair in their wake. To help avoid the landmines I give you the top myths, misconceptions, half-truths and outright lies about virtualization.
Virtualization reduces complexity (I know what server I am. I’m the server, playing a server, disguised as another server)
It seems counter-intuitive that virtualization would introduce management complexity, but the reality is that all the security and systems management requirements currently facing enterprises today do not disappear simply because an OS is a guest within a virtual environment, in fact they increase. Not only does one need to continue to maintain the integrity of the guest OS (configuration, patch, security, application and user management and provisioning), one also needs to maintain the integrity of the virtual layer as well. Problem is this is done through disparate tools managed by FTE’s (full time employees) with disparate skills sets. Organizations also move from a fairly static environment in the physical world, where it takes time to provision a system and deploy the OS and associated applications, to a very dynamic environment in the virtual world where managing guest systems – VMsprawl – becomes an exercise in whack-a-mole. Below are some management capabilities that VMware shared/demoed at VMworld.
- Vddk (Virtual disk development kit) allows one to apply updates by mounting an offline virtual machine as a file system, and then performing file operations to the mounted file system. They ignored the fact that file operations are a poor replacement for systems management, such as applying patches. This method won’t work with windows patch executables, nor will it work with rpm patches which must execute to apply.
- Offline VDI: The virtual machine can be checked out to a mobile computer in anticipation of a user going on the road and being disconnected from the data center. Unfortunately, data transfers, including the diff’s are very large and one needs to be aware of the impact on the network.
- Guest API – allows one to inspect the properties of the host environment, but this is limited to the hardware assigned to the virtual machine
- vCenter – Management framework for viewing and managing a large set of virtual machines across a large set of hardware, a seperate management framework than what IT will use to manage physcial environments.
- Linked Clones – Among other things, this allows for multiple virtual machine images to serve as a source for a VM instance, however without a link to the parent, clones won’t work.
- Virtual Machine Proliferation – Since it is so easy to make a snapshot of a machine and to provision a new machine simply by copying another and tweaking a few key parameters (like the computer name), there are tons of machines that get made. Keeping track of the resulting virtual machines – VMsprawl – is a huge problem. Additionally disk utilization is often under estimated as the number of these machines and their snapshots grows very quickly.
Want to guess how many start-ups will be knocking on your door to solve one or more of the above management issues?
Virtualization increases security (I’m trying to put tiger balm on these hackers nuts)
Customers that are drawn to virtualization should be aware virtualization adds another layer that needs to be managed and secured. Data starts moving around in ways it never did before as virtual machines are simply files that can be moved wherever. Static security measures like physical security and network firewalls don’t apply in the same way and need to be augmented with additional security measures, which will increase both cost and complexity. Network operations, security operations, and IT operations will inherit management of both the physical and the virtual systems so their jobs get more complicated in some ways, and they get simpler in other ways.
Again it would seem counter intuitive that virtualization doesn’t increase security, but the reality is that virtualization adds a level of complexity to organizational security that is marked by new attack vectors in the virtual layer, as well as the lack of security built into virtual environments, which is made even more difficult by the expertise required to secure virtual environments, skills that are sadly lacking in the industry.
The Hoff has written extensively about virtualization security and securing virtual environments (here) – they are different, yet equally complex and hairy – and nowhere will you find a better overall resource to help untangle the Tet offensive of virtualization security or securing virtual environments than from the Hoff.
Virtualization will not require specialization (A nutless monkey could do your job)
What is really interesting about the current state of virtualization technology in the enterprise is the amount of specialization that is required to effectively manage and secure these environments, not only will one need to understand, at least conceptually, the dynamics of systems and security management, but one will also need to understand the technical implementations of the various controls, the use and adminstration of the management tools, and of course follow what is a very dynamic evolution of technology in a rapidly changing market.
Virtualization will save you money today (That’s how you can roll. No more frequent flyer bitch miles for my boy! Oh yeah! Playa….playa!)
Given the current economic climate the CFO is looking for hard dollar savings today. Virtualization has shown itself to provide more efficient use of resources and faster time to value than traditional environments, however the reality is that reaching the promised land requires an initial investment in time, resources, and planning if one is to realize the benefits. Here are some areas that virtualization may provide cost savings and some realities about each of them
- Infrastructure consolidation – Adding big iron and removing a bunch of smaller machines may look like an exercise in cost-cutting, but remember you still have to buy the big iron, hire consultants to help with the implementation, acquire new licenses, deploy stuff, and of course no one is going to give you money for the machines you no longer use.
- FTE reduction – Consolidating infrastructure should allow one to realize a reduction in FTE’s right? The problem is that now you need FTE’s with different skills sets, such as how to actually deploy, manage, secure and manage these virtual environments, which now require separate management infrastructures.
- Decrease in licensing costs – Yes, well, no, depends on if you want to pirate software or not, which is actually easier in virtual environments. With virtual sprawl software asset and license management just jumped the complexity shark.
- Lower resource consumption – See above references to complexity, security and FTE’s, however one area where virtualization will have immediate impact is in power consumption and support of green IT initiatives, but being green can come at a cost
Virtualization won’t make you rich, teach you how to invest in real-estate, help you lose weight or grow a full head of hair, it won’t make you attractive to the opposite sex, nor will it solve all your problems, it can improve the efficiency of your operating environment but it requires proper planning, expectation setting and careful deployment. There will be an initial, in some cases substantial, investment of capital, time, and resources, as well as an ongoing effort to manage the environment with new tools and train employees to acquire new skills. Many will turn to consulting companies, systems integrators and service providers that will help them to implement
solutions that generate a quick payback with virtually no risk and position your organization to take advantage of available and emerging real-time infrastructure enablers designed to closely align your business needs with IT resources.
As Les Grossman said in Tropic Thunder “The universe….is talking to us right now. You just gotta listen.”