I had an interesting conversation with a peer recently that started with a statement he made that “innovation was all but dead in security”. The implication was that we had done all we could do and that there was very little more that would be accomplished. Of course I felt this was an overly simplistic and narrow view, not to mention that it completely ignores the rather dramatic impact changes in computing infrastructures will have over the next 5-10 years and beyond.
How have enterprise architectures evolved over the past 10 years and how will it continue to evolve? Simply put we are pushing more of our computing assets and the infrastructure that supports them out into the Internet / cloud. It began with mobile computing devices, remote offices, and telecommuters and is now moving into aspects of the traditional internal infrastructure, such as storage, application / service delivery, and data management. This has forced IT to, in some cases, radically redefine the technologies and processes they implement to even provide the basics of availability, maintenance and security. How does an IT organization maintain the health and availability of the evolving enterprise while securing the environment? How do they ensure visibility into and control over an increasingly complex and opaque infrastructure?
IT Enterprise Architecture Circa: 2000 – Organizations primarily manage static computing devices that are within the corporate network and primarily access corporate assets; they focus on perimeter security “keep the bad guys out”
IT Enterprise Architecture Circa: 2009 – Organizations must manage and secure a growing globally distributed. remote, and mobile computing environment all accessing corporate assets housed within the corporate network; they tend to focus on data center and critical infrastructure security and for the most part acquiesce management and securing mobile computing devices to fate and luck.
IT Enterprise Architecture Circa: 2012 – Organizations must manage and secure a large, complex, and globally distributed. remote, and mobile computing environment all accessing corporate assets housed within the corporate network as well as corporate assets/resources housed and maintained in a 3rd party service providers infrastructure; The loss of visibility and control again forces them to look to how they can better maintain the health and security of their mobile computing environment – the endpoints that require access to corporate resources that are housed inside of the corporate network and in the “cloud”
Mobile Computing and the “Consumerization” of IT
In the good ol’ days, like 3-4 yeas ago, IT had more freedom to say no to pretty much anything. As soon as they raised the flaming sword of FUD or tossed around outrageous costs associated with trying to manage said “thing” said “thing” was quickly denied. But it is a new dawn, a dawn filled with shiny new gadgetry, social media “drum circles” and all forms of privacy invading digital communication. IT has been forced to shift from the overlords of the word “no” and “sorry, that is against policy” to “uh, yeah, sure you can use that shiny new 3G iPhone to access corporate email, I mean hell you are the CEO”. The reality is that a mobile sales force using mobile devices to access a SaaS CRM application in real-time while on the road or an SE being able to quickly resolve a demo problem through the use of IM is powerful and important to productivity and IT must move from limiting the use of new technologies and instead look to enable and support them. The problem is that most IT organizations simply cannot manage a mobile computing environment, in fact the entire NAC market (NAC is dead btw) was an industry response to the inability of IT to manage mobile computing environments, so instead of trying to implement a method to manage the unmanageable the industry decided it would be best to simply block access to corporate resources until the asset could be determined to meet a base level of security – really, seriously, that was a good idea? IT must learn to manage a mobile computing environment as simply an extension of their corporate networks or they will face severe problems of availability, support costs, and security incidents in the coming years.
Desktop and Client Virtualization
Wouldn’t a majority of systems management and security problems be solved in this new world if we implemented a virtual desktop “container” on all corporate assets and maintain the health and security of the container – couldn’t we just go back to the thin-client model?
It sounds like a beautiful world, a big giant virtual world filled with roses, butterflies and perfect SLA adherence. A virtual container would be placed on a baremetal device or adjacent to an existing OS and voila the enterprise has a “clean” environment that can be swapped out, updated, flushed and quickly returning to homeostasis. Lower costs, greater efficiencies and better security – well not exactly. In some cases the cost of implementing virtual desktops is up to 10x the costs of managing physical environments and there is almost zero improvement in security or operational efficiency. Let me explain the requirements for implementing VMare virtual desktops. First you need a bunch of ESX servers that can host the virtual images – you will need a lot of these so get ready to acquire some heavy iron, next you will need a large SAN to store all the various images the enterprise requires, next you will need the VDI (virtual desktop infrastructure) components, Virtual center and a whole lot of virtual desktop managers, which will act in the capacity of “concentrators and routers” for the client to desktop image repository. This is an overly simplistic view, but to recap you need a lot of iron, VMWare ESX, Virtual Center, VDI, and the virtual desktop managers – you also need FTE’s to manage this new infrastructure, and of course they must posses the expertise to make it all work, oh and btw, this doesn’t support remote, mobile computing environments very well, nor will it magically improve poor patching, security or other traditional IT practices that organizations already struggle at. I am planning on a much longer and detailed post that will bring the awesome to the suck that is desktop virtualization, but in the meantime I really needed to get that off my chest – desktop virtualization is not the magic bullet you are looking for.
Riding the hype train to funky town I couldn’t resist an opportunity to mention cloud computing, of course it is relevant to the topic though. The movement to “cloud computing” is / will strain IT organizations and constrict them back to looking at how to secure mobile computing devices – maybe the Jericho forum guys had it right all along – well except for the whole “burn your firewall” thing – that was just silly! Cloud computing provides tremendous promise leading IT towards the land of “dynamic and agile infrastructure” but along the way they must pass through the dark forest of limited to no visibility and near-zero control. when we allow services to be delivered by a third party we lose all control over how they secure and maintain the health of their environment and you simply can’t enforce what you can’t control. Now the “experts” will tell you otherwise, convince you that their model is 100% secure and that you have nothing to fear, then again those experts don’t lose their job if you fail. this doesn’t mean that cloud computing isn’t important or that it shouldn’t be invested in, but organizations should look for non-critical applications and slowly move to understand how and what needs to be done before they get sucked into the “cloud”
Bottom Line: IT infrastructures are evolving and those organizations that are unable to take advantage of new technologies and innovations will fall behind competitively, unfortunately moving to adopt and address these new changes requires IT to implement technologies and processes that support them – but isn’t that exactly what we all get paid for?