I had an interesting conversation with a peer recently that started with a statement he made that “innovation was all but dead in security”. The implication was that we had done all we could do and that there was very little more that would be accomplished. Of course I felt this was an overly simplistic and narrow view, not to mention that it completely ignores the rather dramatic impact changes in computing infrastructures will have over the next 5-10 years and beyond.
How have enterprise architectures evolved over the past 10 years and how will it continue to evolve? Simply put we are pushing more of our computing assets and the infrastructure that supports them out into the Internet / cloud. It began with mobile computing devices, remote offices, and telecommuters and is now moving into aspects of the traditional internal infrastructure, such as storage, application / service delivery, and data management. This has forced IT to, in some cases, radically redefine the technologies and processes they implement to even provide the basics of availability, maintenance and security. How does an IT organization maintain the health and availability of the evolving enterprise while securing the environment? How do they ensure visibility into and control over an increasingly complex and opaque infrastructure?
IT Enterprise Architecture Circa: 2000 – Organizations primarily manage static computing devices that are within the corporate network and primarily access corporate assets; they focus on perimeter security “keep the bad guys out”
IT Enterprise Architecture Circa: 2009 – Organizations must manage and secure a growing globally distributed. remote, and mobile computing environment all accessing corporate assets housed within the corporate network; they tend to focus on data center and critical infrastructure security and for the most part acquiesce management and securing mobile computing devices to fate and luck.
IT Enterprise Architecture Circa: 2012 – Organizations must manage and secure a large, complex, and globally distributed. remote, and mobile computing environment all accessing corporate assets housed within the corporate network as well as corporate assets/resources housed and maintained in a 3rd party service providers infrastructure; The loss of visibility and control again forces them to look to how they can better maintain the health and security of their mobile computing environment – the endpoints that require access to corporate resources that are housed inside of the corporate network and in the “cloud”
Mobile Computing and the “Consumerization” of IT
In the good ol’ days, like 3-4 yeas ago, IT had more freedom to say no to pretty much anything. As soon as they raised the flaming sword of FUD or tossed around outrageous costs associated with trying to manage said “thing” said “thing” was quickly denied. But it is a new dawn, a dawn filled with shiny new gadgetry, social media “drum circles” and all forms of privacy invading digital communication. IT has been forced to shift from the overlords of the word “no” and “sorry, that is against policy” to “uh, yeah, sure you can use that shiny new 3G iPhone to access corporate email, I mean hell you are the CEO”. The reality is that a mobile sales force using mobile devices to access a SaaS CRM application in real-time while on the road or an SE being able to quickly resolve a demo problem through the use of IM is powerful and important to productivity and IT must move from limiting the use of new technologies and instead look to enable and support them. The problem is that most IT organizations simply cannot manage a mobile computing environment, in fact the entire NAC market (NAC is dead btw) was an industry response to the inability of IT to manage mobile computing environments, so instead of trying to implement a method to manage the unmanageable the industry decided it would be best to simply block access to corporate resources until the asset could be determined to meet a base level of security – really, seriously, that was a good idea? IT must learn to manage a mobile computing environment as simply an extension of their corporate networks or they will face severe problems of availability, support costs, and security incidents in the coming years.
Desktop and Client Virtualization
Wouldn’t a majority of systems management and security problems be solved in this new world if we implemented a virtual desktop “container” on all corporate assets and maintain the health and security of the container – couldn’t we just go back to the thin-client model?
It sounds like a beautiful world, a big giant virtual world filled with roses, butterflies and perfect SLA adherence. A virtual container would be placed on a baremetal device or adjacent to an existing OS and voila the enterprise has a “clean” environment that can be swapped out, updated, flushed and quickly returning to homeostasis. Lower costs, greater efficiencies and better security – well not exactly. In some cases the cost of implementing virtual desktops is up to 10x the costs of managing physical environments and there is almost zero improvement in security or operational efficiency. Let me explain the requirements for implementing VMare virtual desktops. First you need a bunch of ESX servers that can host the virtual images – you will need a lot of these so get ready to acquire some heavy iron, next you will need a large SAN to store all the various images the enterprise requires, next you will need the VDI (virtual desktop infrastructure) components, Virtual center and a whole lot of virtual desktop managers, which will act in the capacity of “concentrators and routers” for the client to desktop image repository. This is an overly simplistic view, but to recap you need a lot of iron, VMWare ESX, Virtual Center, VDI, and the virtual desktop managers – you also need FTE’s to manage this new infrastructure, and of course they must posses the expertise to make it all work, oh and btw, this doesn’t support remote, mobile computing environments very well, nor will it magically improve poor patching, security or other traditional IT practices that organizations already struggle at. I am planning on a much longer and detailed post that will bring the awesome to the suck that is desktop virtualization, but in the meantime I really needed to get that off my chest – desktop virtualization is not the magic bullet you are looking for.
Riding the hype train to funky town I couldn’t resist an opportunity to mention cloud computing, of course it is relevant to the topic though. The movement to “cloud computing” is / will strain IT organizations and constrict them back to looking at how to secure mobile computing devices – maybe the Jericho forum guys had it right all along – well except for the whole “burn your firewall” thing – that was just silly! Cloud computing provides tremendous promise leading IT towards the land of “dynamic and agile infrastructure” but along the way they must pass through the dark forest of limited to no visibility and near-zero control. when we allow services to be delivered by a third party we lose all control over how they secure and maintain the health of their environment and you simply can’t enforce what you can’t control. Now the “experts” will tell you otherwise, convince you that their model is 100% secure and that you have nothing to fear, then again those experts don’t lose their job if you fail. this doesn’t mean that cloud computing isn’t important or that it shouldn’t be invested in, but organizations should look for non-critical applications and slowly move to understand how and what needs to be done before they get sucked into the “cloud”
Bottom Line: IT infrastructures are evolving and those organizations that are unable to take advantage of new technologies and innovations will fall behind competitively, unfortunately moving to adopt and address these new changes requires IT to implement technologies and processes that support them – but isn’t that exactly what we all get paid for?
Great article and nice insight with the virtual container. The move to real virtualization from disk to application can happen today and provides key direction to better SLAs and stronger security and services. Costs of published desktops, virtual PCs, and virtualization of applications are actually dramatically less then physical systems implementations. We’ve been providing this environment for 8 years and continue to win business on TOC and ROI calculations along with superior service.
The key is to architect the overall corporate solution starting with the cloud and moving out. Large locations have specific needs, many that can be served centrally and out of the cloud. Mid-sized locations, other needs that again, server well centrally. Small and remote worker locations are also an easy fit for the cloud. This centralization and focus of IT as a service (not just SaaS) really provides a fantastic cost effective solution and delivers on the promises today.
We’ve been providing it to customers for years and leveraging the cloud and high usage virtualization dramatically reducing our customer’s costs and increasing uptime, security, and overall IT services at great savings to them. It’s here now.
Good article, but I wanted to add another consideration regarding cloud computing and privacy. A large concern for those of us who deal with HIPAA and FERPA compliance is the safety of sensitive data in the cloud.
From a complaince standpoint, know where your data physically resides can be a serious concern. I found an article by Kristen Mathews,
“…companies also will need to consider other privacy concerns when computing in the cloud, such as the possibility that data stored with another entity may be subject to subpoena and disclosed to the government of the jurisdiction where the cloud servers are located, perhaps without the company’s permission or knowledge. “
great point, I neglected to mention it in my post but it is definitely top of mind. I know that Hoff and others have been pointing to the problems of compliance initiatives with the “cloud” model – it isn’t radically different I would imagine from how companies have to deal with auditors when they offshore process to a foreign country, say for example medical billing ro credit card processing, again the data is leaving the organization and the organization losses visibility and control into is use, storage and transmission.
I would guess that we will see more significant, high-visibility breaches associated with a a company using a 3rd party.
Pingback: Friday Summary, February 20, 2009 | securosis.com
The Jericho Forum guys have _never_ said “burn your firewall”.
What they have been saying all along is that your firewall no longer provides the security that people associate mentally with having one there, and that all levels of security need to be considered all the way down to the data level.
To address this they did a lot of work on defining their Collaboration Oriented Architectures to help organisations understand this.
Their focus now is on emerging cloud computing models – check out their vision at https://www.opengroup.org/jericho/vision.htm which also refers to privacy & civil liberty.
No Shane they never said “burn your firewall”, nor did they actually say “firewalls are dead” at least not directly – but the hype created by saying “they are obsolete and offer no protection” and their various marketing collateral was pretty damn close – but this is really here not there in the context of my post, I was just using that to illustrate a point.
These diagrams are great! Do you mind if I use them elsewhere, and provide attribution to you?
the popularity of cloud computing made most companies use saas accounting software nowadays.
Pingback: Client-Side Virtualization Episode II: Standardization, Attack of the Clones and Desktops Reloaded « Amrit Williams Blog