Feeds:
Posts
Comments

Posts Tagged ‘Microsoft’

“I am not a number, I am a free man”

IDC reported that we generated and replicated 1.8 zettabytes – that’s 1.8 trillion gigabytes – of data in 2011. To give you an example of scale you would need to stack CDs from Earth to the Moon and Back again – twice – to represent that amount of data and its expected to grow 50x by 2020. Interesting factoid: Through April of 2011 the Library of Congress had stored 235TBs of data. In 2011 15 out of 17 sectors in the US have more data per company than the US Library of Congress, much of that data is about you.

Facebook is preparing to raise $100 billion, yes a hundred billion, in a highly anticipated IPO next spring. Twitter is valued at $10 billion, and social media companies are pulling massive valuations. In terms of data, roughly 4 billion pieces of content are shared on Facebook every day, and Twitter registered 177 million tweets per day in March of 2011. The success of these companies, and many others, is trade in human commodity. There is an inherent value to your tweet, your wall post, becoming mayor at some DC cafe or posting your location to wherever people post those things, but the real value is simply in your existence as a number in a sea of other 1 and 0′s.

We are entering a world where every aspect of our lives, short of those thoughts we hold deep, will be processed, indexed, analyzed and archived forever. What we search for, our online activity, where and how we drive, what we buy; when and how often, our health, financial, and personal records digitized for quick sale to the highest bidder. Never before have we had the ability to implement systems to handle massive volumes of disparate data, at a velocity that can only be described as break-neck and with this ability comes the inevitable misuse.

The commercial implications for companies seeking access to this depth and breadth of customer intelligence is clear, but this same information federated with the analysis of unstructured video, picture, voice and text data in the hands of our government or one that meant us harm is truly frightening.

Social media is an interesting experiment in applying a large scale operant conditioning chamber to a mass population, the law of effect is a retweet, a friending, being listed on a top x most influential list, or whatever else elicits the desired response. We leap head first off the cliff of technology and only concern ourselves with the implications when they become a problem for us.

The irony is that in our search for identity and individuality in an increasingly digital world we have willingly surrendered that which we used to hold so dear – our privacy.

May future generations forgive us.

Read Full Post »

Recently I wrote a guest editorial for Virtual Strategy Magazine, although I have to admit I wasn’t made aware of my goofy picture – look away I’m hideous – until the article was published. You can find the full contents at Virtual Strategy Magazine

(more…)

Read Full Post »

Michal Zalewski, a security researcher at Google, recently wrote a guest editorial for ZDNet entitled “Security Engineering: Broken Promises”. The article lays out a series of issues with the security industry, specifically looking at an inability to provide any suitable frameworks for software assurance or code security.

We have in essence completely failed to come up with even the most rudimentary, usable frameworks for understanding and assessing the security of modern software; and spare for several brilliant treatises and limited-scale experiments, we do not even have any real-world success stories to share. The focus is almost exclusively on reactive, secondary security measures: vulnerability management, malware and attack detection, sandboxing, and so forth; and perhaps on selectively pointing out flaws in somebody else’s code. The frustrating, jealously guarded secret is that when it comes to actually enabling others to develop secure systems, we deliver far less value than could be expected.

(more…)

Read Full Post »

The rising tide of mobile computing, driven by the introduction of consumer devices such as the iPhone and iPad, is crashing against the shores of many an IT shop. Most IT organizations have lived on a diet of corporate policy restrictions and liberal use of the word “No!”, unfortunately their time has come. (more…)

Read Full Post »

Beijing, China – April 1, 2010 – The Chinese government announced that effective immediately all US based technology firms and associated products and services will be banned from all Chinese government and state-run agency IT environments. The ban is expected to include critical infrastructure, such as military, finance, utilities, and healthcare as well as education, retail and manufacturing companies. (more…)

Read Full Post »

I recently had an opportunity to discuss desktop virtualization with Bill Brenner from CSO online – you can listen to the podcast (here), you can also listen to the most recent Beyond the Perimeter podcast which focuses on Desktop Virtualization (here)

Read Full Post »

We all know that IT security and operations is becoming a more challenging and untenable problem day by day – see “Top 10 Reasons Your Security Program Sucks and Why You Can’t Do Anything About it” – The reality is that we continue to build on top of inherently insecure and fundamentally weak foundations, such as the operating systems and routing infrastructures that power much of the global economy.

We need an alternative to the current computing paradigms that all organizations struggle with.

(more…)

Read Full Post »

btp2

Not too long ago I embarked on a creating a podcast series that would provide more regularity than the blog. Beyond the Perimeter has been a tremendous amount of fun and as we just posted our 50th podcast I wanted to reflect on some of the highlights and wonderful guests we have been honored to have joined us.

Beyond the Perimeter iTunes subscription

Beyond the Perimeter Direct XML Feed

(more…)

Read Full Post »

HVD-fail

Systems and security management is difficult, ineffective, costly and becoming ever more so in increasingly distributed, heterogeneous, complex, and mobile computing environments…

  • 98% of all external attacks take advantage of poorly administered, misconfigured, and unmanaged systems (Source: Verizon Data Breach Investigations Report 2009)
  • A locked down and well managed PC can cost 42% less than an unmanaged one (Source: Gartner – The Total Cost of Ownership: 2008 Update)
  • The direct costs incurred in a “somewhat managed” PC are only slightly lower than the direct costs of an unmanaged PC, because of expenses to maintain underutilized or dysfunctional management systems (Source: Gartner – The Total Cost of Ownership: 2008 Update)

The benefits provided by server virtualization are being realized as server consolidation has enabled cost reduction and efficiencies in data center/server management. This is of course leading many to ask the question “why can we not virtualize our desktops as well?” (more…)

Read Full Post »

The matrix

Consolidation is the major benefit or “killer app” for server/data center virtualization. Standardization is the major benefit or “killer app” for client-side virtualization.

As I was pondering the challenges of current systems management processes, researching the latest and greatest from the client-side virtualization vendors, and talking to a lot of large organizations I was trying to find that one thing that explained the operational benefits of client-side virtualization. There are more than one, but it really does come down to standardization, allow me to explain… (more…)

Read Full Post »

VDI fail

To address the increasing cost and complexity of managing dynamic IT environments organizations are trying to understand how to adopt virtualization technologies. The value proposition and “killer app” are quite clear in the data center, however less attention has been given to the opportunities for endpoint virtualization. Even though there are multiple methods to address client-side virtualization; hosted virtual desktops (HVD), bare-metal hypervisors, local and streaming virtual workspaces and a range of options that layer on top of and between them all, such as application virtualization, portable personalities, and virtual composite desktops, there is still a tremendous amount of confusion and even more misconceptions about the benefits of client-side virtualization than with server virtualization. The major architectural flaw in almost all of these solutions is they remain very back end and infrastructural heavy, which reduces the benefit of cost-reduction and lower complexity.

Unlike server virtualization, which drove adoption from the bottom up, that is from the hypervisor and then through the other stacks, adoption of endpoint virtualization technologies is moving top down, that is starting with single applications within an existing OS. Application virtualization adoption will accelerate over the next 12-18 months with Gartner life cycle management analyst suggesting that it will be included in the majority of PC life cycle RFP’s in 2010 and beyond. Workspace/Desktop virtualization will follow over the next 24-36 months, as will the endpoint virtualization infrastructures. The adoption of both workspace/desktop and endpoint virtualization infrastructure will align with organizations desktop refresh cycles. Considering the average is between 3-5 years and considering that many are looking at desktop refresh to support Vista, although it probably only has about a 10% market adoption, and Windows 7, it is conceivable that we will begin seeing accelerated adoption of desktop and infrastructure virtualization over the next 24-36 months as organizations rethink their current systems management processes and technologies.

Let’s look at the 4 client/desktop virtualization models I believe will become the most prevalent over the next 3-5 years… (more…)

Read Full Post »

stormtrooperlol

So apparently the latest version of the Qualys Laws of Vulnerabilty Report has Qualys jumping to some pretty outrageous claims about how cloud-computing – invented by Qualys according to Courtot (insert cute smiley here) – can secure IT more effectively or allow people to not patch any more or some such nonsense (thanks to Hoff for the heads up).

Anyway so the logic flaw goes something like this -> (more…)

Read Full Post »

conficker3

They’re back!

It has been awhile since we had a good old fashioned, highly publicized, hysteria inducing, globally distributed, mass-infecting worm. The AV vendors (here) and (here) must be ecstatic that 2009 is really turning out to be the year of the largest security incidents since the beginning of forever as I predicted it would be back in January (here). Of course you could make that prediction every year for the next 20-30 years and pretty much experience an 80%+ success rate, it’s like predicting that as social media becomes ubiquitous we will experience more social media related security threats, or that as the economic condition worsens it will drive even more financially motivated cybercrime buoying an already burgeoning digital black market, or that there will be more high-profile data breaches – all no brainers. (more…)

Read Full Post »

ccmanifesto

So apparently a group of technologists and vendors working under the cloak of digital darkness drew out a pentagram and locked arms as they called out to Cthulhu to manifest and drive out those that would oppose their ultimate aims of total and complete world domination. Domination brought about through a set of cloud computing solutions that would revolutionize antiquated IT infrastructures and deliver agility, scalability, and operational efficiencies through an open platform at a really, really good price.  Blood was spilled, virgins were killed, and apparently an “open” cloud-computing manifesto was drafted. (more…)

Read Full Post »

I had an interesting conversation with a peer recently that started with a statement he made that “innovation was all but dead in security”. The implication was that we had done all we could do and that there was very little more that would be accomplished. Of course I felt this was an overly simplistic and narrow view, not to mention that it completely ignores the rather dramatic impact changes in computing infrastructures will have over the next 5-10 years and beyond.

How have enterprise architectures evolved over the past 10 years and how will it continue to evolve? Simply put we are pushing more of our computing assets and the infrastructure that supports them out into the Internet / cloud. It began with mobile computing devices, remote offices, and telecommuters and is now moving into aspects of the traditional internal infrastructure, such as storage, application / service delivery, and data management. This has forced IT to, in some cases, radically redefine the technologies and processes they implement to even provide the basics of availability, maintenance and security. How does an IT organization maintain the health and availability of the evolving enterprise while securing the environment? How do they ensure visibility into and control over an increasingly complex and opaque infrastructure? (more…)

Read Full Post »

fightthefud

Few things can evoke more uncertainty and doubt than fear (here)…

The threat of cybercrime is rising sharply, experts have warned at the World Economic Forum in Davos.

Online theft costs $1 trillion a year, the number of attacks is rising sharply and too many people do not know how to protect themselves, they said.

On-line theft costs $1 trillion US dollars a year?  We have certainly come a long way since the Dark Avenger first crafted his polymorphic virus in the late 80′s but a $1 trillion a year? Seriously? Where the hell did the figure come from? To give you some perspective of size the total US GDP is about 14 trillion and that includes EVERYTHING.

But it gets worse…

“2008 was the year when cyber warfare began.. it showed that you can bring down a country within minutes,” one panelist said.

Cyber warfare began in 2008 – between which countries? It showed you can bring down a country within minutes? Seriously, bring down a country, really, are you kidding? Is this some kind of sick world economic forum humor or just sheer ignorance?

So people are unable to browse to youtube or update facebook, or download Goth porn, or make their way over to my blog and up my readership – these things are all terrible, no question, but bring down a country? I can hear the threats now “Either your country surrenders or we will DoS you back to 1995″, just doesn’t have the same kick as “bomb you back to the stone age” does it.

There is no question that we have a problem, the increased reliance on technology, the ubiquitous nature of broadband connectivity and more digital commerce all create an environment that will breed crime. I believe that awareness is important, people should understand the dynamics and risks inherent in this new digital environment, but FUD doesn’t work, it drives up hysteria and then it crashes into ambivalence, FUD is the drug of the security industry and apparently many are addicted.

Read Full Post »

vader-fail

Well friends we are nearing the end of another year and closing in on the first decade of the century. As we prepare for the onslaught of 2009 predictions I thought it would be appropriate to look back on all that is FAIL in the world of technology over the past decade so we can learn, grow and laugh at someone else’s expense. So I give you the top 10 worst technology failures of the last decade…

10. Oakley MP3 sunglasses (The Death of Cool)

MID MORNING

A WAGNERIAN ARIA plays, a crystalline TENOR SOLO haunting in its beauty consumes an executive board room

FADE IN:

Oakley Executive #1

“Let’s take one of the hottest sunglass brands and combine them with one

of the hottest consumer gadgets and make millions”

All

“Yeah, we will make millions, let’s do it”

This is definitely a case of two things that do not go well together, sort of like Mento’s and Soda, or Symantec and Innovation, aside from the logistical issues of having to wear sunglasses to listen to music, there is simply no way to look cool wearing a pair of dork specs, and honestly who buys a pair of Oakleys if they didn’t want to look cool.

Full Disclosure: I owned stock in Oakley, was actually quite happy that they signed a contract with the Army and when they released the oil drum model I was sure the stock would sky rocket, oh well.

9. The Original DIVX (Making Betamax look genius)

In an awe inspiring moment of fail Circuit City (here), the now gasping for air consumer electronics chain, made an attempt to corner the movie rental industry with the introduction of the Digital Video Express (DIVX) format. The concept was simple, you – the consumer – pay them $4 for a disc that is only viewable for 48 hours and only on a DIVX player – after 48 hours it became as useless as silicone thigh implants, unless you coughed up an additional $3-4 for another 48 hours.

8. HD DVD (The FAIL of a new generation)

The now obsolete high-definition digital video format introduced by Toshiba lost the HD format wars to Blu-Ray, I wold love to weave a David and Goliath story that touched the four corners of the entertainment industry, spin a tale of how the XBox and PS3 were instrumental in the success of one and demise of the other, or how tech savvy consumers, battle hardened from decades of format evolution, were able to understand the nuances of quality, cost, storage capacity and available content. I would have loved to post that the porn industry won the battle, but in the US they actually standardized on HD DVD. So how did HD DVD lose? purely conjecture on my part, but it would appear that Sony simply out biz dev’d Toshiba, scoring retailer and major studio support and amassing a larger collection of movie titles.

7. The Millennium Bug (Y2Fail)

Billed as the technology equivalent of the “Day After” (here), a movie depicting the devastating effects of a nuclear holocaust, the Y2K, or Millennium bug, was supposed to result in a total technology breakdown. It was feared that planes would fall from the sky, critical services would cease to function and the world’s power grids would go dark. I remember at the time I was working at McAfee (here) and as the clock moved closer to New Years the office was crawling with reporters hungry for a front row seat to digital Armageddon. Of course, nothing happened and all the doomsayers were forced to take down their sandbags and unload their automatic rifles – to some this was a really disappointing turn of events, for others it marked the most visible technology FUD fail of all time.

6. Windows ME (Mistake Edition)

I would have said Bob, but that fail was so 1995. Windows ME (here), dubbed the slowest, buggiest, and most unstable operating system ever released, has won top honors as the worst Microsoft OS to date. The biggest flaw in Windows ME, and earlier versions of the Windows OS, is a lack of memory protection. This problem was exasperated in Windows ME as they attempted to introduce a broad set of new capabilities, such as new system utilities like system/virus restore, media support, automatic updates and the new TCP/IP stack all of which allowed Micrsoft to achieve a whole new level of stabiilty fail.

5. The Sony BMG Rootkit (Meine kleine digitale Parasiten)

In what has probably become the epic DRM (digital rights management) fail of all time (here), Sony BMG implemented a copy protection scheme that was distributed through music CD’s to consumer desktops, essentially installing a nearly undetectable rootkit that collected user information and sent it to the Borg collective. It was eventually detected and there was a major backlash from the security industry. Sony is still in the middle of fending off class-action lawsuits as a result.

4. Second Life (Give us your marginal, your dispossessed, your virtually lost)

Second life (here), the internet based virtual world created by Linden labs, in which virtual “residents” roam a 3D virtual world, virtually interacting with each other and virtually trading virtual money, called Linden dollars. No mythical creatures, no battle axes, or quests, or zombies, or explosions, or really any point to it at all. What kind of folks spend their time in a virtual world? Well according to Linden Labs Chairman of the board, Mitch Kapor…

the earliest wave of pioneers in any new disruptive platform, the marginal and the dispossessed are over represented, not the sole constituents by any means but people who feel they don’t fit, who have nothing left to lose or who were impelled by some kind of dream, who may be outsiders to whatever mainstream they are coming from, all come and arrive early in disproportionate numbers.

Just massive amounts of time doing nothing “virtually” with a group of marginal and dispossessed individuals, really? seriously? is this for real? perhaps they should change the name to secondlife.com.

3. Windows Vista (Windows ME Take 2)

Windows Vista (here), the successor to Windows XP was supposed to herald in a new era of Windows security, stability, and functionality, unfortuantely it failed on at least 2 of those fronts as there was widespread incompatibility and performance issues. In one of the oddest enterprise software ad campaigns to date, Microsoft unveiled “Mojave” the “Ha! I tricked you it really is just Vista” experiment (here) – call it what you will, fail is as fail does. Windows is now looking to fast track the release of Windows 7, which is the final nail in the Vista coffin. The folks over at ZDNet have a nice writeup on the top 5 reasons Vista failed (here)

2. The Internet bubble and dot com bomb of the early decade (E Pluribus, deficio)

I loved the 90′s back then you could get your money for nothing and your chicks for free, but like every wild party someone has to deal with the massive hangover the next day (here), and that hangover was the sudden reality that more than half of the .com companies were not only poorly managed and had ridiculous valuations, but were based on business models that seemed to be developed by third graders. Seriously not just one company that sells pet food over the internet but 5? Remember when the market cap for Amazon was greater than the entire addressable market they served, not only the digital market place but brick and mortar included. I know, I know, greed trumps common sense as we are experiencing with the sudden, although not unexpected, mortgage collapse and financial crisis, but didn’t someone think to ask “Seriously, you are willing to invest $20m in my company if I add a .com after the name – that’s just stupid”

1. The paperless office

No greater fail in our lifetime has had the impact that the myth of the paperless office has had. It has driven an entire industry in the PC and shaped a new generation of technical gadgetry and digital fail, from ebooks to digital document management systems, the paperless office has been a myth of epic proportions. Now I wasn’t around in the 40′s, which is when I believe the term was coined, but I imagine that there was far less paper floating around then there is now and there seems to be no let up in the tsunami of felled trees and charred Brazilian rain forest that fuel our appetite to print everything even if a new ink jet print cartridge costs more than a weeks worth of groceries.

Of course this is just one analysts opinion (and an entire market of data) so let me know – what did I get wrong and what did I miss?

Read Full Post »

Ray Ozzie, Microsoft Chief Software Architect and creator of Lotus Notes, announced Windows Azure today during the Windows PDC (Professional Developers Conference) event in Los Angeles (here). Azure coincidentally sounds an awful lot like du Jour, as in “technology hype du Jour”

Windows Azure, previously code name “Red Dog” is a hosted suite of services, including a highly scalable virtualization fabric (a what?), scalable storage, and an automated service management system. It is pretty close to the Amazon web services platform EC2 (Elastic Compute Cloud), except for the whole “Only Microsoft” thing. Hoff was on the ball and posted his thoughts earlier today (here)

Look, when I’m forced into vendor lock-in in order to host my applications and I am confined to one vendor’s datacenters without portability, that’s not ” the cloud” and it’s not an “open architecture,” it’s marketing-speak for “we’re now your ASP/XaaS service provider of choice.”

You can “experience” Azure here (here) also check out Manuvir Das, Director in the Windows Azure team explain the Windows “Cloud OS” (here) or Steve Marx presentation, Azure for Developers (here)

You can read my previous thoughts on cloud-computing (here) and (here)

Read Full Post »

Google recently “leaked” a cartoon providing information on their upcoming browser named “Chrome” (here) and (here) – personally I will be impressed when the movie comes out and there is a guest appearance by Stan Lee. There has already been a tremendous amount of discussion and opinion on the ramifications of such a release. Most of it centering on Google taking aim at Internet Explorer. Hoff believes this signals Google’s entry into the security market (here), obviously the  acquisition of Greenborder and Postini and the release of Google safe browsing were clear signals that security was a critical part of the equation. But what is most important here, and seems to be missed by much of the mainstream media, is that Google is creating the foundation to render the underlying Microsoft PC-based operating system obsolete and deliver the next evolutionary phase of client computing. Hoff pointed this out in his earlier post (here)

So pair all the client side goodness with security functions AND add GoogleApps and you’ve got what amounts to a thin client version of the Internet.

A highly-portable, highly-accessible, secure, thin-client-like, cloud computing software as a service offering that in the next 5-10 years has the potential to render the standard PC-based operating systems virtually obsolete – couple this with streaming desktop virutalization delivered through the Internet and we are quickly entering the next phase of the client computing evolution. You doubt this? OK, ask yourself a question? If Google is to dominate computing through the next decade can it be done on the browser battlefield of old, fought in the same trench warfare like manner experienced during the Early browser wars between Microsoft and Netscape? or will it introduce a much larger landgrab? and what is larger than owning the desktop – fixed or mobile, physical or virtual, enterprise or consumer – regardless of the form it takes?

On another note I recently posted the “7 greatest Ideas in Security” (here), notice that many of them have been adopted by Google in their development of Chrome, including;

  • Security as part of the SDL – designed from scratch to accommodate current needs; stability, speed, and security, also introduces concepts of fuzzing and automated testing using Google’s massive infrastructure.
  • The principle of least privilege – Chrome is essentially sand-boxed so it limits the possibility for drive-by malware or other vectors of attack that use the browser to infect the base OS or adjacent applications, which means the computation of the browser cannot read or write from the file system  – of course social engineering still exists, but Google has an answer for that providing their free Google safe browsing capabilities to automatically and continuously update a blacklist of malicious sites. Now they just need to solve the eco-system problems of plug-ins bypassing the security model of sand-boxing.
  • Segmentation  – Multiple processes with their own memory and global data structures, not to mention the sand-boxing discussed above
  • Inspect what you expect – Google task manager provides visibility into how various web applications are interacting with the browser
  • Independent security research – a fully open source browser, that you can guarantee will be put through the research gauntlet.

Read Full Post »

It is easy to criticize, in fact many have built their entire careers on the foundation of “Monday morning quarter-backing”, not only is it human nature to look for improvements at the detriment of old ideas, but it is also far more humorous to point out what is wrong than to espouse the virtues of what works.

I recently posited what I believed to be the “11 Worst Ideas in Security” (here), but to every yin a yang, to every bad a good, to every Joker a Dark Knight, for the purpose of finding balance, I give to you the 7 Greatest Ideas in Information Security…

7. Microsoft and Security as part of the SDL (Lord Vader finds your lack of faith disturbing)

The greatest flaw in information security is that we try to build security on top of a fundamentally weak foundation, whether we are talking about the core routing infrastructure, the open standards and protocols that drive them or the operating systems themselves, the majority of the Information Security industry is squarely aimed at resolving issues of past incompetence. Nowhere has this been more apparent than the decades plus of vulnerabilities found in Microsoft products. Crappiness exists in other products and is not an attribute solely patented by Microsoft, they just happen to power everything from my Mom’s computer to the Death Star, so when they fail it is almost always epic.

The Microsoft SDL (here) and the work that folks like Michael Howard (here) have done to develop security into a critical aspect of the SDL is not only admirable, it is inspiring. To have witnessed a company the size of Microsoft essentially redesign internal processes to address what was seen as a fundamental deficiency and to then continue to develop these processes changes into thought leadership sets an example for all of us, small business and world dominating enterprise alike. Implementing security as part of the SDL and utilizing concepts such as threat modeling to identify weaknesses and eradicate them before releasing code to the public is arguably one of the greatest ideas in security.

6. The Principle of Least Privilege (Not all of us can know Zarathustra)

Since Saltzer and Schroeder formulated the concept as part of computing we have been striving to achieve it. It is neither new nor is it novel, but it is critical to how we design computing systems and how we develop and implement security controls. It contradicts our own Nietzschean side to feel like constraints and rules are important for the common man, but shouldn’t apply to us personally, but nothing should be afforded more privilege than needed and this is one of the “laws of security”.

5. Segmentation (Your Mendelian trait is in my algorithmic reasoning)

Segmentation of duties, of networks, of memory, of code execution, of anything and everything that should never mix. Combine lack of segmentation with a lack of implementing the principle of least privilege and you turn a simple browser based buffer overflow into a highly damaging payload that can easily replicate throughout the Internets. For us to truly realize improvements in security, as defined by less successful security incidents – real and imagined – and marked by an increase in visibility and control over all of our computing systems, segmentation of everything is an ideal to strive for.

4. Inspect what You Expect (Question everything)

Also known as “trust but verify” as used by the Gipper in his dealings with the Russians during the cold war. Trust is important, but it is even more important to validate that trust. One of the most significant changes every software developer can make today, whether they are developing COTS or internal applications, is to allow security persons to inspect that the application is functioning, being accessed, and managed to the controls that the organizations expects. From networking to applications to users to virtualization to quantum anything, this principle must extend across every layer and concept of computing today and tomorrow,

3. Independent Security Research (So, I’ve been playing with something…no not that)

The ridiculous vulnerability disclosure debate aside, independent security research has had a significant benefit on the security industry. The best example is the recent DNS vulnerability that has been discussed, dissected, and covered ad nauseam. Since it’s disclosure it has not only resulted in providing more awareness of the fundamental flaws in the core infrastructural protocols like DNS and assisted in the implementation of countermeasures, but it has actually driven government policy as the OMB (Office of Management and Budget) has recently mandated the use of DNSSEC for all government agencies (here) – Sweet!

2. Cryptography and Cryptanalysis (From Bletchley with Love)

From the Greek Historian Polybios to the German surrender in May of 1945 to ECHELON, cryptography and cryptanalysis has played a major role in our lives. It has shaped the outcome of wars and changed foreign and domestic policy. It is becoming the cornerstone of the highly distributed, intermittently connected world of technical gadgetry we live in and can make the difference between coverage on the front page of the Wall St. Journal vs. a brief mention in a disgruntled employees blog – Although I wouldn’t argue that encryption as a technology is without flaw, the theory and practice of hiding information and it’s dance partner code breaking, continue to drive some of the greatest advances in information security.

1. Planning, Preparation, and Expectation Setting (Caution: Water on Road, may make road slippery)

Yes a bit of a yawner but since the beginning of forever more failures, more disastrous outcomes and more security incidents result from a lack of proper planning, preparation and expectation setting than all the exploits of all the hackers of all the world combined. As an analyst it became shockingly clear that the majority of failed technology deployments were not the result of a failure in the technology, but a result of poor planning, a lack of preparing and little to no expectation setting, the entire “trough of disillusionment” is riddled with the waste of mismatched technological expectations. The greatest idea in security is not sexy, funny, or terribly enlightened, but it is simple, achievable, repeatable and can be immediately implemented today – plan, prepare and set the proper expectations.

Some may argue that something has been forgotten or that the order is wrong, but I would argue that we must learn to develop securely, implement the proper security controls, verify the functioning of these controls, leverage the research of the greater community, ensure that what cannot be protected is hidden, and from the beginning to the end properly plan, prepare, and set the right expectation – these are the greatest ideas in security and if we learn to embody these principles, we would be moving the industry forward as opposed to constantly feeling like we can only clean up the incompetence that surrounds us.

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 41 other followers