The Future of VA

Alan Shimel and Mitchell Ashley recently hosted a podcast panel on the future of VM (here). I was asked to participate but due to travel was unavailable, however given that I have been involved in this area for so long, and know most of the panelists personally, I thought I would post my thoughts.

To be clear the panel was discussing the future of vulnerability assessment, which is just one technology within the vulnerability management process. Vulnerability management has been completely misused by the vendor community; there are dozens – probably hundreds – of companies claiming to provide VM technology when in reality they only offer some small subset of functions, such as policy mapping, risk prioritization, vulnerability assessment, patch management, security configuration management or security event management. Managing vulnerabilities requires process to fill technology gaps; technology alone will not solve gaps in process.

VA is not new, Dan Farmer did some pioneering work with SATAN (security administrator tool for analyzing networks) in the mid-90’s, which was followed by commercial offerings from ISS (Internet scanner), McAfee (Cybercop, formerly Ballista) and others. At one time ISS had the greatest market share for commercial VA, but lack of product innovation and focus on their Proventia line resulted in other commercial scanning vendors, such as Foundstone (now McAfee), Qualys, and nCircle to unseat them as an incumbent. Today Nessus is the most pervasive scanning technology and maintains a large user base, however it is primarily used as a utility scanner as opposed to being widely deployed within the enterprise unless it is used in combination with an enterprise front end, such as is offered by Tenable.

In my travels and discussions with hundreds of large enterprises I have yet to find anyone that thinks assessing the environment is a bad thing, yet the revenue of VA vendors and the lack of interest in adopting the technology suggests the market needs to undergo significant change and adapt or face an untimely (or timely depending on where you sit) demise. The VA market is stagnant and the vendors are trying to expand their product offerings or reposition themselves by using an even broader and less understood term in risk management.

Let’s start with the technological challenges VA faces, which include…

– The data output is not actionable. Remediation guidance needs to be oriented to the operations team, which couldn’t care less about 30+unique distinct IE vulnerabilities when the corresponding action to fix them is to update to the latest version of IE patched. Vulnerability assessments are useless without the organization acting against the data, unfortunately security cannot take action in the enterprise, and resolution requires desktop, server or network support folks from the operations team. Security people rarely have budget for VA, and operations folks want to spend their money elsewhere.
– VA lacks contextual awareness; it is unable to understand mitigating network or security controls that can prevent the exploit, only that a vulnerable condition may be present.
– VA lacks a breadth of coverage, although it covers more than agent-based scanners. VA only looks at the OS and some commercial applications against a database of known vulnerabilities, it lacks visibility into web applications, non-COTS or internally developed code, database vulnerabilities, Wireless or VoIP vulnerabilities, etc…
– VA is limited by issues of space and time, it can only scan so much within a certain amount of time and is blind to all activity that occurs between scans (devices coming on and off line, scan subversion) want to hide from a scanner, determine when a scan will occur and then disable whatever badness you do not want the company to know about (here)
– Improperly configuring a scanner will result in false-positives, and potentially disrupt services.
– VA has limited integration with other network and security devices, which limits its usefulness in driving value in the greater security eco-system.

I could go on but you get the point, VA faces a lot of challenges…

VA does offer a wealth of data that can be leveraged to improve other network and security technologies. Using end-point intelligence to improve security defenses is one of the areas where VA can play an integral role in the security eco-system. Sourcefire uses VA data through RNA to improve their IDS/IPS, when I first joined nCircle we delivered a “target-aware” IDS that would auto-tune based on the information provided from the active scanner. ISS has done some work with its fusion technology (EOL now I believe) and extended the VA data integration with their IPS through their MSSP virtual patch offering. NAC is virtually useless (actually it is useless in the real world too – different post though) without understanding the state of unmanaged end-points – VA can help here, coincidentally the value of SIEM (here) is dramatically reduced without correlating VA data.

VA as a stand-alone product offering cannot sustain in the market, VA will become commoditized and the intelligence it gathers will be used to drive more effective security offerings. In the near term the vendors will reposition themselves to offer more security configuration and risk management but they will not be able to overcome market dynamics, which have relegated VA scanning to a commoditized utility and less of a strategic technology.

4 thoughts on “The Future of VA

  1. Nice post on this Amrit, I hope you dont mind all these comments, but I just found your blog and am reading these for the first time.

    I think this post makes the mistakes that many do in focussing on one particular facet of security technology by itself. Of course, VA by itself is a dying technology. Companies are realising that the data it provides is a snap shot of their vulnerabilities at that point in time, something that provides very little useful information on devices where configurations change many times a day like desktops, servers and laptops.

    Where it does provide good value is on devices that have limited changes, things like network devices. These devices can be scanned less frequently and the information provided is still relevant.

    So my point is, that security is a holistic practice, the VA vendors need to realise this and partner, purchase or ally themselves with the other components of the security eco-system (did I just use a CISCO term? I need to get that in check!) The work you did with nCircle was a case in point. The two products (IDS and VA) by themselves didnt provide any effective solution, but together?

    We are seeing the growth of security management solutions that provide integrated eco-systems to maintain policy across any device.

    McAffee are moving towards it with their recent moves and acquisitions, Symantec as well. BigFix has a solution that many think will start to move in this direction and vendors like LANdesk and Criston will eventually catch up or be snatched up.

    What this means for the user (or at least for the user who is willing to invest) is a single platform to manage vulnerabilities across all devices, a platform that is policy driven based upon location or any other factor and a platform that recognises these polices need to be enforced in real time.

    I look forward to seeing who delivers.

  2. Pingback: Effective Vulnerability Management (Part 1) « Observations of a digitally enlightened mind

  3. Pingback: Moving Security through Visibility to Implemeting Operational Controls « Amrit Williams Blog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s