Alan Shimel and Mitchell Ashley recently hosted a podcast panel on the future of VM (here). I was asked to participate but due to travel was unavailable, however given that I have been involved in this area for so long, and know most of the panelists personally, I thought I would post my thoughts.
To be clear the panel was discussing the future of vulnerability assessment, which is just one technology within the vulnerability management process. Vulnerability management has been completely misused by the vendor community; there are dozens – probably hundreds – of companies claiming to provide VM technology when in reality they only offer some small subset of functions, such as policy mapping, risk prioritization, vulnerability assessment, patch management, security configuration management or security event management. Managing vulnerabilities requires process to fill technology gaps; technology alone will not solve gaps in process.
VA is not new, Dan Farmer did some pioneering work with SATAN (security administrator tool for analyzing networks) in the mid-90’s, which was followed by commercial offerings from ISS (Internet scanner), McAfee (Cybercop, formerly Ballista) and others. At one time ISS had the greatest market share for commercial VA, but lack of product innovation and focus on their Proventia line resulted in other commercial scanning vendors, such as Foundstone (now McAfee), Qualys, and nCircle to unseat them as an incumbent. Today Nessus is the most pervasive scanning technology and maintains a large user base, however it is primarily used as a utility scanner as opposed to being widely deployed within the enterprise unless it is used in combination with an enterprise front end, such as is offered by Tenable.
In my travels and discussions with hundreds of large enterprises I have yet to find anyone that thinks assessing the environment is a bad thing, yet the revenue of VA vendors and the lack of interest in adopting the technology suggests the market needs to undergo significant change and adapt or face an untimely (or timely depending on where you sit) demise. The VA market is stagnant and the vendors are trying to expand their product offerings or reposition themselves by using an even broader and less understood term in risk management.
Let’s start with the technological challenges VA faces, which include…
– The data output is not actionable. Remediation guidance needs to be oriented to the operations team, which couldn’t care less about 30+unique distinct IE vulnerabilities when the corresponding action to fix them is to update to the latest version of IE patched. Vulnerability assessments are useless without the organization acting against the data, unfortunately security cannot take action in the enterprise, and resolution requires desktop, server or network support folks from the operations team. Security people rarely have budget for VA, and operations folks want to spend their money elsewhere.
– VA lacks contextual awareness; it is unable to understand mitigating network or security controls that can prevent the exploit, only that a vulnerable condition may be present.
– VA lacks a breadth of coverage, although it covers more than agent-based scanners. VA only looks at the OS and some commercial applications against a database of known vulnerabilities, it lacks visibility into web applications, non-COTS or internally developed code, database vulnerabilities, Wireless or VoIP vulnerabilities, etc…
– VA is limited by issues of space and time, it can only scan so much within a certain amount of time and is blind to all activity that occurs between scans (devices coming on and off line, scan subversion) want to hide from a scanner, determine when a scan will occur and then disable whatever badness you do not want the company to know about (here)
– Improperly configuring a scanner will result in false-positives, and potentially disrupt services.
– VA has limited integration with other network and security devices, which limits its usefulness in driving value in the greater security eco-system.
I could go on but you get the point, VA faces a lot of challenges…
VA does offer a wealth of data that can be leveraged to improve other network and security technologies. Using end-point intelligence to improve security defenses is one of the areas where VA can play an integral role in the security eco-system. Sourcefire uses VA data through RNA to improve their IDS/IPS, when I first joined nCircle we delivered a “target-aware” IDS that would auto-tune based on the information provided from the active scanner. ISS has done some work with its fusion technology (EOL now I believe) and extended the VA data integration with their IPS through their MSSP virtual patch offering. NAC is virtually useless (actually it is useless in the real world too – different post though) without understanding the state of unmanaged end-points – VA can help here, coincidentally the value of SIEM (here) is dramatically reduced without correlating VA data.
VA as a stand-alone product offering cannot sustain in the market, VA will become commoditized and the intelligence it gathers will be used to drive more effective security offerings. In the near term the vendors will reposition themselves to offer more security configuration and risk management but they will not be able to overcome market dynamics, which have relegated VA scanning to a commoditized utility and less of a strategic technology.