The number of vulnerabilities published is increasing against everything from networking devices to large enterprise applications and databases. This isn’t really groundbreaking news, but it does beg the question: Why can’t we build more-secure and less-vulnerable software?
Back in the gold-rush of the ’90s, market share was king. Quality was an afterthought and in direct conflict with time-to-market goals. Let’s face it, software quality was really bad. Then, all that buggy software started to fall over, and the market got angry, quality became an evaluation criteria ranked as high as functionality and this forced software vendors to address the quality issue. Quality assurance and proper software development finally started to get the attention they deserved. A huge industry was formed, technologies were developed to automate software testing and eventually large companies like IBM (Rational), HP (Mercury), and Microsoft were selling suites of automated QA testing tools along with other software development technologies, QA professionals gained a level of respect they had never had, and some companies even created VP of Quality positions. Software is still buggy, but quality has certainly improved since the mid-90’s.
Around the late 90’s, early part of the millennium hackers started taking advantage of poorly coded applications en masse, and then worms and other widespread and fast-acting malware appeared, mostly targeting Microsoft products. The full might of the market’s ire was aimed directly at the company in Redmond; it was not a pleasant time to be responsible for Microsoft security. Since then, Microsoft has done some pioneering work in defining secure application development, threat modeling and creating a culture that supports security throughout an application’s life cycle. And to Microsoft’s credit, it did make a lot of changes in how it develops software, a change that would not have been possible without the efforts of top management driving security down through the ranks of the organization, which was a direct result of market forces.
Security has to be seen as a quality issue and the market has to demand it. We are continually at the mercy of organizations that are unable to deliver secure software and unless we make software security an evaluation criterion as important as function or quality this pattern of vulnerabilities will increase. Do not let vendors that are unable to properly implement security best practices and tools into their SDLC put your organization at risk!
Unfortunately, software will never be 100 percent defect-free or 100 percent secure, nor do most organizations have the resources that Microsoft has; however, all organizations can certainly do a better job of limiting the breadth, depth and frequency of software vulnerabilities. Integrating security best practices and tools into the SDLC, using security source code scanning tools early in the development cycle, scanning externally facing web applications for vulnerabilities, implementing a program to work with security researchers, including security capabilities as part of the requirements, essentially taking security seriously should be a priority for all organizations that develop applications, commercial or otherwise.