Posted in Politics, Security, tagged Air Force, Center for Strategic and International Studies, CSIS, Cyber Command, cyber security, DC3, digital forensics, DoD, DoD Cyber Crime Center, FBI, National Security Agency, network security, President Obama, SANS Institute, US Cyber Challenge, US Cyber Policy on August 4, 2009 |
5 Comments »
As part of the administrations continuing efforts to actually do something tangible to improve the security posture of US critical infrastructure and to better deal with a severe lack of technical talent the CSIS (Center for Strategic and International Studies) announced the US Cyber Challenge (here) to identify and develop 10,000 cyber security specialists.
One of the fundamental deficiencies of the current US critical infrastructure protection programs (there are many of them), is the astonishing lack of qualified technical security specialists. This program aims to develop the next generation of technically advanced cyber warriors and security specialists.
The United States Cyber Challenge
The US Cyber Challenge is a national talent search and skills development program. Its purpose is to find 10,000 young Americans with the interest and skills to fill the ranks of cyber security practitioners, researchers and warriors. Some will, we hope, become the top guns in cyber security. The program will nurture and develop their skills, and enable them to get access to advanced education and exercises, and where appropriate, enable them to be recognized by employers where their skills can be of the greatest value to their nation.
Improving our private and public sector security posture will be an ongoing process as we adopt new technology innovations and as the dynamic global environment shifts between hostile and friendly actors. Recruiting the next generation of technically advanced security specialists and developing the skills today to deal with tomorrows threats is key to ensuring we have a population of talent to enable continued growth and prosperity of the United States and its citizens. Like so many times in our history, the hopes of an aging nation rest on the shoulders of America’s youth.
Read Full Post »
Posted in Security, tagged Auditing, BigFix, cloud computing, Gartner, IDS, Intrusion detection, Intrusion prevention, IPS, McAfee, Monitoring, nCircle, network security, Risk, threats, Virtualization, vulnerabilities, Vulnerability Assessment on December 22, 2008 |
1 Comment »
Quick thought for the day. Most technologies in the security world move through a predictable cycle of adoption. First an organization implements a solution to gain visibility into the scope of the problem (VA, IDS, DLP/CMF, SIEM) then once it becomes apparent that the problem is vast and overwhelming they move to operationally implement technical controls to protect the environment and to enforce organizational policies, when this switch over occurs the adoption of the pure visibility tools becomes eclipsed by the control tools. This doesn’t mean that the visibility tools are ineffective, it generally means that the scope of the problem is understood to the point that an organization can effectively implement controls, it also means that the problem has successfully moved from the security team to the operations team. You can apply this same logic to any segment of security and to any new technology, including cloud computing, virtualization and all the little shiny obejcts in between.
Examples of this movement from visibility to control include intrusion detection, vulnerability assessment and content monitoring and filtering. Let’s look at VA, It’s initial use was to determine the scope of the ‘exposure’ problem, that is to scan the environment against a database of known vulnerabilities to determine the extent of exposure. Unfortunately the volume of output was very high and was presented in a format that was not easily consumable or actionable by the IT operations team. What exactly does one expect the server admin to do with 300 pages of vulnerability data? There were also inherent issues of fidelity. The use of VA tools moved into targeted scans to determine what needed to be patched, which resulted in the operational implementation of patch management technologies, which soon overtook the market adoption of vulnerability assessment tools. There was also the pressure of auditors looking for the implementation of technical controls and although vulnerability assessments were viewed as an important first step, without the work-flow and controls to address the volume of vulnerability data they proved to be less effective in improving operational security than was originally thought.
It became clear that vulnerability management needed to cross the chasm to become an operationally actionable tool, without remediation capabilities the organization would always be under a mountain of vulnerabilities and the use of the technology would linger in the trough of disillusionment. Security configuration management met that need, it allowed an organization to define the desired configuration state of an environment against industry best practices (NIST, DISA, CIS, etc) and then to operationally implement technical controls to identify non-compliant devices and enforce policy. Security configuration management also had the benefit of providing a common language between the security, audit, and operations teams. I wrote about this in a series of posts (here), (here), and (here).
Read Full Post »