際際滷

際際滷Share a Scribd company logo
Operational Security Testing

                 Competency Model
                 and Job Performance
                 Model Development

                      David McGuire
                      Vice Chair, OST Panel
                      Senior Security Engineer
                      Veris Group
Introduction

 David McGuire
    Senior Security Engineer with Veris Group
    Lead penetration tester for multiple Federal and commercial clients
    Train at the Black Hat Technical Security Conference
    Vice-Chair of the NBISE Operational Security Tester competency
     development panel
    Previously, senior technical lead for a large DoD Red Team
    Have about 12 years of experience in information security with about 9
     in penetration testing and exploitation
    Have lots of pen testing related certifications
        (CISSP, CREST CCT, OSCP, CEH, GPEN, GWAPT)

    Opinions expressed here are my own
Introduction


 Veris Group
    Management and technology services firm with a core focus in information
     security
    Conduct penetration tests and develop penetration testing programs for
     various Federal agencies
    Clients include:
        Carnegie Mellon University (CMU), Software Engineering Institute (SEI)
        Department of Justice
        Department of Homeland Security,
        Department of the Treasury
        Department of Defense
        Social Security Administration
        Multiple commercial customers including Fortune 500 companies
Todays agenda


1. Challenges in the penetration testing field
2. Overview of the NBISE and OST panel
3. How the NBISE and the OST panel are approaching the
   problem differently
4. Current draft OST competency model
5. Future OST panel work
6. Improving competency now
What do these have in common?
What do all of these have in common?

 Relatively unsophisticated penetrations that are conducted on a far
  larger scale than nation state threats
Security Assessments Dont Match Real World
Threats
     Technical difficulty of attack


                                                      Nation State Level Threats     Red Teaming



                                                        Largest Percentage of
                                                                Threats
                                                       (Hacktavists, Most Criminal
                                                              Orgs, etc)

                                      Vulnerability
                                      Assessments     Unsophisticated Threats
                                           Level of Effort, Cost and Timeframe of Assessment
Challenges In The Field



 How we conduct penetration tests, red teaming or blue
  teaming as an industry is poorly defined
 Therefore, the skills testers need to conduct successful
  assessments are poorly defined
 Official organizations have come to require penetration
  testing (or something similar), but have not attempted to
  define it
 To date, the industry has been unwilling to standardize
  terminology, assessment types and competency definitions
Challenges In The Field



 A wide range of education and training exists that does not
  conform to any standard methodology
 Methods for assessing real-world competency are inadequate
    Primarily based on certifications today

 Practitioners in the field still very much eschew organization or
  rigidity
 Practitioners trying to gain experience generally must sign on
  with well known organizations
Community Wide Efforts for Improving
Competency




 Council for Registered Ethical Security Testers (CREST)
    Organization designed to provide a robust certification mechanism for
     individuals and companies to guarantee quality assurance
    Certifications are designed to provide a demonstrable level of
     expertise and have a reputation for being very difficult
    Currently only active in the U.K.
Community Wide Efforts for Improving
Competency




 Penetration Testing Execution Standard (PTES)
    A community driven standard methodology for penetration testing
    The intent is to provide both penetration testers and consumers with a
     common language and scope
    Still in its infancy, but has some momentum
    This course follows as much of the PTES methodology as currently
     possible
Introduction to the NBISE

 Mission is to improve the potential and performance of the cyber
  security workforce
 Seeks to develop assessment instruments to reliably predict future
  performance and aptitude for cyber security jobs
 Leadership:
    Michael Assante, CEO (former CSO of NERC)
    David Tobey, Director of Research

 Board of Directors
    Franklin Reeder, Co-Founder of CIS
    Alan Paller, Founder of SANS
    Karen Evans, National Director of U.S. Cyber Challenge
    James Lewis, Program Director CSIS
    Richard Schaeffer, Former Director NSA IAD
Introduction to the NBISE

 Working to define competency for various job categories in the
  information security field (based on need)
 Works as a catalyst to form panels of experts in the categories
 Categories identified so far:
     Operational Security Testing
     Smart Grid
     Advanced Threat Response
     Secure Coding
 Provides the framework for the panels to define competency in their
  categories
 Advanced Defender Aptitude and Performance Testing and
  Simulation (ADAPTS) Program
     Brings the work of the various panels to academic institutions
How is the NBISE Approaching Competency
Differently?

 Context elicitation through vignettes and responsibilities
 Competencies defined at novice, apprentice, journeyman, and
  expert levels for multiple roles (organizational language)
 Creating profiles for technical and operational skills
 By defining assessments at the task level we create reusable
  libraries to assess multiple roles
 ADAPTS validation and extension of job performance model
 Standards based on validated curricula, assessment, and
  simulation libraries
How is the NBISE assessing competency
differently?


 Assessing achievement vs. aptitude
    Objective: Categorizing vs. Comparing /ranking
    Precision: related to vs. measure of skill
    Validity: Content vs. Predictive
    Inference: Past vs. Future performance
How is the NBISE assessing competency
differently?


 Knowledge exams vs. constructed response exams
    Good enough (safe) vs. Mastery (beyond the norm)
    Pass (cut-off score) vs. Profile (heat map)
    Expertise: Experience vs. Consistently high performance
    Raise the bar separating people vs. Raise the bar of overall
     competency
Model of Defining Competency?


   Broad




 Ability
 (transfer
 across
 domains)


                                                                         Consistent

                                                                          Skills
                                                                       (consistency of
  Narrow                                                                performance)
                                                                 Inconsistent
             Shallow             Knowledge             Deep
                                (understanding of
                             strategy or procedure)
  = Novice    = Apprentice          = Journeyman      = Master
What is the OST Panel?


 Formed to establish competency models for Operational
  Security Testers for various experience levels
 Categories include that fall under Operational Security Testing:
    Penetration Testers
    Red Teamers
    Blue Teamers
 The end goal of the panel is to:
    Provide trainers and educational institutions a set of standards to train
     against
    Design a set of assessments to measures how individuals match up
     against the competency models
Operational Security Testing (OST)




 Identified 126 technical tasks across the 3 categories
Competency Model Development



               Overview of Methodology
Participation Rates So Far

   68 total responses to date
   31 interested in volunteering
   17 respondents have experience working in commercial
    pen testing and red/blue teams

          Experience base          Respondents
          Commercial Pen Testing   31
          Red Team                 31
          Blue Team                26
Current Draft OST Competency Model
Constructs (Goal & Task Categories)              Constructs (Goal & Task Categories)
Manage the project                               Mitigate vulnerabilities
   Develop project plan                            Identify resources
   Monitor project plan                            Plan and document actions
                                                    Communicate actions
Identify the critical vulnerabilities            Understand and demonstrate impact
   Identify critical vulnerabilities               Specify target-specific impact
   Analyze and map critical vulnerabilities        Determine implications/plan response
   Develop and execute mitigation strategy         Communicate impact
Penetrate targets                                Educate team and clients
   Identify targets to penetrate                 Educate team
   Analyze targets to penetrate                  Educate clients
   Develop and execute penetration strategy
Exploit vulnerabilities
   Infrastructure
   Web and Applications
   Other
   Perform tasks in a safe and lawful fashion
OST Critical Differentiation Analysis

                                               Task Criticality
                                         Low                 High
        Task Differentiation
                                      #1             #1 
                                      #2             #2 
                                      #3             #3 
                               Low


                                      #4             #4 
                                      #5             #5 



                                      #1             #1 
                                      #2             #2 
                                      #3             #3 
                               High




                                      #4             #4 
                                      #5             #5
Examples of High Critical /
High Differentiation Tasks

 Develop mission / attack plan to assess security posture of client network
 Explain results of attacks to clients
 Recognize when tools provide inaccurate data
 Identify targets for potential exploitation
 Establish control of Windows hosts
 Identify major client attack targets and assets
 Identify specific vulnerabilities on specific hosts
 Understand and attack standard data protection mechanisms
 Survey the post-exploitation environment to develop situational awareness
 Demonstrate robust post-exploitation capabilities
 Analyze data on compromised machines for strategic value
 Educate others and translate cyber risk to operational risk
Causal model of job performance



 Experts




                      Differentiation




      Apprentices         Journeymen
Workforce Development Framework


            Competency             Content                  Object              Development
              Model                 Model                   Model                  Model




                                                          Assessment
                                                            Objects
                                                           (TestLets)




                                                           Training
                             SCORM/CMI-5                   Objects
                              XML Library                (CourseLets)




                                                          Simulation
                                                           Objects
                                                          (SimLets)




                    Adapted from Ostyn (2005) Competency Data for Training Automation
Future of Utilizing the Competency Models

 Modular assessment or test design
    Adaptive, goal-driven test sequences assess aptitude, knowledge, skill,
     and ability to perform a specific task
    Validated pools of test scenarios/questions for constructing
     randomized, equivalent test forms

 Modular simulation exercises
    Validated, simulation routines for modeling, practicing, and assessing
     specific skilled behavior

 Modular curriculum design
    Validated content packages for accomplishing specific learning
     objectives
Potential Performance Realization Framework



                                                            Performance prediction

                                                                       Performance
                                                                          exam
                                                                            Proficiency
                                                                              Exam
                                                                          Critical
                       Proficiency
                                                                         Incident                                          Proficiency
                         Ratings                                         Analysis                                            Ratings


            Performance                                                 Exploratory
                                                                                                                                     Performance
              Ratings                                                      factor                                                      Ratings
                                      Misuse
                                      Cases
                                                                          analysis                              Misuse
      Team Challenge                                                                                            Cases                         Test
         Modules                                                            Job                                                              Modules
                                                                                                     Scored
                            Best                Team                      Analysis                   Exams
                                                                                                                           Best
                          Practices            Profiles                                                                  Practices
  Collaborative                                                        Questionnaire                                                               Exercise
Learning Modules                                                                                                                                   Modules
                        Use             Team                                                                                  Use
                                       Scoring        Performance                            Curriculum        Skill
                       Cases
                                                      Evaluation
                                                                       Competency            Syllabus         Profiles        Cases
                                                                                                                                                    Problem-based
                                                                         Model                                                                     Learning Modules
 Scenario
 Modules                                                                             Individual
                                                                 Job
                                                                                    Development
                                                               Definition
                                                                                        Plan



      Organization-focused products                                                       Individual-focused products
NBISE IA Curriculum Development Program


 Participate in research teams producing Competency Models, Content
  Models, Object Models, and Development Models as a fellow of NBISEs
  Information Assurance Research Society
 Pilot testing of simulations, and content tailored to content of IA
  curriculum
 Pilot testing of assessment for students at entry, testing points, and
  completion of courses and overall program, providing aptitude, skill profile,
  and personal development plan recommendations
 Pilot testing of proficiency-based and performance-based assessments for
  student career planning and overall program assessment
What Can You Do Today to Improve
Competency?

 Educate yourself in supporting technologies:
    Computer Science
    Networking
    Operating Systems

 Take training in ethical hacking / penetration testing courses
    Understand that the content is the authors perspective / many other
     perspectives out there

 Follow prominent testers / read as many books as possible
 If you have the experience, help the community better define
  the tasks and skills necessary to define the OST field
Questions?


             Contact:
             David McGuire
             Vice Chair, NBISE OST Panel
             Veris Group
             dmcguire@verisgroup.com

More Related Content

MDC3 NBISE OST Presentation

  • 1. Operational Security Testing Competency Model and Job Performance Model Development David McGuire Vice Chair, OST Panel Senior Security Engineer Veris Group
  • 2. Introduction David McGuire Senior Security Engineer with Veris Group Lead penetration tester for multiple Federal and commercial clients Train at the Black Hat Technical Security Conference Vice-Chair of the NBISE Operational Security Tester competency development panel Previously, senior technical lead for a large DoD Red Team Have about 12 years of experience in information security with about 9 in penetration testing and exploitation Have lots of pen testing related certifications (CISSP, CREST CCT, OSCP, CEH, GPEN, GWAPT) Opinions expressed here are my own
  • 3. Introduction Veris Group Management and technology services firm with a core focus in information security Conduct penetration tests and develop penetration testing programs for various Federal agencies Clients include: Carnegie Mellon University (CMU), Software Engineering Institute (SEI) Department of Justice Department of Homeland Security, Department of the Treasury Department of Defense Social Security Administration Multiple commercial customers including Fortune 500 companies
  • 4. Todays agenda 1. Challenges in the penetration testing field 2. Overview of the NBISE and OST panel 3. How the NBISE and the OST panel are approaching the problem differently 4. Current draft OST competency model 5. Future OST panel work 6. Improving competency now
  • 5. What do these have in common?
  • 6. What do all of these have in common? Relatively unsophisticated penetrations that are conducted on a far larger scale than nation state threats
  • 7. Security Assessments Dont Match Real World Threats Technical difficulty of attack Nation State Level Threats Red Teaming Largest Percentage of Threats (Hacktavists, Most Criminal Orgs, etc) Vulnerability Assessments Unsophisticated Threats Level of Effort, Cost and Timeframe of Assessment
  • 8. Challenges In The Field How we conduct penetration tests, red teaming or blue teaming as an industry is poorly defined Therefore, the skills testers need to conduct successful assessments are poorly defined Official organizations have come to require penetration testing (or something similar), but have not attempted to define it To date, the industry has been unwilling to standardize terminology, assessment types and competency definitions
  • 9. Challenges In The Field A wide range of education and training exists that does not conform to any standard methodology Methods for assessing real-world competency are inadequate Primarily based on certifications today Practitioners in the field still very much eschew organization or rigidity Practitioners trying to gain experience generally must sign on with well known organizations
  • 10. Community Wide Efforts for Improving Competency Council for Registered Ethical Security Testers (CREST) Organization designed to provide a robust certification mechanism for individuals and companies to guarantee quality assurance Certifications are designed to provide a demonstrable level of expertise and have a reputation for being very difficult Currently only active in the U.K.
  • 11. Community Wide Efforts for Improving Competency Penetration Testing Execution Standard (PTES) A community driven standard methodology for penetration testing The intent is to provide both penetration testers and consumers with a common language and scope Still in its infancy, but has some momentum This course follows as much of the PTES methodology as currently possible
  • 12. Introduction to the NBISE Mission is to improve the potential and performance of the cyber security workforce Seeks to develop assessment instruments to reliably predict future performance and aptitude for cyber security jobs Leadership: Michael Assante, CEO (former CSO of NERC) David Tobey, Director of Research Board of Directors Franklin Reeder, Co-Founder of CIS Alan Paller, Founder of SANS Karen Evans, National Director of U.S. Cyber Challenge James Lewis, Program Director CSIS Richard Schaeffer, Former Director NSA IAD
  • 13. Introduction to the NBISE Working to define competency for various job categories in the information security field (based on need) Works as a catalyst to form panels of experts in the categories Categories identified so far: Operational Security Testing Smart Grid Advanced Threat Response Secure Coding Provides the framework for the panels to define competency in their categories Advanced Defender Aptitude and Performance Testing and Simulation (ADAPTS) Program Brings the work of the various panels to academic institutions
  • 14. How is the NBISE Approaching Competency Differently? Context elicitation through vignettes and responsibilities Competencies defined at novice, apprentice, journeyman, and expert levels for multiple roles (organizational language) Creating profiles for technical and operational skills By defining assessments at the task level we create reusable libraries to assess multiple roles ADAPTS validation and extension of job performance model Standards based on validated curricula, assessment, and simulation libraries
  • 15. How is the NBISE assessing competency differently? Assessing achievement vs. aptitude Objective: Categorizing vs. Comparing /ranking Precision: related to vs. measure of skill Validity: Content vs. Predictive Inference: Past vs. Future performance
  • 16. How is the NBISE assessing competency differently? Knowledge exams vs. constructed response exams Good enough (safe) vs. Mastery (beyond the norm) Pass (cut-off score) vs. Profile (heat map) Expertise: Experience vs. Consistently high performance Raise the bar separating people vs. Raise the bar of overall competency
  • 17. Model of Defining Competency? Broad Ability (transfer across domains) Consistent Skills (consistency of Narrow performance) Inconsistent Shallow Knowledge Deep (understanding of strategy or procedure) = Novice = Apprentice = Journeyman = Master
  • 18. What is the OST Panel? Formed to establish competency models for Operational Security Testers for various experience levels Categories include that fall under Operational Security Testing: Penetration Testers Red Teamers Blue Teamers The end goal of the panel is to: Provide trainers and educational institutions a set of standards to train against Design a set of assessments to measures how individuals match up against the competency models
  • 19. Operational Security Testing (OST) Identified 126 technical tasks across the 3 categories
  • 20. Competency Model Development Overview of Methodology
  • 21. Participation Rates So Far 68 total responses to date 31 interested in volunteering 17 respondents have experience working in commercial pen testing and red/blue teams Experience base Respondents Commercial Pen Testing 31 Red Team 31 Blue Team 26
  • 22. Current Draft OST Competency Model Constructs (Goal & Task Categories) Constructs (Goal & Task Categories) Manage the project Mitigate vulnerabilities Develop project plan Identify resources Monitor project plan Plan and document actions Communicate actions Identify the critical vulnerabilities Understand and demonstrate impact Identify critical vulnerabilities Specify target-specific impact Analyze and map critical vulnerabilities Determine implications/plan response Develop and execute mitigation strategy Communicate impact Penetrate targets Educate team and clients Identify targets to penetrate Educate team Analyze targets to penetrate Educate clients Develop and execute penetration strategy Exploit vulnerabilities Infrastructure Web and Applications Other Perform tasks in a safe and lawful fashion
  • 23. OST Critical Differentiation Analysis Task Criticality Low High Task Differentiation #1 #1 #2 #2 #3 #3 Low #4 #4 #5 #5 #1 #1 #2 #2 #3 #3 High #4 #4 #5 #5
  • 24. Examples of High Critical / High Differentiation Tasks Develop mission / attack plan to assess security posture of client network Explain results of attacks to clients Recognize when tools provide inaccurate data Identify targets for potential exploitation Establish control of Windows hosts Identify major client attack targets and assets Identify specific vulnerabilities on specific hosts Understand and attack standard data protection mechanisms Survey the post-exploitation environment to develop situational awareness Demonstrate robust post-exploitation capabilities Analyze data on compromised machines for strategic value Educate others and translate cyber risk to operational risk
  • 25. Causal model of job performance Experts Differentiation Apprentices Journeymen
  • 26. Workforce Development Framework Competency Content Object Development Model Model Model Model Assessment Objects (TestLets) Training SCORM/CMI-5 Objects XML Library (CourseLets) Simulation Objects (SimLets) Adapted from Ostyn (2005) Competency Data for Training Automation
  • 27. Future of Utilizing the Competency Models Modular assessment or test design Adaptive, goal-driven test sequences assess aptitude, knowledge, skill, and ability to perform a specific task Validated pools of test scenarios/questions for constructing randomized, equivalent test forms Modular simulation exercises Validated, simulation routines for modeling, practicing, and assessing specific skilled behavior Modular curriculum design Validated content packages for accomplishing specific learning objectives
  • 28. Potential Performance Realization Framework Performance prediction Performance exam Proficiency Exam Critical Proficiency Incident Proficiency Ratings Analysis Ratings Performance Exploratory Performance Ratings factor Ratings Misuse Cases analysis Misuse Team Challenge Cases Test Modules Job Modules Scored Best Team Analysis Exams Best Practices Profiles Practices Collaborative Questionnaire Exercise Learning Modules Modules Use Team Use Scoring Performance Curriculum Skill Cases Evaluation Competency Syllabus Profiles Cases Problem-based Model Learning Modules Scenario Modules Individual Job Development Definition Plan Organization-focused products Individual-focused products
  • 29. NBISE IA Curriculum Development Program Participate in research teams producing Competency Models, Content Models, Object Models, and Development Models as a fellow of NBISEs Information Assurance Research Society Pilot testing of simulations, and content tailored to content of IA curriculum Pilot testing of assessment for students at entry, testing points, and completion of courses and overall program, providing aptitude, skill profile, and personal development plan recommendations Pilot testing of proficiency-based and performance-based assessments for student career planning and overall program assessment
  • 30. What Can You Do Today to Improve Competency? Educate yourself in supporting technologies: Computer Science Networking Operating Systems Take training in ethical hacking / penetration testing courses Understand that the content is the authors perspective / many other perspectives out there Follow prominent testers / read as many books as possible If you have the experience, help the community better define the tasks and skills necessary to define the OST field
  • 31. Questions? Contact: David McGuire Vice Chair, NBISE OST Panel Veris Group dmcguire@verisgroup.com