The CATS Hackathon: Creating and Refining Test Items for Cybersecurity Concept Inventories

For two days in February 2018, 17 cybersecurity educators and professionals from government and industry met in a "hackathon" to refine existing draft multiple-choice test items, and to create new ones, for a Cybersecurity Concept Inventory (CCI) and Cybersecurity Curriculum Assessment (CCA) being developed as part of the Cybersecurity Assessment Tools (CATS) Project. We report on the results of the CATS Hackathon, discussing the methods we used to develop test items, highlighting the evolution of a sample test item through this process, and offering suggestions to others who may wish to organize similar hackathons. Each test item embodies a scenario, question stem, and five answer choices. During the Hackathon, participants organized into teams to (1) Generate new scenarios and question stems, (2) Extend CCI items into CCA items, and generate new answer choices for new scenarios and stems, and (3) Review and refine draft CCA test items. The CATS Project provides rigorous evidence-based instruments for assessing and evaluating educational practices; these instruments can help identify pedagogies and content that are effective in teaching cybersecurity. The CCI measures how well students understand basic concepts in cybersecurity---especially adversarial thinking---after a first course in the field. The CCA measures how well students understand core concepts after completing a full cybersecurity curriculum.

READ FULL TEXT VIEW PDF

Authors

page 1

page 2

page 3

page 4

04/10/2020

Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity

We reflect on our ongoing journey in the educational Cybersecurity Asses...
11/22/2020

Development of Rubrics for Capstone Project Courses: Perspectives from Teachers and Students

This study attempted to develop fair, relevant, and content-valid assess...
09/10/2019

Investigating Crowdsourcing to Generate Distractors for Multiple-Choice Assessments

We present and analyze results from a pilot study that explores how crow...
12/20/2018

Kappa Learning: A New Method for Measuring Similarity Between Educational Items Using Performance Data

Sequencing items in adaptive learning systems typically relies on a larg...
11/20/2021

Exploring Language Patterns in a Medical Licensure Exam Item Bank

This study examines the use of natural language processing (NLP) models ...
06/22/2021

Face Identification Proficiency Test Designed Using Item Response Theory

Measures of face identification proficiency are essential to ensure accu...
04/06/2022

Do They Accept or Resist Cybersecurity Measures? Development and Validation of the 13-Item Security Attitude Inventory (SA-13)

We present SA-13, the 13-item Security Attitude inventory. We develop an...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

References

  • [Bra19] C. J. Brame. Writing good multiple choice test questions. https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test -questions/, 2019. [accessed 1-19-19].
  • [HCZL14] G. L. Herman, C. C. Zilles, and M. C. Loui. A psychometric evaluation of the Digital Logic Concept Inventory. Computer Science Education, 24(4):277–303, 2014.
  • [HSJ93] R. Hambleton, K. Swaminathan, and R. J. Jones. Comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues & Practice, 12:253––262, 1993.
  • [HWS92] D. Hestenes, M. Wells, and G. Swackhamer. Force Concept Inventory. The Physics Teacher, 30:141–166, 1992.
  • [JGJ95] N. Jorion, B. D. Gane, K. James, L. Schroeder, L. V. DiBello, and J. W. Pellegrino. An analytic framework for evaluating the validity of concept inventory claims. J. of Engineering Education, 104:454––496, 1995.
  • [PDH16] G. Parekh, D. DeLatte, G. L. Herman, L. Oliva, D. Phatak, T. Scheponik, and A. T. Sherman. Identifying core concepts of cybersecurity: Results of two Delphi processes. IEEE Transactions on Education, 61(11):11–20, May 2016.
  • [Pin11] D. H. Pink. Drive: The Surprising Truth About What Motivates Us. Riverhead Trade, New York, 2011.
  • [SDH18] A. T. Sherman, D. DeLatte, G. L. Herman, M. Neary, L. Oliva, D. Phatak, T. Scheponik, and J. Thompson. Cybersecurity: Exploring core concepts through six scenarios. Cryptologia, 42(4):337–377, 2018.
  • [SOD17] A. T. Sherman, L. M. Oliva, D. DeLatte, E. Golaszewski, M. Neary, K. Patsourakos, D. Phatak, T. Scheponik, G. L. Herman, and J. Thompson. Creating a cybersecurity concept inventory: A status report on the CATS Project. In 2017 National Cyber Summit, June 2017.
  • [THS18] J. Thompson, G. L. Herman, T. Scheponik, E. Golaszewski, A. T. Sherman, D. DeLatte, K. Patsourakos, D. Phatak, and L. Oliva. Student misconceptions about cybersecurity concepts: Analysis of think-aloud interviews with students. Journal of Cybersecurity Education, 1(5), 2018.