DeepAI
Log In Sign Up

Integrating Hackathons into an Online Cybersecurity Course

Cybersecurity educators have widely introduced hackathons to facilitate practical knowledge gaining in cybersecurity education. Introducing such events into cybersecurity courses can provide valuable learning experiences for students. The nature of the hackathon format encourages a learning-by-doing approach, and the hackathon outcomes can serve as evidence for students knowledge, capability and learning gains. Prior work on hackathons in education mainly focused on collocated hackathon events in the traditional classroom setting. These hackathon events often took place as a one-off event at the end of the course. However, one-off hackathon events at the end of a course might not be sufficient to improve learning. Instead, we focus on analyzing the integration of a series of online hackathon events into an online cybersecurity course and explore how this integration can address online education issues by encouraging collaboration and developing a practical understanding of the delivered course by solving real-world challenges. We evaluate interventions to foster learning and analyze its effect on collaboration and learning gains for students in the course. Our findings indicate that students attribute learning benefits to the introduced interventions that supported teamwork and collaboration, maintained student participation and interest in the course, and encouraged learning-by-doing.

READ FULL TEXT VIEW PDF

page 11

page 13

10/12/2022

Integrating Accessibility in a Mobile App Development Course

The growing interest in accessible software reflects in computing educat...
10/18/2021

Modeling MOOC learnflow with Petri net extensions

Modern higher education takes advantage of MOOC technology. Modeling an ...
04/22/2019

Will this Course Increase or Decrease Your GPA? Towards Grade-aware Course Recommendation

In order to help undergraduate students towards successfully completing ...
04/16/2022

Gender-Wise Perception of Students Towards Blended Learning in Higher Education: Pakistan

Blended learning (BL) is a recent tread among many options that can best...
07/07/2020

Training design fostering the emergence of new meanings toward unprecedented and critical events

Our research is part of a technological research program in adult educat...
04/26/2018

History-Themed Games in History Education: Experiences on a Blended World History Course

In this paper we explain our experiences and observations on a blended w...
08/25/2020

Introducing students to research codes: A short course on solving partial differential equations in Python

Recent releases of open-source research codes and solvers for numericall...

1. Introduction

Cybersecurity education has seen an increased interest in literature and practice because of the alarming rate of cybersecurity breaches due to existing security vulnerabilities (fowler2016data; key2017automotive), the reported lack of security experts or shortage of cybersecurity workers to defend against cyber attacks (boopathi2015learning; weiss2015teaching) and the increase of information and communication technologies in everyday use (venter2019cyber). These point to a need to improve existing cybersecurity curricula to foster practical cybersecurity knowledge. To that end, researchers have proposed expanding the practical aspect of cybersecurity education. One approach that has gained momentum in this regard are educational hackathons organized to support cybersecurity education  (weiss2015teaching; kharchenko2016university; boopathi2015learning). Sadovykh et al. (sadovykh2019hackathons) and Steglich et al. (steglich2020hackathons) report how adding hackathons to an educational curriculum fosters student familiarity with different technologies and supports students in adopting problem-solving practices. Hackathons are time-bounded events during which participants with diverse backgrounds form teams and work on projects of interest to them (pe2019designing). Hackathons have since begun as competitive coding events and have proliferated into various domains, including corporations (moe2021improving; pe2020corporate; komssi2015hackathons), entrepreneurship (medina2021supporting; richter2021digital; bubbar2019promoting), civic engagement (hartmann2018innovation; lodato2016issue; henderson2015getting), communities (nolte2020support; pe2019understanding; huppenkothen2018hack) and others. Educational settings, in particular, have been found to benefit from the introduction of hackathons because they encourage students to practice the concepts learned in the classroom (gama2018hackathons; porras2019code). Consequently, educators have adopted hackathons in traditional education settings related to computer science, software engineering, and STEM (Seidametova2022hack).

Current works mainly focus on one-off events at the end of the course. However, the learning process cannot reach its full potential, especially with online education, until students are encouraged to frequently practice what they learn (dhawan2020online). Introducing one-off opportunities at the end of the course to practice what they learnt may not be effective. For students to excel in their respective courses, they need to be cognitively and socially engaged throughout their course activities since learning is a social and cognitive process (tinto2011taking)

. A way to achieve this is through collaborative problem solving and active learning approaches, integrated throughout the course, to engage students cognitively in the coursework, develop their learning skills, and enhance academic achievement and engagement 

(unal2021effect). Thus, we can carefully design and introduce hackathons at multiple points in the course to provide intended benefits of improved student interest and engagement, increased interaction between instructor and student, and the opportunity to practice or apply theoretical concepts taught. Our proposed hackathon design covers actions employed to address issues with the learning process of an online format by introducing interventions.

Additionally, prior works have also focused on in-person hackathon events in the context of traditional courses (la2017engineering; tandon2021using). The COVID-19 pandemic has forced education online. And as a result, online instruction and collaboration are gaining popularity to support the learning and innovative process for students (dhawan2020online); but challenges exist. Such challenges are associated with current technological constraints (download errors, installation issues, audio and video problems etc.), reduced student interest and engagement in online study, lack of a sense of belonging and connectedness, distractions at home, and time management challenges for students (dhawan2020online; gama2021online). Additionally, the online course can be all theoretical, which does not prompt students to practice alongside the instructor, leading to an inadequate two-way interaction between instructor and student that discourages an effective learning process.

In this paper, we thus investigate:

How can different interventions at online educational hackathons influence learning about cybersecurity?

Building on prior work on hackathon interventions that foster learning in cybersecurity (abasi2020developing), we explore these hackathon interventions in an online-led cybersecurity course. Specifically, the questions guiding this research are: [style=RQFrame]RQHow do the interventions contribute to teamwork? [style=RQFrame]RQHow do the interventions contribute to learning? We introduced and integrated a series of online educational hackathons throughout an online-led cybersecurity course to answer these research questions. We designed hackathon interventions to support teamwork and collaboration, maintain student participation and interest in the course, and encourage learning-by-doing within the online context. Our findings indicate that the hackathon interventions contributed to learning gains, fostered teamwork, helped students maintain interest in the course topic and supported participation.

We organized the rest of our paper as follows: First, we explored the research gap and hackathon design aspects for learning in  Section 2, then explained the setting of the hackathon format and the introduced interventions in  Section 3. In Section 4, we analyze the student perception of the contribution of the intervention to teamwork and collaboration (RQ???) during these events. Lastly, we discuss how the hackathon interventions contribute to student learning (RQ???) in Section 5.

2. Background

In this section, we will discuss hackathons in cybersecurity education in Section 2.1, the design of hackathon interventions introduced in our research in Section 2.2 and prior work in educational hackathons in Section 2.3.

2.1. Hackathons and Cybersecurity Education

Hackathons have been widely introduced to facilitate training and cybersecurity awareness within the computer science and software engineering communities (kharchenko2016university; boopathi2015learning; weiss2015teaching; foley2018science). However, hackathons in cybersecurity typically focus on the investigation of a topic, tool or technology within the cybersecurity domain and the development of research outcomes, security artefacts and prototypes  (kharchenko2016university; boopathi2015learning; weiss2015teaching; foley2018science).

Kharchenko et al. (kharchenko2016university) presented a series of hackathons to encourage cybersecurity research, development and university-industry cooperation. The first hackathon covered presentations and brainstorming to develop a platform for testing the cybersecurity features of industrial programmable logic controllers (PLCs), forming the basis for future hackathons. Subsequent hackathons covered cybersecurity training, idea generation for securing field-programmable gate array (FPGA)-based PLCs, startup development for security testing services, and further actions to support research at the university. Foley et al. (foley2018science) discussed the outcome of a hackathon organized for researchers to secure cyber-physical systems (CPS) using transverse use-cases on shared CPS testbed platforms. Researchers were able to explain ongoing research in securing CPS using the testbeds as a case, learn and teach each other about the technology and their research, and then develop prototypes. Lastly, Weiss et al. (weiss2015teaching) reported the design and experience of using an interactive cybersecurity scenarios framework to teach security analysis and support the computer science curriculum. The paper proposed scenarios to be used by educators to nurture the development of security analysis skills in students to complement theoretical security concepts and specific software tools taught within the school’s educational curriculum. Though linked to universities and an educational curriculum, these papers (foley2018science; kharchenko2016university; weiss2015teaching) were not designed to support a specific cybersecurity course.

But similar to our proposal, Boopathi et al. (boopathi2015learning) introduced a training course to learn about cybersecurity using a format that introduces a hackathon much like a capture-the-flag (CTF) competition. Boopathi et al. presented the application of a training format to support graduate and undergraduate level curricula. Here learning happens in three rounds representing three aspects of the training format. The learning round introduced cybersecurity concepts. The jeopardy round tested the participants’ knowledge by administering questions to solve as assignments. The interactive rounds tested the gained knowledge in a real-world scenario through gamification. Boopathi et al. thus introduced a hackathon as part of the third round – but only as a one-off event. Instead, we propose the integration of hackathons into an online cybersecurity course, not as a one-off event but as a series of events, introduced at strategic points of the course to provide ample opportunity for students to learn by doing.

2.2. Hackathon design aspects for learning

Online-led instruction and course delivery can introduce challenges in student engagement, collaboration, and even learning within the course (tinto2011taking). Thus, online educational hackathons should be designed to promote active, experiential learning through dealing with real-world problems while increasing student interest and engagement within the course (horton2018project).

Hackathon interventions can serve to promote engagement, innovation, teamwork, and problem-solving (rennick2018engineering; abasi2020developing). Based on a prior study on developing hackathon interventions to foster learning (abasi2020developing), we can outline the following design aspects in our research that inform interventions suitable for our case. At a typical hackathon event, organizers tend to devote the early part of any hackathon to problem-set development and planning (stoyanov2007effect). Working on a problem set can require single or group participation, with each participant having the necessary skill or experience to solve the posed problem. Thus, groups or teams of two or more participants contribute to enhancing learning through working together to solve the developed problem, complete needed tasks and learn new concepts (abasi2020developing). However, working in teams can pose challenges, such as the clarity of the team goal, effectiveness of the team process, and level of team participation. To alleviate these side effects, educators should introduce team management interventions. Team management interventions improve teamwork and encourage learning by enhancing collaborative team power.

Problem-solving, even in hackathons, requires that students apply known concepts to develop suitable solutions to set problems (stoyanov2007effect) and the opportunity to acquire new information relevant to carrying out a specific project. Thus, domain-specific knowledge encourages learning by problem-solving using the hackathon format. This intervention in the educational setting is strongly supported as courses readily provide domain knowledge through lectures and related lecture materials.

Lastly, lectures and lecture materials provide passive learning, where students are not required to be actively involved. Thus, introducing an opportunity for feedback encourages an environment of shared inquiry between students and educators to foster learning and creativity and enhance active and collaborative learning activities (phillips2005strategies). Feedback given by educators acting as mentors support students to seek their solutions, and findings in the context of educational mentoring attribute increased self-confidence to a positive feedback experience (nolte2020support).

2.3. Related Works

Studies have presented results or experiences concerning educational hackathons. For example, Tandon et al. (tandon2021using) and Mtsweni et al. (mtsweni2015stimulating) investigated how educational hackathons can increase student interest levels. Tandon et al. explored how educational hackathons can increase interest in STEM education and showed positive results in growing participant interest and improving participant knowledge levels (tandon2021using). The research suggested benefits from providing an un-intimidating environment by introducing hackathon approaches where students learn by doing and give the participants collective creative liberties to determine the solutions to problems not usually present in the classroom. However, this study focuses on collocated hackathons in a traditional classroom course and does not account for the case of online education and collaboration. Mtsweni et al. also presented findings to show that the hackathon approach can stimulate and maintain students’ interest in computer science with distance teaching mode. The key elements of the proposed hackathon approach covered collaborations, networking, mentoring, hands-on engagement projects, and community involvement (mtsweni2015stimulating). The authors conducted hackathons in a hybrid format (physically or online); however, they did not introduce hackathons in a course curriculum.

Hackathons should also foster a rapid learning process for educational courses. La Place et al. (la2017engineering) showed how hackathons foster rapid learning-by-doing for engineering students. The research work documented the perceived learning process of the participants and the actions of their learning methods to determine how hackathons can help improve project-based learning courses in engineering to bridge the gap between self-directed active learning and formal learning. According to La Place et al. (la2017engineering), hackathon attributes to improve learning-by-doing include creative freedom and motivation for problem-solving, team collaboration opportunities, proper team management to achieve the team goals and the constraint of time to complete the hackathon tasks. Although the research discusses valuable attributes of the hackathon format that was beneficial to improve rapid learning, it also focuses on collocated hackathons in a traditional classroom course.

Recently, some research works have explored remote educational hackathons. For example, Steglich et al. (steglich2021online) explored how intense collaboration takes place between student teams in an educational hackathon to produce a technological solution and develop professional skills in the online context. The study presents findings that the most valuable skills to enable effective student teams were communication, collaboration, initiative, and creativity/innovation (steglich2021online). However, this study did not account for learning gains in this educational hackathon but rather professional skills and collaboration. Gama et al. (gama2021online) also presented an experience report on how online educational hackathon is used as a resource to engage students in the development of their semester project. The paper describes details of the students’ perception of the hackathon approach used and how it helped to create an intense collaborative experience while having the sense of being virtually collocated (gama2021online). However, the authors introduced hackathon interventions at the end of the course and did not focus on increasing engagement or learning during the course instruction. Lastly, Goodman et al. (goodman2020learn) proposed a framework for online educational hackathons by generalizing two frameworks for hackathons and CTFs, respectively. The approach focused on three main stages - Learn, Apply, and Reinforce/Share where educators teach students or allow students to self-learn new material, apply gained knowledge and skills to tackle a specific problem (self-refined or introduced by educators), and share/present work done (individually or in teams) at the end of the hackathon. However, the work lacked details or concrete usage of this framework in a course setting.

Our work is thus different from prior studies because we integrated multiple online hackathon events into a cybersecurity course instead of a typical one-off hackathon event at the end of the course. Additionally, we evaluate the student’s perception of the hackathon format, the designed interventions integrated into the course, the progression of team aspects during these hackathon events, and how these contribute to students’ learning gains.

3. Empirical Method

To answer our research questions (RQ???, RQ???), we conducted an action research study (lewin1946action) applying proposed interventions at a series of hackathon events integrated into an educational course to foster learning. We conducted the study over six (6) months, spanning three (3) course-integrated hackathon events. We describe the proposed interventions to support the hackathon events in Section 3.1, the course design and timeline of the hackathon events in Section 3.2, our data collection activities in Section 3.3 and our analysis procedures in Section 3.4.

3.1. Proposed Interventions

Based on prior work (sadovykh2019hackathons; abasi2020developing) we developed and introduced three interventions to stimulate collaborative problem solving and encourage learning during the hackathon events – lecture, feedback and team management plan interventions. We based these interventions on the design aspects previously discussed in section 2.2. They are in line with the framework for online educational hackathons proposed by Goodman et al. (goodman2020learn).

First, as part of the cybersecurity course, we provided lecture materials suitable for online instruction. This preparation involved the preparation of video lectures that students can watch online at their convenience and the introduction of discussion sessions and lecture live streams to understand the lecture materials before the hackathon events. We introduced the lecture intervention to enable students to learn about basic concepts and techniques inspire students to reflect on their selected use-case and how the security concepts introduced may apply to the use-case. The lecture intervention also provides students with a base knowledge to attempt hackathon tasks during the hackathon event.

We organized ample team interaction with course instructors through online feedback sessions during each hackathon event and feedback to write-ups submitted after each hackathon event. We introduced the feedback intervention sessions to provide students with the opportunity to gain expert feedback and answers to questions regarding hackathon tasks. Students were also required to submit write-ups at the end of each hackathon, and the course instructors provided feedback on the write-ups. Since we designed closely related hackathon events, we hoped that the feedback provided to the last hackathon tasks would benefit the upcoming hackathon.

Lastly, as the students formed teams for the hackathon events, we prepared the team management plan intervention documents to help teams plan tasks, work together and complete tasks more efficiently. The team management plan aimed to help students document task assignments, assign responsibilities to the hackathon tasks, and specify deadlines for completing the hackathon tasks. The team management plan also aimed to support cooperative working relationships, task organization and assignment, or team leadership.

Figure 1. Timeline of activities

3.2. Setting

This section introduces the course design and how we integrated hackathon events into the course. Based on the hackathon planning kit (nolte2020organize), we set up the core components of the hackathon events.

3.2.1. Cybersecurity Course Design

We introduced our hackathon format as part of a cybersecurity course to teach students about the principles for secure software design from a security risk-aware perspective. The main goal of the course focused on the security risk management of software systems. The course informed on how to ensure the security of software system assets, security requirements engineering and modelling, and understanding major security controls, like role-based access control and cryptography, fundamental to secure software design. As a learning outcome of completing the course, the students should confidently identify causes and consequences of (lack of) system and software security, master techniques to address system and software security problems, elicit and justify the introduction and application of security requirements and controls.

There are three (3) important aspects of ensuring secure software system design concerning security risk management and following the security risk management method used in the course: (i) asset-related concepts, (ii) risk-related concepts, (iii) risk-treatment related concepts. These major aspects led to the context of each of our hackathon events. To promote active learning and achieve the course’s learning outcome, we proposed hackathon events covering the three (3) aspects of ensuring risk-aware secure software design. This included hackathons focusing on the asset-related, risk-related and risk treatment-related concepts taught in the lectures. The proposed format enabled us to easily introduce and integrate hackathon events to allow students to understand the theoretical knowledge and apply it in real-world cases.

3.2.2. Timeline of events

This section covers the timeline of major events throughout the hackathon integrated course duration also outlined in Fig. 1.

Before starting the course, we prepared two use-cases for the students to run a security risk analysis on at the hackathon using knowledge gained during the course. The use-cases covered two (2) software-intensive systems – a Bike Sharing System and an Autonomous Vehicle Parking System – reflecting real-world scenarios to guide the students’ learning process. The Bike Sharing System provides bike-sharing services to its users through its major components: Smart Bike (SB), Bike Share Website (BWA), and the Bike Mobile Application (BMA). The Autonomous Vehicle Parking System provides parking services to its users through its major components: Autonomous Vehicle (AV), Parking Service Provider (PSP), and Parking Lot Terminal (PLT). We prepared use-case information, including UML diagrams to show system interaction between components and textual descriptions to explain the system functions as much information as possible. We also validated the use-case information gathered through the system stakeholders to confirm that they reflect the real-world software system (to a reasonable degree). We also prepared the proposed interventions as discussed in section 3.1.

Once the course began (see Fig. 1), we formally introduced the course and its learning outcomes, the course design and the hackathon format to the students. We requested the students to form teams of three (3) or four (4) members and select a preferred use-case for the team. The students freely formed teams and selected their preferred use-case to be analyzed during the hackathon events. Additionally, we provided the first round of lectures introducing the analysis of asset-related concepts of software systems.

We organized the first hackathon event (Hackathon 1 in  Fig. 1) about two (2) weeks after the start of the course. We provided the hackathon tasks for students to analyze the asset-related concepts of the chosen use-case with knowledge provided by the lecture intervention – lectures and lecture resources offered. The students completed the tasks in teams and submitted a security asset analysis document as their hackathon task report. The students also participated in presentation sessions to discuss the outcome of their hackathon tasks. We provided feedback to the students concerning the presented work and the submitted report after the hackathon event and offered feedback on an ad-hoc basis based on requests by the students or teams.

We organized the second hackathon event (Hackathon 2 in  Fig. 1) about a month after the first hackathon event. Before the second hackathon, we provided the second round of lectures analyzing risk-related concepts of software systems before the event. Thus, we provided hackathon tasks to the students to analyze the risk-related concepts of their chosen use-case building on their asset analysis from the first hackathon event. We introduced the team management plan intervention at this event to support teamwork and collaboration and to help the teams achieve their hackathon tasks more efficiently. The students also participated in online feedback (intervention) sessions with the educators to discuss progress or challenges with their hackathon tasks. At the end of the hackathon event, we asked the teams to submit their team management plan document alongside their task report, documenting their completed  risk-related analysis, as an output of the hackathon event. We provided additional feedback during presentation sessions and written feedback to the task report submitted after the hackathon event.

We organized the third and final hackathon event (Hackathon 3 in  Fig. 1) about a month after the second hackathon event. Before this event, we provided the third round of lectures analyzing risk treatment-related concepts of software systems before the event, where we emphasized the modelling of role-based access controls. Thus, the hackathon tasks for the third hackathon event covered an analysis of the risk treatment-related concepts of their chosen use-case building on the risk analysis from the second hackathon event. We provided multiple online feedback sessions to discuss progress or challenges with the hackathon tasks. The students completed and submitted a cumulative task report of the hackathon tasks from all three (3) hackathon events, representing a security analysis of the use-case they selected (Bike Sharing System or an Autonomous Vehicle Parking System). As an output of the hackathon tasks, the teams provided in the report a proposal for secure software design following asset-related, risk-related, and risk treatment-related analysis of the selected use-case. This report counted to the student’s grade at the end of the course.

3.3. Data collection

After each hackathon event, we conducted a post-hackathon questionnaire using pre-existing instruments that we adapted for our study. Table 4 in  Appendix A shows the scales we utilized for our questionnaire instruments. We collected student perception of team familiarity once at the first hackathon event because continual data collection was unnecessary for this data point as each member remained in the same team for all hackathon events. We also collected student perception of learning only at the final hackathon event, allowing the student to respond retrospectively to the hackathon events and the course. The questionnaire also covered the students’ perception of usefulness and satisfaction with the three interventions and their contribution to team properties such as team familiarity, goal clarity, team efficiency, and team collaboration, thus answering RQ???. We also collected data on the students’ perception of learning achieved to answer RQ???. In addition, we asked open-ended questions in the questionnaire to provide more contextual information about the students’ perception of the team experience and evaluate how the different interventions benefited the teamwork and the learning process (RQ???, RQ???).

3.4. Analysis procedure

After the cybersecurity course completion, we analyzed data collected from each hackathon event questionnaire and qualitatively analyzed open-ended questions to support arguments and provide potential explanations to our analysis from the questionnaire scales and answer research questions RQ??? and RQ???. We selected responses from six (6) teams for our data analysis based on the team size (between three (3) and four (4) members), course grade outcome and teams who provided more complete responses to the questionnaires. We selected and grouped the highest-scoring, middle and lower scoring teams in the course grade outcome criteria. Our selections resulted in three (3) major groups by grade selection having two (2) teams per grade group. The selected team characteristics are summarized in Table 1.

Grade selection Teams Participants
High grade Team A A01, A02, A03, A04
Team B B01, B02, B03, B04
Middle grade Team C C01, C02, C03, C04
Team D D01, D02, D03, D04
Low grade Team E E01, E02, E03, E04
Team F F01, F02, F03
Table 1. Team characteristics

3.4.1. Pre-processing

We first prepared the data collected, replacing the 5-point scale answers with corresponding numbers from , accounted for negative scales and reversed them accordingly. The questions were also coded for easy reference and use in further preparation stages and analysis.

Figure 2. Data points collected for analysis from selected teams.

3.4.2. Reliability

We tested the reliability of the variables using Cronbach’s Alpha. We created summated scales by calculating a composite score from each Likert-type item and gave it a human-readable name format. For example, naming summated scale questions that measure how well the student participants knew each other before the hackathon activities as the Team familiarity

scale. We will continue to use these summated scales and not individual question items for further analysis of the data as Cronbach’s alpha does not provide reliability estimates for single items 

(gliem2003calculating). We presented results for the reliability of the items in Table 2 for the selected sample. The Cronbach’s Alphas were higher than the recommended value of , which corroborates reliability.

3.4.3. Descriptive analysis

Once we found our scales reliable, we then grouped individual responses to those scales into their respective teams and grouped each team into their grade groups for additional analysis. Descriptive statistics used for our data items include the median for central tendency and interquartile range as a measure of statistical dispersion. These methods represented the data collected for all teams and aided the extraction of valuable observations. We illustrate the median and interquartile range values per team (see

Table 3).

Cronbach’s Alpha ()
Team familiarity 0.77
Team goal clarity 0.76
Team participation 0.88
Team process 0.91
Feedback outcome 0.83
Lecture outcome 0.95
Team management plan outcome 0.91
Learning outcome 0.89
Table 2. Scale reliability.
Grade
selection
Descriptive statistic Team
familiarity*
Team goal clarity Team participation Team
process
Lecture Feedback Team management plan Learning**
High Team A n 4 12 12 12 12 12 8 4
M 3.50 5.00 4.25 4.50 4.00 4.25 3.75 4.00
IQR 0.63 0.50 0.56 0.50 0.56 0.63 0.50 0.13
Team B n 4 11 11 11 11 11 7 3
M 3.00 4.00 4.00 4.00 4.00 4.00 4.00 5.00
IQR 0.13 0.75 0.75 0.75 0.25 0.00 0.25 0.25
Middle Team C n 2 9 9 9 9 9 7 4
M 2.00 4.50 4.00 4.00 3.00 4.00 3.00 4.00
IQR 0.00 0.50 0.25 0.00 0.50 0.50 0.25 0.25
Team D n 2 8 8 8 8 8 5 3
M 2.00 4.25 4.25 4.00 4.00 4.00 3.00 4.00
IQR 0.00 0.63 0.75 0.75 0.31 0.00 0.50 0.00
Low Team E n 4 11 11 11 11 9 7 3
M 2.00 5.00 5.00 5.00 4.00 4.00 3.00 4.00
IQR 0.25 0.00 0.00 0.00 0.75 0.50 0.75 0.25
Team F n 2 6 6 6 6 5 2 2
M 1.00 4.00 4.50 4.00 4.00 4.00 4.50 4.50
IQR 0.00 0.19 0.69 0.19 0.38 1.00 0.25 0.25
Table 3. Calculated cumulative data points from all three (3) hackathon events (response count , median , and interquartile range ) used in qualitative analysis. Median and interquartile range values are from responses given on a 5-point scale.
* We collected team familiarity data once at the first hackathon events. Continual data collection was unnecessary for this data point as each member remained in the same team for all hackathon events.
**Learning data was collected once at the last hackathon event.

4. Findings

This section outlines the students’ perception of each learning intervention for each team and the differences between teams by their team properties and learning process. The participants who formed part of the study were cybersecurity students with different levels of prior knowledge related to cybersecurity coming from different backgrounds. We did not examine their background in this paper though. The majority of participants who reported their age (n=23) were less than fifty (50) years, with an average of twenty-eight (28) years.

In general, we observed a positive perception of the team properties, interventions introduced and learning outcome at the end of the course. From our observations, the medians of student responses to most data points remained above the average of our 5-point Likert scale interval (see  Fig 2 and  Table 3). However, this was different for team familiarity suggesting lower familiarity between team members. Comparing box plots in Fig 2, the data suggests that overall, the students had a high level of agreement on how well the team properties and the interventions benefited them and on learning gains. In this section, we analyze collected data to understand better the student’s perception of the team properties and the contribution of the interventions to the team properties that foster teamwork in Section 4.1 to answer RQ???. We also discuss the interventions contributions to learning in Section 4.2 to answer RQ???.

4.1. Perception of Team Properties

By analyzing the responses, we observed that the teams formed for the hackathon events were mainly comprised of individuals who were not very familiar with each other as the teams reported lower values of team familiarity (as seen in Fig 2). This observation can be due to various issues, including the diversity of the students taking the course or having trouble meeting before the course or before team formation due to the COVID-19 pandemic. However, high grade teams (, , see Team Familiarity in Fig 3) possessed an above-average and relatively higher perception of team familiarity than teams in other grade groups, indicating that familiarity among team members was beneficial in having effective teamwork and collaboration for the hackathon tasks (see Fig 3 and Table 3). Student B01, from a high grade team – Team B, reported to “already know teammates from the previous semester”(B01).

Figure 3. Team properties per grade group.

All teams reported a higher than average perception of team process with . We observed a rise in the team process between the first and second hackathon event and a drop at the third hackathon event (see  Fig 4). The rise in the perceived team process effectiveness may be due to several aspects, including the team members getting to know each other and learning from experiences of the first hackathon event to improve coordination and effects from introduced interventions. However, the drop in perceived team process effectiveness in completing hackathon tasks as seen in  Fig 4, may be due to the natural progression of the cybersecurity course and hackathon task difficulty alongside other classes the student will take in parallel during the semester as expected. Student C02, from a middle grade team – Team C reported that by the third hackathon event it was “very hard to coordinate team efforts when each and every course requires team work and has different members; … context switching in human brain does not allow such load” (C02). However, this drop in perceived team process effectiveness remained .

Figure 4. Team properties at each hackathon event.

All teams reported a higher than average perception of team goal clarity with . The teams also reported a relatively higher perception of team goal clarity compared to other team measures (i.e., team process, team participation). We observed a significant rise in the team goal clarity between the first and second hackathon event and relative stability by the third hackathon event (see  Fig 4). The rise and relative stability of the perceived team goal clarity values may be due to the team learning from experiences of previous hackathon events to redefine their goals and gain more from the introduced interventions such as the team management plan to help with coordinating duties and responsibilities.

Lastly, all teams reported a higher than average perception of team participation with . We however saw a drop in team participation between the first and second hackathon event, although, with the drop, team participation perception remained . But, the third hackathon event saw a rise at the third hackathon event to the same level as the first hackathon (see  Fig 4). This may be due to the need for increased participation within teams at the first hackathon event to initially define the team goals and tasks, and eventually, increased participation to compile the final security analysis report at the third hackathon event. It is possible that not as much participation was necessary at the second hackathon event.

4.2. Perception of Learning

Students reported positive learning experiences in responses to the open-ended questions and analyzed teams reported a higher than average perception of learning. High grade teams reported higher values of achieving the course learning outcome than other grade groups ( = , see Learning Outcome in Fig 5), with low grade teams having a reported value of and middle grade teams having a reported value of (see Learning Outcome in Fig 5). Student B01 added that the hackathon tasks contributed to learning as it “allowed me to understand some concepts in-depth or concepts that seemed clear in class but were not in reality” (B01). We also observed that the interventions contributed to perceived learning. Student A02 commented about the ”team management plan intervention, stating that the “team management plan was not easy, however, I believe it provides a positive impact in enhancing my knowledge” (A02). Student B01 also added that the “the lectures contributed to learning” (B01), and ”textit“the lecture resources provided additional context to class lectures” (B01). However, student F01 complained that the lectures had “too much content and it gets confusing at times” (F01). Regarding the feedback intervention, student B01 noted that the online feedback given helped most for learning as it allowed the team to “ask for explanations” (B01), and ”“better understand some concepts that seemed clear but were not” (B01).

Figure 5. Intervention and learning measures per grade group.

4.3. Perception of Interventions

We analyzed the students’ perception of each intervention – lectures, feedback, team management plan based on the student grade groups (see  Fig 5) to answer RQ???.

4.3.1. Lecture Intervention

All teams reported a higher than average perception of the usefulness of the lecture intervention. High grade teams (, see Lecture Outcome in Fig 5) however, reported higher values of the lecture intervention than other teams followed by low grade teams () then middle grade teams (). At the first hackathon event, Student A01 commented on how Team A worked with the lecture intervention. Student A01 stated that although it was personally easy to follow the lectures and lecture resources, “the most difficult part is to work with the team, as everyone has a different understanding of the lecture” (A01). As such, Student A01 asserted that the team had issues with “correctly completing parts of the hackathon tasks due to these misunderstandings of the lecture” (A01). Student F01 indicated about the lecture content that “sometimes it feels that there is too much content and it gets confusing at times” (F01) and Student C02 corroborated this, stating that “some of the lecture topics needs to be reworked (due to the difficulty of the topics)” (C02). At the second hackathon event, student E01 commented that the “lectures have touched nicely the aspects that are expected by the team to work on in the hackathon” (E01) but also noted that “…in some cases, it seems we had misunderstood lecture materials” (E01). Student B01, however, added that for the second hackathon event, “without the lectures and reading resources it was quite unfeasible completing the hackathon tasks” (B01). At the third hackathon event, and in retrospect of the hackathon events completed, Student B01 repeated that “the lectures and reading resources were useful in understanding the hackathon tasks” (B01), thus encouraging task completion within the team and an improved team process.

4.3.2. Feedback Intervention

High grade teams (, see Feedback Outcome in Fig 5) reported higher values of the feedback intervention than other teams followed by middle grade teams () then low grade teams (). For written feedback, Student A03 noted that “it felt vague and rushed” (A03), further clarifying that members of team A “needed to ask for clarification on every single point individually as to what the feedback comment meant” (A03). Student D02 also noted that “receiving feedback was good, and it helped, but there might be a more effective way of doing it in the future for both students and lecturers” (D02). However, student B01 highlighted how the written feedback intervention was helpful for team B, stating that it “allowed us to find and ”correct some inconsistencies with the previous work” (B01) and “correct many errors before the submissions” (B01). Student B01, however, preferred online feedback sessions, stating that “the opportunity to get explanations in real-time about the feedback was better” (B01). Student B01 also clarified that for Team B, the online feedback was more useful “because having a review of the current work at sessions instead of general comments allowed us to find some errors” (B01). Student A02 corroborated this, stating that Team A “used the online feedback sessions to double-check that we got the tasks right” (A02) and student C01 also remarked on how the “feedback sessions and feedback helped a lot” (C01) for the team.

4.3.3. Team management plan Intervention

All teams reported a higher than average perception of the team management plan. High grade teams ( = , see Team management plan Outcome in Fig 5) reported a higher values related to the team management plan intervention than other teams followed by low grade teams () followed by middle grade teams (). At the first hackathon event, we observed a low perception of team familiarity and other issues with team aspects. Student A03 commented on problems with the team experience and organization, noting that “it was difficult to plan and find the time when all of us were free to work together” (A03). Student B01 (Team B) reported that although the team “worked quite well” (B01), they “were disorganized” (B01) for the first hackathon as they attempted to “divide the tasks in a fair way at the beginning” (B01). We introduced the team management plan intervention at the second hackathon event. Student E01 noted that “communication and organization of the team was very effective” (E01). Student B01 added that the introduction of the team management plan “was useful to keep track of who is doing what and to see the progress” (B01). But, Team C and Team D (from the middle grade group) reported the lowest values for the team management plan intervention. Student D01 noted that although the team management plan was “beneficial for clearly defining smaller tasks…, the added value from the management plan was minimal. However, it didn’t hinder us or add to our workload” (D01).

5. Discussion

In this section, we will discuss the student’s perception of how the interventions benefited teamwork (RQ???) and contributed to student learning (RQ???).

5.1. Teamwork and Collaboration

The introduced interventions showed effects on teamwork and collaboration on the hackathon tasks. We found that the lecture intervention contributed to the perception of students’ effectiveness in completing the hackathon task within the team, especially when the lectures and lecture resources are understandable and applicable to the use-case and the hackathon tasks. The lectures also provided fundamental conceptual security knowledge that was crucial to achieving the hackathon tasks. The introduction of the team management intervention could have improved the team process and effectiveness to work on the hackathon tasks. We noticed that with the observed perception of team familiarity (see section 4.1), coordinating tasks within the team could be challenging and sometimes time-wasting as each team member is just learning to collaborate for the first time. The team management plan provided sections that aid definition and coordination of hackathon tasks, assignment of responsibility, and setting deadlines for hackathon tasks where Team B (from the high grade group) and Team F (from the low grade group) benefited the most. Although responses from Team C and Team D (from the middle grade group) can indicate that the team management plan provided minimal contribution (see Team management plan Outcome in Fig 5), it did not hinder the students from their hackathon tasks nor place an additional workload on the teams. Lastly, we saw how the feedback interventions contributed to the teamwork through the opportunity to quickly clear out misunderstandings that may exist within the team about the lectures and the previous and current hackathon tasks. The feedback intervention improves the team effectiveness in completing hackathon tasks correctly. We also saw the contributions of the online feedback sessions over written feedback. Here, an asynchronous form of feedback seemed to improve the teams’ ability to handle hackathon tasks quicker without task roadblocks that can be solved by expert guidance in a shorter period.

Our findings thus address the gap identified by Gama et al. (gama2021online) by designing a hackathon approach that shows results in stimulating student engagement throughout the online course through the hackathon interventions and its introduction at strategic points during the course.

5.2. Learning Outcome

The students reported a positive learning experience in this course. Additionally, we observed that the hackathon format integrated into the course improved learning, allowing the students to practice concepts introduced during the course and get feedback that helps improve knowledge in the domain and their process.

The lecture intervention was able to provide in-depth information on the security concepts needed for the course. Students reported learning by applying security concepts and practices explained in the lectures in their hackathon tasks. Additionally, the hackathon tasks were crafted with the course curriculum and lecture paths in mind, increasing applicability and the opportunity to learn by doing. The feedback intervention showed greater usefulness in learning as it allowed teams to discuss possible misunderstandings and errors found in the past hackathon task reports and make corrections for future hackathon tasks. The online feedback sessions especially allowed students to discuss current hackathon tasks to prevent repeating past mistakes or introducing new errors into the task outcomes. The team management plan intervention also served to improve collaborative power for teams, which improves teamwork and encourages rapid learning by working together to complete hackathon tasks correctly.

Although all teams reported positive benefits of the interventions to learn, we can observe how the interventions contributed to learning by analyzing the learning experiences of teams in the high grade group instead of teams in the middle grade or low grade group. We saw that teams in the high grade group reported higher benefits from the interventions than other teams. Responses from members of Team A and Team B (from the high grade group) showed that they could take advantage of the provided interventions. They could do this by applying the lecture resources to understand and complete the hackathon tasks, using the feedback sessions to discuss the written feedback points and other task-related questions, and asking questions concerning past hackathon tasks completed. The teams also used the team management plan to work more effectively, thereby providing a positive impact in enhancing knowledge. The positive benefits are also evident in the teams in the high grade group reported higher learning outcomes than students in low and middle grade groups.

Our findings address the gaps in Steglich et al. (steglich2021online) by introducing interventions that support collaboration between students in teams alongside interventions to support learning gains. Through our approach, the hackathon format supported learning gains at multiple points in the course, not just at the end of the course, as done in Gama et al. (gama2021online). Additionally, we address the gaps in La Place et al. (la2017engineering) and Tandon et al. (tandon2021using) to demonstrate how the introduced interventions support learning-by-doing through educational hackathons in an online context.

5.3. Suggested Improvements

Our findings on the lecture intervention indicate that an understanding (or lack of understanding) of the lectures and lecture resources can affect the team process and output. When the team members do not understand the lectures or lecture resources, it negatively impacts the team process. Students will likely spend some time understanding task-related lectures and lecture resources before working on the tasks. Additionally, in the bid to provide as much information as required for learning, there were some complaints of the lectures having too much content, which can be confusing and affect the students’ learning process. To improve the lecture intervention when following this hackathon approach, we suggest more consideration of the balance between the quantity of theory provided and its applicability in the hackathon tasks and provide ample opportunity for feedback sessions where educators can further discuss the lectures and how it relates to the hackathon tasks. Secondly, we expect that familiarity between members of teams formed during online education and hackathon events will likely have low team familiarity levels. Thus hackathon interventions must account for this and work towards improving collaboration regardless of familiarity levels. Also, introducing more interesting real-life use-cases and possible hands-on collaboration with the industry in the use-cases provided can help to boost student and team interest and participation.

5.4. Limitations

There are certain limitations associated with this particular study design. We developed specific interventions and studied teams participating in the hackathon events as part of a particular cybersecurity course. First, it is not possible to generalize our findings beyond the context of our specific course since another study on a different course and with different use-cases might yield different results. However, the point of our work is not generalization but rather to evaluate the interventions, report findings and provide suggestions on how to handle hackathon integration to online-led courses. Second, the sample size of our study may include bias based on our selection for analysis. However, we selected a cross-section of student teams covering high, middle and low grade levels and meeting other requirements such as team size and questionnaire response completeness. Third, two out of three researchers conducting the study were involved in the hackathon planning, execution, and course grading, which can introduce bias to the reported findings by the students. Additionally, the post-hackathon questionnaire for all three hackathon events was not anonymous, also introducing bias to the reported findings. However, one researcher had no involvement in executing the hackathon and refrained from interfering during the hackathon and the course until analysis of the collected data began. We began our analysis of collected data after all three hackathon events and the cybersecurity course was completed to prevent bias in grading the students involved in our analysis based on how they reacted to our intervention style. Since we compare within the students, such bias will affect all students equally. Furthermore, we abstained from making causal claims in our analysis; instead, we provide a rich description of students and teams’ observed behaviour and reported perceptions. Lastly, there might be a bias in reporting and analyzing the open-ended questions; however, we did not generalize or draw final conclusions but used the responses as potential explanations to our findings.

6. Concluding Remarks

This paper reported findings from an action research study of six (6) teams at a series of educational hackathons integrated into an online cybersecurity course. The study aimed to show how educators can support teamwork and collaboration, maintain student participation and interest, and encourage learning-by-doing throughout the course through educational hackathons in an online context. We introduced the lecture, feedback and team management plan interventions to achieve our hackathon goals.

Our findings indicate that these interventions helped the students to achieve the course learning outcomes by knowledge sharing through lectures, guidance on applying knowledge gained primarily through feedback, improving efficiency in completing the given hackathon tasks through the team management plan. Collectively, these interventions improved the team collaborative power and maintained interest and participation in the online course, thereby addressing challenges faced with online instruction. Our results also point to suggestions useful for future iterations of the hackathon format.

References

Appendix A Appendix

Team familiarity (based on Filippova et al.(filippova2017diversity)), anchored between not at all and completely.
I know my team members well.
I have collaborated with some of my team members before.
I have socialized/met with some of my team members outside of work/school before.
Team process (based on Bhattacherjee(bhattacherjee2001understanding)), anchored between 1 and 5.
I am satisfied with the work completed in my project.
I am satisfied with the quality of my team’s output.
My ideal outcome coming into my team was achieved.
My expectations towards my team were met.
Perceived satisfaction with team process (based on Filippova et al.(filippova2017diversity)), anchored between strongly disagree and strongly agree.
(1) Inefficient to (5) Efficient
(1) Uncoordinated to (5) Coordinated
(1) Unfair to (5) Fair
(1) Confusing to (5) Easy to understand
Team goal clarity (based on Nolte et al.,(nolte2018you)) anchored between strongly disagree and strongly agree.
I was uncertain of my duties and responsibilities in this team.
I was unclear about the goals and objectives for my work.
I was unsure how my work relates to the overall objectives of my team.
Perception of team participation and voice (based on Nolte et al.(nolte2018you)) anchored between strongly disagree and strongly agree.
Everyone had a chance to express her/his opinion.
The team members responded to the comments made by others.
The team members participated very actively during our collaboration.
Overall, the participation of each team member was effective.
Perception of the usefulness of the interventions (based on Sauro(sauro2011measuringu)) anchored between strongly disagree and strongly agree.
Using the [intervention] enabled me to accomplish tasks more quickly.
Using the [intervention] improved my team’s performance.
Using the [intervention] increased my productivity in the hackathon.
Using the [intervention] enhanced my effectiveness in my team.
Using the [intervention] made it easier to complete my [hackathon] solution.
I found the [intervention] useful in my team.
Learning outcome measured students’ perception of achieving the course’s learning outcomes, perceived learning process, and learning through problem-solving; anchored between strongly disagree and strongly agree.
The hackathon events allowed me the opportunity to design secure systems and software.
The hackathon activities made my learning experience more productive.
The lectures given were geared to promote my understanding.
There were enough opportunities during the course to find out if I clearly understood the course material.
The [interventions] given were appropriate and geared to promote my understanding.
Table 4. Post-Hackathon Questionnaire Instrument