1. Introduction
In the entire history of computing, programming has been a largely physically static activity. But technologies previously inaccessible to most users are now growing rapidly. Today, 78% of Americans are familiar with VR (from 45% in 2015) (Greenlight Insights, 2018). As a result, experiences traditionally created for desktops are now appearing in VR, e.g., training (Bertram et al., 2015) and automobile design (Lawson et al., 2016). Researchers argue that VR increases immersion (Hussein and Nätterdal, 2015), which in turn increases engagement and learning (Dede, 2009). VR might be especially useful for teaching programming because spatial navigation in VR helps reduce extraneous cognitive load and increase germane cognitive focus on learning content compared to text on a screen (Lee and Wong, 2014). Further, VR allows users to experience a sense of self-presence in the environment (Ratan, 2013), which facilitates an embodied-cognitive learning experience (Shin, 2017; Melcer, 2018) through which users interact with the learning content more intuitively (Steed et al., 2016), potentially augmenting learning outcomes (Cheon and Grant, 2012). Nonetheless, only a handful of environments for programming exists in VR. In this article, we describe a programming game in virtual reality that we created called Hack.VR.111Trailer video: https://youtu.be/3Mp6ISjD1mg.222Walkthrough video: https://youtu.be/TGc8H3Nw-3M.
2. Programming in VR
Existing environments for programming in VR can be seen in Figure 1. These include VR-OCKS and Cubely, block-based VR programming environments (Segura et al., 2019; Vincur et al., 2017), and Imikode, a multiple choice code snippet selection environment in VR (Oyelere and Cavalli-Sforza, 2019). Other significant projects include NeosVR (NeosVR, 2020), a shared social universe that features powerful programming tools for VR world creation, and Primitive (Primitive, 2020), an “immersive development environment” enabling 3D code visualization. In the indie game Glitchspace (Glitchspace, 2016), players use a visual programming language to solve puzzles. These environments for programming in VR have been developed for education (Cubely, Imikode, VR-OCKS), for modifying virtual worlds (NeosVR), for code visualization (Primitive), and for entertainment (Glitchspace).
Importantly, Hack.VR was created specifically to teach object-oriented programming (OOP) compared to the highly procedural approaches in the systems above. OOP encapsulates logic into objects. This paradigm has a natural translation to VR, where 3D objects can each contain internal programming that is abstracted from observers. Hack.VR is the first system for learning OOP in VR, and will serve as a testbed to perform research studies. This testbed may also be useful for studying other aspects, e.g., help facilities, embellishment, the player avatar, feedback, and the resulting effects on VR programming (Frommel et al., 2017; Kao, 2020a; Hicks et al., 2019; Kao, 2020b, 2019b; Kao and Harrell, 2017b; Birk et al., 2016; Kao and Harrell, 2017a; Kao, 2019c; Kao and Harrell, 2018, 2016b, 2015a, 2015b; O’Rourke et al., 2014; Kao, 2019a; Kao and Harrell, 2016a).
3. The Game
3.1. Engine
In Hack.VR, a program is a set of nodes. See Figure 2. Nodes contain typical programming constructions, e.g., primitive values, objects, arithmetic operators, conditionals, event handlers, function calls. Nodes facilitate communication through data flow. Nodes may have both inputs and outputs depending on the node type. For example, see Figure 3. Nodes can also represent entire method calls, the details of which are abstracted from the player except input and output. Because the goal of Hack.VR is to teach the player OOP, the inner-workings of the methods themselves are intentionally abstracted away (and players cannot see the code) so that the player can concentrate on higher-level representations. The engine also supports extensions. For example, once a new function has been defined in the engine, a node can call it. To reduce complexity, players in Hack.VR use designer-embedded nodes to solve each puzzle instead of creating their own nodes. While the engine supports on-the-fly node creation, the UI does not currently support this. Node-based programming, like any other type of programming, can lead to execution errors. For example, a NOT node expects a boolean input (true or false), and outputs the inversion of its input. However, if a numerical value is instead used as input to a NOT node, this results in an error. While this is valid in some procedural languages (in C programming, 0 is false and anything else is true), implicit conversions from numerical values to boolean data types is not allowed in object-oriented programming (e.g., Java, C#). When an error is detected in the program, this is indicated by a red tube. See Figure 4 for examples.
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Similarities can be drawn between Hack.VR and other programming paradigms inherently predisposed to visualization, e.g., flow-based programming (Morrison, 1994). Hack.VR is inspired by node graph systems: Unreal Engine 4’s Blueprints (Epic Games, 2020), Unity Playmaker (Hutong Games LLC, 2019), and shader editors (Shader Graph (Unity Technologies, 2020) and Amplify Shader Editor (Amplify Creations, 2020)).
![]() |
![]() |
![]() |
![]() |
3.2. Art Style and Gameplay
Hack.VR is based in a sci-fi futuristic setting. See Figure 12. In Hack.VR, the player holds two “guns” that take the place of their VR controllers. See Figure 5. Hack.VR is compatible with both HTC Vive and Oculus. Hack.VR’s controls are found in Appendix A. Using these controls, Hack.VR challenges players with object-oriented programming puzzles. Hack.VR consists of 17 different puzzles. Each puzzle builds upon concepts from prior puzzles. See Figures 10 through 11 for a short description of puzzles.
3.3. Design Process
The design process followed a spiral approach of design → implementation → evaluation. Iterations grew in complexity and refinement over several cycles for each part of the game. Feedback was solicited from designers, developers, and playtesters. Comments affirmed positive design choices (e.g., “I like that you can see the physical buttons and physical door miniaturized in the node tree”) and highlighted potential improvements (e.g., “When I’m connecting things, it’s hard to tell what connector I’m working with; maybe highlight the current selected connector?”). A typical early prototype can be seen in Figure 13.
![]() |
![]() |

4. Conclusion
In this article we described Hack.VR, a programming game in virtual reality. We created a VR programming language that is highly visual, while being highly semantic. Hack.VR is inspired by the possibilities of programming in VR. Imagine these highly evocative scenarios:
-
Programming an infinite stairwell taking you into the clouds.
-
Programming a robot carrying you across vast deserts, rolling hills, and tundras.
-
Reconfiguring and reprogramming the mechanical parts in your gun to enhance your capabilities.
Given the great potential for VR to enhance learning outcomes (Lee and Wong, 2014; Shin, 2017; Steed et al., 2016; Cheon and Grant, 2012), we expect that Hack.VR might help teach programming concepts more effectively than similar, non-immersive tools. Although assessment research should be conducted to confirm this expectation empirically, from a perspective that spans research, design, and play, there is reason to be excited about what the coming decade will bring for programming in VR.
References
- (1)
- Amplify Creations (2020) Amplify Creations. 2020. Amplify Shader Editor. (2020). https://assetstore.unity.com/packages/tools/visual-scripting/amplify-shader-editor-68570
- Bertram et al. (2015) Johanna Bertram, Johannes Moskaliuk, and Ulrike Cress. 2015. Virtual training: Making reality work? Computers in Human Behavior 43 (2015), 284–292. https://doi.org/10.1016/j.chb.2014.10.032
- Birk et al. (2016) Max V Birk, Cheralyn Atkins, Jason T Bowey, and Regan L Mandryk. 2016. Fostering Intrinsic Motivation through Avatar Identification in Digital Games. CHI (2016). https://doi.org/10.1145/2858036.2858062
- Cheon and Grant (2012) Jongpil Cheon and Michael M Grant. 2012. The effects of metaphorical interface on germane cognitive load in web-based instruction. Educational Technology Research and Development 60, 3 (2012), 399–420.
- Dede (2009) Chris Dede. 2009. Immersive interfaces for engagement and learning. (2009). https://doi.org/10.1126/science.1167311
- Epic Games (2020) Epic Games. 2020. Blueprints Visual Scripting. (2020). https://docs.unrealengine.com/en-US/Engine/Blueprints/index.html
- Frommel et al. (2017) Julian Frommel, Kim Fahlbusch, Julia Brich, and Michael Weber. 2017. The Effects of Context-Sensitive Tutorials in Virtual Reality Games. (2017), 367–375. https://doi.org/10.1145/3116595.3116610
- Glitchspace (2016) Glitchspace. 2016. Glitchspace. (2016). https://store.steampowered.com/app/290060/Glitchspace/
- Greenlight Insights (2018) Greenlight Insights. 2018. New Consumer Data Finds VR Headset Usage Expected To Increase In 2019, According To Greenlight Insights. (2018). https://greenlightinsights.com/new-consumer-data-finds-vr-headset-usage-expected-increase-2019-according-greenlight-insights/
- Hicks et al. (2019) Kieran Hicks, Kathrin Gerling, Graham Richardson, Tom Pike, Oliver Burman, and Patrick Dickinson. 2019. Understanding the effects of gamification and juiciness on players. IEEE Conference on Computatonal Intelligence and Games, CIG 2019-Augus (2019). https://doi.org/10.1109/CIG.2019.8848105
- Hussein and Nätterdal (2015) Mustafa Hussein and Carl Nätterdal. 2015. The benefits of virtual reality in education- A comparison study. (2015).
- Hutong Games LLC (2019) Hutong Games LLC. 2019. Playmaker. (2019). https://assetstore.unity.com/packages/tools/visual-scripting/playmaker-368
- Kao (2019a) Dominic Kao. 2019a. Exploring the Effects of Growth Mindset Usernames in STEM Games. American Education Research Association (2019).
- Kao (2019b) Dominic Kao. 2019b. Infinite Loot Box: A Platform for Simulating Video Game Loot Boxes. IEEE Transactions on Games (2019). https://doi.org/10.1109/tg.2019.2913320
- Kao (2019c) Dominic Kao. 2019c. The Effects of Anthropomorphic Avatars vs. Non-Anthropomorphic Avatars in a Jumping Game. In The Fourteenth International Conference on the Foundations of Digital Games.
- Kao (2020a) Dominic Kao. 2020a. Exploring Help Facilities in Game-Making Software. In ACM Foundations of Digital Games.
- Kao (2020b) Dominic Kao. 2020b. The effects of juiciness in an action RPG. Entertainment Computing 34, November 2018 (2020), 100359. https://doi.org/10.1016/j.entcom.2020.100359
- Kao and Harrell (2015a) Dominic Kao and D. Fox Harrell. 2015a. Exploring the Impact of Role Model Avatars on Game Experience in Educational Games. The ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play (CHI PLAY) (2015).
- Kao and Harrell (2015b) Dominic Kao and D. Fox Harrell. 2015b. Mazzy: A STEM Learning Game. Foundations of Digital Games (2015).
- Kao and Harrell (2016a) Dominic Kao and D. Fox Harrell. 2016a. Exploring the Effects of Encouragement in Educational Games. Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI 2016) (2016).
- Kao and Harrell (2016b) Dominic Kao and D. Fox Harrell. 2016b. Exploring the Impact of Avatar Color on Game Experience in Educational Games. Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI 2016) (2016).
- Kao and Harrell (2017a) Dominic Kao and D. Fox Harrell. 2017a. MazeStar: A Platform for Studying Virtual Identity and Computer Science Education. In Foundations of Digital Games.
- Kao and Harrell (2017b) Dominic Kao and D. Fox Harrell. 2017b. Toward Understanding the Impact of Visual Themes and Embellishment on Performance, Engagement, and Self-Efficacy in Educational Games. The annual meeting of the American Educational Research Association (AERA) (2017).
- Kao and Harrell (2018) Dominic Kao and D. Fox Harrell. 2018. The Effects of Badges and Avatar Identification on Play and Making in Educational Games. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI’18.
- Lawson et al. (2016) Glyn Lawson, Davide Salanitri, and Brian Waterfield. 2016. Future directions for the development of virtual reality within an automotive manufacturer. Applied Ergonomics 53 (2016), 323–330. https://doi.org/10.1016/j.apergo.2015.06.024
- Lee and Wong (2014) Elinda Ai Lim Lee and Kok Wai Wong. 2014. Learning with desktop virtual reality: Low spatial ability learners are more positively affected. Computers and Education (2014). https://doi.org/10.1016/j.compedu.2014.07.010
- Melcer (2018) Edward F Melcer. 2018. Learning with the body: Understanding the Design Space of Embodied Educational Technology. Ph.D. Dissertation. New York University Tandon School of Engineering.
- Morrison (1994) J. Paul Morrison. 1994. Flow-based programming. In Proc. 1st International Workshop on Software Engineering for Parallel and Distributed Systems. 25–29.
- NeosVR (2020) NeosVR. 2020. NeosVR. (2020). https://store.steampowered.com/app/740250/Neos
- O’Rourke et al. (2014) E O’Rourke, Kyla Haimovitz, Christy Ballweber, Carol S. Dweck, and Zoran Popović. 2014. Brain points: a growth mindset incentive structure boosts persistence in an educational game. Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI ’14 (2014), 3339–3348. http://dl.acm.org/citation.cfm?id=2557157
- Oyelere and Cavalli-Sforza (2019) Solomon Sunday Oyelere and Violetta Cavalli-Sforza. 2019. Imikode: A VR Game to Introduce OOP Concepts. Proceedings of the 19th Koli Calling International Conference on Computing Education Research (2019).
- Primitive (2020) Primitive. 2020. Primitive. (2020). https://primitive.io/
- Ratan (2013) Rabindra Ratan. 2013. Self-presence, explicated: Body, emotion, and identity extension into the virtual self. In Handbook of research on technoself: Identity in a technological society. IGI Global, 322–336.
- Segura et al. (2019) Rafael J. Segura, Francisco J. del Pino, Carlos J. Ogáyar, and Antonio J. Rueda. 2019. VR-OCKS: A virtual reality game for learning the basic concepts of programming. Computer Applications in Engineering Education August (2019). https://doi.org/10.1002/cae.22172
- Shin (2017) Dong Hee Shin. 2017. The role of affordance in the experience of virtual reality learning: Technological and affective affordances in virtual reality. Telematics and Informatics (2017). https://doi.org/10.1016/j.tele.2017.05.013
- Steed et al. (2016) Anthony Steed, Ye Pan, Fiona Zisch, and William Steptoe. 2016. The impact of a self-avatar on cognitive load in immersive virtual reality. In Proceedings - IEEE Virtual Reality. https://doi.org/10.1109/VR.2016.7504689
- Unity Technologies (2020) Unity Technologies. 2020. Shader Graph. (2020). https://unity.com/shader-graph
- Vincur et al. (2017) Juraj Vincur, Martin Konopka, Jozef Tvarozek, Martin Hoang, and Pavol Navrat. 2017. Cubely: Virtual reality block-based programming environment. Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST Part F1319, 2 (2017). https://doi.org/10.1145/3139131.3141785
Appendix A
Hack.VR Controls
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Comments
There are no comments yet.