RadVR: A 6DOF Virtual Reality Daylighting Analysis Tool

07/02/2019
by   Mohammad Keshavarzi, et al.
berkeley college
1

This work introduces RadVR, a virtual reality daylighiting analysis tools, that simultaneously allows the analysis of qualitative immersive renderings and the assessment of quantitative data of physically correct daylighting simulations in a 6DOF virtual environment. With an end-to-end workflow and integration with commonly used modeling software, the system takes a 3D model and material properties as input and allows user-designers to (1) perform physically-based daylighting simulations powered by the Radiance engine (2) study sunlight penetration in different hours of the year by navigating through time (3) Interact with a 9-point-in-time matrix for the nine most representative times of the year (4) Visualize, compare and analyze daylighting simulation results using integrated tools in virtual reality. By conducting user experiments and comparing the system with a conventional 2D-display daylight analysis tool, Diva4Rhino, the results show that RadVR outperforms Diva4Rhino in spatial understanding tasks, navigation and sun position analysis.

READ FULL TEXT VIEW PDF

Authors

page 4

page 5

page 6

page 7

page 8

page 9

02/07/2020

Above Surface Interaction for Multiscale Navigation in Mobile Virtual Reality

Virtual Reality enables the exploration of large information spaces. In ...
02/21/2022

ReViVD: Exploration and Filtering of Trajectories in an Immersive Environment using 3D Shapes

We present ReViVD, a tool for exploring and filtering large trajectory-b...
04/06/2015

Preprint Big City 3D Visual Analysis

This is the preprint version of our paper on EUROGRAPHICS 2015. A big ci...
12/03/2019

The Plausibility Paradox in Small-Scale Virtual Environments

This paper identifies a new phenomenon: when users interact with physica...
08/27/2013

Affordable Virtual Reality System Architecture for Representation of Implicit Object Properties

A flexible, scalable and affordable virtual reality software system arch...
04/12/2021

Some Lessons Learned Running Virtual Reality Experiments Out of the Laboratory

In the past twelve months, our team has had to move rapidly from conduct...
02/05/2021

The Plausibility Paradox for Resized Users in Virtual Environments

This paper identifies and confirms a perceptual phenomenon: when users i...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In the past decade, with the rise of consumer-friendly and affordable hardware, Virtual Reality has formed a larger role in the AEC community by assimilating the sense of scale and depth of various stakeholders of building projects. Studies suggest that immersive environments - which are comprised but not limited to visual immersion- can enable better spatial understanding of virtual prototypes when compared to 2D and non-immersive 3D representations [Schnabel_Kvan_2003, Paes_Arantes_Irizarry_2017], enhance collaboration and team engagement among stakeholders [Bassanino_Wu_Yao_Khosrowshahi_Fernando_Skj??rb??k_2010, Berg_Vance_2016, Fernando2013], and also predict human-building interactions that can feed designers and researchers with reliable user behavior data [Kuliga_Thrash_Dalton_Hölscher_2015, Adi_Roberts_2014, Heydarian_Carneiro_Gerber_Becerik-Gerber_2015, Heydarian_Pantazis_Carneiro_Gerber_Becerik-Gerber_2015]

Yet, it is important to highlight the impact of the elevated spatial perception and its effect on enhancing user performance and task performance in collaborative activities. Methods for measuring spatial perception and presence within immersive environments are very complex to develop and have been intensively explored in the fields of cognitive and computer sciences. Witmer and Singer[Witmer_Singer_1998] were pioneers in this subject and by developing Presence Questionnaires they argue consistent positive relationship can be found between presence and task performance in virtual environments. Similar questionnaires have been applied in several studies in AEC and related fields ever since[Castronovo_Nikolic_Liu_Messner_2013, Kalisperis_Muramoto_Balakrishnan_Nikolic_Zikic_2006], with Faas et al. specifically investigating whether immersion and presence can produce better architectural design outcomes in early stages of design [Faas_Bao_Frey_Yang_2014]

Various studies have outlined the impact of immersive environments in enhancing user performance and task performance in collaborative projects. For skill transfer and decision making, in particular, Waller et al. show sufficient exposure to the virtual training environment would eventually surpass a real-world training environment[Waller_Hunt_Knapp_1998]. Safety training in construction can also benefit from the adoption of such systems [Sacks_Perlman_Barak_2013, HaiyanXie_Tudoreanu_Weishi_2006]due to their ability to promote high levels of involvement that can optimize the learning process [Faas_Bao_Frey_Yang_2014]. Heydarian et al. conclude users perform similarly in daily office activities (object identification, reading speed and comprehension) within immersive virtual environments and benchmarked physical environments [Heydarian_Carneiro_Gerber_Becerik-Gerber_Hayes_Wood_2015]. Moreover, other studies have investigated how occupant navigations in buildings are enhanced within IVEs when compared to 2D screens, with some suggesting significant improvement while using virtual reality headsets [Robertson_Czerwinski_van_Dantzich_1997, Ruddle_Payne_Jones_1999], while others indicate no significant differences [Mizell_Jones_Slater_Spanlang_2002, Sousa_Santos_Dias_Pimentel_Baggerman_Ferreira_Silva_Madeira_2009].

Such capabilities of immersive environments have been broadly investigated for collaborative review purposes which usually happens in the last phases of the design process. In these phases, critical analysis or design reviews are considered important activities, where of greater impact on costs, speed and quality of the project, and are made and when the ability to influence the overall quality of construction is higher. [Eastman_Teicholz_Sacks_Liston_2011] Commercial software such as Unity Reflect, Autodesk Revit Live and IrisVR enable virtual walkthroughs and facilitate visualization of conventional 3D and BIM file formats to cover the usability gap of modification of building design within software and allowing 3D data from the BIM ecosystem to be automatically integrated in their software[Faraj_Alshawi_1997]. However, positive impacts such as decreased time and costs of the project can be expected from any process or tool that enables the finalization of design decisions earlier in the project life cycle.

Figure 1: Workflow of RadVR- the system takes a 3D model with material properties as input and provides a virtual reality enrvironment with daylighting simulations tools avaiable

Nevertheless, immersive visualization can also be implemented earlier in the design process to assist the user in the decision-making process by allowing the user to modify their design and receive quantifiable feedback by simulating building performance metrics. Building performance simulation, either visualizing previous simulated values or performing simulation through the user interface itself has been explored in multiple studies in the AEC realm. For building performance visualization, Nytsch-Geusen et al. developed a VR simulation environment using bi-directional data exchange between Unity and Modelica/Dymola [Nytsch-Geusen_Ayubi_Möckel_Rädler_Thorade_2016]. Rysanek et al. developed a workflow for managing building information and performance data in VR with equirectangular image labeling methods [Nytsch-Geusen_Ayubi_Möckel_Rädler_Thorade_2016]. For augmenting data on current buildings, Malkawi et al. developed a Human Building Interaction system implementing augmented reality to visualize CFD simulations [Malkawi_Srinivasan_2005]

. Augmented and virtual reality interfaces have also been applied for structural investigations and finite element method simulations. However, to expedite the simulation process to achieve real-time interaction for the complex geometry, artificial neural networks (ANN) and approximation methods are applied in Hambli et al studies

[Hambli_Chamekh_Bel_Salah_2006].

In the domain of building performance, daylighting design follows visual and photometric properties to enhance occupant experience and control visual discomfort. The importance of daylighting design, however, is not limited to quantitative metrics and integrates extensively with geometry and visual factors. Such property has been a cornerstone for daylight research, with previous studies proposing tools for objective driven daylight form-finding [Caldas_Santos_2016] and optimization [Caldas_Norford_2002] merging spatial and visual qualities of which daylight can offer, with numeric goal-oriented generative design strategies.

For immersive environments, previous work has studied daylight performance as an end user tool and occupant input method in virtual reality. In this regard, Heydarian et al. explore the lighting preferences of users through the users control of the blinds and artificial lights inside a virtual environment [Heydarian_Pantazis_Carneiro_Gerber_Becerik-Gerber_2015]. Rockcastle et al. use virtual reality headsets to collect subjective evaluations of rendered daylit architectural scenes [Rockcastle_Chamilothori_Andersen_2017] and Chamilothori et al. experiment façade patterns on the perceptual impressions and satisfaction of a simulated daylit space in virtual reality [Chamilothori_Wienold_Andersen_2018]. In the field of lighting, Jones developed a GPU accelerated version of Radiance[34] for global illumination simulation for parallel multiple-bounce irradiance caching, allowing much faster renderings for VR environments [Jones_2017]

. However, this method currently provides pre-rendered equirectangular images instead of 6 degrees-of-freedom (6DOF) renderings, resulting in a limited sense of presence and scale.

1.1 Current Limitations of Daylighting Design within Immersive Environments

Although virtual immersive environments have been widely used in the various design and engineering tasks, some important limitations of the current state of the technology can result in critical drawbacks in design decision making. In daylighting design in particular, which the user highly depends on visual feedback and rendered information, graphical and display limitations can path the way to misleading visual representations provided by the system. Therefore, it is vital to identify and address this limitation as part of the design in any virtual reality tool. For daylight simulation and graphics renderings, ray-tracing has been a widely accepted method in computer graphics and radiometric simulations. Following the rendering equation introduced by Kajiya [Kajiya_1986], many raytracing methods have been developed since then to simulate light behavior and optical effects. Tools for simulating daylighting performance metrics such as Radiance and Velux) take advantage of such ray-tracing techniques and have been further validated through numerous studies. As a result, these tools are broadly used in building performance design and analysis, assisting architects and building engineers to evaluate daylight behavior in different phases of the design process.

However, implementing raytracing [Cook_Porter_Carpenter_1984] methods in virtual environments is highly challenging due to current limitations in graphic processing. This has resulted in the inability to produce physically correct renderings in high-frequency rates. In order to experience 6 degrees of freedom (6DOF) and avoid user discomfort within immersive environment, rendered information displayed on Virtual Reality HMDs is required to update in a framerate of least 90Hz to match the pose and field of view on the user. Rendering in such high frequencies requires high graphical computation power, which current conventional GPU

s are unable to provide. In addition to updating pose estimation, the wide field of view experienced in virtual environments requires high-resolution output, adding complexity and computation load to the rendering process.

Therefore, non-physic based methods such as depth buffered triangular rasterization have been widely implemented in real-time rendering applications such as game engines [Gregory_2018]

which are currently considered as the main platform of virtual reality development. These methods which are biased towards scene optimization for fast processing, do not calculation global illumination lightings and limits it calculation to local interpolation models such as the Blinn Phong model

[Blinn_1977] the Gouraud model [Gouraud_1971]and flat shading model. However, as light bounces are limited in such methods and cannot illustrate accurate illuminance values of given viewpoint, ambient lighting of surfaces are not achieved and is mainly limited to shadow and occlusion calculation of a scene. Many methods have been introduced to bypass this limitation [Williams_1983, Gers_Felix_Jürgen_Schmidhuber_1984, Segal_Foran_1992] by applying visual illusion techniques and preprocessed rendering systems such as mip mapping and texture mapping, in which prebaked light textures are mapped to corresponding geometry in the scene, decreasing the real-time rendering load of the model. However, in applications that lighting conditions are constantly changing, such methods cannot be implemented due to the dynamic property of the lighting sources. In addition to graphical outputs, display limitations in current HMD systems can also decrease the required fidelity for daylighting design and decision making. Although prototypes of high dimensional range monitors have been studied extensively, current consumer HMD hardware such as the Oculus Rift and the HTC Vive are measured with a maximum brightness that does not exceed 150cd/m2. Therefore, although immersive virtual environments have shown to extensively mimic presence and enhanced awareness within virtual spaces, relying on the misleading visual information rendered and displayed due to limited hardware can be counterproductive for the design process. Hence, it is important to inform the user of possible errors and mismatches of photometric values through an extended visualization medium, allowing the user to compare and analyze rendered information and actual quantitative values in the form of common daylighting metrics.

To address this challenge, this work proposes an end-to-end 6DOF virtual reality tool, RadVR, that uses Radiance [Ward_J._1994], a physically based raytracing system developed by Greg Ward as its calculation engine. RadVR attempts to encompass simultaneous analysis of qualitative immersive presence and quantitative physically correct daylighting calculations by allowing designers to overlay simulation data to spatial immersive experiences. The simulation accuracy can be customized by the user, from limited geometrical properties of direct light to progressively accurate daylighting simulations with higher detailed resolution. With an end-to-end system architecture, RadVR integrates 3D modeling software within conventional 2D environments such as Rhino3D and provides an immersive virtual reality framework for the user designer to simulate and explore various daylighting strategies. Approaching the topic from a different perspective, this thesis proposes that rather than evaluating end-user experience for the occupants, the user itself can evaluate based on predefined metrics.

One of the main contributions of this work is establishing a stable bi-directional data pipeline between Unity3D and other third-party building performance simulations tools. While many building performance simulations engines do not have native GUIs and are accessed through console-based systems, the development of virtual reality GUI would allow building performance designers to conduct pre-construction analysis through 1:1 scale immersive environments of various performance metrics. Moreover, with the integration of VR design methodologies in CAD-based software, such analysis can be applied in earlier stages of design, all within immersive environments and without the need of transferring between 2D platform and back

2 2. Methods

Figure 2: RadVR plug-in for Grasshopper, a visual programming language for Rhinoceros 3D, commonly used by many architects. Using the Assign Material component, different material types (glazing, plastic, translucent, electrochromic glazing, etc.) can be applied to the modeled geometry and exported directly to RadVR

2.1 2.1 System Architecture

Figure  1 shows an overview of RadVR

s end-to-end processing pipeline. The system takes a semantic 3D geometry as input and automatically converts it to an Octree scene format with the corresponding properties and material. When RadVR is run within the virtual environment, the Radiance engine in the background implements an initial simulation to prepare the primary scene within VR. This loads the entire geometry with its defined material into VR, allowing the user designer to explore, simulate and review multiple daylighting functions of the tool. From this moment on, the software (which runs on a game engine) simultaneously integrates with the simulation Radiance engine in performing various simulations in a bi-directional manner.

First, we describe the core issues addressing the design of the system architecture in the following subsections: semantic 3D geometry input, Octree preparation, Radiance integration and game engine implementation. Second, we discuss different functionalities of RadVR, simulations types, visualization and output metrics. Finally, we describe the design approach of the user interaction of RadVR and implementation of the different modules of the system.

2.1.1 2.1.1 Semantic Geometry Input

As daylighting performance of a building is highly dependent on material properties of the target space, the procedure of importing geometry should be intertwined with a semantic material selection to achieve correct results in simulations. In addition, daylighting studies may happen in early stages of the building design process where material and finishing selections have not been integrated, therefore, practices such as BIM modeling cannot assist in the data extraction of the model. To address this limitation, authors developed a RadVR plug-in for Grasshopper – a visual programming environment for Rhinoceros3D that is commonly used by architects- in which the user can directly assign the corresponding material to each geometry prepared in the 3D modeling environment. In addition, parametric geometries processed by other grasshopper components and plug-ins can also serve as an input to the RadVR plug-in. The RadVR plug-in is responsible to prepare the required data file for both the game engine and simulation engine (Radiance) in two separate target directories. Such method would allow the user to interact with one unified input module in the 2D platform before transferring to virtual immersive environments. In addition, such approach can serve as a bridge between 3D modeling and parametric practices with the performance analysis virtual reality system. Moreover, the plug-in provides a predefined material list which the user can chose material and also modify the main parameters of each material with adding additional components to the Grasshopper pipeline.

Figure 3: Changing the time of the year using virtual reality touch controllers. By pressing up/down the month of the year is modified and by pressing left/right the hour of day is will be modified

2.1.2 2.1.2 Octree Preparation

To prepare the input building for daylighting simulation within virtual reality, the system labels each instance of the geometry input to the corresponding material property and assigns a global sky condition to the scene. Such information is stored in an Octree format, which contains a geometry reference, material reference and a generated sky. The system uses material information provided by DIVA for Rhino as benchmark properties for generic and specific materials. The material file can be modified by advanced users of the system if needed following the basic reflection model and contains transparent, translucent and properties. The system uses a clear sky model for its simulations.

2.1.3 2.1.3 Radiance Integration

For raycasting-based daylight simulation, RadVR integrates with Radiance[Ward_J._1994] as its calculation engine. Radiance is a validated daylighting simulation tool developed by Greg Ward which is collection of multiple console-based programs. The system uses rTrace, which simulates radiance or irradiance values at individual sensors. These sensors may form a grid over a work plane, or they may represent individual view directions for pixels of an image. However, instead of calculating color values of output pixels of a scene, rTrace sensors can be implemented in a wide range of spatial distribution covering target locations with multiple direction in an efficient manner. This would allow to minimize computation time by limiting ray tracing calculations to specific targets and avoid calculating large image size as one directional array.When a simulation is triggered in RadVR a designated C# script is activated to communicate with Radiance rTrace through the native command console. The required input of every simulation is provided according to the virtual state of the user and time of the year of defined in GUI of RadVR. Moreover, the rTrace simulation runs as a background process without the user viewing the simulation console or process. Once the simulation is complete, a virtual window notifies the user of the completion and the scene would be updated with the simulation visualization. The results are stored in memory and can be later parsed and visualized if called by the user.

2.2 2.1.4 Game Engine Implementation

As discussed in Section 1.2, the ability to output high frequency renderings in an efficient manner is the main objective of modern game engines. RadVR uses the Unity3D game engine and libraries for its main development platform. Like many other game engines, Unity is not capable of real-time raytracing for virtual reality applications and implements a variety of rasterization methods to output biased renderings. For material visualization, a library of Unity material files was manually developed, which visually corresponds to the properties of listed materials the user designer pre-defined in Grasshopper plugin. In addition, these properties can be later modified in RadVR which would be visually updated during runtime.

2.3 2.2. RadVR User Modules

2.3.1 2.2.1 Direct Sunlight Position Analysis

One important aspect of daylight analysis is understanding the relationship of the time, sun location and building geometry. Hence, a control function over time that would result in the correct movement of the sun in the sky based on the building location (latitude and longitude) would shed direct sunlight on the building, and thus render visible the suns geometric relation to the building volume and its effect of direct lighting throughout the year. In RadVR, an interactive 3D version of the stereographic sun path diagram is developed with calculations of the NOAA Sunrise/Sunset and Solar Position Calculators. These calculations are based on equations from Astronomical Algorithms, by Jean Meeus [Meeus_1998]. Each arc represents a month of the year and each analemma represents an hour of the day.

A C# script was implemented to translate NOAA equations to functions that operate within the Unity3D environment. This script calculates the zenith and azimuth of the sun based on longitude, latitude and time of the year, and controls the position and rotation components of a direct light object in VR environment. The mentioned inputs are accessible from the implemented GUI of the program, both through user interface menu options and VR controller input.

To avoid non-corresponding arcs throughout the months, the representing days of each month differ and are as follows: January 21, February 18, March 20, May 21, June 21, July 21, August 23, September 22, October 22, November 21, December 22. In addition, monthly arcs are color coded based on their season with the winter solstice (December 22 in northern hemisphere and June 21 in southern hemisphere) visualized in blue, and the summer solstice arc (June 21 in northern hemisphere and December 22 in southern hemisphere) color coded in orange. Monthly arcs in between correspond to a gradient of blue and orange based on their seasonality.

Figure 4: 9 point-in-time matrices in RadVR. While choosing each date in the matrix, the sun position instantly updates to construct the corresponding shadows and daylighting effects.

The observer location is set to the center eye point (mid-point between the virtual left and right eye). Hence, the sun location and diagram in RadVR correctly update based on the users head location, both as result of turning around the head in its location or result of virtual locomotion within immersive environment. This feature of the software also allows users to indicate whether direct sun illumination is visible from the observers specific point in space throughout the year. Any portion of sun-path diagram is visible through the building openings that surround that observation point indicates direct sunlight penetration of the corresponding time of the visible diagram.

The user control of the time of day is accessed from two different input methods. The first is by using VR controllers and changing the time with moving the joystick. On moving the joystick from left to right the time of the day increases on a constant day of the year and vice versa. The joystick input is designed to mimic the mainly horizontal movement of the sun from sunrise to sunset. In contrast, on moving the joystick from down to up, the day of the year increases in a constant time of the day, resulting in moving on the corresponding analemma in the sun path diagram. The speed of the movement can be adjusted through RadVR settings, allowing users to control their preferable sun path movement for intended daylight analysis.

oreover, to adjust the time in hourly steps and avoid the smooth transition in minutes, a SnapTime function has been applied to assist user designers in altering time of the day controls. This function also extends to the day of the year, with snaps happening on the 21st of the month only. SnapTime allows users to quickly and efficiently round the time of the year on hourly and monthly numbers for sunlight analysis.

The second input method is using the designed immersive GUI and menu controls. As will be explained extensively in Section 4.9, the RadVR menu is the main portal of graphic user interface and the time of the year can be adjusted with the corresponding month, day, hours and minutes sliders available in the Level1 tab.

As the time of day and date of the year is transformed by the user, lighting conditions and shadows are updated based on the corresponding building model. However, in many cases the user is eager to locate the position of the sun relative to the building, but due to the specific geometry of the model, the sun location is being blocked by the solid obstructions. To resolve this issue, the “Transparent Mode” function is implemented in the workflow, which adds a see-through effect to the model when the sun position is being changed. Solid and translucent material are all replaced with a transparent material to achieve this quality.

2.3.2 2.2.2 The 9 point-in-time matrix

In addition to the manual configuration of time, assessing a 9 point-in-time matrix is also a useful method used in daylight studies. The analysis of the morning (9:00am), noon (12:00pm) and afternoon (3:00pm) for the solstices and equinoxes is a fast way to evaluate and compare typical sun movement throughout the year. User-designers access the RadVR version of point in time matrix through the corresponding UI menu which contains 9 captioned buttons that represent the 9 point in times. By clicking on each button, the time would be updated in the surrounding environment, resulting the move of the sun position, shadows, etc. In contrast to the conventional 9 point-in-time matrix where a single viewpoint of the building is rendered in 9 times in different time of the year to form a 3x3 matrix of the rendered viewpoints in one frame, RadVRs 9 point-in-time is a set of nine 360 degrees 6DOF viewpoints that are individually accessed and updated through the 3x3 user interface shortcut. Therefore, evaluation of these times can be done in a much wider field of view covering all surroundings and not just one specific camera angle. This may result in a more comprehensive daylight comparison of the buildings space, as user designers can simultaneously identify geometrical properties of daylight in multiple viewports of the buildings. However, the limitation of not being able to view all renderings in one frame can be viewed as a drawback compared to the conventional 9 point-in -time matrix. For an in-depth comparison analysis of conventional 2D point in time matrix vs RadVR, please see results of user experiences in section 4.5.

2.3.3 2.2.3 Quantitative Simulations

While qualitative renderings of the daylit scene are produced directly from the game engine rendering pipeline, the physically correct quantitative simulations of conventional daylighting metrics are achieved by triggering Radiance simulations through the front-end user interface of RadVR. By defining certain simulation settings such as simulation type, sensor array resolution, and ambient bounce count through user-centric interaction modules, the user can run, visualize, compare and navigate different types of daylighting simulations within the virtual immersive environment of RadVR. Different components of the quantitative simulation front-end modules are explained below:

A general approach of achieving daylight simulations is to define an array of planar sensors with mutual vector directions and measure the illuminance values each sensor interprets from the light sources in the scene. For daylighting simulation, the sun location and sky conditions define the lighting environment, therefore the time of the desired simulation and its corresponding sky model are applied as input parameters of the simulation.

To construct the sensor arrays in RadVR, a floating transparent plane - Simulation Plane - is instantiated when the user is active in the simulation mode. This Simulation Plane follows the user within the virtual space during all types of virtual locomotion (teleporting, touchpad-walking, flying) allowing the user to place the simulation plane based on its own position in space. The size and height of the Simulation Plane can be adjusted using corresponding sliders. This type of interaction was designed to allow to the VR user to locate the simulation sensors wherever the user intends in the virtual environment from a user-centric point of view. In contrast to conventional 3D modeling software which take advantage of birds-eye views and orbiting transformation and their main navigation interaction, immersive experiences and their corresponding virtual viewpoints are highly effective when designed around human-scale experiences and user-centric interactions. Therefore, instead of expecting the user to use flying locomotion navigation and accurate point selection for simulation plane construction, the simulation plane automatically adjusts it position and height based on the user him- or herself.

Moreover, the spatial resolution of the sensor arrays, or in other words, the distance between sensor points, can be adjusted by the user in both X and Y directions. Such property allows the user designer to control the simulation time for various testing scenarios or allocate different sensor resolutions for various locations of the space. If studying a certain area of virtual space requires more resolution, the user can adjust the simulation plane size, height and sensor grid distance respectively, while modifying the same parameters for another simulation which can be later overlaid or visualized in the same virtual space.

In addition to sensor resolution, the ambient bounce count of the light source rays is another important factor determining the accuracy of the ray tracing simulations. While the default value of the RadVR simulations is set to 2 ambient bounces per simulation, this parameter can be modified through the corresponding UI slider to increase simulation accuracy in illuminating the scene. However, such increase would exponentially impact processing time, a factor which the user can adjust based on the objective of each simulation instance.

The time and the corresponding sun location of each simulation is based on the latest time settings controlled by the user in the RadVR runtime. By using the touchpad controller to navigate the month of year and hour of the day or accessing any of the given timestamps of the 9-point-matrix, the user can modify the time of the year for the simulation setting. Moreover, longitude and latitude values can be accessed through the RadVR menu allowing comparative analysis for different locations.

2.4 2.3.4 Visualization of Simulation Results

Figure 5: Visualization of a Daylight Factor (DF) simulation within RadVR. Value are plotted in the location of each sensor node. A three-color gradient palette is implemented where blue is considered as the minimum value, yellow as the median, and red as the maximum value. The range of heatmap can be modified through RadVR menus.

After the completion of the simulation, results are plotted on the corresponding simulation plane with a heatmap representation, where each sensor is located at the center of colored matrices. RadVR implements a three-color gradient palette where blue (RGB 0,0,255) is considered as the minimum value, yellow (RGB 255,255,0) as the middle value, and red (RGB 255,0,00) as the maximum value. For point-in-time illuminance simulations, the minimum and maximum values are extracted by the simulation results, whereas in the Daylight Factor simulations the minimum and maximum bounds are set to 0 and 10, respectively, as default values. The user can later modify the minimum and maximum bounds of the visualization through accessing the corresponding range-slider in RadVR simulation menu.

3 3. User Studies

The goal of the user experiments were to develop a comparative analysis on how general daylighting study activities in RadVR perform in terms of user experience compared to a conventional desktop tool, in our case Diva for Rhino, which also uses Radiance as its simulation engine [Jakubiec_Reinhart_2011]. This study was conducted using as the initial basis the design work produced by architecture students to study daylighting during a graduate level course – ARCH 240: Advanced Topics in Energy and Environment- at UC Berkeleys Department of Architecture. 16 students participated in this study. The experiment was conducted in three phases: 1. Depart from the architectural daylighting design task previously done by students with the goal of achieving efficient daylight performance metrics, using Diva for Rhino; 2. Conducting daylight analysis in RadVR; 3. Completing an exit survey for comparing both software in daylight analysis activities.

During their previous Arch 240 assignment, students had been asked to design a 25m x 40m swimming pool facility in San Francisco with a variable building height. The goal of the design was to achieve a coherent and well-defined daylight concept for the building that addresses both the diffuse and direct component of light. Students were instructed to consider relevant daylight strategies, including top lighting, side lighting, view out, relation with solar gains, borrowed light, and materials. The modeling of this design task took place in Rhino3D, in which all students had prior experience with.

Students had subsequently used the Diva for Rhino tool to asses and refine the daylighting strategies implemented in the design task phase. Daylight Factor analysis and 9 point-in-time matrix visualizations had been conducted in this phase and reported as part of the deliverables assignment. The simulation plane for Daylight Factor plane was positioned at 0.8 meters from the ground floor, with grid nodes of 60m x60m and an ambient bounce of 6.

For the experiments carried out during this thesis, the Rhino models that students had previously developed in their ARCH 240 assignments were imported to RadVR by the authors, using the designated RadVR Grasshopper component and selected manual configurations. Materials were added based on the design choices that had been made by students. Some models included more detailed textured materials which students had allocated through a set of prefabricated materials.

Daylight analysis in RadVR was conducted in two sections. First, students intended to understand the relationship between the sun, time and the building. For this, users initially navigated and inspected their designed buildings using allocated locomotion functions where both teleportation and flying functions were used in this phase. In order to change the time of the year, students used the time controllers resulting in the corresponding movement of the sun location and observing daylight affect to their buildings. The change of time and sun locations was also explored using the 9 point-in time matrix.

In the second section, a Daylight Factor simulation of the building was executed through the Level 2 menu. The simulation plane was positioned in 0.8 meter of the ground floor with grid nodes of 1m x1m. An ambient bounce of 2 was chosen to reduce computation time. After the simulation, the user started navigating through the results while evaluating its building design and identifying key elements to the affecting the simulation results. Visualization bins were accessed through the Level 2 menu to change the gradient change (Section 3-2) and narrow down its range to a preferred domain. Before each section a brief tutorial on how to use the software was made by the author. Each user spent approximately 4 minutes in each of the two sections.

Upon completion, an exit survey was conducted to evaluate the user experience, mainly focused on comparing the performance between RadVR and Diva for Rhino. The survey was divided in three parts. First, to evaluate the understanding of the relationship between the sun, time and building mainly covering activities done in section 1 of the VR experiment. Second, user experience in navigating through simulation results in Virtual Reality, in section 2 of the VR experiment. And third, an overall evaluation (comfort, learning curve) of the RadVR software compared to Diva for Rhino.

Each question covered a specific activity in general daylight analysis and students were asked to choose between a linear 5-point scale on how they compared the performance of the mentioned activity between the two software. To prevent confusion, the words RadVR and Diva for Rhino were colored in different colors and comparison adjectives (significantly, slightly, same) were displayed as bold text.

Figure 6: User experiments of RadVR while performing Daylight Factor simulations on designed spaces

4 4. Results

Figure 7.1 illustrates the survey results of understanding the relationship between time, the sun and the building. 82% percent of responses concluded that RadVR helped users accumulate the inquired tasks, 16% of them show Diva for Rhino perform better and 2% believe both software perform the same. Responses show navigation the time of year and perceiving the sun location can be significantly perceived in RadVRs VR environments comparing to a 2D screen.

As time navigation is achieved with simple user interaction of the VR controller joystick, many students found it efficient to slide through the hours of the day and the days of the year. The direct sunlight penetration smoothly changes throughout the day and users are able to fully perceive the sunlight variation throughout the year in low amount of time. Although that was not part of the user experiment, some subjects pointed out the lack of quality in rendering diffuse light which is notably noticed in the RadVR experience.

Figure 7: User surveys

In addition to navigating time from a specific point of view, the added component of moving around in the building was found useful for understanding daylight features and elements of buildings. Subjects commented on misperception of scale while designing within 2D environments and how immersive the VR experience contain this vital quality by understating of scale and certain building elements. However, as locomotion was only set to basic teleporting and the flying function operated in minimum speed (Section 3.2) to avoid nausea in potential users, some subjects had acknowledged difficulty in moving around in the Virtual scape compared to the established Zoom and Pan within 2D screens.

As mentioned in section 2.2.2, the 9-point-matrix in RadVR is not a grid of 9 rendered images of different times of the year but actually is a matrix of 9 buttons that upon click changed the time of year of the surrounding environment. Comparison only takes place with switching from one time to another. The responses can emphasize the hypothesis that comparison mostly takes place between two modes of daylighting conditions at one time so spatial and visual memory can evaluate this comparison. In addition, the fact that users are not limited to one view in space and can constantly move around and compare different instances of times are issues that were mentioned as advantages compared to 2D still renderings.

Figure 7.2 shows results of the second section survey regarding simulations. Understanding the relationship between Daylight Factor simulations results and the building was the main idea of the questions. 82% percent of the answers generally preferred RadVR as a simulation visualization tool, 16% of them show Diva for Rhino performs better and 2% insist both software perform the same.

With the simulation plane located lower than the eye level subjects were observed to instantly locate over-lit or under-lit areas and virtually teleport towards these areas that were outside the preferred 4-6% daylight factor, to inspect building elements (side openings, skylights, etc.) that affected the results. In some cases, participants accessed the gradient change feature from the Level 2 menu, to change the default range (0%-10%) to custom values (for example 2%-4%) order to narrow down their objective results.

Figure 7.3 reports usability experiences in this experiment. 74% of the subjects reported that RadVR was easier to learn than Diva for Rhino. As mentioned in Section 3.4, change of time and teleporting functions are designed with minimum “interactions”, and only with moving or pressing the controller joystick. Other functions are accessed through immersive menus, which due to the large range of field of view compared to 2D screens, every window can contain most of the GUI needed and there is less effort to navigate between menu hierarchies. During the experiment, if the subject asked on how to do a specific function, the author would assist vocally while the subject had the headset on. It is important to note, user interface task performance was not measured in this experiment and only individual user experience feedback was recorded in result of the survey.

All responses indicated that RadVR was a more enjoyable experience than Diva for Rhino. However, as 67% of subjects had never experienced 6DOF Virtual Reality before, enjoyment may have been triggered due to fact many students were experiencing VR as a new and engaging platform. Many subjects seemed to enjoy the experience of walking and navigating in their building with smooth 6DOF technology. Some students were inspired by the added value of understanding scale, in which they could obtain in such experience. In addition, the real-time update of sunlight and shadows while navigating the time of day or year was observed to be interesting moments for users during the experience.

5 5. Conclusion

The research proposed in this work introduces a 6DOF virtual reality daylight analysis tool, RadVR, for daylighting-based design and simulations, that allows simultaneous comprehension of qualitative immersive renderings to be analyzed with quantitative physically correct daylighting calculations. With a user-centric interaction design approach, and an end-to-end workflow, RadVR facilitates users to 1) observe direct sunlight penetration through different hours of the year and navigate changes in sunlight patterns related to time, latitude and longitude 2) Interact with a 9-point-matrix of illuminance calculations for the nine most representative times of the year 3) Simulate, visualize and compare Radiance raytracing simulations of point-in time illuminance and daylight factor directly through the system and 4) accessing various simulations settings for different analysis strategies through the front-end virtual reality user interface.

By conducting user experiments and comparing the system with a conventional 2D-display daylight analysis tool, Diva for Rhino, the results show that RadVR outperforms Diva4Rhino in spatial understanding tasks, navigation and sun position analysis. In addition, users report they could better identify what building elements impact simulation results compared to the 2D-display analysis tool. Moreover, users also found the system more comfortable to use, easier to learn and a better alternative as an effective daylighting teaching tool.

However, despite the spatial immersion and presence generated from the proposed tool, different types of limitations can still be found. Given the rasterization rendering pipeline of the system, and limited graphic power of current real-time rendering systems, many spatial qualities of the illuminated spaces cannot be captured, resulting in flat renderings and unrealistic qualitative outputs. Such limitation is mostly seen when indirect lighting strategies are implemented, since biased rendering methods used in game engines are not capable of capturing ray bouncing effects in large numbers in a real-time fashion. Moreover, reading large scale heatmap results from a human-scale point of view has shown to be difficult, with the visualized work plane usually set at 0.8m and the eye height at 1.7m. Users reported this limitation was rather resolved when a flying locomotion mechanism was implemented, so that they could observe results from a birds-eye view. However, repositioning to the right point of view was time consuming in comparison to the fast orbit interactions on 3D modeling environments. Yet, after identifying over-lit or under-lit areas through analysis of simulations results, users where able to teleport to the exact location and investigate what element had caused the underperformance from a closer, human-centric point of view.

Future work on the development of this tool can fall in three main categories. First, improving qualitative graphics by hardcoding state-of-the art rendering shaders, specifically for each generic building material, to achieve partial ambient bounces in the scene, and updating the current rendering pipeline with recent GPU based real-time solutions to achieve improved global illuminance. Second, additional daylighting metrics and climate-based simulations can be implemented in the system. Given the console-based architecture, other building performance simulations such as energy and CFD simulations can also be integrated in the future. Third, improving data visualizations by exploring data representation formats that align with 3D immersive spaces. Such approach can be achieved by taking advantage of visual properties such as stereostopic depth or gaze and color maps for enhancing user comprehension of simulated data. Moreover, visualization strategies can be further explored to facilitate users to maintain a connection between the quantitative values from the simulation engine and qualitative visual outputs from the VR rendering engine.

References