Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Visualizing architectural lighting: creating and reviewing workflows based on virtual reality platforms
(USC Thesis Other)
Visualizing architectural lighting: creating and reviewing workflows based on virtual reality platforms
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
1
VISUALIZING ARCHITECTURAL LIGHTING:
Creating and Reviewing Workflows Based on Virtual Reality Platforms
By
Hang Li
Presented to the
FACULTY OF THE
SCHOOL OF ARCHITECTURE
UNIVERSITY OF SOUTHERN CALIFORNIA
In partial fulfillment of the
Requirements of degree
MASTER OF BUILDING SCIENCE
August 2017
2
COMMITTEE
Karen M. Kensek, LEED AP BD+C, Assoc. AIA
Associate Professor of the Practice of Architecture
USC School of Architecture
kensek@usc.edu
(213)740-2081
Marc E Schiler, FASES
Professor
USC School of Architecture
marcs@usc.edu
(213)740-4591
Douglas Noble, FAIA, Ph.D.
Associate Professor
USC School of Architecture
dnoble@usc.edu
(213)740-2723
3
ACKNOWLEDGEMENTS
I would first like to express my deepest gratitude to my committee chair, Professor Karen Kensek, who has offered
excellent advisory on my project. She provided great logics, creative ideas and detailed comments, guiding me to the
right direction. Her positive attitude towards work, patience on details and well organized time management will
inspire me in my whole life.
Besides my thesis chair, I would like to thank the rest of my committee members, Professor Marc Shiler and Professor
Douglas Noble. They helped me a lot to improve the project and the written document and provided me with great
support and encouragement. I am very grateful to their life guidance and advices on my future career.
I would also like to thank to the whole MBS family. It is you that make my life here enjoyable and colorful. I am
proud of our group and all our achievement.
Finally, thank you, my dear Mom and Dad. Thank you for providing me through moral and emotional support in my
life. I couldn’t have the thesis done without you. I am also grateful to my other family members and friends who have
supported me along the way.
4
ABSTRACT
Virtual reality (VR) can be used to create interactive, immersive experiences, for example, for visualizing the world
with Google Earth VR and participating in VR computer games. It is also being used in other industries that rely on
three dimensional models for communication. Architecture is one of the professions that requires interaction between
the real world and the digital domain. Some architects are trying to introduce VR into their daily workflow. However,
there are many programs and platforms for designers to choose from with different strengths and weaknesses. Lighting
designers can also benefit from the immersive and interactive features of VR both in communicating with clients and
using VR tools in their own design process. VR technology could reduce the workload and make presentations more
efficient and interesting. VR could also help clients to change the lighting fixtures and dim the light interactively and
present the real-time walkthrough experience. Different workflows for integrating VR into the lighting designer’s
design process should be based on the ultimate use intended and issues of accuracy (e.g. illuminance), type of
visualization (illumination values, false color, rendered view, etc.), the level of interactivity needed (stationary viewer,
full movement in the space, ability to interact with objects and light sources), compatibility with other software, and
cost.
Eight workflows that incorporated virtual reality outcomes were developed for lighting design. These were evaluated
based on visualization, compatibility, accuracy, and interactivity through surveys and case studies, and then two new
workflows were developed based on these.
Lighting simulation software programs (AGI32, Autodesk Revit, 3ds MAX, DIALux, RADIANCE from Ecotect, and
RELux Vision) were first assessed as which of them might be part of the final workflow. Previous research has shown
indicated differences in simulated results between lighting simulation software, but for VR use, the illuminance results
from each of the programs appears identical with in an accepted range. However, according to other issues such as
compatibility, user interface, and visualization, each of the programs differs from each other. Based on the evaluation
results, 3ds MAX is better for just rendering. However, AGi32 and Grasshopper based Radiance in Rhino have a
better simulation data visualization overall including false color images, renderings, and illumination level calculation.
In the aspect of interactivity, Unreal Engine 4, Unity 3D, and Autodesk Stingray have the similar potentiality for VR
editing and scene visualization. However, Unreal Engine 4 has a better compatible platform for development. Fuzor
was discovered to have many of the desired features; it was incorporated into one of the workflows and used as the
benchmark for comparing the other eight workflows.
One phone-based modified workflow and one computer-based workflow were proposed for lighting designers to
achieve illuminance values visualization or false color visualization in a VR mode, and a real project was used as a
reference model for evaluation.
Hypothesis
AGI32, Revit Dynamo, Revit to vCAD is a better phone-based workflow versus others tested, and AGI32, Revit
dynamo, Revit, 3ds MAX to Unreal Engine 4 is a better computer-based workflow for lighting designers to visualize
interactive lighting simulation in aspect of cost, interactivity compatibility, visualization and accuracy than the other
selected workflows including AGI32-Mobile VR station, Elumtools-Mobile VR station, 3ds MAX-Cardboard
Camera, Revit-vCAD, Revit-Fuzor (phone-based), Revit-Fuzor, Revit-Enscape, 3ds MAX-Unreal Engine 4.
5
CONTENTS
ACKNOWLEDGEMENTS ........................................................................................................................................... 3
ABSTRACT .................................................................................................................................................................. 4
CHAPTER 1: INTRODUCTION .................................................................................................................................. 8
1.1 An Introduction to VR ......................................................................................................................................... 8
1.1.1 Virtual reality ............................................................................................................................................... 8
1.1.2 Virtual reality applications in architecture and design ............................................................................... 11
1.2 An Introduction to Architecture Lighting Design ............................................................................................. 13
1.2.1 Typical architectural lighting design .......................................................................................................... 13
1.2.2 VR aided design ......................................................................................................................................... 15
1.3 Thesis Development .......................................................................................................................................... 17
1.3.1 Significance of study .................................................................................................................................. 17
1.3.2 Abstract, hypothesis, and research objectives. ........................................................................................... 18
1.3.3 Chapter structure ........................................................................................................................................ 19
1.4 Chapter Summary .............................................................................................................................................. 19
CHAPTER 2: BACKGROUND OF VIRTUAL REALITY IN LIGHTING INDUSTRY ......................................... 20
2.1 Typical Lighting Design Workflows ................................................................................................................. 20
2.1.1 Programming phase ................................................................................................................................... 20
2.1.2 Schematic design phase ............................................................................................................................. 20
2.1.3 Design development phase ......................................................................................................................... 20
2.1.4 Contract documents phase ......................................................................................................................... 20
2.1.5 Bidding and negotiation phase ................................................................................................................... 21
2.1.6 Construction phase ..................................................................................................................................... 21
2.1.7 Post-occupancy evaluation phase ............................................................................................................... 21
2.2 Lighting Simulation........................................................................................................................................... 21
2.2.1 Lighting simulation history ........................................................................................................................ 21
2.2.2 Lighting simulation algorithms .................................................................................................................. 21
2.2.3 Lighting simulation programs .................................................................................................................... 23
2.2.4 Visualization of lighting simulation ........................................................................................................... 26
2.2.5 Accuracy evaluation methods .................................................................................................................... 28
2.3 VR Platform ...................................................................................................................................................... 28
2.3.1 Computer-based VR software .................................................................................................................... 29
2.3.2 Computer-based VR hardware ................................................................................................................... 29
2.3.3 Phone-based VR software .......................................................................................................................... 30
2.3.4 Phone-based VR hardware ......................................................................................................................... 30
2.4 Chapter Summary .............................................................................................................................................. 31
CHAPTER 3: METHODOLOGY ............................................................................................................................... 32
3.1 Describe the Workflows .................................................................................................................................... 32
3.1.1 General lighting simulation VR workflow ................................................................................................. 32
6
3.1.2 Specific workflows .................................................................................................................................... 34
3.1.3 Workflow summary ................................................................................................................................... 65
3.2 Evaluate the Workflows: Criteria and Methods ................................................................................................ 65
3.2.1 Evaluation criteria ...................................................................................................................................... 65
3.2.2 Evaluation .................................................................................................................................................. 66
3.3 Discussion of Workflow Development ............................................................................................................. 66
3.4 Chapter Summary .............................................................................................................................................. 67
Chapter 4: VR SIMULATION RESULTS CASE STUDY ......................................................................................... 68
4.1 Case Study Reference Building ......................................................................................................................... 68
4.2 VR Workflows Results ...................................................................................................................................... 70
4.2.1 AGi32-Mobile VR Station ......................................................................................................................... 70
4.2.2 Elumtools-Mobile VR Station ................................................................................................................... 74
4.2.3 Autodesk 3ds Max-Mobile VR Station ...................................................................................................... 79
4.2.4 Autodesk Revit – vCAD ............................................................................................................................ 83
4.2.5 Revit-Fuzor phone based ........................................................................................................................... 86
4.2.6 Revit-Fuzor computer based ...................................................................................................................... 90
4.2.7 Revit-Enscape ............................................................................................................................................ 94
4.2.8 3ds MAX-Unreal Engine 4 ........................................................................................................................ 99
4.3 Chapter Summary ............................................................................................................................................ 106
CHAPTER 5: RESULTS ANALYSIS & DEVELOPING NEW WORKFLOWS ................................................... 107
5.1 Workflows Evaluation ..................................................................................................................................... 107
5.1.1 Cost evaluation......................................................................................................................................... 107
5.1.2 Interactivity evaluation ............................................................................................................................ 109
5.1.3 VR visualization evaluation ..................................................................................................................... 109
5.1.4 Compatibility evaluation .......................................................................................................................... 110
5.1.5 Benchmarking accuracy evaluation ......................................................................................................... 111
5.1.6 User based evaluation .............................................................................................................................. 113
5.1.7 Design based evaluation........................................................................................................................... 115
5.2 Develop New Workflows ................................................................................................................................ 116
5.2.1 Phone-based VR workflow platforms selection ....................................................................................... 119
5.2.2 Phone-based VR workflow development ................................................................................................. 119
5.2.3 Computer-based VR workflow platforms selection ................................................................................. 120
5.2.4 Computer-based VR workflow development ........................................................................................... 120
5.2.5 Workflows evaluation .............................................................................................................................. 122
5.3 Chapter Summary ............................................................................................................................................ 122
CHAPTER 6: CONCLUSIONS AND FUTURE WORK ......................................................................................... 123
6.1 VR Aided Lighting Design.............................................................................................................................. 123
6.2 Evaluation Metrics .......................................................................................................................................... 124
6.3 Workflow Evaluation Summary ...................................................................................................................... 124
7
6.3.1 Cost .......................................................................................................................................................... 125
6.3.2 Interactivity .............................................................................................................................................. 125
6.3.3 Visualization ............................................................................................................................................ 125
6.3.4 Compatibility ........................................................................................................................................... 125
6.3.5 Accuracy to the benchmark ...................................................................................................................... 126
6.4 Proposed New Workflows............................................................................................................................... 126
6.5 Future Work .................................................................................................................................................... 127
6.5.1 Evaluation methods .................................................................................................................................. 127
6.5.2 Software improvement ............................................................................................................................. 128
6.5.3 Improve the VR environment .................................................................................................................. 129
6.5.4 Workflow development............................................................................................................................ 129
6.5.5 Hardware improvement............................................................................................................................ 129
6.6 Conclusion....................................................................................................................................................... 129
REFERENCE ............................................................................................................................................................ 130
BIBLIOGRAPHY ..................................................................................................................................................... 136
8
CHAPTER 1: INTRODUCTION
Lighting, both natural lighting and electrical lighting, is a very important part of architecture. People involved in the
lighting industry have many chances to present lighting design renderings, new lighting products, and lighting control
systems to owners, architects, and contractors. The process to prepare the presentation material is very time-consuming.
The representatives have to take the heavy lighting fixtures to each of the companies to introduce their new products.
Architects or lighting designers have to reformat the renderings of each perspective of the building to show the visual
effects of the space. Virtual reality (VR) technologies could help designers to better design and present new design
projects in a more efficient way. This chapter will introduce the history and terminologies related to VR and
demonstrate how VR could benefit different users.
1.1 An Introduction to VR
Virtual reality can be applied to architectural lighting design, and lighting designers need to better understand what
VR is and understand its applications with the workflow of their design process.
1.1.1 Virtual reality
Virtual reality is a three-dimensional immersive simulated environment that can be interacted with by a user (Sherman,
2003).
The earliest version of virtual reality is the 360-degree murals (or panoramic paintings) from the nineteenth century,
which were intended to provide viewers with the entire field of vision, making them feel immersed at the event or the
scene (VRS, 2017). A large panoramic painting at the Gettysburg Civil War Museum curves around the viewers; it
has recently been updated with 3d dioramas to even more give the illusion of reality. Panorama is a term describing
wide-angle view of anything usually presented by paintings, drawings, photos, and films. The oldest panoramic found
is a painting in Pompeii dated as early as 20 A.D. (Grau, 2003). The history of the panorama photograph lasts almost
as long as photography. At the beginning, photographers wanted to show city scenes and city skyline, which is not
easily possible to achieve with a single view from the normal camera. Photographers started trying to make panoramas
by processing photos and placed a series of images next to each other. With the development of photographic
technology, there were more and more specialized panoramic cameras invented for making panoramas from the 1840s
(Grau, 2003) to panoramic settings on current digital cameras today.
In the year of 1838, Charles Wheatstone’s research demonstrated that two eyes perceived different two-dimensional
images, and the brain processed them into a scene of three dimensions (Bowers, 2001). Stereoscopy is the term
describing three-dimensional vision processed by the brain when two slightly different views are perceived by two
eyes (Figure 1). It is also defined as a technique to establish a three-dimensional effect creating an illusion of depth
(Liddell and Scott, 1889). The principle of stereoscopy is to present two offset images separately to the left and right
eye of the viewer (Figure 2). There are many ways to achieve this goal such as shutter systems, polarization systems,
interference filter systems, and color anaglyph systems (often red and blue) with anaglyph glasses (Figure 3).
Figure 1 Different views perceived by two eyes (Reeve and Flock, 2010)
9
Figure 2 Present two offset images to left and right eyes Figure 3 Color anaglyph systems with anaglyph glasses
In 1849, the lenticular stereoscope was invented by David Brewster (Werge, 1890). Later, the popular View-Master
stereoscope (Figure 4) was developed for virtual tourism by William Gruber (Gruber, 2015).
Figure 4 Gruber Viewmaster stereoscope (McKAY, 1953)
In 1929, the Link Trainer, the first flight simulator was created, providing safer ways to train pilots (Kelly, 1970).
Link Trainer is a non-digital version of virtual reality that includes physical instruments and a radio system for
determining an airplane's position In the 1930s, real virtual reality was predicted by a science fiction story, which
contains the idea of goggles and walkthrough experience in a fictional world (Weinbaum, 2007). In the mid-1950s,
Sensorama, a virtual reality device featured by a stereoscopic display, stereo speakers, smell generators, and a
vibrating chair (Pimentel & Teixeira, 1993). In 1960, the Telesphere Mask, the first head-mounted display (HMD)
was invented by Mortor Heilig (Heilig, 1960). In 1961, the first motion tracking head mounted display was developed
by two Philco Corporation engineers (Comeau & Bryan), incorporating a video screen for each eye and a magnetic
tracking system that linked to a closed-circuit camera (Kiyokawa, 2007). In 1965, the concept of “Ultimate Display”
was described by Ivan Sutherland, which included a virtual world viewed through a HMD, computer software for
creating the virtual world in real time, and the ability for users to interact with virtual objects (Sutherland, 1963). In
1987, the term virtual reality was created by Jaron Lanier, founder of the visual programming lab (VPL Research) that
was started in the corner of Lanier’s cottage (Barbara, 2000). A range of virtual reality gears were developed and sold
at that time.
Recent VR systems could be categorized as smartphone-based VR and computer based VR (Brockwell, 2016). The
beginning of the 21
st
century has seen a rapid development of virtual reality because of advanced computer technology
(fast processors in the computer) together with small powerful smartphones. Companies like Google have released
products such as Google Cardboard (Figure 5), very simple headsets driven by smartphones. With digital imaging
software, it is very easy for users to merge images into a seamless panorama. People can use smartphones to take
panoramic photos and view the photos in an interactive way. Using the app Cardboard Camera, a 360 view of the
studio space in USC School of Architecture was easily captured and viewed with Google Cardboard (Figure 6).
10
Figure 5 Google Cardboard Figure 6 A 360 view of the studio space in USC School of Architecture
The high-tech version of virtual reality headset uses electronic goggles such as Oculus Rift, which has a stereoscopic
OLED (organic light-emitting diode display) and a positional tracking system. A stereoscopic OLED is made with
organic compounds, emitting light in response to an electric current. Different from LED (light-emitting diode), each
pixel on an OLED display lights itself up independently of the others and when they are shut off, they are completely
off. A positional tracking system includes a stationary infrared sensor that creates 3D space, defining a space for users
to walk around. The video game industry has continued to push the development of technology on virtual reality
together with other industries such as the architecture, engineering, and construction (AEC) field. With the benefits
of development of headsets, people in architectural industry exploring the use of VR on the process of design,
presentation, and management. The most important elements defining current VR technology are 3D stereo (usually
be showing a separate image to each eye), walkthrough navigation, and motion tracking.
Walkthrough navigation is a VR characteristic allowing people to move through a viewport or a continuous animation
and navigate a virtual 3D space (Autodesk, 2017). Walkthrough always comes with a navigation cursor showing
directional arrow indicating (forward, back left and right, sometimes replaced by keyboard commands W, S, A and
D).
Motion tracking is a process of transmitting people’s position using sensors (VRS, 2017). Motion tracking technology
has been widely used for helping people to receive real time feedback when they move their heads or move around
(Morris 1973). Motion tracking system could be broken down into optical and non-optical systems. Optical method
usually uses two or more cameras to record the objects’ movement. The object being tracked is always equipped with
highly reflective markers (VRS, 2017). The non-optical method is a sensor based method, and the processed motion
data is transmitted directly to the computer. The most popular VR equipment such as HTC vive, Oculus Touch, and
SteamVR controller are all non-optical motion tracking devices (Roetenberg, 2013).
According to the development history of virtual reality, the types of VR could be summarized as still stereo images,
still panorama images, still stereo panorama images, interactive panorama images, interactive stereo panorama images,
walkthrough animations, stereo walkthrough animations, modifiable walkthrough animations, modifiable stereo
walkthrough animations. Among which, motion tracking stereo panorama, motion-tracking walkthrough animation,
and modifiable possibilities of walkthrough animation are the main scope of the thesis (Table 1).
Table 1 VR types and thesis scope
Still Motion-tracking Walkthrough Non-walkthrough Modifiable Non-modifiable
Stereo image √ ---- ---- ---- ----
Panorama ---- ---- ---- ----
Stereo panorama √ ---- ---- ---- ----
Animation ---- √ √ √
Stereo animation ---- √ √ √
Interactivity for virtual reality could be defined as how well people receive feedback from the virtual world and in the
best case, also change things within the virtual world directly. Interactivity depends on many factors such as interaction
speed, virtual outcomes resulting from user actions, and natural senses stimulation. The main keys to evaluate
interactivity could be the possibility of navigation, environment modification and motion tracking. For example, in
11
architectural industry, a still rendering stereo image could be the least interactive VR version (Figure 7), because there
is no way to interact with the image – for example walking to a different location. People could get a 360-degree
overview though an interactive panorama image, which is better than the still panorama image in aspect of interactivity
because of the motion tracking ability. Another type of VR could be an interactive walkthrough model with the ability
of modifying. One method combines both – a pre-set animation through a 3D model of an airport was developed, but
the user could also look around in 360 degrees at any location along the path (Knudsen, 2017).
Figure 7 A spherical image
Four classes of interactivity can be defined: observing, looking around, acting in and communicating with the virtual
world. From least interactive to most interactive, this can be order as the following: still stereo image (observing),
interactive stereo panorama (looking around), motion tracking walkthrough (looking around + acting in), and scene-
modifiable motion tracking walkthrough (acting and feedback).
1.1.2 Virtual reality applications in architecture and design
The Marxent company has been trying to transform 3D format to a more flexible AR (augmented reality) or VR (Wire,
2016). It is much easier for users to view the project through youTube 360 or VR head-set monitor as well as to share
and manage data. More and more companies are interested in transforming the paper based business brochure to
immersive format of AR or VR. Some well-known companies like Simmons Bedding Company and Azek have
utilized VisualCommerce to demonstrate virtual scenes to consumers. Virtual reality represents the real world in a
simulated one, while augmented reality presents in real time world together with visualized objects and environment
which could be either virtual or real (Jonathan, 1992).
• Architects
Architects could benefit a lot from VR especially on the immersive experience and 3D perception depth. This provides
designers with a real feeling of the space. Although 3D rendering images can mimic the appearance of the project,
people still need process and imagine the real scale and space effect. However, VR, brings architects to a new level
of viewing the project by allowing them to walk through a 1 to 1 scale virtual world and to modify details. BNIM has
already incorporated VR technology in their project presentations (Figure 8). This project is specifically a six-story
collaborative office hub with retail and restaurant suites on the ground floor, targeting net zero and LEED Platinum.
BNIM also provides the still version that can be viewed through Google Cardboard (BNIM website, 2017). VR
technology also brings a competitive advantage for architects to show clients a better visual effect of the design and a
better interaction between them, which may offer a better chance for them to win the design.
Figure 8 VR presentation (BNIM website, 2017)
• Structural engineer
Incorporating Revit, Dynamo and Escape, Sgambelluri from John A. Martin & Associates (JAMA) structural
Engineers (Autodesk, 2016) has tried to visualize structure and structural details through VR platform (Figure 9).
12
Another example from JAMA uses an avatar in a VR environment, walking along the structural beams of a roof to
find gaps in the 3d model.
Figure 9 VR presentation for structural engineers (Sgambelluri, 2016)
• Urban design
Virtual reality is also extremely useful for cultural heritage and urban planning (Ceconello, 2008). A model of Piazza
Cordusio located in the historical center of Milan was created to evaluate the re-design of the site (Figure 10).
Figure 10 VR for urban planning (Ceconello, 2008)
• Construction
McCarthy Building Companies has been designing and constructing hospitals, laboratories, and education facilities
for over 150 years. It is using virtual reality to improve its design and building process. At its office in Roseville,
Calif., McCarthy has been using VR technology for several years. In 2012, McCarthy built its own building
information modeling (BIM) Cave, which uses projection technology and 3D glasses allowing multiple users to see
what a hospital room or office space will look like (Gaudiosi, 2015). Making any type of changes once a building is
under construction is extremely expensive and time-consuming. By using VR, McCarthy offers clients the ability to
make changes for free well before actual construction has begun. The company currently uses a very powerful gaming
computer with an Oculus VR headset and an Xbox remote control. McCarthy Building Companies employs virtual
reality to construct hospitals (Figure 11) and tries to integrate AR in the construction process (McCarthy Building
Companies, 2015). Even the use of scaled mock-up models of buildings is being replaced by VR (Gaudiosi, 2015).
Figure 11 VR for construction (McCarthy Building Companies, 2015)
13
• Lighting manufacturer representatives
Like all the product representatives, for the purpose of presenting products, lighting manufacturer representatives have
to take the heavy products to architectural firms, lighting design firms, and electrical engineering firms. It is of great
help to set up a virtual show room for representatives to present their products through VR. Delta Light Company has
created a web-based VR show room to present their products (Figure 12) (Delta Light VR website, 2017). Instead of
bringing the whole family of the products, representatives could take only one typical physical product, to show the
material and quality. Resulted from the similar effects between VR presentation and physical products, customers
could see all the details of the whole series of products and easily experience the outcome from different fixtures.
Figure 12 VR for lighting agencies (Delta Light VR website, 2017)
• Clients/ owner
VR is very beneficial to owners. Instead of reviewing all the sophisticated 2D drawings and matching each space to
3D renderings through imagination, owners can interactively explore a project in VR. Owners could easily look around
and check all the details as well as changing the schemes and even adjusting the lighting controls. With the help of
VR technology, owners can easily point out all the issues and give the specific feedback to the designers. Corgan, a
leading architecture and design firm, has invested and developed VR technology that allows clients to be fully
immersed in the design projects (Figure 13-14) (Corgan website, 2017).
Figure 13 Corgan VR (Corgan website, 2017) Figure 14 Corgan VR (Corgan website, 2017)
1.2 An Introduction to Architecture Lighting Design
Architecture lighting design is a process that involves many opportunities for visual presentation throughout each of
the design phases. Schematic design, design development, and construction document are the three main important
phases which involves more visualization and design modification. It is important to understand what is a designer’s
work in each of stage how VR technology could assist design.
1.2.1 Typical architectural lighting design
Architectural lighting concerns both natural and electrical lighting, which combines architecture design, interior
design and electrical engineering (Julian, 1983). The scope of architectural lighting design includes targeting the
14
required light level based on activities within the space, achieving aesthetic requirements by using appropriate colors
and brightness, creating visual interest for the space (Fontenelle, 2008).
The main typical workflow for lighting designers includes three phases: schematic design, design, development and
construction document. Chapter 2.1 provides a more detailed description of the lighting designer workflow.
1. During the schematic design phase, lighting designers should present one or more relatively complete
concepts and schemes (Figure 15). During this time, lighting designers will make the basic mock-ups or
modify computer models received from architects, preparing for rendering and lighting simulation.
Lighting designers will also identify potential fixtures and show the basic lighting calculation results
achieving code compliance and fulfilling client requirements (Steffy, 2002).
Figure 15 Christ Cathedral schematic design (Francis Krahe & Associates, 2014)
2. In the stage of design development, lighting designers will receive the feedback from architects or clients
(Figure 16). According to the review comments, they should provide reasonable recommendations and
appropriate solutions. Lighting designers will also coordinate with other building systems such as MEP
systems, furniture systems and other architectural features, and make related modifications (University
of Colorado Boulder Facilities Management website, 2017).
Figure 16 Harvard University district energy facility design development (Francis Krahe & Associates, 2017)
3. During construction documentation, all the design work should be finalized (Figure 17). It is not unusual
that sometimes clients would still adjust the design in this stage, and lighting designers should leave the
flexibility for the adjustment. Lighting designers should also submit all the drawing layouts together with
lighting calculations results and specifications for lighting fixtures (AIA website, 2017).
15
Figure 17 Harvard University district energy facility construction document (Francis Krahe & Associates, 2017)
1.2.2 VR aided design
Computer aided design (CAAD) is common in the architectural industry. Almost all the drawing layouts and
submittals are generated using computers and even the AIA examination is supported by computers (Qaqish, 2001).
Under this circumstance, virtual reality aided design (VRAD) could be defined as computer aided design using the
methods of virtual reality (Donath and Regenbrecht, 1996). VRAD is a new emerging way for designers to evaluate
and present their projects. They could change the components in the projects efficiently and compare the difference
of the effects. Some of the lighting manufacturers have already applied VR on their website to help their clients to
experience the design with their products (DELTALIGHT, 2017).
As explained earlier, he main typical workflow for lighting designers includes three phases: schematic design, design
development and construction document. Each of these phases can incorporate virtual reality.
1. During the schematic design phase, a lighting designer usually views the rendering results through 2D image
or a simple 3D model, which could provide them with the basic impression of the project. Lighting designers
usually select the lighting fixtures based on their knowledge and working experience. For complicated space
and sophisticated geometry, designers should follow the simulation results from the lighting software to
identify the fixtures. It is time consuming to get an updated processed realistic rendering. With the help of
VR technology, designers could modify the fixtures and walkthrough the room as well as check the
illuminance level (Figure 18). In VR mode, it is also efficient for lighting designers to compare the outcomes
of different options such as mounting types, mounting heights, and fixture figures. Also, in the VR mode,
designers may control the light level or change the fixtures interactively with a one to one scale full immersive
experience. This could help them to modify or adjust with the fixtures more efficiently. There is a lot of
interaction between designers and clients during the design process. In the meeting of selecting the scheme,
designers could show all the potential options to clients through VR, helping them to establish a real feeling
of the project.
Figure 18 Lighting schematic design
16
2. In the stage of design development, it is easier for lighting designers to discuss the design details with
architects and owners in VR platforms than to communicate through traditional 2D drawings or 3D
rendered models. For architect and owners, VR provides them with a flexible method for reviewing the
design work. Instead of marking on the 2D drawings, architects could export the 3D scenes and make
the mark-ups on it (Figure 19-20). When lighting designers receive the feedbacks from architects and
owners, they could modify the mistakes or coordinate with the new design proposal directly through VR
platforms.
Figure 19 VR PC mark-up (Fuzor, 2017) Figure 20 VR mobile mark-up (Fuzor, 2017)
During the meeting with architects and interior designers, lighting designers could present the project
interactively and deliver the detailed information more efficiently through a VR platform. Design firms like
Decorilla has employed VR to communicate with clients (Figure 21) (Decorilla, 2017). The firm is an online-
only business. After receiving the dimensions of the room from clients, designers will send back a digital
proposed design model. The client could view the furnished room through VR application in the smart phone.
Figure 21 Design development (Decorilla, 2017)
3. During the stage of construction documents phase, it is convenient for lighting designers to examine the
detailed 3D models and 2D drawings. For example, the project of Harvard University district energy
facility construction is filled with different types of mechanical, electrical and plumbing systems (Figure
21). There are so many huge machines and pipes throughout the space, so it is hard for lighting designer
to determine whether the fixture is located in the correct place or not. It is very hard to notice if the
fixtures is located inside the pipe or the fixtures are conflicted with the machines.
17
Figure 22 Construction document (Francis Krahe & Associates, 2017)
However, through VR platforms, lighting designers could walk into the project in a real scale and check
all the potential mistakes.
1.3 Thesis Development
This section explains the significance of study; presents the abstract, hypothesis, and research objectives; defines the
scope of work; and summarizes the contents of the thesis chapters.
1.3.1 Significance of study
In the architectural industry, virtual reality technology has a high potential to change architecture and help people to
better visualize and experience the design and space. In the lighting design field, it is helpful to let the designers know
how to get the project visualized in VR platform. Through step-by -step descriptions, it is easier for lighting designers
to understand the general workflow of virtual reality aided design as well as each of the specific VR workflows. Based
on the comparison of the selected programs, it is meaningful to provide a guidance to help designers to choose software
programs based on specific goal. A VR design methodology is proposed (Figure 23) for both smart phones and desktop
computers, which helps lighting designers to better examine the design and modify the design in a more efficient way.
Figure 23 Proposed lighting design method
18
With the two new proposed workflows, lighting designers could visualize illumination level as well as photorealistic
renderings, which helps to establish a complete loop for VR aided lighting design (Figure xx).
1.3.2 Abstract, hypothesis, and research objectives.
The abstract describes the scope and the goal of the research. The hypothesis is validated through a case study together
with simulation results from selected programs. Besides the evaluation of the programs, a new workflow was created
to help lighting designers to better visualize the design project through VR platform.
Abstract
Virtual reality (VR) can be used to create interactive, immersive experiences, for example, for visualizing the world
with Google Earth VR and participating in VR computer games. It is also being used in other industries that rely on
three dimensional models for communication. Architecture is one of the professions that requires interaction between
the real world and the digital domain. Some architects are trying to introduce VR into their daily workflow. However,
there are many programs and platforms for designers to choose from with different strengths and weaknesses. Lighting
designers can also benefit from the immersive and interactive features of VR both in communicating with clients and
using VR tools in their own design process. VR technology could reduce the workload and make presentations more
efficient and interesting. VR could also help clients to change the lighting fixtures and dim the light interactively and
present the real-time walkthrough experience. Different workflows for integrating VR into the lighting designer’s
design process should be based on the ultimate use intended and issues of accuracy (e.g. illuminance), type of
visualization (illumination values, false color, rendered view, etc.), the level of interactivity needed (stationary viewer,
full movement in the space, ability to interact with objects and light sources), compatibility with other software, and
cost.
Eight workflows that incorporated virtual reality outcomes were developed for lighting design. These were evaluated
based on visualization, compatibility, accuracy, and interactivity through surveys and case studies, and then two new
workflows were developed based on these.
Lighting simulation software programs (AGI32, Autodesk Revit, 3ds MAX, DIALux, RADIANCE from Ecotect, and
RELux Vision) were first assessed as which of them might be part of the final workflow. Previous research has shown
indicated differences in simulated results between lighting simulation software, but for VR use, the illuminance results
from each of the programs appears identical with in an accepted range. However, according to other issues such as
compatibility, user interface, and visualization, each of the programs differs from each other. Based on the evaluation
results, 3ds MAX is better for just rendering. However, AGi32 and Grasshopper based Radiance in Rhino have a
better simulation data visualization overall including false color images, renderings, and illumination level calculation.
In the aspect of interactivity, Unreal Engine 4, Unity 3D, and Autodesk Stingray have the similar potentiality for VR
editing and scene visualization. However, Unreal Engine 4 has a better compatible platform for development. Fuzor
was discovered to have many of the desired features; it was incorporated into one of the workflows and used as the
benchmark for comparing the other eight workflows.
One phone-based modified workflow and one computer-based workflow were proposed for lighting designers to
achieve illuminance values visualization or false color visualization in a VR mode, and a real project was used as a
reference model for evaluation.
Hypothesis
AGI32, Revit Dynamo, Revit to vCAD is a better phone-based workflow versus others tested, and AGI32, Revit
dynamo, Revit, 3ds MAX to Unreal Engine 4 is a better computer-based workflow for lighting designers to visualize
interactive lighting simulation in aspect of cost, interactivity compatibility, visualization and accuracy than the other
selected workflows including AGI32-Mobile VR station, Elumtools-Mobile VR station, 3ds MAX-Cardboard
Camera, Revit-vCAD, Revit-Fuzor (phone-based), Revit-Fuzor, Revit-Enscape, 3ds MAX-Unreal Engine 4.
Objectives
Based on the hypothesis and the goal, three objectives have been set:
• Help lighting designers to understand how to incorporate VR in different design phase and how to visualize
lighting simulation results in virtual reality.
19
• Describe and evaluate several workflows for lighting designers to employ and determine which workflow is
better than the others in specific circumstances. Then recommend two workflows, one smart phone based,
the other desktop computer based.
• Develop one workflow to visualize illuminance value of the lighting environment in VR platform
1.3.3 Chapter structure
Chapter 1 presented an overview of virtual reality, lighting design, and the potential intersection of the two. Chapter
2 demonstrates previous research that has been completed concerning VR aided lighting design and the outcomes of
new simulation method. Information about the comparison between different software and contrast between VR aided
design and traditional design is detailed in Chapter 3. The methodology is described in Chapter 3, which outlines
exactly how the study is completed, what assumptions were made, and the rationale behind each step of the research.
The analysis and results of the data collection are presented in Chapter 4. This chapter outlined what data gets collected
and any interesting trends that emerged. Chapter 5 presents the results of the study which were analyzed and
interpreted. Following the actual content of the research, Chapter 6 presents the limitations and conclusions of the
study which tied the research back to the broader ideas presented in the introduction and discussed some potential
ways that future research could expand upon the findings.
1.4 Chapter Summary
VR reality has been explored by people for more than one century and is beginning to be used by architects, structural
engineers, urban designers, lighting manufacturers, and clients. VR aided design is helpful for lighting designers to
visualize design details and to communicate with architects and clients. Also, VR is beneficial to all the lighting design
phases. Utilizing VR technology not only makes design process more efficient but also helps designers to identify the
design issues that are hard to be detected in traditional 2D drawings and 3D models.
20
CHAPTER 2: BACKGROUND OF VIRTUAL REALITY IN LIGHTING INDUSTRY
Chapter 2 introduces the background of VR technology applying to architectural industry and different viewpoints
from professional architects on VR. Previous evaluation of lighting software including Radiance, Ecotect, AGI32,
Elumtools, DIALux, Relux, Inspirer, Lightscape, Autodesk Revit, Autodesk 3ds Max, and Velux Daylight Visualizer
are also listed in chapter 2 together with some simulation researches for lighting.
2.1 Typical Lighting Design Workflows
In the professional lighting industry, the design workflow is straight forward but sophisticated because there are too
many small tasks such as searching cut sheets, determining the total fixture wattage in a specific area and documenting
specification submittals. But generally, essential lighting design tasks can be assigned to these standard stages:
programming, schematic design, design development, contract documents, bidding and negotiation, construction, and
post-occupancy evaluation (Figure 24).
Figure 24 Design process
John E. Reichardt discussed seven phases of lighting design in his guide book named the lighting design process. All
the seven phases are summarized as below together with some other literature review (Reichardt, 2010).
2.1.1 Programming phase
The programming phase is the first step for lighting designers to consider the whole project including client
preferences and occupant needs (Parshall, 2011). Space functions, codes, color of light, and color rendering index,
budget and costs and energy codes should all be considered in this phase. Programming phase is a preparation stage
for designers to make a reasonable plan for the whole design circle according to the schedule. Designers should also
consider whether the project is targeting any special goals such as Leadership in Energy and Environmental Design
(LEED) as this may affect the lighting design (Hershberger, 1974).
2.1.2 Schematic design phase
In the schematic design phase, the lighting designer should provide one or several design schemes together with some
preliminary fixture selections as well as lighting control systems. During this stage, designers usually present ideas
through some previous design projects and some photos or rendering images (Borson, 2014). The lighting designer
will also establish the basic lighting layout together with lighting simulation results using lighting calculation
programs such as AGI32. Designers should also conduct a preliminary energy and economic analysis (Steffy, 2001).
Finalizing the conceptual design, all the preliminary considerations should be documented in drawings, 3D computer
modeling and specifications and presented to the architects or clients for review.
2.1.3 Design development phase
The architect will approve or modify the design details based on their reviews and the meeting feedback from clients.
Coordinating with other building systems such as HVAC systems, designers will modify the lighting layout avoiding
any conflict with other systems and developing more detailed specifictions (AIA, 2007). With the development of the
design, illumination levels and other electric issues should be compliance with the code. All the design work and
finalized calculations should be updated in the official documents for clients to review (Winchip, 2017).
2.1.4 Contract documents phase
Coordinated with architecture and HVAC systems, all the lighting designs details, and an updated report of probable
cost should be finalized in the contract document phase (CAM website, 2017). The lighting designers should be
flexible with design modifications based on any change of the building system even the change of the space. At the
21
end of the stage, designers should submit the completed design layouts together with illumination level calculation
results and final specifications to clients to review and approve. According to the special goals and requirements of
each project, lighting design should be compiled with the specific certification requirements.
2.1.5 Bidding and negotiation phase
During the bidding phase, lighting designers will assist clients to evaluate which electrical sub-contractor is most
suitable based on price, experience, and availability (Klinger, 2006). Additionally, designers will also review and
confirm with the final contract the client receives from the contractor of choice.
2.1.6 Construction phase
During the construction phase, lighting designers should answer contractor’s inquiries regarding the lighting design’s
intent, design details, and specifications. Responsibilities of lighting designers during construction include review of
submittals, conducting commissioning through field visits, answering requests for information (RFI’s), solving field
problems and summarize the construction issues in document (HMH website, 2017).
2.1.7 Post-occupancy evaluation phase
Post occupancy evaluation is the process to determine how successful the delivery and design of the project was,
where there is potential for further improvement, and what lessons can be learned for future projects. Lighting quality
could be evaluated by brightness, light distribution, color, sense of comfort, daylight, energy cost, and aesthetics
(Dilouie, 2006). Also, it would be helpful for designers to follow with the post-occupancy evaluation to understand
the difference between design assumption and real operation (Mango, 2015).
2.2 Lighting Simulation
Lighting simulation could be defined as rendering and calculation. Lighting simulation rendering can be divided into
two different criteria: one is photorealistic rendering based on the real-life renderings, and the other is predictive
rendering (Ward and Shakespeare 1998, Moeck and Selkowitz 1996). Lighting simulation calculation is the process
using software program, which provides a fast and accurate assessment of light level for a specific environment (Bryan,
2002). Lighting simulation is one of the crucial steps in lighting design practice, which helps lighting designers to
evaluate illumination level of the design as well as to determine the proper lighting fixtures. Simulation can be time-
consuming, and choosing the correct software can be difficult.
2.2.1 Lighting simulation history
Lighting simulation has been explored by people for a long time, since so many complex calculations are considered
with large quantities of data concerning lighting distribution. Compared to hand calculation, it is much easier to
perform the complex simulation with computers (Hirata et al. 1999). And some of the first programs which could
accomplish lighting simulation and data visualization could be traced back to the 1970s (Hirata et al. 1999). In the
same decade, combined with the past research, attempts of integrating daylighting and artificial lighting calculations
were being made (Plant and Archer 1973).
The early lighting simulation programs appeared to have many limitations on simplifying geometries, daylighting
analysis, and output accuracy (Svendenius and Pertola 1995). Since the 1980s, significant progress was made in the
field of computer graphics. Computer methods were improved for both calculation and representation of lighting
falling on arbitrary volumes (Nakamae and Tadamura 1995). The method of physically-based rendering was
developed based on light and material properties. (Kniss, 2003). Global illumination is a general name of a process
that simulates not only direct light but also takes indirect lighting into account. Simulation programs began to
accommodate complex geometries, daylighting and better calculation algorithms, but the solution to solve the global
illumination problem still remained in progress (Ulbricht, 2006). With the development of the computer graphics, the
industry of architectural representation, game industry, film industry and lighting research were directly benefited
from the improvements. However, there still remained a lot to be improved in aspect of fully realistic rendering and
accurate calculations (Wilkie et al. 2009).
2.2.2 Lighting simulation algorithms
Lighting simulation algorithms are instructions to solve lighting distribution problems based on physical and
mathematical theories. With the development of computer graphics, shading algorithms became widely accepted,
which attributed a single color to a single defined polygon and the color was determined by the incident lighting on
the surface. This type of algorithm helps to present shape and orientation but is far away from realistic rendering.
22
Common algorithms include cosine shading, Gouraud and Phong shading, Radiosity, and ray-tracing. Texture
mapping is also important.
Cosine shading
Lambert’s cosine law states that the illuminance falling on the surface varies depending on the cosine of the light’s
angle of incidence (Taylor, 2000). With the increase of light’s angle, the illuminance level of the area being illuminated
would decrease, because more light spreads out over perceived area (Ryer, 1997) (Figure 25).
Figure 25 Lambert’s cosine law (IESNA lighting handbook, 2006)
Gouraud and Phong shading
Henri Gouraud and Bui Tui Phong helped to develop shading algorithms by using the method of interpolation over
the rendered facets, which made surface look more continuous and smoother. Gouraud shading interpolates
illumination vertex values over the area of a polygon to produce continuous shading of surfaces. Phong shading
interpolates the normal vectors over the area of a polygon that preserves the highlights caused by non-diffuse
reflections (Dutre, 2006). This makes better specular highlights. Because of different strength and weakness in
different situations, both algorithms are still widely accepted in computer graphics (Dutre, 2006).
Radiosity
Radiosity was developed and introduced by researchers at the Cornell University Program of Computer Graphics in
1984. This method improves the simulation for diffuse surfaces. The surfaces are divided up into smaller pieces called
patches (Figure 26). The total amount of light at each patch is calculated including light bouncing off or hitting another
surface. Radiosity could handle complex reflections models except specular or glossy surfaces. It is very time
consuming and requires much storage of the computer to simulate complex models and non-diffuse materials with
Radiosity (Bickford, 2008).
Figure 26 Radiosity figure (Chan, 2012)
23
Ray tracing
Ray tracing is one of the most popular algorithms used in computer graphics rendering, especially in creating photo
realistic images (Figure 27). The raytracing algorithm determinates the position of the viewpoint and calculates each
ray coming from the viewpoint as it hits or reflects off different surfaces. Ray tracing is capable of simulating
transparent, translucent and specular surfaces, although it is also a time-consuming process (Kay, 1979). It limits on
indirect light simulation and can be more efficient when combined with Radiosity technologies (Wu, 2006).
Figure 27 Raytracing (Chan, 2012)
Texture mapping
A major breakthrough for more realistic simulation was the method of texture mapping. It is possible to index a texture
map and attach the divided colors to the local coordinate (Figure 28). Once the color of points on a surface was able
to change accordingly with the texture mapping, it was easy to change the other attributes such as bump, displacement,
and environment, etc. Texture mapping is still one of the techniques used in rendering (Wang, 2016).
Figure 28 Texture mapping
2.2.3 Lighting simulation programs
Radiance, Ecotect, AGI32, Elumtools, DIALux, Relux, Inspirer, Lightscape, Autodesk Revit, Autodesk 3ds Max, and
Velux Daylight Visualizer are popular lighting simulation programs used in research field and lighting industry. Some
24
of them are no longer being developed, but it is important to learn how each of the programs work and what their
strengths and weaknesses are.
Radiance
Radiance is a suite of programs for the analysis and visualization of lighting originally written by Greg Ward
(Luebkeman, 2006). The entire package was developed by the researchers over nine years at Lawrence Berkeley
Laboratory and the Ecole Polytechnique Federale de Lausanne in Switzerland (Ward, 1994). The input files for
Radiance specify geometries, materials, luminaires, time, data and sky conditions (Radsite, 2017). The primary
advantage of Radiance is that there are few limitations on simulating geometries or the materials, and simulation
results could be displayed as color images, numerical values and contour plots. But it has a difficult to use user
interface and is almost always plugged in an existing program. Radiance has been used by architects and engineers to
predict illumination level, visual quality and effects of design spaces, and by researchers to evaluate artificial lighting
or daylighting technologies or both. Radiance is the first program that has been rigorously validated (Reinhart, 2006).
Rhino Grasshopper, with plugins of Ladybug and Honeybee, is one software platform for Radiance lighting simulation
engine (Figure 29).
Figure 29 Radiance platform (Grasshopper website, 2017)
Ecotect
Ecotect is a building simulation program created by Square One Research, which is a building performance analysis
tool that can perform lighting calculations, energy simulations, and thermal simulations. The program is compatible
with Radiance and provides illuminance values over a customized analysis grid or a rendered image output. The
integration of Ecotect and Radiance was more accepted by academic research rather than professional lighting design.
In recent years, Ecotect has been discontinued (Autodesk, 2016).
AGI32
AGI32 is a lighting simulation tool used for daylight and artificial lighting. It was developed and distributed by
Lighting Analyst. AGI32 has integrated both algorithms of raytracing and Radiosity to produce lighting calculations
25
and photorealistic images. AGI32 is still one of the most accepted commercial software in lighting industry today,
especially in North America (Reinhart et al. 2006). Elumtools is the first lighting calculation software plug-in for
Autodesk Revit. Elumtools and AGI32 are developed by the same company, Lighting Analysts, Inc.), and both of
them have the similar functions and lighting algorithms (Ashdown, 2010).
DIALux
DIALux is a free software that has been used for studies on energy efficient lighting (Ryckaert et al. 2010). Developed
by DIAL Gmbh, DIALux is widely used for indoor and outdoor electric lighting calculations. It can also import
photometric data file directly from manufacturers. The most common used photometric data file in North America is
IES photometric data file. IES stands for Illuminating Engineering Society; it is the largest lighting organization in
America. IES made the standard of how to transform the photometric data into electronic version that is usually called
IES file (Photometric & Optical Testing Services, 2017). Integrating with the external model, POV-Ray, DIALux
could generate photorealistic images together with calculation results. DIALux has some limitations of inputting
geometry shapes and sky choices.
Relux
Relux by Informatik is primarily used for electric lighting design; it is free but not open source. Relux includes links
of photometric files from manufacturers and can be used both for interior and exterior space, which has been used to
evaluate comfort conditions of traditional architectures (Ruggiero et al. 2009). Relux uses a combination algorithm of
radiosity and a modified Radiance raytracing for calculation. Relux could produce photorealistic images integrating
Radiance through Relux Vision interface.
Inspirer
Inspirer is a commercial lighting simulation program developed by Integra, distributed mainly in Japan. Inspirer
employs raytracing algorithm and has been evaluated by a few studies. Utilizing the DEPT technique, Inspirer could
do lighting simulation and generate rendering images (Drago and Myszkowski, 2001).
Lightscape
Lightscape was the world’s earliest Radiosity rendering package developed by Donald Greenberg at the Cornell
University Department of Computer Graphics in the year of 1991 (McFarland, 2007). Lightscape contributed a lot in
rendering technology development and was used in many rendering products. Autodesk purchased rights of
Lightscape and incorporated Lightscape with the mental ray simulation engine that used the photon map algorithm
similar to Radiosity and raytracing. It supported illuminance engineering society data files for luminaries and was
compatible with other software product such as 3ds Max (Autodesk, Inc 2010a).
Autodesk Revit
Building information modeling (BIM) is a management process and a standard involved in building design,
construction, and operation (Kensek, 2014). Autodesk Revit is a BIM software program for professionals to design,
build, and operate high-performance buildings (Autodesk, 2017). Revit is a tool that works for architects, MEP
engineers, and structural engineers with the functions of modeling, rendering, energy simulation, and documentation.
Autodesk 3ds Max
Autodesk 3ds Max is a professional 3D computer graphics program that is used for specialized in creating models,
rendering images, and making 3D animations and is used from game industry to architecture industry (Erik, 2012).
Evaluating study on 3ds Max and Radiance has been performed to validate the simulation accuracy of 3ds Max
(Reinhart, 2009).
Velux Daylight Visualizer
Velux Daylight Visualizer is a lighting simulation software that is mainly focused on daylighting design and analysis.
It has been tested and validated against CIE test cases. The simulation time has been tested for the purpose of building
design and analysis (Labayrade, 2009).
Discussion:
From the tools described above, both AGI32 and Radiance remain in use for lighting simulation in the professional
industry. Some of the other software or programs may implement similar advanced algorithms, but there remain some
limitations of either artificial lighting capability or continuous development. Based on the high-quality continuous
26
development of algorithms, visualization and compatibility, programs of AGI32, Elumtools, and 3ds Max were chosen
for developing the VR lighting workflows.
2.2.4 Visualization of lighting simulation
Visualization refers to the methods of displaying data. With the development of digital technologies such as computer
graphics and the improvement of software and hardware, computer images, drawings, and animations were introduced
as the main media of architectural visualization. Architectural visualization is a step-by-step development, starting
from hand-drawn images to computer renderings and even modern VR technologies (Wu, 2006). The current
visualization methods have been achieved through advanced computer graphics technologies of geometry, material,
and lighting technology.
Architectural visualization could be divided into detailed output features including illuminance contours, illuminance
values, photorealistic renderings, walkthrough animation, interior solar study, materials refraction and transparency,
shadows etc. (Roy, 2000). Some important elements evaluating lighting simulation could be photorealistic renderings,
illuminance contours, false color renderings and illuminance values (Figure 30-33).
Figure 30 Photorealistic rendering image
Figure 31 Illuminance contours
27
Figure 32 False color rendering
Figure 33 Illuminance values
Illuminance describes the total luminous flux that is incident on a unit area. In SI units, illuminance is measured in
units of lumens per square meter, also called lux. In lighting industry, the most accepted unit for illuminance is foot
candle (lumens/foot
2
), which means the total luminous flux incident on the object surface, 1 foot away from the 1
candela light source on, per unit area. For example, the average maintained illuminance level in a typical high school
classroom should be around 50 foot candles (Stelzer, 1994).
Illuminance contours are contour lines that are drawn on top of Calculation Zones showing constant values (Figure
31). These are most often associated to illuminance, but are based on the metric used to define the Calculation Zone
(Visual, 2012).
28
False color is a method used to display images in different colors that were recorded to evaluate the rendered images.
A false-color image is an image that describes an object in colors related to some variable, but differs from a true-
color photograph. This method is often used in lighting renderings to give people a quick idea of lighting distribution
by matching the different illuminance with different corresponding colors or gray colors (E-light, 2017).
Discussion:
Besides 2D drawings and 3D models, illuminance values, photorealistic renderings, false color rendering and
illuminance contours are the most important visualization characteristics considered by lighting designers. In addition,
walkthrough VR rendered models, luminaire source photometric, VR interactive 2D navigational maps, and real-time
lighting changing function are also considered as VR visualization evaluation criteria.
2.2.5 Accuracy evaluation methods
A set-point reference is needed in order to validate a simulation program. Several evaluating strategies have been
identified, including mathematical model reference, real building reference, scaled model building reference, and
setting a benchmark to compare with (Roy, 2000). In aspect of validation in artificial lighting simulation, the strategy
of real building reference or setting a benchmark is more reasonable and feasible, since it is hard to build a scaled
model with real artificial fixtures and also the strategy of mathematical modeling could only provide a simple primary
validation for the simulation programs. Evaluation of AGI32, DIALux, Radiance and RELux have been performed
compared with analytical and manual method calculation (Shikder, 2009). The results from AGI32, DIALux, Radiance,
and RELux are almost identical compared with the manual method calculation (Figure 34). The test was based on the
simple case study with simple room geometry and a few fixtures.
Figure 34 Comparison of illuminance level simulation with analytical results (Shikder, 2009)
Discussion:
It is possible to use manual calculation as the benchmark to evaluate the accuracy of the software programs, if the case
study is very simple. When it comes to a very complex project, it is too time-consuming to calculate manually. One
of the best strategies is to set one of the programs as the benchmark and to compare the simulation results from others
with the results from benchmark software.
2.3 VR Platform
VR platforms are all suites of tools for people to explore VR including phone-based VR software and hardware and
computer-based VR software and hardware. Phone-based VR software includes Cardboard Camera, Mobile VR
Station, Vcad. Phone-based hardware relates to modern stereoscope such as Google Cardboard and Samsung’s Gear
29
VR. Computer-based VR software includes Unity 3D, Unreal Engine 4, Fuzor, Enscape, and Autodesk Stingray.
Computer-based VR software covers HTC Vive, Microsoft’s Hololens, and Oculus Rift. All the VR platforms listed
above are the most accepted technologies in the architectural industry for exploring VR and more are being developed.
Also, the proposed workflows are established with the selected platforms among them.
2.3.1 Computer-based VR software
Unity 3D
Unity3D is a cross-platform 3D engine providing users with flexible development environment to build video games
(Unity website, 2017). Unity gaming engine is helpful to bring 3D models into a virtual reality environment. Currently,
it is possible for professionals from architecture, engineering and construction (AEC) industry to take their Revit
model, bring it into Unity, and create a VR experience. The VR models can be adjusted and customized with Unity
knowledge. Unity is very flexible for designers and engineers to create interactive walkthroughs and providing them
with a realistic sense of the simulated virtual environment. (Creighton, 2010). Unity is capable of 3D games and VR
development for mobiles, desktops, webs, and consoles (Unity website, 2017).
Unreal Engine 4
Unreal Engine 4 is a suite of game development platforms made by game developers to create games and virtual
reality environments (Unreal Engine website, 2017). Unreal Engine 4 is compatible with from 2D mobile games to
VR immersive platforms, providing people with user-friendly interface. It is free for everyone to explore visualizing
spaces and rendering architectural models immersive environment with a deep powerful developer toolset (Nutt, 2015).
Because of the Unreal Engine’s open source code, developers can adapt existing features and create new ones to meet
their application’s needs (Mol, 2008). Unreal Engine has been used for video games as well as 3D film industry,
training simulations and rendering visualizations. With the access to complete C++ source code, thousands of
individuals and companies have built careers around skills developed using Unreal Engine 4 (Unreal Engine website,
2017).
Fuzor
Fuzor was developed by Kalloc Inc. It can be used for visualization and assisting clients in managing building lifecycle
process (Fuzor website, 2017). Fuzor can transform Revit or SketchUp models into virtual reality experiences,
allowing users to improve their designs and manage the building information. With the unique ability of synchronizing
with Revit platforms, it is very flexible for designers to modify and validate their projects such as changing the
furniture location, materials, or even dimming light sources.
Enscape
Enscape, developed by Autodesk, is a Revit plug-in software that creates a VR walkthrough experience. Compatible
with Revit, it is possible for users to change materials and geometries that come from Revit during VR simulation.
Combined with VR hardware, customers could experience in virtual world as well as adjust the models (Enscape,
2017).
Autodesk Stingray
Autodesk Stingray is a 3D game engine for real-time design visualization and game development (Autodesk, 2017).
Catching up with mainstream game engines Unreal and Unity, Stingray provides interactive VR experiences for games
and design visualization and is compatible with BIM-informed 3D visualizations in Maya and 3ds Max. It is a powerful
3D game engine with development tools to help game makers and designers create virtual worlds (Kleckner, 2015).
2.3.2 Computer-based VR hardware
HTC Vive
HTC Vive, developed by HTC and Valve, is a virtual reality headset that needs to connect to a powerful computer.
The Vive system includes a pair of motion controllers and a camera embedded in front of the headset (Lamkin, 2016).
Using room scale VR (5 meters by 5 meters), HTC Vive allows those in the AEC industry to experience walkthrough
VR with motion-tracking and 360˚ coverage. Wireless controllers are used to simulate natural interactions in a 3D.
The headset is tracked by base stations mounted on walls or tripods (Plasencia, 2015).
Microsoft’s Hololens
Microsoft’s Hololens is still in its development stage, but its conceptual release has already introduced various ways
in which this technology can be applied to different industries (McBride, 2016). The Hololens utilizes augmented
30
reality to create three dimensional objects within a real space (versus virtual reality, which focuses on a full virtual
experience) through the use of light to create holographic images (Microsoft website, 2017). It delivers a mixed reality
of both digital and real words, allowing people to present holograms in physical environment. With Microsoft
HoloLens, holographic computing becomes possible, enabling people to visualize and work with digital content
naturally (Davies, 2015). For the architectural industry, Hololens can extend the interaction with 3D models beyond
2D computer screens, creating new ways to visualize and manage complex projects.
Oculus Rift
The Oculus Rift is a light weight headset that allows users to look around through virtual world (Desai, 2014). The
Oculus Rift screen displays two images adjacent to each other enabling the zoom in-out and re-shaping pictures for
both eyes, thereby creating a stereoscopic 3D image (Goradia, 2014). Oculus Rift’s headset is traditionally seen as a
tool for gamers (Nield, 2016). It could be used more in the architectural lighting profession. More and more
architecture companies are deciding to experiment with using the Rift technology for visualization and design (Fairs,
2015).
2.3.3 Phone-based VR software
Google Cardboard Camera
Cardboard Camera is an VR application for both Android phone and iPhone, and it works with Google Cardboard. It
allows people to create stereo 3D, 360-degree photos using a mobile device’s camera (Perez, 2016). Cardboard Camera
can also record sound for VR photos, adding another layer of dimension to the whole immersive experience (Betters,
2016).
Mobile VR station
Mobile VR Station is a media player designed for Virtual Reality with head tracking and works mainly with
Google Cardboard. With the application, people can view photos, panoramas and videos projected onto a virtual screen
floating in space (Fuller, 2017). Mobile VR station supports displaying standard 2D and 3D side by side or over under
content (Rohit, 2016).
vCAD
The application vCAD is a platform that presents computer models (CAD or BIM) in VR environment. It also allows
people to walkthrough the models in a controllable VR mode (Honkonen, 2017). The application combines the average
smartphone with an inexpensive virtual reality (VR) head-mounted display. AutoCAD, Rhino, Revit, and SketchUp
Pro are currently supported by the application. A model could be uploaded to vCAD where the VR is process and then
users could download the converted architectural model to the smartphone for VR viewing with head mounted display
such as Google Carboard and Samsung Gear VR (vCAD website, 2017).
2.3.4 Phone-based VR hardware
Google Cardboard
Google Cardboard consists of a folded cardboard holder and lenses, which is a modern version of a stereoscope. It is
inexpensive (Branstetter, 2015). Through pairing Google Cardboard with a compatible smart phone application such
as vCAD, one can experience 3D models using internal rotational tracking sensors of the phone. Google Cardboard is
a relatively cheap VR hardware with a price around $15, though it is limited to static or navigable pre-rendered view
(Konnikova, 2015).
Samsung Gear VR
Samsung Gear VR is a virtual reality headset compatible mainly with Samsung phones. The headset includes a
touchpad and a back button on the side together with a proximity sensor. The Samsung phones act as displays and
processors, while the Gear VR act as the controller that enables rotational tracking and measurement (Samsung
website, 2017). Samsung Gear VR delivers a large field of view, smooth images, and reflection. Samsung Gear VR
can be used for BIMVR visualization and for picture-based construction use (KIM, 2015).
Discussion:
Phone-based VR platforms are more portable and flexible for people to present the project than computer-based VR
platforms. However, computer-based VR is more editable and presents project in a more interactive way. Google
Cardboard is chosen for the workflow development because of its low cost and high flexibility. HTC VIVE was chosen,
31
because it is compatible with most of the VR computer software, and its wireless controllers and base stations help
people to achieve walkthrough experience.
With regard to VR software, Stingray, Unity 3D and Unreal Engine 4 are three similar game developing software, and
Unreal Engine 4 is chosen because of its open source. Fuzor and Enscape are chosen because of their compatibility
with Revit. Mobile VR Station is chosen because there is no limitation of image format. Vcad is chosen because it is
compatible with most of the 3D formats and it provides navigable walkthrough rendered models.
2.4 Chapter Summary
According to the previous research and opinions from professional architects, both simulation methods and VR
technology are becoming mature and widely accepted in architectural industry and could also be used more
specifically by lighting designers also. The evaluation strategy of setting a benchmark to evaluate different workflows
will be implemented in the following chapters. Software which has a continuous development and is widely accepted
such as AGI32, Elumtools, Revit and 3ds Max will be validated together with related VR platforms Mobile VR Station,
Vcad, Fuzor, Enscape, Unreal Engine 4. VR experience will also be developed in a phone-based workflow and a
computer-based workflow to achieve a more flexible and editable VR lighting visualization.
32
CHAPTER 3: METHODOLOGY
Chapter 3 introduces the general methodology for lighting simulation on a VR platform including describing several
potential workflows, how they will be evaluated, and how a new workflow will be developed. This chapter also
describes how to create interactive visualization for lighting simulation through each of the software programs and
discusses the possibility of the workflows to achieve customized VR experience for lighting designers such as VR
illuminance level visualization (Figure 36).
Figure 35 Methodology
3.1 Describe the Workflows
A square room with a red carpet and a simple fixture was used as a case study for each workflow. The bright red floor
was chosen as a check that reflections and bouncing of light was being calculated by the rendering software. This case
study was taken through eight workflows to introduce the different editing method of seven software programs: AGI32,
ElumTools, Autodesk 3ds Max, Autodesk Revit, Fuzor, Enscape and Unreal Engine 4.
3.1.1 General lighting simulation VR workflow
The general lighting simulation VR workflow includes modeling, VR editing, and visualizing (Figure 36).
33
Figure 36 General workflow
1) Modeling
In the modeling phase, designers need to prepare both space models and lighting fixture models. Space modeling
includes room geometry, materials of enclosed system, furniture and equipment. The lighting fixture modeling
includes both lighting geometry and materials. For example, the reference room has a dimension of 24 feet (length),
12 feet (width) and 10 feet (height) with red carpet floor, acoustic ceiling, and white finish wall (Figure 37). There are
several ways to create the 3D models. Each selected software is capable geometry modeling and materials editing.
Also, geometries created in one software could be compatible with other software.
Figure 37 Reference room
2) VR editing
The second stage is VR editing that includes setting up the lighting environment, setting up the VR environment, and
running the simulation.
a) Setting up lighting environment includes selecting fixtures, matching the IES photometric, adjusting parameters
and locating fixtures.
• Select fixtures
Fixtures could be selected from the lighting manufacture website, for example, Lumenfacade from
Lumenpulse Group.
34
• Import IES file and fixture geometry
IES stands for Illuminating Engineering Society; it is the largest lighting organization in America.
IES made the standard of how to transform the photometric data into electronic version that is
usually called IES file. It is always an import button in each of the program to import IES file into
the program. Some programs need the plug-in to import the IES file.
• Adjust the parameters
The parameters are loss factor, power, configuration, and photometric source direction. Some
programs need the plug-in to adjust the IES file.
• Locate fixtures
Use the move commend to adjust the location of the fixtures and the rotate button to finalize the
location of fixture geometry.
b) Setting up VR software environment includes setting up the camera view port, selecting VR scenes, and selecting
control methods.
c) All the programs run a simulation of the space, but lighting different algorithms might be used.
3) Visualizing
The final stage is visualization that includes VR platform editing and visualizing in VR.
a) VR platform editing
• Panorama VR (phone based)
Step 1: Import selected images into Photoshop
Step 2: Merge images into panoramic photography
Step 3: Create a seamless Loop with a Photoshop action
Step 4: Trim away unwanted top and bottom edges
Step 5: Save the file as TIFF or JPEG image
Step 5: View the VR image in VR application
Panorama viewer apps: Mobile VR station
• Walkthrough VR (phone based and computer based)
Step 1: Set up VR hardware
Step 2: Import the project into VR platform: Vcad, Enscape, Fuzor, Stingray, Unreal Engine 4, Unity3D
b) VR platform visualizing
Visualizing VR is a process that describes how to import images or 3D model into VR platform and view through the
VR hardware.
3.1.2 Specific workflows
There are eight specific workflows. They are
1) AGi32-Mobile VR station,
2) Elumtools-Mobile VR station,
3) 3ds MAX-Mobile VR station,
4) Revivt-vCAD,
5) Revit-Fuzor (phone-based),
6) Revit-Fuzor (computer-based),
7) Revit-Enscape
8) 3ds MAX-Unreal Engine 4.
Each of them has steps
a) modeling,
b) VR editing
c) visualizing as described in the general workflow description 3.1.1.
1) AGi32 to VR Platform (Mobile VR Station/ PSViewer for Google Cardboard)
35
The workflow of AGi32-Mobile VR station includes modeling and lighting simulation in AGi32, processing the
rendering images in Photoshop and then visualize the stereo image through Mobile VR Station (Figure 38).
Figure 38 AGi32 to Mobile VR Station workflow
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 39-42)
Figure 39 Create the room Figure 40 Create the room
Figure 41 Change the floor material to red carpet Figure 42 Change the floor material to red carpet
b) VR editing
Select the fixture from Lumenpulse Group, Lumenfacade: LOG HO-120-48-35K-30x60-SI-NO_G1504031-R1
(Figure 43-47).
36
Figure 43 Import IES file and fixture geometry Figure 44 Adjust the parameters
Figure 45 Locate fixtures
Figure 46 Calculate Figure 47 Calculate
c) Visualizing in VR (Figure 48-50).
Figure 48Export the selected scenes from AGI32
37
Figure 49 Use Photoshop to process the images into spherical image
Figure 50 Import the spherical image into Mobile VR Station
2) Elumtools to VR Platform (Mobile VR Station/ PSViewer for Google Cardboard)
Elumtools-Mobile VR station workflow includes making the models in Revit, simulation lighting with Elumtools that
is plugged in Revit, processing the renderings with Photoshop and then visualize the stereo image in Mobile VR
Station (Figure 51).
Figure 51 Elumtools to Mobile VR Station workflow
38
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 52-54).
Figure 52 Create the room
Figure 53 Change the floor material to red carpet
Figure 54 Create compound ceiling
39
Modeling the fixtures (Figure 55).
Figure 55 Model the fixture
b) VR editing
Select the fixture from Lumenpulse Group, Lumenfacade: LOG HO-120-48-35K-30x60-SI-NO_G1504031-R1
(Figure 56-58).
Figure 56 Import IES file and fixture geometry
Figure 57 Adjust the parameters Figure 58 Locate fixtures
Lighting Simulation (Figure 59-60).
40
Figure 59 Calculate Figure 60 Calculate
c) Visualizing
Processing the images (Figure 61-62).
Figure 61Export the selected scenes from AGI32
Figure 62 Use Photoshop to process the images into spherical image
Visualizing in Mobile VR Station (Figure 63).
Figure 63 Import the spherical image into Mobile VR Station
3) 3ds MAX to VR Platform (PSViewer for Google Cardboard)
41
3ds MAX could export 360 panoramas directly after setting up the models. And Google Cardboard Camera could
view the panoramas in VR mode (Figure 64).
Figure 64 3ds MAX to Mobile VR Station workflow
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 65-67)
Figure 65 Create the room
42
Figure 66 Change the floor material to red carpet
Figure 67 Create compound ceiling
Modeling the fixtures (Figure 68).
Figure 68 Model the fixture
43
b) VR editing
Setting up light (Figure 69).
Figure 69 Locate fixtures
Setting up VR environment in 3ds MAX (Figure 70-73).
Figure 70 Setting up physical camera
Figure 71 Adjust FOV to 90 degree Figure 72 Adjust exposure gain target to 9-10 EV
44
Figure 73 Export panorama from 3ds MAX
c) Visualizing
Import the panorama image to Mobile VR Station and view the image (Figure 74).
Figure 74 View panorama through Mobile VR Station
4) Revit to VR Platform (vCAD)
The workflow of Revit-vCAD includes modeling in Revit and visualizing VR in vCAD (Figure 75).
45
Figure 75 Revit to vCAD
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 76-78)
Figure 76 Create the room
Figure 77 Change the floor material to red carpet
46
Figure 78 Create compound ceiling
Modeling the fixtures (Figure 79).
Figure 79 Model the fixture
b) VR editing
Setting up light (Figure 80).
Figure 80 Locate fixtures
47
c) Visualizing
Visualizing in vCAD (Figure 81-82).
Figure 81 Converting Revit to VR
Figure 82 Visualizing Revit in vCAD
5) Revit to VR Platform (Fuzor phone based)
Revit-Fuzor phone-based workflow includes modeling in Revit, exporting Fuzor file from Fuzor and visualizing VR
in Fuzor phone-based application (Figure 83).
Figure 83 Revit to Fuzor (Phone based)
48
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 84-86)
Figure 84 Create the room
Figure 85 Change the floor material to red carpet
Figure 86 Create compound ceiling
49
Modeling the fixtures (Figure 87)
Figure 87 Model the fixture
b) VR editing
Setting up light: Select the fixture from Lumenpulse Group, Lumenfacade: LOG HO-120-48-35K-30x60-SI-
NO_G1504031-R1 (Figure 88-90).
Figure 88 Import IES file and fixture geometry
50
Figure 89 Adjust the parameters: Loss factor, power, configuration, photometric source direction
Figure 90 Locate fixtures
51
Setting up VR environment: export VR file from Fuzor (Figure 91).
Figure 91 Export phone-based VR file from Fuzor
c) Visualizing
Visualizing in Fuzor application (Figure 92).
Figure 92 Visualizing Revit model in Fuzor phone application
52
6) Revit to VR Platform (Fuzor computer based)
Revit-Fuzor includes modeling in Revit and Visualize VR in Fuzor (Figure 93).
Figure 93 Revit to Fuzor workflow
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 94-96)
Figure 94 Create the room
53
Figure 95 Change the floor material to red carpet
Figure 96 Create compound ceiling
• Modeling the fixtures (Figure 97).
Figure 97 Model the fixture
54
b) VR editing
Setting up light: Select the fixture from Lumenpulse Group, Lumenfacade: LOG HO-120-48-35K-30x60-SI-
NO_G1504031-R1 (Figure 98-100).
Figure 98 Import IES file and fixture geometry
Figure 99 Adjust the parameters Figure 100 Locate fixtures
Setting up VR software environment (Figure 101).
Figure 101 Setting up rendered mode
55
b) Visualizing
Setting up VR hardware (Figure 102).
Figure 102 Setting up VR hardware
Visualizing (Figure 103).
Figure 103 Visualizing Revit model in Fuzor
7) Revit to Enscape
Revit-Enscape includes modeling in Revit and editing VR environment and visualize VR in Enscape (Figure 104).
Figure 104 Revit to Enscape workflow
56
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 105-107)
Figure 105 Create the room
Figure 106 Change the floor material to red carpet
Figure 107 Create compound ceiling
57
• Modeling the fixtures (Figure 108).
Figure 108 Model the fixture
b) VR editing
Setting up light: Select the fixture from Lumenpulse Group, Lumenfacade: LOG HO-120-48-35K-30x60-SI-
NO_G1504031-R1 (Figure 109-111).
Figure 109 Import IES file and fixture geometry
Figure 110 Adjust the parameters Figure 111 Locate fixtures
58
Setting up VR software environment (Figure 112).
Figure 112 Setting up Enscae settings
c) Visualizing
Setting up VR hardware (Figure 113).
Figure 113 Setting up VR hardware
Visualizing (Figure 114).
Figure 114 Visualizing Revit model in Fuzor
59
8) 3ds MAX to Unreal Engine 4
The workflow of 3ds MAX to Unreal Engine 4 includes creating the space in Revit (modeling), setting up the lighting
in 3ds MAX, setting up the VR environment in Unreal Engine 4 (VR editing) and setting up VR hardware and viewing
the project through Unreal Engine 4 (visualizing) (Figure 115).
Figure 115 3ds MAX to Unreal Engine 4 workflow
a) Modeling
Space Modeling: length: 24’, width: 12’, height: 10’ (Figure 116-118)
Figure 116 Create the room
60
Figure 117 Change the floor material to red carpet
Figure 118 Create compound ceiling
• Modeling the fixtures (Figure 119).
Figure 119 Model the fixture
61
b) VR editing
Setting up light (Figure 120-121).
Figure 120 Locate fixtures
Figure 121 Import geometry into 3ds MAX
Attach photometric data file (Figure 122-123).
Figure 122Attach photometric data file Figure 123 Select the modified IES file
62
Setting up VR environment (Figure 124-137).
Figure 124 Create a new Unreal file Figure 125 Default starting interface
Figure 126 Default blueprint shooting coding
Figure 127 Deactivate blueprint shooting coding
63
Figure 128 Default gun geometry Figure 129 Delete gun geometry
Figure 130 Default crosshair coding Figure 131 Deactivate the crosshair coding
Figure 132 VR mode without guns and crosshair
64
Figure 133 Create new scene map Figure 134 Drag sky light and directional light into the scene
Figure 135 Import FBX geometry Figure 136 Organize Geometry
Figure 137 Re-edit photometric file
65
c) Visualizing
Setting up VR hardware (Figure 138).
Figure 138 Setting up VR hardware
Visualize in Unreal Engine 4 (Figure 139).
Figure 139 Unreal Engine 4
3.1.3 Workflow summary
Phone-based lighting VR workflow were AGi32-Mobile VR station, Elumtools-Mobile VR station, 3ds MAX-
Cardboard Camera, Revivt-vCAD and Revit-Fuzor (phone-based). The computer-based workflow were Revit-Fuzor
(computer-based), Revit-Enscape and 3ds MAX-Unreal Engine Each of them followed the steps of modeling, VR
editing, and visualizing. For most phone-based workflows, lighting simulation is done during the step of VR editing;
however for computer-based workflows, simulation could run real-time during visualization.
3.2 Evaluate the Workflows: Criteria and Methods
Five criteria were used to evaluate each VR workflow for lighting designers: pricing cost, interactivity, visualization,
compatibility, and accuracy. Efficiency was considered, but rejected as a criterion. A case study also provided a basis
for comparing and evaluation workflows.
3.2.1 Evaluation criteria
The pricing cost criteria is judged on the dollar cost of the software involved in each of the workflows. Cheaper is
better.
Interactivity refers to the level of user engagement with the software. According to the workflows, it was divided into
four classes: interactive stereo image (observing), interactive stereo panorama (looking around), motion tracking
walkthrough (looking around + acting in), and scene-modifiable motion tracking walkthrough (acting and feedback).
Instead of viewing only one direction, people could look around in 360° view through panorama. So, interactive stereo
panorama is better than stereo image. Motion tracking walkthrough allows people to navigate in the project as well as
to look around, which is better than just standing in a fixed spot to look around. So, motion tracking walkthrough is
better interactive panorama. Scene-modifiable motion tracking walkthrough allows people to add customized features
into VR environment in order to get a better feedback from the virtual world. So, scene-modifiable motion tracking
walkthrough is better than motion tracking walkthrough.
66
Visualization scores how lighting information is portrayed in the VR environment including eight main characteristics:
walkthrough VR rendered models, luminaire-based photorealistic renderings, illuminance values, illuminance
contours, false color renderings, luminaire source photometric, VR interactive 2D navigational map and real-time light
changing effects. In addition, visualization characteristics like color effects, materials reflections, lighting environment
etc. are also considered as visualization evaluation. The character of illuminance values may be more important for
lighting designers than VR interactive 2D navigational map during the design process, but it is hard to measure how
much more important or whether one is more important than the other for visualization. So, to make the evaluation
simple, each of the eight visualization characteristics is considered equally important. The more characteristics the
workflow could achieve, the better it is in the evaluation of visualization.
Compatibility criteria describes how well data could be imported or exported from one program to the other including
geometry, luminaire source files (fixture IES file), material properties and workflow inner compatibility (compatibility
within workflows between modeling software and VR software). If a VR workflow could read the geometry and
materials directly from Revit and there is no need for re-editing, the workflow is ranked as high compatibility. If a
workflow could read fixture IES file directly, the workflow is considered with high compatibility. Within a workflow,
if there is no extra process to take the project from modeling software to VR software, the workflow is ranked as high
compatibility. Characteristics of geometry, material, luminaire source and workflow inner compatibility are
considered equally important for compatibility evaluation.
Accuracy could be validated through several strategies: mathematical calculation, comparing with a real reference
building or physical models, or comparing with a benchmark. Mathematical calculation is hard to achieve especially
when comparing complex geometries. Building real construction was too time consuming and expensive. AGI32, one
of the most popular lighting simulation software in North America (Byrne, 2014), was selected as a benchmark to be
compared with. Light level results can be compared with the illuminance value and based on the reflection effects,
color effects and texture effects. Simulation renderings from the different software programs were used also to evaluate
the simulation accuracy. Closer results to AGI32 were given higher marks.
Efficiency is a criterion evaluates how time efficient the workflow is for different. But different people have different
familiarity with the programs, so it is not easy to record the time cost for different people achieving the whole
workflow. Even for one person, it is impossible for one to have the same familiarity of all the tested programs. This
criterion was judged to be too difficult to measure and was not used.
3.2.2 Evaluation
By listing all the simulation and test results, it is obvious to see the differences between each of the workflows. Based
on the criteria of pricing cost, interactivity, visualization, compatibility, and accuracy, each of the workflows were
analyzed. Also, each of the five criteria was given 8 points total for ranking. But the 8 points ranking scale for each
criterion doesn’t weight the same, e.g. for lighting designers, accuracy could be more important than compatibility, so
8 points in accuracy could be more important than 8 points in compatibility for each workflow.
A case study, user based comparison was done based on the ranking metrics. Criteria of cost, interactivity,
visualization, compatibility, and accuracy were evaluated separately and 8 workflows were ranked by each criterion.
Each workflow could be beneficial in different design phases and comparing the workflow based on the perspective
of a real lighting designer through a case study is necessary for the evaluation. In addition, a survey of people from
the professional industries is an efficient way to way to know whether the workflow is suitable or not for the real
practice; this was not completed.
3.3 Discussion of Workflow Development
Per the evaluation, the strength and the weakness of each workflow was pointed out. One phone-based workflow and
one computer based workflow were developed and determined to be the better workflow. For further development,
the characteristic of visualizing illuminance value in walkthrough VR was added to the two new workflows. From the
perspective of lighting designers, each criterion was added different weights in order to determine the better workflow
and evaluate the two new workflows.
67
3.4 Chapter Summary
It is necessary for lighting designers to know the workflows to achieve VR experience as well as visualizing lighting
simulation results. Workflows of AGi32-Mobile VR station, Elumtools-Mobile VR station, 3ds MAX-Cardboard
Camera, Revivt-vCAD, Revit-Fuzor (phone-based), Revit-Fuzor (computer-based), Revit-Enscape and 3ds MAX-
Unreal Engine 4 were described using the case study of a simple box. An eight-point ranking system was created to
evaluate pricing, interactivity, visualization, compatibility, and accuracy to help people better understand which
workflow is stronger in which criteria. Although the score for each criterion doesn’t weigh the same, it is possible to
add weighing factors to each of the eight weighting factors depending on the user’s preferences and requirements in
order to get a total score for final evaluation.
68
Chapter 4: VR SIMULATION RESULTS CASE STUDY
To evaluate and determine a better workflow for lighting VR simulation, a bar lounge was introduced as a case study.
Eight workflows (AGI-Mobile VR Station, Elumtools- Mobile VR Station, Revit-Vcad, 3ds MAX-Cardboard Camera,
Revit-Fuzor (phone-based), Revit-Fuzor (computer-based), Revit-Enscape, 3ds MAX- Enscape) were developed, and
all the simulation results are shown in this chapter. They are evaluated to determine how each specific VR workflow
benefits the design and which VR workflow is the better choice for lighting designers.
4.1 Case Study Reference Building
The reference building is one of the design projects in USC Arch577 lighting class instructed by Professor Lauren
Dandridge in spring semester 2016. It is a project of bar and lounge that includes bar area, cigar room, lounge area
and bathrooms (Figures 140-147).
Figure 140 Lounge area
Figure 141 Cigar room
69
Figure 142 Bar area
Figure 143 Floor Plan
Figure 144 Original Reflected Ceiling Plan
Figure 145 Proposed Reflected Ceiling Plan with highlighted lighting fixtures
70
Figure 146 East Elevation
Figure 147 West Elevation
All the workflows are tested with fixture IES files from the Lumenpulse manufacturer. The interior in each of the
workflows is also designed with the same style including wooden floor, brick walls, and wooden groove ceilings.
4.2 VR Workflows Results
VR workflows results were developed and tested based on the workflow descriptions and the case study reference
building. Each workflow was presented with detailed operation process as well as VR visualization results. As
described in detail in Chapter 3.1, the general lighting simulation VR workflow includes modeling, VR editing, and
visualizing (Figure 148).
Figure 148 General workflow
4.2.1 AGi32-Mobile VR Station
71
a) Modeling
Space Modeling (Figures 149– 153)
Figure 149 Import CAD file as the model reference
Figure 150 Create the room in AGI Figure 151 Model the room geometry
Figure 152 Edit room material Figure 153 Select the corresponding material to each surface
b) VR editing (Figures 154-157)
Select the fixture from Lumenpulse Group, Lumenfacade: LOG HO-120-48-35K-30x60-SI-NO_G1504031-R1
72
Figure 154 Import IES file and fixture geometry Figure 155 Adjust the parameters
Figure 156 Locate fixtures
Figure 157 Calculate
73
c) Visualizing in VR (Figures 158-161)
Figure 158 Use Photoshop to process the images into spherical image
Figure 159 Import the spherical image into Mobile VR Station
Figure 160 Visualize in 180 degree
Figure 161 Visualize the illuminance value and photometric
74
4.2.2 Elumtools-Mobile VR Station
a) Modeling
Space Modeling (Figures 162-165)
Figure 162 Create the room in Revit
Figure 163 Change the floor material
Figure 164 Edit material structure Figure 165 Select the corresponding material
75
Modeling the fixtures geometry (Figures 166)
Figure 166 Model the fixture geometry in Revit
b) VR editing
Setting up light (Figures 167-173)
Figure 167 Locate fixtures
76
Figure 168 Attach photometric data file Figure 169 Select IES file
Figure 170 Initialize Revit with Elumtools
Figure 171 Check the luminaire data
77
Figure 172 Create calculation points
Figure 173 Calculate the project
C) Visualizing
Process the images into panorama (Figures 174)
Figure 174 Process images in Photoshop
78
Import the panorama into Mobile VR Station (Figures 175-177)
Figure 175 Import the panorama
Figure 176 Visualize in Mobile VR Station
Figure 177 Visualize the illuminance value
79
4.2.3 Autodesk 3ds Max-Mobile VR Station
3ds MAX could export 360 panoramas directly after setting up the models. And Google Cardboard Camera could
view the panoramas in VR mode.
a) Modeling
Space Modeling (Figures 178-181)
Figure 178 Create the room in Revit
Figure 179 Change the floor material
Figure 180 Edit material structure Figure 181 Select the corresponding material
80
Modeling the fixtures geometry (Figures 182)
Figure 182 Model the fixture geometry in Revit
b) VR editing
Setting up light (Figure 183)
Figure 183 Locate fixtures
81
Attach photometric data file (Figure 184-185)
Figure 184Attach photometric data file Figure 185 Select the modified IES file
Setting up VR environment in 3ds MAX (Figure 186-189)
Figure 186 Setting up physical camera
Figure 187 Adjust FOV to 90 degree Figure 188 Adjust exposure gain target to 9-10 EV
82
Figure 189 Export panorama from 3ds MAX
c) Visualizing
Import the panorama image to Mobile VR Station and view the image (Figure 190)
Figure 190 View panorama through Mobile VR Station
83
4.2.4 Autodesk Revit – vCAD
a) Modeling
Space Modeling (Figure 191-194)
Figure 191 Create the room in Revit
Figure 192 Change the floor material
Figure 193 Edit material structure Figure 194 Select the corresponding material
84
Modeling the fixtures geometry (Figure 195)
Figure 195 Model the fixture geometry in Revit
b) VR editing
Setting up light (Figure 196)
Figure 196 Locate fixtures
85
c) Visualizing
Converting Revit to VR in vCAD (Figure 197)
Figure 197 Converting Revit to VR
Visualizing in vCAD (Figure 19-200)
Figure 198 VR Bar in vCAD
Figure 199 VR Bar in vCAD
86
Figure 200 VR Bar in vCAD
4.2.5 Revit-Fuzor phone based
a) Modeling
Space Modeling (Figure 201-204)
Figure 201 Create the room in Revit
Figure 202 Change the floor material
87
Figure 203 Edit material structure Figure 204 Select the corresponding material
Modeling the fixtures geometry (Figure 205)
Figure 205 Model the fixture geometry in Revit
b) VR editing
Setting up light (Figure 206)
Figure 206 Locate fixtures
88
Attaching the photometric data to the fixtures (Figure 207-208)
Figure 207 Attach photometric data file Figure 208 Select IES file
Setting up VR environment: export VR file from Fuzor (Figure 209)
Figure 209 Export phone-based VR file from Fuzor
c) Visualizing
Import Fuzor file to Fuzor phone application (Figure 210)
Figure 210 Import Fuzor file
89
Visualizing VR in Fuzor (Figure 211-214)
Figure 211 View the project in Fuzor
Figure 212 View the project in Fuzor
Figure 213 View the project in Fuzor
90
Figure 214 Show objects information in Fuzor
4.2.6 Revit-Fuzor computer based
a) Modeling
Space Modeling (Figure 215-218)
Figure 215 Create the room in Revit
Figure 216 Change the floor material
91
Figure 217 Edit material structure Figure 218 Select the corresponding material
Modeling the fixtures geometry (Figure 219)
Figure 219 Model the fixture geometry in Revit
b) VR editing
Setting up light (Figure 220)
Figure 220 Locate fixtures
92
Attaching the photometric data to the fixtures (Figure 221-222)
Figure 221 Attach photometric data file Figure 222 Select IES file
Setting up VR environment (Figure 223)
Figure 223 Setting up time in Fuzor
c) Visualizing
Setting up VR hardware (Figure 224)
Figure 224 Setting up VR hardware
93
Visualizing in Fuzor (Figure 225-229)
Figure 225 Visualize the project in Fuzor
Figure 226 Visualize the project in Fuzor
Figure 227 Visualize the project in Fuzor
94
Figure 228 Visualize the project in Fuzor
Figure 229 Visualize the photometric web in Fuzor
4.2.7 Revit-Enscape
a) Modeling
Space Modeling (Figure 230-233)
Figure 230 Create the room in Revit
95
Figure 231 Change the floor material
Figure 232 Edit material structure Figure 233 Select the corresponding material
Modeling the fixtures geometry (Figure 234)
Figure 234 Model the fixture geometry in Revit
96
b) VR editing
Setting up light (Figure 235)
Figure 235 Locate fixtures
Attaching the photometric data to the fixtures (Figure 236-237)
Figure 236 Attach photometric data file Figure 237 Select IES file
Setting up VR environment (Figure 238-239)
Figure 238 Setting up time in Revit
97
Figure 239 Set up Enscape features
c) Visualizing
Setting up VR hardware (Figure 240)
Figure 240 Setting up VR hardware
Visualizing in Enscape (Figure 241-244)
Figure 241 Visualize in Enscape
98
Figure 242 Visualize in Enscape
Figure 243 Visualize in Enscape
Figure 244 Visualize in Enscape
99
4.2.8 3ds MAX-Unreal Engine 4
a) Modeling
Space Modeling (Figure 245-248)
Figure 245 Create the room in Revit
Figure 246 Change the floor material
Figure 247 Edit material structure Figure 248 Select the corresponding material
100
Modeling the fixtures geometry (Figure 249)
Figure 249 Model the fixture geometry in Revit
b) VR editing
Setting up light (Figure 250)
Figure 250 Locate fixtures
101
Import Revit file into 3ds MAX (Figure 251)
Figure 251 Import Revit into 3ds MAX
Attach photometric data file (Figure 252-253)
Figure 252Attach photometric data file Figure 253 Select the modified IES file
Setting up VR environment (Figure 254-268)
Figure 254 Create a new Unreal file Figure 255 Default starting interface
102
Figure 256 Default blueprint shooting coding
Figure 257 Deactivate blueprint shooting coding
Figure 258 Default gun geometry Figure 259 Delete gun geometry
103
Figure 260 Default crosshair coding Figure 261 Deactivate the crosshair coding
Figure 262 VR mode without guns and crosshair
Figure 263 Create new scene map
Figure 264 Drag sky light and directional light into the scene
104
Figure 265 Import FBX geometries Figure 266 Organize geometries
Figure 267 Re-edit photometric data
105
Figure 268 Re-edit materials in Unreal Engine 4
c) Visualizing
Setting up VR hardware (Figure 269)
Figure 269 Setting up VR hardware
Visualizing in Unreal Engine 4 (Figure 270-271)
Figure 270 Visualize in Unreal Engine 4 Figure 271 Visualize in Unreal Engine 4
106
4.3 Chapter Summary
Based on the case study and workflow descriptions, all eight workflows AGi32-Mobile VR station, Elumtools-Mobile
VR station, 3ds MAX-Mobile VR station, Revit-vCAD, Revit-Fuzor (phone-based), Revit-Fuzor (computer-based),
Revit-Enscape, and 3ds MAX-Unreal Engine 4 were demonstrated using a more complex room. From the detailed
workflow process and the VR result, each of the workflows will be ranked with the criteria of interactivity,
visualization, compatibility, and accuracy in Chapter 5.
107
CHAPTER 5: RESULTS ANALYSIS & DEVELOPING NEW WORKFLOWS
Based on the results shown in chapter 4, criteria of cost, interactivity, visualization and compatibility are introduced
to evaluate workflows of AGI-Mobile VR Station, Elumtools-Moble VR Station, 3ds MAX- Moble VR Station, Revit-
vCAD, Revit-Fuzor phone based, Revit-Fuzor computer based, Revit-Enscape and 3ds MAX-Unreal Engine 4. Per
the evaluation analysis results, one new phone-based workflow and one new computer-based workflow are developed
(Figure 272).
Figure 272 Chapter 5 Scope
5.1 Workflows Evaluation
In order to evaluate a VR workflow, potential criteria could be to evaluate cost, interactivity, visualization,
compatibility, and accuracy. It is necessary for people to know how what is required to explore VR in the design
process, how well they could receive feedback from a virtual world, what they could visualize in VR and whether the
VR platform is compatible with commonly used modeling platforms.
5.1.1 Cost evaluation
Per the purchase price on the official website of each software and hardware, the cost for different workflows ranges
from $910 to $5299 (Table 2)
Table 2 VR workflow cost
Simulation software fee VR software fee Google Cardboard HTC Vive Yearly total cost
AGI - Mobile VR Station $895 $0 $15 ------ $910
Revit-Elumtools - Mobile VR Station $2000/year + $849 $0 $15 ------ $2,910
3ds MAX - Cardboard Camera $2,000/year + $1,470/year $0 $15 ------ $1,485
Revit - vCAD $2,000/year $10/month $15 ------ $2,135
Revit - Fuzor (phone based) $2,000/year $2,500/year $15 ------ $4,515
Revit – Fuzor (computer based) $2,000/year $2,500/year ------ $799 $5,299
Revit - Enscape $2,000/year $679/year ------ $799 $3,478
3ds MAX – Unreal Engine 4 $2,000/year + $1,470/year $0 ------ $799 $2,269
108
Considering all the software and hardware involved in VR workflows, Reivt-Fuzor costs most, and AGI32-Mobile
VR Station costs least (Figure 273).
Figure 273 Total cost rank
In most design firms, Revit has already been purchased and is already used in their daily workflows, Revit is thus not
extra investment for VR. So, the cost for each workflow should eliminate the Revit fee, especially in design firms
(Table 3).
Table 3 VR workflow cost without Revit fee
Simulation software fee VR software fee Google Cardboard HTC Vive Yearly total cost
AGI - Mobile VR Station $895 $0 $15 ------ $910
Revit-Elumtools - Mobile VR Station $849 $0 $15 ------ $864
3ds MAX - Cardboard Camera $1,470/year $0 $15 ------ $1,485
Revit - vCAD $0 $10/month $15 ------ $135
Revit - Fuzor (phone based) $0 $2,500/year $15 ------ $2,515
Revit – Fuzor (computer based) $0 $2,500/year ------ $799 $3,299
Revit - Enscape $0 $679/year ------ $799 $1,478
3ds MAX – Unreal Engine 4 $1,470/year $0 ------ $799 $2,269
Without the Revit fee, Revit-Fuzor still costs most, and Revit-vCAD costs the least (Figure 274).
Figure 274 VR workflow cost without Revit fee
$0
$1,000
$2,000
$3,000
$4,000
$5,000
$6,000
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone
based)
Revit – Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal
Engine 4
$0
$500
$1,000
$1,500
$2,000
$2,500
$3,000
$3,500
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Cardboard
Camera
Revit - vCAD Revit - Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal
Engine 4
109
5.1.2 Interactivity evaluation
Interactivity includes four classes: interactive stereo image (observing), interactive stereo panorama (looking around),
motion tracking walkthrough (looking around + acting in), and scene-modifiable motion tracking walkthrough (acting
and feedback) (Tables 4 and Figure 275). A software workflow could earn up varying numbers of “points” from two
to eight, based on the highest level of interactivity it receives.
Table 4 Interactivity evaluation
Interactive stereo image
(2 points)
Interactive
stereo panorama
(4 points)
Motion tracking
walkthrough
(6 points)
Modifiable
walkthrough
(8 points)
Total
points
AGI - Mobile VR Station √ 2
Revit-Elumtools - Mobile VR Station √ 2
3ds MAX - Mobile VR Station √ 4
Revit - vCAD √ 6
Revit - Fuzor (phone based) √ 6
Revit – Fuzor (computer based) √ 8
Revit - Enscape √ 6
3ds MAX – Unreal Engine 4 √ 8
Figure 275 Interactivity rank graph
5.1.3 VR visualization evaluation
Visualization scores how lighting information is portrayed in the VR environment including eight main characteristics:
walkthrough VR rendered models, luminaire-based photorealistic renderings, illuminance values, illuminance
contours, false color renderings, luminaire source photometric, VR interactive 2D navigational map and real-time light
changing effects (Table 5 and Figure 276). These terms are defined in Chapter 1-2.
Table 5 Visualization evaluation
Workflows
Characteristics
1 point for each
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit -
Enscape
3ds MAX
– Unreal
Engine 4
Walkthrough VR
rendered models
---
---
--- √ √ √ √ √
Customized luminaire
renderings
√ √ √ --- --- √ √ √
Illuminance values √ √ --- --- --- --- --- ---
Illuminance contours √ √ --- --- --- --- --- ---
False color renderings √ √ --- --- --- --- --- ---
Luminaire source
photometric
√ --- --- --- --- √ --- ---
VR Interactive 2D
navigational map
--- --- --- --- --- √ √ ---
Real-time lighting
changing
--- --- --- --- --- √ --- ---
Total points 5 4 1 1 1 5 3 2
0
1
2
3
4
5
6
7
8
9
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit - Enscape 3ds MAX –
Unreal Engine
4
110
Figure 276 Visualization rank graph
5.1.4 Compatibility evaluation
Compatibility includes four main characteristics including geometry, luminaire source files (fixture IES file), material
properties and workflow inner compatibility. Each of the characteristics is considered equally important and given 2
points each (Table 6). For example, if a workflow could read Revit or Auto CAD geometry and there is no need for
extra edition, the workflow could get 2 points in the characteristic of geometry. It is important to note that individual
evaluators can decide to give categories different point values and that each category (like compatibility) can be given
its own weighting factor also. This is just one method of evaluating the workflows.
Table 6 Compatibility evaluation
Workflows
Characteristics
(point)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
(Trial
version)
3ds MAX
– Unreal
Engine 4
Compatible with Revit
CAD geometry (1)
√ √ √ √ √ √ √ √
No need for re-editing
geometry (1)
--- √ √ √ √ √ √ √
Compatible with Revit
materials (1)
--- √ √ √ √ √ √ √
No need for re-editing
materials (1)
--- √ --- √ √ √ √ ---
Compatible with IES
data files (1)
√ √ √ --- --- √ √ √
No need for re-editing
IES data files (1)
√ --- √ --- --- --- --- √
VR inner workflow
compatibility (2)
--- --- √ √ √ √ √ √
Total points 3 5 7 6 6 7 7 7
According to the compatibility evaluation, 3ds MAX-Mobile VR Station, Revit-Fuzor, Revit-Enscae and 3ds MAX
Unreal Engine 4 have the highest score at 7 points, while AGI-Mobile VR Station has the lowest score at 3 points
(Figure 277).
0
1
2
3
4
5
6
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal Engine
4
111
Figure 277 Compatibility rank graph
5.1.5 Benchmarking accuracy evaluation
Different from visualization, accuracy evaluation takes AGi32 as a benchmark to see how close the lighting simulation
in one workflow could achieve to the results from AGI-Mobile VR Station (Table 7 and Figure 278). If the simulation
effect is similar to the result from AGi32, the workflow receives two points in that criterion.
Illuminance criterion is defined based on both illuminance value report and the lighting distribution effect in the
rendering environment. For example, AGi32 could use the IES (photometric web file) for rendering and generate an
illuminance value report. And Elumtools workflow could also read the IES file and visualize illuminance value.
Elumtools workflow has the same function in illuminance criterion. So Elumtools workflow gets 2 points. Revit-
vCAD workflow could not read the customized IES file and it doesn’t generate the illuminance value. So, Revit-
vCAD gets zero point in this criterion.
Color effect, reflection effect and texture effect are evaluated based on the simulation algorithms discussed in chapter
2 and rendering results from the case study. If the simulation algorithm from one workflow is similar to AGI workflow,
and the rendering results appear similar to the results from AGI workflow, the workflow should get 2 points for each
category. All the characteristics are scored by professionals from design field.
Table 7 Accuracy evaluation
Workflows
Characteristics
(point)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Illuminance (2) 2 2 1 0 0 1 1 1
Color effect (2) 2 2 2 1 1 1 2 2
Reflection effect (2) 2 2 2 0 0 1 1 1
Texture effect (2) 2 2 1 1 1 1 1 1
Total points 8 8 6 2 2 4 5 5
0
1
2
3
4
5
6
7
8
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal Engine
4
112
Figure 278 Accuracy rank graph
According to the simulation results, Elumtools workflow is closest to AGI workflow and Revit-Vcad and Revit-Fuzor
phone-based have the greatest differeces with AGi workflow.
From subjective judgment, texture effect is an exception, and it is obvious for people to tell whether the simulation
results show the texture or not. AGI workflow doesn’t show the texture effect very well. So, it is set 1 point as default,
and the workflow that present with better result could get higher point (Table 8 and Figure 279).
Table 8 Accuracy evaluation with subjective judgment
Workflows
Characteristics
(point)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Illuminance value (2) 2 2 1 0 0 1 1 1
Color effect (2) 2 2 2 1 1 1 2 2
Reflection effect (2) 2 2 2 0 0 1 1 1
Texture effect (2) 1 1 2 2 2 2 2 2
Total points 7 7 7 3 3 5 6 6
Figure 279 Subjective accuracy evaluation
0
1
2
3
4
5
6
7
8
9
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal Engine
4
0
1
2
3
4
5
6
7
8
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit - Enscape 3ds MAX –
Unreal Engine
4
113
Per the subjective judgment, AGI, Elumtools, 3ds MAX to Mobile VR Station are in the similarly highest accuracy
level, while Revit-VCAD and Revit-Fuzor (phone based) get the lowest score in accuracy evaluation.
5.1.6 User based evaluation
User based evaluation is based from the perspective of either lighting designers or clients. For the purpose of these
examples, it is assumed that lighting designers consider accuracy and compatibility as the most important criteria in
the real practices, while the clients think that interactivity and visualization are more helpful for them to better review
the projects in VR. A third option was also considered.
1) Lighting designers
From the perspective of lighting designers, compatibility and accuracy is assume to be more important. As mentioned
previously, other users can create their own weights for the evaluation matrices. Make the weight of accuracy and
compatibility twice as interactivity and visualization to evaluate which workflow is better for lighting designers (Table
9 and Figure 280).
Table 9 Lighting designer’s workflow evaluation
Workflows
Characteristics
(weight)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Interactivity (1)
2 2 4 6 6 8 6 8
Visualization (1)
5 4 1 1 1 5 3 2
Compatibility (2)
3 5 7 6 6 7 7 7
Accuracy (2)
7 7 7 3 3 5 6 6
Total points 27 30 33 25 25 37 35 36
Figure 280 Lighting designer’s workflow evaluation rank graph
Results show that Revit-Fuzor workflow is the best among the chosen workflows from the perspective of lighting
designers. And 3ds MAX – Unreal Engine 4 is the second best with one point less than Revit-Fuzor.
2) Clients
For clients, interactivity and visualization were assumed to be more important for them to understand the project.
Make the weight of interactivity and visualization twice as compatibility and accuracy (Table 10 and Figure 281).
0
5
10
15
20
25
30
35
40
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal Engine
4
114
Table 10 Client’s workflow evaluation
Workflows
Characteristics
(weight)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Interactivity (2)
2 2 4 6 6 8 6 8
Visualization (2)
5 4 1 1 1 5 3 2
Compatibility (1)
3 5 7 6 6 7 7 7
Accuracy (1)
7 7 7 3 3 5 6 6
Total points 24 24 24 23 23 38 31 33
Figure 281 Client's workflow evaluation rank graph
Results show that Revit-Fuzor workflow is the best among the chosen workflows based on the sample evaluation
metrics.
3) Another option
Gideon Susman from BuroHappold Consulting Engineers, at a review of this project, said that only visualization and
accuracy was important for him, with accuracy three times as important as visualization. The results for his evaluation
would be different (Table 11 and Figure 282).
Table 11 Other workflow evaluation
Workflows
Characteristics
(weight)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Interactivity (0)
2 2 4 6 6 8 6 8
Visualization (1)
5 4 1 1 1 5 3 2
Compatibility (0)
3 5 7 6 6 7 7 7
Accuracy (3)
7 7 7 3 3 5 6 6
Total points 26 25 22 10 10 20 21 20
0
5
10
15
20
25
30
35
40
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal Engine
4
115
Figure 282 Another workflow evaluation rank graph
Results show that AGI-Mobile VR Station becomes the best workflow with extreme weights on accuracy and
visualization. And workflows of Fuzor, Enscape and Unreal Engine 4 score similarly in this option.
5.1.7 Design based evaluation
As mentioned previously, other users might have different weighting factors for what is important for them. These are
just examples.
1) Schematic design phase
During the schematic deisgn phase, interactivity and possibility of modifying the projects are the most important ,and
because there are more chance to change the design ideas (Table 12).
Table 12 Schematic design phase evaluation
Workflows
Characteristics
(weight)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Interactivity (2)
2 2 4 6 6 8 6 8
Visualization (1)
5 4 1 1 1 5 3 2
Compatibility (1)
3 5 7 6 6 7 7 7
Accuracy (1)
7 7 7 3 3 5 6 6
Total points 19 20 23 22 22 33 28 31
2) Design development phase
When the main idea is almost settled down, it is more important to provide more visual information for desginers to
check the projects. So in design development phase, visualization is the most important criteria (Table 13).
Table 13 Design development phase evaluation
Workflows
Characteristics
(weight)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Interactivity (1)
2 2 4 6 6 8 6 8
Visualization (2)
5 4 1 1 1 5 3 2
Compatibility (1)
3 5 7 6 6 7 7 7
Accuracy (1)
7 7 7 3 3 5 6 6
Total points 22 22 20 17 17 30 25 25
0
5
10
15
20
25
30
AGI - Mobile
VR Station
Elumtools -
Mobile VR
Station
3ds MAX -
Mobile VR
Station
Revit - vCAD Revit - Fuzor
(phone based)
Revit – Fuzor
(computer
based)
Revit -
Enscape
3ds MAX –
Unreal Engine
4
116
2) Contract document phase
When it comes to contact document phase, every drawing and design details should be correct, so accuracy is the most
import part in VR aided design to help deisgners to determine detailed mistakes (Table 14).
Table 14 Contract document phase evaluation
Workflows
Characteristics
(weight)
AGI -
Mobile VR
Station
Revit-
Elumtools -
Mobile VR
Station
3ds MAX –
Mobile VR
Station
Revit -
vCAD
Revit -
Fuzor
(phone
based)
Revit –
Fuzor
(computer
based)
Revit –
Enscape
3ds MAX
– Unreal
Engine 4
Interactivity (1)
2 2 4 6 6 8 6 8
Visualization (1)
5 4 1 1 1 5 3 2
Compatibility (1)
3 5 7 6 6 7 7 7
Accuracy (2)
7 7 7 3 3 5 6 6
Total points 24 25 26 19 19 30 28 29
5.2 Develop New Workflows
Two additional workflows were developed based on the selected eight workflows. According to the evaluation, one
phone-based workflow, Revit-VCAD was chosen because its high interactivity and compatibility as well as low cost.
Also, one computer-based workflow, 3ds MAX-Unreal Engine 4 ws chosen, because it has high potentiality for further
development in aspect of visualization and interactivity.
The two new workflows were developed based on the vCAD workflow and Unreal Engine 4 workflow described in
Chapter 3. Different from the general workflows, the new workflows needed more editting during the process of
preparing AGi32 calculation results, transforming the results into 3D geometry, and importing 3D geometry of the
lluminance values into modeling platforms. This is because the completed VR environment includes 3D text for the
illuminance values.
1) Generating AGi32 calculation results (Figure 283-284).
Figure 283 Calculate lighting in AGi32
117
Figure 284 Calculation results from AGi32
2) Transforming results into 3D geometry (Figures 285 – 291)
Figure 285 Create text 3D geometry
Figure 286 Create calculation points
118
Figure 287 Use Dynamo to create Excel sheet
Figure 288 Rewrite the Excel sheet with AGi32 calculation results
Figure 289 Use Dynamo to read the new Excel sheet to change the text of geometry
119
Figure 290 Change the 3D geometry text
Figure 291 Check the 3D geometry
3) Importing 3D number geometry into modeling platforms
When the new geometry is created, it is ready to import the geometry to the selected workflows for further
development.
5.2.1 Phone-based VR workflow platforms selection
AGi32, Excel, Revit Dynamo, Revit and Revit-vCAD are the selected software for phone-based VR workflow
development. AGi32 helps create the illuminance value reports and Dynamo works as a media to transform
illuminance data to Revit geometry in order to visualize in vCAD.
5.2.2 Phone-based VR workflow development
The new phone-based VR workflow follows the same steps of Revit-VCAD workflow except the the simulation results
tranformed from numbers to 3D geometry. After the simulation data is transformed, the only additional process is to
import the 3D geometry of illuminnce values into the Revit platform (Figure 292-293).
120
Figure 292 View the illuminance level in vCAD
Figure 293 View the illuminance level in vCAD
5.2.3 Computer-based VR workflow platforms selection
AGi32, Excel, Revit Dynamo, Revit, 3ds MAX and Unreal Engine 4 are the selected software for phone-based VR
workflow development. AGi32 helps create the illuminance value reports and Dynamo works as a media to transform
illuminance data to Revit geometry in order to visualize in Unreal Engine, 4.
5.2.4 Computer-based VR workflow development
The new computer-based VR workflow follows the same steps of 3ds MAX-Unreal Engine 4 workflow excepg the
the simulation data tranformed from numbers to 3D geometry. After the simulation data is transformed, the only
additional step is to import the 3D geometry into 3ds MAX (Figure 294-296).
121
Figure 294 Visualize illuminance value in Unreal Engine 4
Figure 295 Visualize illuminance value in Unreal Engine 4
122
Figure 296 Visualize illuminance value in Unreal Engine 4
5.2.5 Workflows evaluation
Table 15 Workflow overall evaluation
Interactivity Visualization Compatibility Accuracy Total points
AGI - Mobile VR Station 2 5 3 7 17
Revit-Elumtools - Mobile VR Station 2 4 5 7 18
3ds MAX - Mobile VR Station 4 1 7 7 19
Revit - vCAD 6 1 6 3 16
Revit - Fuzor (phone based) 6 1 6 3 16
Revit – Fuzor (computer based) 8 5 7 5 25
Revit - Enscape 6 3 7 6 22
3ds MAX – Unreal Engine 4 8 2 7 6 23
New phone-based workflow 6 2 6 4 18
New computer-based workflow 8 3 7 7 25
5.3 Chapter Summary
Ten workflows including two new workflows, AGI - Mobile VR Station, 3ds MAX - Mobile VR Station, Revit - Vcad,
Revit - Fuzor (phone based), Revit - Fuzor (computer based), Revit - Enscape, 3ds MAX - Unreal Engine 4, new
phone-based workflow, and new computer-based workflow were evaluated by criteria of cost, interactivity,
visualization, compatibility, and accuracy. Each of the criteria was considered seperately for workflow evaluation.
When it comes to user-based and design-based evaluation, each of the criteria was considered with different assumed
user priorities, and given different weights for evaluation.
123
CHAPTER 6: CONCLUSIONS AND FUTURE WORK
VR is very beneficial to designers, owners, and engineers for visualizing design projects, but it might also frustrate
them on how to choose the appropriate workflow and the platforms from so many choices. Based on the case study,
workflow description and evaluation, eight selected workflows, including AGI-Mobile VR Station, Elmtools-Mobile
VR Station, 3ds MAX- Mobile VR Station, Revit-vCAD, Revit - Fuzor (phone based and computer based), Revit-
Enscape, 3ds MAX-Unreal Engine 4 together with two new proposed workflows were developed and ranked by the
criteria of cost, interactivity, visualization, compatibility and accuracy. The preference of the workflow could be
different depending on the users and design process.
The possibility of more evaluation methods, such as a physical testing, surveys, and game engine evaluation together
with new a category of efficiency evaluation is discussed in the future work section. The improvement of interactivity,
visualization and accuracy of software is described. The opportunities of reducing VR motion sickness, developing
more workflows and improving hardware are also mentioned in this section.
6.1 VR Aided Lighting Design
Traditional lighting design follows a linear workflow of modeling, simulation and visualization. When it comes to
real practice, this linear strategy is time-consuming and generates a lot of repeating workload. For example, to choose
the appropriate fixtures, lighting designers should do a lighting calculation through modeling, simulation, and
visualization. In order to present the design outcomes to clients, lighting designers may create a new model in the
other workflow through modeling and simulating and processing the rendering images. If the clients need to see some
other design options, designers would re-edit the model and do the simulation again and re-process the renderings.
In the case of VR aided lighting design, instead of setting up the new model and re-processing, the process of
renderings, modeling, simulation, visualizing and modifying are within one loop. And whenener there is some change
of the project, lighting designers would just need to modify the changing items and updating the design parameters,
to achieve the design requirement (Figure 297).
Figure 297 VR aided lighting design
During the schematic deisgn phase, interactivity and the possibility of modifying the project is the most important,
124
because there are more chances to change the design ideas. When the main idea is almost settled, it is more important
to provide more visual information for desginers to check the projects. So in the design development phase,
visualization is the most important criteria. When it comes to the contract document phase, every drawing and design
detail should be correct, so accuracy is the most important part in VR aided design to help designers to determine
detailed mistakes. Statements like these (if true) could help to decide weighting factors for evaluation of the workflows.
6.2 Evaluation Metrics
The pricing cost criteria is judged on the dollar cost of the software involved in each of the workflows. Cheaper is
better. The Revit fee is exempted for evaluation, because most design firms have already employed Revit, and it should
not be counted as an extra fee for VR workflow.
Interactivity refers to the level of user engagement with the software. According to the workflows, it was divided into
four classes ranked from lower level to higher level: interactive stereo image (observing), interactive stereo panorama
(looking around), motion tracking walkthrough (looking around + acting in), and scene-modifiable motion tracking
walkthrough (acting and feedback). Each level of interactivity is set with different points: interactive stereo image-2
points, interactive stereo panorama-4 points, motion tracking walkthrough-6 points, and scene-modifiable motion
tracking walkthrough-8 points. Each workflow was awarded points based on the highest level of interactivity that it
received.
Visualization scores how lighting information is portrayed in the VR environment including eight main characteristics:
walkthrough VR rendered models, luminaire-based photorealistic renderings, illuminance values, illuminance
contours, false color renderings, luminaire source photometric, VR interactive 2D navigational map and real-time light
changing effects. The character of illuminance values may be more important for lighting designers than VR
interactive 2D navigational map during the design process, but it is hard to measure how much more important or
whether one is more important than the other for visualization. So, to make the evaluation simple, each of the eight
visualization characteristics is considered equally important. Each is set with one point. The more characteristics the
workflow could achieve, the more points it gets in the evaluation. The score ranges from zero to eight.
Compatibility criteria describes how well data could be imported or exported from one program to the other including
geometry, luminaire source files (fixture IES file), material properties and workflow inner compatibility (compatibility
within workflows between modeling software and VR software). If a VR workflow could read the geometry and
materials directly from Revit and there is no need for re-editing, the workflow could get 2 points with these two
characteristics. If a workflow could read fixture IES file directly and there is no need for re-editing, the workflow is
scored up with two more points. If a workflow could read material from Revit directly and there is no need for re-
editing, the workflow could get two more points. Within a workflow, if there is no extra process to take the project
from modeling software to VR software, the workflow could get two more points. Characteristics of geometry,
material, luminaire source and workflow inner compatibility are considered equally important for compatibility
evaluation.
Accuracy was validated by comparing with a benchmark, AGi32. Accuracy is evaluated with criteria of illuminance
effect, reflection effect, color effect and texture effect. Illuminance effect is defined by the ability of generating
illuminance values and creating the lighting environment based on the customized lighting IES file. Simulation
algorithms and rendering results from the different software programs were used to evaluate reflection effect, color
effect and texture effect from the perspective of the professional.
6.3 Workflow Evaluation Summary
It is beneficial to provide people with the ranking results, helping them to choose the approperiate workflow based on
different criteria. For the clients who wish to achieve more feedback and information from VR platforms, interactivity
and visualizaion are the more important criteria (Figure 298).
125
Figure 298 Workflow and evaluation
6.3.1 Cost
Workflows ranked with the Revit fee from lower cost to higher cost are AGI - Mobile VR Station, Revit - vCAD,
Elumtools - Mobile VR Station, 3ds MAX - Mobile VR Station, Revit - Enscap, 3ds MAX - Unreal Engine 4, Revit -
Fuzor (phone based), Revit - Fuzor (computer based).
Workflows ranked without the Revit fee from lower cost to higher cost are Revit - vCAD, Elumtools - Mobile VR
Station, AGI - Mobile VR Station, Revit - Enscape, 3ds MAX - Mobile VR Station, 3ds MAX - Unreal Engine 4,
Revit - Fuzor (phone based), Revit - Fuzor (computer based).
6.3.2 Interactivity
Workflows ranked from higher interactivity to lower interactivity are 3ds MAX - Unreal Engine 4, Revit - Fuzor
(computer based), Revit - Enscape, Revit - Fuzor (phone based), Revit - vCAD, 3ds MAX - Mobile VR Station, AGI
- Mobile VR Station, Elumtools - Mobile VR Station.
Among the workflows, 3ds MAX - Unreal Engine 4 and Revit - Fuzor (computer based) have the same interactivity.
Revit - Enscape, Revit - Fuzor (phone based) and Revit – Vcad have the same interactivity. AGI - Mobile VR Station
and Elumtools - Mobile VR Station have the same interactivity.
6.3.3 Visualization
Workflows ranked from higher visualization to lower visualziation are Revit - Fuzor (computer based), AGI - Mobile
VR Station, Elumtools - Mobile VR Station, Revit - Enscape, 3ds MAX - Unreal Engine 4, Revit - Fuzor (phone
based), Revit - vCAD, 3ds MAX - Mobile VR Station.
Among the workflows, Revit - Fuzor (computer based) and AGI - Mobile VR Station are at the same level of
visualization. Revit - Fuzor (phone based), Revit – vCAD and 3ds MAX - Mobile VR Station have the same level of
visualization.
6.3.4 Compatibility
Workflows ranked from higher compatibility to lower comatibiity are Revit - Fuzor (computer based), Revit - Enscape,
3ds MAX - Unreal Engine 4, 3ds MAX - Mobile VR Station, Revit - Fuzor (phone based), Revit - vCAD, Elumtools
- Mobile VR Station, AGI - Mobile VR Station.
126
Revit - Fuzor (computer based), Revit - Enscape, 3ds MAX - Unreal Engine 4 and 3ds MAX - Mobile VR Station
have the similar compatibility. Revit - Fuzor (phone based), Revit – vCAD have the similar compatibility.
6.3.5 Accuracy to the benchmark
Setting AGI workflow as a benchmark, workflows ranked from closest simulation results to most different simulation
results comparing with AGI workflow are Elumtools - Mobile VR Station, 3ds MAX - Mobile VR Station, Revit -
Enscape, 3ds MAX - Unreal Engine 4, Revit - Fuzor (computer based), Revit - Fuzor (phone based), Revit – vCAD.
Revit - Enscape, 3ds MAX - Unreal Engine 4 have the similar accuracy and Revit - Fuzor (phone based), Revit – Vcad
have the similar accuracy.
According to the subjective evaluation, workflows ranked from highest accuracy to lowest accuracy are AGI - Mobile
VR Station, Elumtools - Mobile VR Station, 3ds MAX - Mobile VR Station, Revit - Enscape, 3ds MAX - Unreal
Engine 4, Revit - Fuzor (computer based), Revit - Fuzor (phone based), Revit – vCAD.
AGI - Mobile VR Station, Elumtools - Mobile VR Station, 3ds MAX - Mobile VR Station have the similar accuracy.
Revit - Enscape, 3ds MAX - Unreal Engine 4 have the similar accuracy. And Revit - Fuzor (phone based), Revit –
vCAD have the similar accuracy.
6.4 Proposed New Workflows
Both the new phone-based workflow and the new computer-based workflow achieve illuminance value visualization,
which improves the criteria of visualization.
AGi32, Excel, Revit Dynamo, Revit and Revit-vCAD is the cheapest workflow that provides illuminance value,
walkthrough rendered models, and high compatibility (Figure 299).
Figure 299 New phone-based workflow
AGi32, Excel, Revit Dynamo, Revit, 3ds MAX and Unreal Engine 4 workflow is an overall best workflow among the
ten selected workflows. It is scored the highest in criteria of interactivity, compatibility, and accuracy, and also the
visualization is improved by adding the feature of illuminance value visualization (Figure 300).
127
Figure 300 New computer-based workflow
6.5 Future Work
There are aspects where the methodology and research could be improved with future work including improving
existing workflows on simulation results, interactivity, and adding more visualization features. Also, it is possible to
achieve dimming and fixture modification functions in real-time VR mode. More characteristics and criteria could be
considered for further evaluation. Other categories of future work include the evaluation methods, software
improvement, improved VR environment, workflow development, and hardware improvement.
6.5.1 Evaluation methods
More evaluation methods could be considered to compare VR workflows with new evaluation criteria, such as more
objective metrics, physical tests, and surveys.
Physical test
A physical test would be very beneficial to validate the accuracy of software. It is possible to find a room of the same
dimension with the reference box mentioned in Chapter 3 (Figure xx). People could measure the illuminance value in
the real world with the same lighting fixture and compare the value results with software simulation results (Figure
301-302).
Figure 301 Reference box Figure 302 Lumenpulse fixture
Game engine evaluation
128
Unity 3D, Unreal Engine 4, and Autodesk Stingray could be evaluated in aspects of cost, interactivity, visualization,
compatibility, and accuracy. More evaluation categories could be included, and better methods of objectifying the
evaluation criteria could be applied.
Efficiency evaluation
Workflow efficiency could be evaluated by cost of time and complexity. The more time one workflow costs to achieve
final VR visualization, the less efficient it is. If one workflow is harder and more complex for people to master, it
means that the workflow is less efficient. Efficiency evaluation could be tested by the students in the same grade with
similar academic background or designers with similar working experience.
Surveys
It is possible to make a survey for professionals in the architectural industry to collect people’s opinions of each
workflow. The evaluation categories such as interactivity, compatibility, visualization, accuracy could be included in
the survey. How is the efficiency of the workflow and how easy it is to master a workflow could also be included.
6.5.2 Software improvement
Interactivity
It is possible to add new features such as dimming the light and changing fixtures in Unreal Engine 4. There is a light
intensity modification function in Unreal Engine 4, and it is possible to connect the light intensity command with
interactivity codlings. People could walk through the project virtually and hit the switch button in order to control real
time light level (Figure 303-304).
Figure 303 Lighting intensity commend
Figure 304 Coding in Unreal Engine 4
129
More visualization opportunities
Among the chosen workflows, only AGi32, Elumtools, and Fuzor could generate illuminance values. The extra editing
method makes the two new workflows show the illuminance values based on AGi32 calculation results, but they could
not generate illuminance values independently. It would be beneficial for people to visualize illuminance values
together with false color renderings and illuminance (isolux) contours.
Accuracy validation
One of the efficient ways to improve software accuracy is to improve the lighting simulation algorithm such as
raytracing, radiosity or customized combination of algorithms. Accuracy could be validated by comparing with a
physical model, the benchmark or both. Brightness, lighting contrast and pixels from different goggles, computer
displays and phone screens could also be considered as accuracy evaluation criteria. A more in depth study should be
undertaken to compare the actual values and to compare the viewing capabilities of the goggles and to examine how
one might not only validate the values but also help to calibrate the views in VR mode so that the visualization is as
close to real world experience as possible.
6.5.3 Improve the VR environment
Similar to motion sickness, VR sickness occurs when visualization doesn’t match the body motion, and people may
have the feeling of dizziness, fatigue, and nausea. It is possible to adjust the view angle of the VR scene to reduce
sickness. There are also other possible methods to help people to synchronize the visualization and body motion
allowing people to move in the real world while visualizing in the virtual world without getting motion sickness.
6.5.4 Workflow development
More workflows could be developed such as Revit-Stingray, 3ds MAX-Unity 3D and Dynamo-Unreal Engine 4. A
better VR aided design workflow for lighting designers could be proposed. Also more interactive functions could be
added into new proposed workflows such as dimming and changing lighting fixtures in real time.
6.5.5 Hardware improvement
Processing speed
It is important to improve processing speed for rendering or calculation. Processing speed could be limited by the
simulation algorithms. Usually, the more sophisticated the algorithms a program uses, the more time it needs for
processing. Processing speed also relies on the hardware such as CPU (Central Processing Units), GPU (Graphics
Processing Unit), and RAM (Random Access Memory). Specialized video cards are recommended for a better VR
experience.
Devices
To experience real virtual reality, people need to wear headsets such as HTC Vive, Oculus Rift and Microsoft
Hololens. The aspects of portability, comfort, controllability, and display resolution could be improved.
Cost
Most of the VR devices cost over a hundred dollars and some of them even over one thousand dollars. The pricing is
not quite acceptable by individual users or small companies. It is important to improve the technology of
manufacturing to make VR devices more affordable for people.
6.6 Conclusion
A general VR workflow has been described and developed together with 8 specific workflows including AGI-Mobile
VR Station, Elumtools-Moble VR Station, 3ds MAX- Moble VR Station, Revit-vCAD, Revit-Fuzor phone based,
Revit-Fuzor computer based, Revit-Enscape and 3ds MAX-Unreal Engine 4. A case study of the reference building
was conducted, and all the chosen workflows were evaluated and ranked with sample evaluation metrics based on the
criteria of cost, interactivity, visualization, compatibility and accuracy. Two new proposed workflows were developed
based on the evaluation results, achieving the visualization of illuminance values in VR environment. The possible
improvement of the current study as well as future work regarding this topic was also discussed. Professionals in the
architectural industry, especially lighting designers could benefit from the description of VR workflows and workflow
evaluations to create lighting VR environments and choose proper software packages.
130
REFERENCE
Ashdown, Ian. "Colored light sources place new demands on lighting design software." Laser focus world 46, no. 10
(2010).
Autodesk: "Using Walkthrough Navigation." Autodesk Support & Learning. February 7, 2016. Accessed April 30,
2017. https://knowledge.autodesk.com/support/3ds-max/learn-
explore/caas/CloudHelp/cloudhelp/2016/ENU/3DSMax/files/GUID-5B06CB73-DEA7-465E-9C6F-
B8E7D454A7FB-htm.html.
Barbara. McKenna. "Talking technology: A Q&A with the inventor of virtual reality". UC Santa Cruz, (2000-01-10)
Betters, Elyse. "Google Cardboard Camera app: What you need to know about VR photos." Pocket-lint. Accessed
April 28, 2017. http://www.pocket-lint.com/news/136092-google-cardboard-camera-app-what-you-need-to-
know-about-vr-photos.
Bickford, Jen. "Lighting software tools." Architectural Lighting Magazine (2008).
Bowers, Brian. Sir Charles Wheatstone. London: Institution of Electrical Engineers, 2001.
Borson, Bob. "Schematic Design – This isn’t “Architecture”." Life of an Architect. November 09, 2016. Accessed
April 26, 2017. http://www.lifeofanarchitect.com/schematic-design-this-isnt-architecture/.
Branstetter, Gillian. "Cardboard is everything Google Glass never was." The Kernel. Accessed April 28, 2017.
http://kernelmag.dailydot.com/issue-sections/staff-editorials/13490/google-cardboard-review-plus/.
Bryan, Harvey, and S. Autif. "Lighting/daylighting analysis: a Comparison." In PROCEEDINGS OF THE SOLAR
CONFERENCE, pp. 521-526. AMERICAN SOLAR ENERGY SOCIETY; AMERICAN INSTITUTE OF
ARCHITECTS, 2002.
Brockwell, Holly. "Forgotten genius: the man who made a working VR machine in 1957." TechRadar. April 03, 2016.
Accessed April 30, 2017. http://www.techradar.com/news/wearables/forgotten-genius-the-man-who-made-a-
working-vr-machine-in-1957-1318253/2.
Ceconello, Mauro, and Davide Spallazzo. "Virtual reality for enhanced urban design." In 5th INTUITION
International Conference Proceedings. 2010.
Chan, Ying-Chieh, and Athanasios Tzempelikos. "A hybrid ray-tracing and radiosity method for calculating radiation
transport and illuminance distribution in spaces with venetian blinds." Solar energy 86, no. 11 (2012): 3109-3124.
Corgan VR, "Virtual Reality is Transforming How We Design – Corgan", Corgan website, Accessed April 26, 2017.
https://www.corgan.com/story/virtual-reality-transforming-design/
Creighton, Ryan Henson. Unity 3D game development by example: A seat-of-your-pants manual for building fun,
groovy little games quickly. Packt Publishing Ltd, 2010.
Davies, Chris, "This is Windows Holographic on HoloLens (and it looks insane)." SlashGear. Accessed April 28,
2017. https://www.slashgear.com/this-is-windows-holographic-on-hololens-and-it-looks-insane-29381414/.
Delta Light VR, "IMMERSE YOURSELF IN THE WORLD OF DELTA LIGHT." Delta Light VR. Accessed April
26, 2017. http://www.deltalight.com/vr/.
Decorilla VR "How to Preview Your Interior Design in Virtual Reality." Decorilla. August 1, 2016. Accessed April
26, 2017. https://www.decorilla.com/online-decorating/how-to-preview-your-interior-design-in-virtual-reality/
131
Desai, Parth Rajesh, Pooja Nikhil Desai, Komal Deepak Ajmera, and Khushbu Mehta. "A review paper on oculus rift-
a virtual reality headset." arXiv preprint arXiv:1408.1173 (2014).
DiLouie, Craig. Advanced lighting controls: energy savings, productivity, technology and applications. The Fairmont
Press, Inc., 2006.
Dutre, Philip, Philippe Bekaert, and Kavita Bala. Advanced global illumination. CRC Press, 2016.
Enscape. "Enscape™: Architectural real-time rendering plugin for Revit and SketchUp." Enscape3d. Accessed April
28, 2017. https://enscape3d.com/.
Fairs, Marcus . "Virtual reality architecture will be "more powerful than cocaine"." Dezeen. October 21, 2015.
Accessed April 28, 2017. https://www.dezeen.com/2015/04/27/virtual-reality-architecture-more-powerful-
cocaine-oculus-rift-ty-hedfan-olivier-demangel-ivr-nation/.
Fontenelle, Ciro Vidal. "The importance of lighting to the experience of architecture." Architecture Quality Issues,
KTH Royal Institute of Technology, Stockholm, Sweden (2008).
Fuller, Michael. "Mobile VR Station® on the App Store." App Store. March 27, 2017. Accessed April 28, 2017.
https://itunes.apple.com/us/app/mobile-vr-station/id959820493?mt=8.
Fuzor, "Interested in a Web Demo? Schedule a Demonstration." Fuzor - The Best VR for AEC! Accessed April 28,
2017. https://www.kalloctech.com/.
Gaudiosi, John. "How virtual reality is changing the construction industry." How virtual reality is changing the
construction industry | Fortune.com. August 26, 2015. Accessed April 26, 2017.
http://fortune.com/2015/08/25/mccarthy-construction-vr/.
Goradia, Ishan, Jheel Doshi, and Lakshmi Kurup. "A review paper on oculus rift & project morpheus." International
Journal of Current Engineering and Technology 4, no. 5 (2014): 3196-3200.
Grasshopper Website. "Ladybug Tools." Grasshopper. Accessed April 28, 2017.
http://www.grasshopper3d.com/group/ladybug.
Grau, Oliver. Virtual Art: from illusion to immersion. MIT press, 2003.
Gruber, William. "Gruber, William E." The Writers Directory, edited by Laura Avery, 33rd ed., vol. 2, St. James Press,
2015
Gunnarsson, Erik, and Magnus Olausson. "Implementing a render plugin for 3ds Max using the Autodesk RapidRT
path tracer." (2012).
Heilig, Mortor "Stereoscopic-television apparatus for individual use." U.S. Patent 2,955,156, issued October 4, 1960.
HMH, "Boulder Colorado Architects - HMH Architecture Interiors." HMH Architecture Interiors - Modern Architect
- Boulder, Colorado. Accessed April 26, 2017. http://hmhai.com/.
Hirata, Mitsugu, Hideki Gama, and Hajimu Nakamura. "Improvement of accuracy in lighting simulation by flux
transfer method." In Proceedings of 6th international IBPSA conference. 1999.
Honkonen, Christina . "Tour the Super Bowl Stadium in Virtual Reality with vCAD." Business Wire. January 24,
2017. Accessed April 28, 2017. http://www.businesswire.com/news/home/20170124005564/en/Tour-Super-
Bowl-Stadium-Virtual-Reality-vCAD.
IES, "Photometric & Optical Testing Services - Your Measure of Confidence in Independent Light Measurement."
Pro Lite Technology. Accessed April 28, 2017. http://www.photometrictesting.co.uk/.
132
Julian, Warren G, and Harold James Turner. "Lighting: basic concepts." Department of Architectural Science,
University of Sydney, 1983.
Kay, Douglas S., and Refraction Transparency. "Ray Tracing for Computer Synthesized Images." Master's Thesis,
Cornell University, Ithaca, NY (1979).
Kelly, Lloyd L. The Pilot Maker. New York: Grosset & Dunlap, 1979
Kensek, Karen, and Douglas Noble. Building information modeling: BIM in current and future practice. John Wiley
& Sons, 2014.
Kleckner, Stephen. "Autodesk’s Stingray may be a big threat to Unity and Unreal in the game-engine wars."
VentureBeat. August 03, 2015. Accessed April 28, 2017. https://venturebeat.com/2015/08/03/autodesks-stingray-
may-be-a-big-threat-to-unity-and-unreal-in-the-game-engine-wars/.
Kniss, Joe, Simon Premoze, Charles Hansen, Peter Shirley, and Allen McPherson. "A model for volume lighting and
modeling." IEEE Transactions on Visualization and Computer Graphics 9, no. 2 (2003): 150-162.
Kim, Monica. "The Good and the Bad of Escaping to Virtual Reality." The Atlantic. February 18, 2015. Accessed
April 28, 2017. https://www.theatlantic.com/health/archive/2015/02/the-good-and-the-bad-of-escaping-to-
virtual-reality/385134/.
Kiyokawa, Kiyoshi. "A wide field-of-view head mounted projective display using hyperbolic half-silvered mirrors."
In Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on, pp. 207-
210. IEEE, 2007.
Klinger, Marilyn, and Marianne Susong. "The construction project: phases, people, terms, paperwork, processes."
American Bar Association, 2006.
Knudsen, Steven from Gensler. Lecture at USC on March 2, 2017, showing a VR example for LAX, new mid-field
concourse.
Konnikova, Maria. "Virtual Reality Gets Real." The Atlantic. September 21, 2015. Accessed April 28, 2017.
https://www.theatlantic.com/magazine/archive/2015/10/virtual-reality-gets-real/403225/.
Liddell, Henry George, and Robert Scott. An intermediate Greek-English lexicon: founded upon the seventh edition
of Liddell and Scott's Greek-English lexicon. Harper & Brothers, 1896.
Labayrade, Raphaë l, Henrik Wann Jensen, and Claus Jensen. "Validation of Velux Daylight Visualizer 2 against CIE
171: 2006 test cases." In Proceedings 11th International IBPSA Conference, International Building Performance
Simulation Association, pp. 1506-1513. 2009.
Lamkin, Paul. "Phil Chen: A state of the VR nation with the HTC Vive founder." Wareable. November 17, 2016.
Accessed April 28, 2017. https://www.wareable.com/meet-the-boss/phil-chen-the-founder-of-the-htc-vive-on-
all-things-vr-3523.
Luebkeman, Chris, and Alvise Simondetti. "Practice 2006: Toolkit 2020." In Intelligent Computing in Engineering
and Architecture, pp. 437-454. Springer Berlin Heidelberg, 2006.
Mango, Jennifer . "The value of the post-occupancy evaluation." Laboratory Design News. June 13, 2016. Accessed
April 26, 2017. https://www.labdesignnews.com/article/2015/08/value-post-occupancy-evaluation.
McBride, Sarah. "With HoloLens, Microsoft aims to avoid Google's mistakes." Reuters. May 23, 2016. Accessed
April 28, 2017. http://www.reuters.com/article/us-microsoft-hololens-idUSKCN0YE1LZ.
133
McFarland, Jon. Mastering Autodesk VIZ 2008. John Wiley & Sons, 2007.
Moeck, Martin, and Steven E. Selkowitz. "A computer-based daylight systems design tool." Automation in
construction 5, no. 3 (1996): 193-209.
Mó l, Antô nio Carlos A., Carlos Alexandre F. Jorge, and Pedro M. Couto. "Using a game engine for VR simulations
in evacuation planning." IEEE computer graphics and applications 28, no. 3 (2008): 6-12.
Morris, J. R. W. "Accelerometry—a technique for the measurement of human body movements." Journal of
biomechanics 6, no. 6 (1973): 729IN17733-732736.
Microsoft. "Detail of light relfecting on lens of HoloLens." Microsoft HoloLens. Accessed April 28, 2017.
https://www.microsoft.com/en-us/hololens.
Nakamae, Eihachiro, Xueying Qin, and Katsumi Tadamura. "Rendering of landscapes for environmental
assessment." Landscape and Urban Planning 54, no. 1 (2001): 19-32.
Nield, David. "Oculus shows off the finished Rift headset (and Oculus Touch controllers), available in 2016." New
Atlas - New Technology & Science News. June 11, 2015. Accessed April 28, 2017. http://newatlas.com/oculus-
rift-consumer-version/37989/.
Pena, William M., and Steven A. Parshall. Problem seeking: An architectural programming primer. John Wiley &
Sons, 2012.
Perez, Sarah. "Google’s new iOS app Cardboard Camera lets you snap VR photos to share with friends." TechCrunch.
September 12, 2016. Accessed April 28, 2017. https://techcrunch.com/2016/09/12/googles-new-ios-app-
cardboard-camera-lets-you-snap-vr-photos-to-share-with-friends/.
Pimentel, Ken, and Kevin Teixeira. "Virtual reality through the new looking glass." Intel/Windcrest, (1993).
Plant, C. G. H., and D. W. Archer. "A computer model for lighting prediction." Building Science 8, no. 4 (1973): 351-
361
Plasencia, Diego Martinez. "XRDS: One step beyond virtual reality." XRDS, an ACM Publication. November 01,
2015. Accessed April 28, 2017. http://xrds.acm.org/article.cfm?aid=2809921.
QaQish, K Ra’Ed. "Exploiting Tools of Evaluation to Improve CAAD Teaching Methods." In Computer Aided
Architectural Design Futures 2001
Radsite, "Radiance - A Validated Lighting Simulation Tool." Radsite. Accessed April 28, 2017. https://www.radiance-
online.org/.
Reichardt, John E. "THE LIGHTING DESIGN PROCESS" Wiley & Wilson 2010
Rohit. "Top 3 Free VR Video Players for iPhone to Play SBS 3D & 360 Videos." TechnOrange. February 11, 2016.
Accessed April 28, 2017. http://www.technorange.com/2016/02/top-3-free-vr-video-players-for-iphone-to-play-
sbs-3d-360-videos/.
Ruggiero, Francesco, Rafael Serra Florensa, and Antonella Dimundo. "Re-interpretation of traditional architecture for
visual comfort." Building and Environment 44, no. 9 (2009): 1886-1891.
Ryckaert, Wouter R., Catherine Lootens, Jonas Geldof, and Peter Hanselaer. "Criteria for energy efficient lighting in
buildings." Energy and Buildings 42, no. 3 (2010): 341-347.
Ryer, Alex. Light measurement handbook. Newburyport, MA: International Light, 1997.
134
Samsung VR, "Samsung Gear VR with Controller." The Official Samsung Galaxy Site. Accessed April 28, 2017.
http://www.samsung.com/global/galaxy/gear-vr/.
Svendenius, Nils, and Peter Pertola. "Searching for useful lighting design software: Developing technical
specifications based on real needs." In Right Light Proceedings: the 3rd European Conference on Energy-
Efficient Lighting, pp. 19-24. 1995.
Shikder, Shariful. "Evaluation of four artificial lighting simulation tools with virtual building reference."
In Proceedings of the 2009 Summer Computer Simulation Conference, pp. 430-437. Society for Modeling &
Simulation International, 2009.
Sherman, W. and Craig, W. (2003). Understanding Virtual Reality. Interface, Application and Design. Morgan
Kaufmann Publishers.
Sgambelluri, Marcello. “ High-Tech Structural Engineering: Using New Technologies to Enhance Your Workflows.”
Autodesk University, 2016
Steffy, Gary. "Architectural lighting design". John Wiley & Sons, 2002.
Stelzer, Tim, and W. F. Long. "Automatic generation of tree level helicity amplitudes." Computer Physics
Communications 81, no. 3 (1994): 357-371.
Sutherland, Ivan. "SKETCHPAD-a man-machine graphical interface." PhD diss., PhD thesis, MIT, 1963.
University of Colorado Boulder, "Facilities Management." Home | Facilities Management | University of Colorado
Boulder. Accessed April 26, 2017. http://www.colorado.edu/fm/.
Unity 3D, "Game Engine." Unity. Accessed April 28, 2017. https://unity3d.com/.
Unreal Engine 4, "Make Something Unreal." What is Unreal Engine 4. Accessed April 28, 2017.
https://www.unrealengine.com/what-is-unreal-engine-4.
Ulbricht, Christiane, Alexander Wilkie, and Werner Purgathofer. "Verification of physically based rendering
algorithms." In Computer Graphics Forum, vol. 25, no. 2, pp. 237-255. Blackwell Publishing Ltd, 2006.
vCAD. "VCad App - CAD to Virtual Reality - Immersive CAD Visualization by ImmersaCAD." VCad - CAD to
Virtual Reality. Accessed April 28, 2017. http://www.vcad.com/.
"Virtual Reality." Virtual Reality. Accessed April 30, 2017. https://www.vrs.org.uk/.
Wang, Xiaolong, and Abhinav Gupta. "Generative image modeling using style and structure adversarial networks."
In European Conference on Computer Vision, pp. 318-335. Springer International Publishing, 2016.
Ward, Gregory J. "The RADIANCE lighting simulation and rendering system." In Proceedings of the 21st annual
conference on Computer graphics and interactive techniques, pp. 459-472. ACM, 1994.
Ward, Greg, and Rob Shakespeare. "Rendering with Radiance: the art and science of lighting visualization." (1998).
Werge, John. The evolution of photography. Arno Press, 1973.
Weinbaum, Marvin G. "Counterterrorism, Regional Security, and Pakistan’s Afghan Frontier." US House of
Representatives Armed Services Committee Hearing (2007).
Wlosinski, Brandon. "Virtual Reality in Practice: Q&A with Brandon Wlosinski, BNIM Manager of Virtual Design
and Construction." BNIM. November 04, 2016. Accessed April 26, 2017. http://www.bnim.com/blog/virtual-
reality-practice-qa-brandon-wlosinski-bnim-manager-virtual-design-and-construction.
135
Wilkie, Alexander, Andrea Weidlich, Marcus Magnor, and Alan Chalmers. "Predictive rendering." In ACM
SIGGRAPH ASIA 2009 Courses, p. 12. ACM, 2009.
Wu, Hao. "Virtual reality-improving the fidelity of architectural visualization." PhD diss., 2006.
136
BIBLIOGRAPHY
Alvarado, Rodrigo Garcia, and Tom Maver. "Virtual reality in architectural education: defining possibilities." Acadia
Quarterly 18, no. 4 (1999): 7-9.
Carroll, William L. "Daylighting simulation: methods, algorithms, and resources." Lawrence Berkeley National
Laboratory (1999).
Gouraud, Henri. Computer display of curved surfaces. No. UTEC-CSc-71-113. UTAH UNIV SALT LAKE CITY
COMPUTER SCIENCE DIV, 1971.
Guglielmetti, Rob, Dan Macumber, and Nicholas Long. "OpenStudio: an open source integrated analysis platform."
In Proceedings of the 12th conference of international building performance simulation association. 2011.
Hershberger, Robert G., and Robert C. Cass. "Predicting user responses to buildings." Man-Environment Interactions:
Evaluations and Applications (Part 2), Stroudsburg: Dowden, Hutchinson and Ross (1974).
Hitchcock, Robert J., and William L. Carroll. "DElight: A daylighting and electric lighting simulation
engine." International Building Performance Simulation Association, IBPSA BS (2003).
Jonathan, Steuer. "Defining virtual reality: Dimensions determining telepresence." Journal of communication 42, no.
4 (1992): 73-93.
Karlen, Mark, James R. Benya, and Christina Spangler. Lighting design basics. John Wiley & Sons, 2012.
Kieferle, Joachim, and Uwe Woessner. "BIM Interactive-About combining BIM and Virtual Reality-A Bidirectional
Interaction Method for BIM Models in Different Environments." (2015).
Lá nyi, Cecilia Sik. "Lighting in Virtual Reality.", JAMPAPER 1. /IV, 2009
McKay, Herbert C. "Three Dimensional Photography Principles Of Stereoscopy." (1951).
Meseth, Jan. "Towards predictive rendering in virtual reality." PhD diss., University of Bonn, 2008.
Ochoa, Carlos E., Myriam BC Aries, and Jan LM Hensen. "State of the art in lighting simulation for building science:
a literature review." Journal of Building Performance Simulation 5, no. 4 (2012): 209-233.
Reinhart, Christoph F., and Pierre-Felix Breton. "Experimental validation of 3ds Max® Design 2009 and Daysim
3.0." Building Simulation (2009).
Reinhart, Christoph, and Annegret Fitz. "Findings from a survey on the current use of daylight simulations in building
design." Energy and Buildings 38, no. 7 (2006): 824-835.
Rosenberg, Robin S., Shawnee L. Baughman, and Jeremy N. Bailenson. "Virtual superheroes: Using superpowers in
virtual reality to encourage prosocial behavior." PloS one 8, no. 1 (2013)
Roy, Geoffrey G. "A comparative study of lighting simulation packages suitable for use in architectural
design." School of Engineering, Murdoch University, Perth, Australia (2000).
Slusallek, Philipp, Marc Stamminger, Wolfgang Heidrich, J-C. Popp, and H-P. Seidel. "Composite lighting
simulations with lighting networks." IEEE Computer Graphics and Applications 18, no. 2 (1998): 22-31.
Souha, Tahrani, Jallouli Jihen, Moreau Guillaume, and Woloszyn Philippe. "Towards a Virtual Reality Tool for
Lighting Communication and Analysis in Urban Environments." (2005).
137
Ubbelohde, M. Susan, and Christian Humann. "Comparative evaluation of four daylighting software
programs." Proceedings of ACEE Summer Study on Energy Efficiency in Buildings, Pacific Grove, CA (1998): 23-28.
Wire, Business. "Leading Virtual Reality and Augmented Reality Company Marxent Wins 2016 Gold Edison Award
for Innovation in Virtual Reality." The Hitchhiker's Guide to Dayton Tech & Startups. April 29, 2016. Accessed
April 30, 2017. http://www.daytontechguide.com/in-the-news/2016/4/29/leading-virtual-reality-and-augmented-
reality-company-marxent-wins-2016-gold-edison-award-for-innovation-in-virtual-reality.
Winchip, Susan M. Fundamentals of lighting. Bloomsbury Publishing USA, 2017.
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
CFD visualization: a case study for using a building information modeling with virtual reality
PDF
Daylight prediction: an evaluation of daylighting simulation software for four cases
PDF
Building information modeling based design review and facility management: Virtual reality workflows and augmented reality experiment for healthcare project
PDF
Integrating non-visual effects of lighting in the evaluation of electrical lighting designs for commercial offices
PDF
Multi-domain assessment of a kinetic facade: determining the control strategy of a kinetic façade using BIM based on energy performance, daylighting, and occupants’ preferences; Multi-domain asse...
PDF
Environmentally responsive buildings: multi-objective optimization workflow for daylight and thermal quality
PDF
Simplified acoustic simulation - Rutabaga Acoustics: a Grasshopper plug-in for Rhino
PDF
Evaluation of daylighting circadian effects: Integrating non-visual effects of lighting in the evaluation of daylighting designs
PDF
Daylight and health: exploring the relationship between established daylighting metrics for green building compliance and new metrics for human health
PDF
Blind to light loss: evaluating light loss through commercial building facades as a contribution to urban light pollution
PDF
Simulation-based electric lighting control algorithm: integrating daylight simulation with daylight-linked lighting control
PDF
Performative shading design: parametric based measurement of shading system configuration effectiveness and trends
PDF
Data visualization in VR/AR: static data analysis in buildings
PDF
Comparative evaluation of lighting design software programs for daylighting in buildings
PDF
Adaptive façade controls: A methodology based on occupant visual comfort preferences and cluster analysis
PDF
Landscape and building solar loads: development of a computer-based tool to aid in the design of landscape to reduce solar gain and energy consumption in low-rise residential buildings
PDF
Pushing the solar envelope: determining solar envelope generating principles for sites with existing buildings
PDF
A simplified building energy simulation tool: material and environmental properties effects on HVAC performance
PDF
Daylighting in Riyadh and Los Angeles: comparison of cultural factors in potential market penetration
PDF
Building bridges: filling gaps between BIM and other tools in mechanical design
Asset Metadata
Creator
Li, Hang
(author)
Core Title
Visualizing architectural lighting: creating and reviewing workflows based on virtual reality platforms
School
School of Architecture
Degree
Master of Building Science
Degree Program
Building Science
Publication Date
07/14/2017
Defense Date
04/28/2017
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
architectural lighting,design workflows,OAI-PMH Harvest,virtual reality
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Kensek, Karen (
committee chair
), Noble, Douglas (
committee member
), Schiler, Marc (
committee member
)
Creator Email
hli822@usc.edu,lihang.application@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-398272
Unique identifier
UC11264430
Identifier
etd-LiHang-5507.pdf (filename),usctheses-c40-398272 (legacy record id)
Legacy Identifier
etd-LiHang-5507.pdf
Dmrecord
398272
Document Type
Thesis
Rights
Li, Hang
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
architectural lighting
design workflows
virtual reality