Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Data visualization in VR/AR: static data analysis in buildings
(USC Thesis Other)
Data visualization in VR/AR: static data analysis in buildings
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
DATA VISUALIZATION IN VR/AR:
Static Data Analysis in Buildings
by
Danqi Meng
A Thesis Presented to the
FACULTY OF THE USC SCHOOL OF ARCHITECTURE
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
MASTER OF BUILDING SCIENCE
May 2023
Copyright 2023 Danqi Meng
ii
ACKNOWLEDGMENTS
I would thank my parents who supported me for my study at USC and always encouraged me.
I would like to thank thesis chair Professor Karen Kensek for her patient guidance through each
stage of the process.
I would like to thank to my thesis committee member Professor Eric Hanson, Professor Douglas
Noble and Professor Joon-Ho Choi who give valuable advice in professional perspectives.
I want to give my heartfelt thanks to all the people who have ever helped and supported me in
this thesis. Thank you.
Committee members
Karen M. Kensek, LEED BD+C, DPACSA
Professor of Practice, USC, School of Architecture
kensek@usc.edu
Eric Hanson, B.A.
Associate Professor of Cinematic Practice, USC, School of Cinematic Arts
hanson@usc.edu
iii
Douglas Noble, Ph.D., FAIA
Professor, USC, School of Architecture
dnoble@usc.edu
Joon-Ho Choi, Ph.D., LEED AP, Assoc. AIA
Associate Professor, Building Science, USC, School of Architecture
joonhoch@usc.edu
iv
TABLE OF CONTENTS
ACKNOWLEDGMENTS .............................................................................................................. ii
ABSTRACT .................................................................................................................................. vii
Chapter 1. INTRODUCTION .................................................................................................... 1
1.1 Data Visualization ............................................................................................................ 1
1.2 Virtual Reality and Augmented Reality ........................................................................... 7
1.2.1 Virtual Reality ........................................................................................................... 7
1.2.2 Augmented Reality ................................................................................................. 17
1.2.3 Benefits of VR and AR in Buildings ...................................................................... 28
1.2.4 Software Used to Create VR and AR ..................................................................... 33
1.3 BIM Based Real-time Data Monitoring ......................................................................... 35
1.3.1 BIM ......................................................................................................................... 35
1.3.2 Environmental Sensors ........................................................................................... 36
1.4 Summary ........................................................................................................................ 39
Chapter 2. LITERATURE REVIEW ....................................................................................... 41
2.1 Data Visualization in a Building’s Indoor Environment ................................................ 41
2.1.1 History of Data Visualization ................................................................................. 42
2.1.2 Methods of Data Visualization ............................................................................... 44
2.1.3 Architecture and Facility Management ................................................................... 45
2.1.4 Using Data Visualization in Architecture Area ...................................................... 46
2.2 Virtual Reality and Augmented Reality Used in Scientific Visualization ..................... 48
2.2.1 History of VR and AR ............................................................................................ 48
2.2.2 VR and AR in Visualization ................................................................................... 49
2.2.3 Potential in Future Building Industry ..................................................................... 55
2.3 Summary ........................................................................................................................ 56
Chapter 3. METHODOLOGY ................................................................................................. 57
3.1 Digital Geography .......................................................................................................... 59
3.1.1 Revit Model ............................................................................................................ 60
3.1.2 Define Materials...................................................................................................... 62
3.2 Data Collection ............................................................................................................... 66
3.3 Visualization................................................................................................................... 69
3.4 User Interface ................................................................................................................. 75
3.4.1 Hands ...................................................................................................................... 78
3.4.2 Canvas ..................................................................................................................... 80
3.4.3 Buttons .................................................................................................................... 81
3.4.4 Interface Setting ...................................................................................................... 95
v
3.5 Scripts ........................................................................................................................... 101
3.5.1 Toggle ................................................................................................................... 101
3.5.2 Button .................................................................................................................... 102
3.5.3 Slider ..................................................................................................................... 104
3.5.4 Menu ..................................................................................................................... 105
3.6 Package the Project ...................................................................................................... 107
3.7 Summary ...................................................................................................................... 110
Chapter 4. USER EXPERIENCE VR .................................................................................... 113
4.1 Prepare the Files ........................................................................................................... 114
4.1.1 Create a Revit File ................................................................................................ 114
4.1.2 Create a Spreadsheet with the Data ...................................................................... 116
4.1.3 Unity Project File .................................................................................................. 119
4.2 Revit to 3ds Max to Unity ............................................................................................ 121
4.2.1 Export the Revit model as an FBX File ................................................................ 122
4.2.2 FBX File Import into Unity .................................................................................. 128
4.3 Import CSV File to Unity ............................................................................................. 129
4.4 Experience in VR ......................................................................................................... 130
4.4.1 Enabling VR.......................................................................................................... 130
4.4.2 Set Up Headset ...................................................................................................... 132
4.4.3 Set Up Player’s View ............................................................................................ 147
4.4.4 Implementation of Button for VR in Unity........................................................... 154
4.5 Trying Out All the Features ......................................................................................... 157
4.6 Summary ...................................................................................................................... 168
Chapter 5. USER EXPERIENCE AR .................................................................................... 170
5.1 Unity Project Experience in AR ................................................................................... 171
5.2 Prepare the Files ........................................................................................................... 172
5.2.1 Difference Files with VR ...................................................................................... 173
5.2.2 Create a Spreadsheet with the Data ...................................................................... 174
5.2.3 Script ..................................................................................................................... 178
5.2.4 Unity Project File .................................................................................................. 180
5.3 Import CSV File to Unity ............................................................................................. 181
5.4 Enable AR in Unity ...................................................................................................... 182
5.4.1 Register Vuforia .................................................................................................... 182
5.4.2 Download the SDK and Import it into Unity ........................................................ 184
5.4.3 Add and Set Up ARCamera .................................................................................. 185
vi
5.4.4 Set Up Buttons ...................................................................................................... 186
5.4.5 Add Identification Marks ...................................................................................... 188
5.4.6 Build and Run the Project ..................................................................................... 194
5.5 Using All the Features .................................................................................................. 195
5.6 Summary ...................................................................................................................... 201
Chapter 6. CONCLUSION AND FUTURE WORK ............................................................. 203
6.1 Discussion .................................................................................................................... 203
6.1.1 Background ........................................................................................................... 203
6.1.2 VR Workflow........................................................................................................ 204
6.1.3 AR Workflow........................................................................................................ 208
6.1.4 Custom Scripts ...................................................................................................... 213
6.2 Limitations and Improvements..................................................................................... 213
6.2.1 Unity ..................................................................................................................... 213
- Visual Effect ................................................................................................................. 213
- Experience Effect ......................................................................................................... 215
- Data Collection ............................................................................................................. 216
- Data Visualization ........................................................................................................ 217
6.2.2 VR Limitations...................................................................................................... 217
- Users Experience .......................................................................................................... 218
- Project Presentation ...................................................................................................... 218
- Physical Comfort .......................................................................................................... 219
6.2.3 AR Limitations...................................................................................................... 220
- Outcome Not Stable ..................................................................................................... 220
- Direction Not Clear ...................................................................................................... 221
6.3 Validation and Future Work ......................................................................................... 222
6.4 Summary ...................................................................................................................... 226
REFERENCES ........................................................................................................................... 229
APPENDIX ................................................................................................................................. 238
Scripts of data visualization for VR ........................................................................................ 238
Script for data visualization for AR ........................................................................................ 243
Scripts of Map button.............................................................................................................. 245
Scripts for start button ............................................................................................................. 246
Script of data visibility ............................................................................................................ 247
Scripts of timestamp ............................................................................................................... 248
vii
ABSTRACT
The depiction of data through the use of typical graphics, such as infographics, charts, and even
animations, is known as data visualization. These informational visual representations make
complex data relationships and data-driven insights simple to comprehend. Virtual reality (VR)
and augmented reality (AR) are two ways to make the direct perception of data through images
of virtual data. These potentially powerful and innovative multidimensional data visualization
tools can also provide an easy and natural way to visualize and explore collaborative data. This
data can come from a variety of sources. Especially appropriate to portray in VR and AR is data
that is associated with specific locations or spaces. For example, one can collect data on
temperature, humidity, and particle concentration at different locations in a building and then
visually represent the data in the form of numeric, graphs, charts, and other methods. One can
visually visualize the data at different times by swiping to change the date and moment of the
data. In addition, with appropriate programming different types of data can be toggled with a
button. WSP's office in downtown Los Angeles was chosen as a case study. The indoor
environmental condition was recorded every minute. Temperature and relative humidity data
were collected by a HOBO MX 1102 sensor, and the particle density collected as the
concentration of PM 2.5 in the office gathered by Pa- II- SD air quality sensor from PurpleAir.
The spatial model and data were processed by Unity to visualize different positions at different
times. The sensors were used as props to collect real-time data so that people can walk into the
room and see the temperature and other conditions of the building from the physical sense and
data. Through all these, it is possible to monitor the indoor environmental conditions of a room
during a period to test the comfort level of this room for humans. The data was transformed to
viii
both a VR (virtual reality) and AR (augmented reality) environments. Both VR and AR could be
beneficial as two techniques for people to have a better understanding of the indoor environment
quality. It also provides a tool for people to choose the most suitable location for themselves
within the office.
Research Objectives
• Make the data of the indoor environment easier and clearer for people to understand.
• Find ways to visualize data using AR/VR based on sensor measurements in a building.
• Create an intuitive system for evaluating the indoor conditions of buildings.
Hypothesis
Data on the temperature, humidity, air particle count, and other conditions of the building can be
visualized into 3D formats to people through VR or AR methods in a static process.
Keywords: data visualization, BIM, virtual reality, augmented reality
1
Chapter 1. INTRODUCTION
This chapter describes some general information about data visualization, virtual reality and
augmented reality and their comparison and BIM based real-time data monitoring.
1.1 Data Visualization
Thanks to the continuous advancement of modern technology and the processing ability of various
software and systems, data are no longer mere numbers or spreadsheets, they can be shown visually
in many forms to facilitate the analysis of its benefits and trends. To have a more detailed and
direct understanding of data, it has been a vital tool to make those numbers into a wide variety of
graphs.
Data visualization is the graphic depiction of data. It enables decision-makers to view analysis
given in a visual manner so they can understand intricate topics or spot novel patterns (Donalek,
2014). With interactive visualization, it is possible to delve deeper into graphs and charts to find
more specific information (Figure 1.1). Charts and graphs make it simpler to visualize enormous
volumes of complex data than spreadsheets and reports due to the way the human brain processes
information (Li et al., 2015).
2
Figure 1.1 Basic data visualization formats (Oetting, 2022)
Data visualization can convey concepts quickly and easily, besides, it can also convey a large
amount of information in a single picture, with visual comparisons and adjustments. Some
company provide more methods to visualize data like the aid of artificial intelligence and
machine learning to help parse through large datasets and for analysis. The project of BadVR
shows the geospatial to visualize smart city data in VR at the same time in the figure which
combined several kinds of graph. (Figure 1.2). The velocity of wind and its comfort level to
pedestrians can be shown in graphs to give a comparison with ideal result (Figure 1.3). In
addition, it can also be visualized in plan to see the distribution in the area (Figure 1.4). The
3
same data can also be shown in 3D model to demonstrate the relationship between wind velocity
and building height. One series of data can show different information through those different
kinds of data visualization methods. (Figure 1.5) (AECmagazine, 2019).
Figure 1.2 Screenshot of BadVR of geospatial data visualization (Borders, 2018)
Figure 1.3 The relative velocity of the wind tunnel test (AECmagazine, 2019)
4
Figure 1.4 The velocity and comfort level of wind tunnel shown in plan
(AECmagazine, 2019)
Figure 1.5 Same data of wind velocity and comfort level in 3D (AECmagazine, 2019)
5
Visualization of data has been an important topic in scientific research in recent years. The role of
visualization is to guide domain scientists to a better and more intuitive understanding of their data.
In the past decade, a large number of ensemble visualization techniques have been introduced with
encouraging results (Cecchini et al., 2019). Scientific visualization has always been an important
process in engineering to keep track of large-scale simulation data and provide intuitive,
understandable graphics and models to display useful data. Organizing data into an understandable
format and showing trends and anomalies facilitates the telling of stories. A strong visualization
highlights important information while reducing data noise. Information can be communicated
more effectively, intuitively, and clearly through visualization. It provides the most accurate visual
perception of the facts (Cruz et al., 2021).
The need for data visualization has been widely investigated and discussed in many fields.
Similarly, in the architectural area, the importance of data visualization to support design decisions
in the early design phase has been demonstrated. In architecture, the visualization of data is
particularly helpful in visualizing and communicating ideas and analyzing information and data
(Han et al., 2022). Visual representations have played an essential role in analyzing the factors that
influence the architectural design as a factor. Since the invention of computers and the
development of computer simulation and analysis, their applicability has been further expanded
(Guo et al., 2017). Using the same method of visualizing data, information about the building's
interior and exterior environment such as temperature, light levels, etc. can be displayed
graphically. The sunlight and the data of its radiation at the place is shown in a wheel chart (Figure
6
1.3). The influence and the level of radiation to surrounding buildings can be visualized in colors
Figure 1.4).
Figure 1.3 A sunlight hour analysis of radiation in a wheel graph (Vakshoor, 2018)
Figure 1.4 Sunlight Hour Analysis for a Sydney Project using Rhino, Grasshopper
and Ladybug (Vakshoor, 2018)
7
1.2 Virtual Reality and Augmented Reality
“Augmented reality (AR) and virtual reality (VR) bridge the digital and physical worlds. They
allow people to take in information and content visually, in the same way, users can take in the
world” (AR & VR Home, 2021). The field of architecture is being revolutionized by augmented
reality and virtual reality unity, which allow designers to use 3D models to see their concepts, both
AR and VR offer the opportunity to show proposed structures to clients because the 3D models
help them visualize what the design is expressing and help clients determine what they want and
use it for design analysis (Henze, 2022). Discovering problems earlier by designers and clients
reduces the need for revisions and rework later in the construction process (Hall, 2018). With AR,
one can add a digital layer to the viewer’s physical surroundings. One can explore and engage with
a virtual reality (VR) environment. Both technologies have significant promise for the growth of
methods and options in building and architecture.
1.2.1 Virtual Reality
Virtual reality (VR) is an interactive computer simulation (usually used with a 3d digital model)
that senses the location and actions of participants and replaces or enhances feedback to one or
more of the senses, giving a sense of mental immersion or presence in the simulation (Sherman,
2003). People may sense things that are only digital because it is a simulated experience that can
be significantly different from or identical to the real world (William R. 2003). It has been shown
that virtual reality can play an important role in the discovery of domains whose primary
dimensions are spatial (Sherman, 2003).
8
A typical virtual reality device contains at least a head mounted display, a set of sensors, and a set
of computational components that are assembled in the device. The screen is used to display a
simulated image projected on the user's eyes, the sensors are used to sense the user's rotation angle
and sometimes location, and the computing components collect information from the sensors to
determine what image to display on the screen (Metcalfe, 2018) (Figure 1.5).
Figure 1.5 A man using VR headset and hand controllers (Metcalfe, 2018)
VR technology already has some uses in the architectural industry, where architects using VR will
be able to visualize the dimensions of architectural work, among other things (Hole, 2021). And,
since the machines use these same models to cut the pieces, they will know exactly what is being
sent to the construction site to make production and design more efficient. Floor plans, 3D
renderings, and models are often used to convey the idea of a particular space in a design, but even
9
these methods may not be effective in conveying the idea to the client. In this case, it would be
helpful if the client could truly visualize the design, allowing them to explore a virtual
representation of a particular room, floor, or architectural design as a whole (Natephra, 2019)
(Figure 1.6).
Figure 1.6 Concept of using of VR in architecture area (Grant, 2017)
In theory, placing the client in a virtual, detailed representation of the architectural design makes
the feedback process more direct. They can see what they like and dislike about certain elements
of the design clearly. Clients are often not familiar with standard professional graphics and models
of the architecture profession. They may have difficulty understanding and viewing floor plans or
3D models. Providing clients with an interactive AR/VR environment that mimics real life can
make comprehension easier and faster. This means less time spent going back and forth to revise
the design and waiting for further feedback. Real-time changes can also be made in the virtual
10
world, giving clients a sense of specific aesthetic features, such as wall color, lighting, and even
furniture (TMDStudio, 2017). There are many applications for virtual reality both within
architecture and without (Table 1-1).
Table 1-1 Applications of VR
Field Example
Automotive
Industry
BMW and Jaguar
Land Rover use VR
for early design and
engineering reviews,
checking the visual
design of vehicles and
problems when actual
use before actual
manufacturing
(Thompson, 2022)
(Figure 1.7.1).
Figure 1.7.1 JLR are using
VR to hold engineering
reviews (Thompson, 2022)
Healthcare Osso VR enable
surgeons to interact
with medical devices
in VR and perform
surgical exercises on
11
virtual bodies to help
improve familiarity
with new devices and
proficiency with
implanted equipment
(Ugolik, 2019) (Figure
1.7.2).
Figure 1.7.2 Osso VR provide
a surgical training and
assessment tool (Ugolik,
2019)
Tourism VR Expeditions 2.0
allows users to travel a
digital world
(RobotLab, 2022)
(Figure 1.7.3).
Figure 1.7.3 Screenshot of
VR Expedition 2.0 (RobotLab,
2022)
Real Estate Matterport 3D camera
provides a realistic
scan of the building,
which people can then
visit in VR, before
visiting the real site
(Matterport, 2022)
(Figure 1.7.4).
Figure 1.7.4 Matterport 3D
VR building (Matterport,
2022)
12
Architecture In the UK, the BBC
has a TV show, "Your
Home is Perfect,"
which features two
rival architects
showing designs to
homeowners in VR
before they are built in
reality (BBC, 2019)
(Figure 1.7.5).
Figure 1.7.5 BBC TV show Your Home Made
Perfect (BBC, 2019)
Interior Design Flipspaces uses VR to
provide users with a
3D visualization of the
interior of their home
or work space, from
lighting to ventilation,
color schemes and
products (Sharma,
2022) (Figure 1.7.6).
Figure 1.7.6 Interior design
from Flipspaces shown in VR
(Sharma, 2022)
13
Entertainment VR is changing the
way media is created.
XR offers real-time
animation and motion
capture, enabling
creators to build
interactive animation
shows or live-stream
animation
performances via VR
(Georgiadis, 2022)
(Figure 1.7.7).
Figure 1.7.7 Flipside studio
on Steam (Georgiadis, 2022)
Virtual reality (VR) applications created for the needs of architects and their clients can be used
in the area of architecture to give people a lifelike impression of being in a building that is about
to be constructed. As a result, the user can get a more accurate sense of the building's size,
ambiance, and space before anything is really built. Clients can also concentrate on a real visual
experience to get a true grasp of the entire structure and the chance to see many interior design
variants and in some cases, be able to move furniture or edit the space directly (Anett, 2018)
(Figure 1.8).
14
Figure 1.8 Screenshot of the use of VR in interior design (Racz, 2016)
In addition to this, it is also possible to make improvements to the interior lighting model in VR
to provide realistic lighting and manipulate it, making VR a useful design tool. Using a Radiance
lighting models in a model of a room with different light levels can result in faster, more
efficient, more realistic lighting models and interactive applications to design interior lighting
systems more effectively (Vrex, 2022) (Figure 1.9).
15
Figure 1.9 VR application in Visa Lighting (Visa Lighting, 2022)
Images do not have to be realistic. They can also show data, for example, by using false color
(Figure 1.10) (Figure 1.11).
Figure 1.10 The rendered model in Enscape (Dell'Oglio, 2022)
16
Figure 1.11 A false color image in Enscape to show the illuminance levels of the
interior of the model (Dell'Oglio, 2022)
VR can help people step into a 3D model of the project and share the experience. Everyone
involved in the project gets an intuitive understanding of the proposed solution, regardless of
their role, the stage of the project and their professional background (Vrex, 2022) (Figure 1.12).
It can improve collaboration, planning, visualization of data and 3D models, enable clearer
communication between stakeholders, and lead to better coordinated outcomes between all trades
for better project quality (Schiavi, 2022).
17
Figure 1.12 Virtual meeting in Vrex software (Vrex, 2022)
1.2.2 Augmented Reality
Augmented reality (AR), often used in conjunction with the term virtual reality in the literature,
allows users to view additional computer-generated information in real-world scenarios. Unlike
virtual reality, augmented reality is an overlay of digital with the real scene (Figure 1.13).
18
Figure 1.13 CityViewAR showing virtual building on-site in AR view (Lee, 2012)
As mobile technology advances and becomes more intertwined with every aspect of everyday life,
augmented reality (AR) is getting closer to full fruition (Newman, 2016, Gillis, 2022). Pokémon
Go provides a perfect example of what could be the trigger to begin widespread development and
adoption of this technology (Newman, 2016). It has brought augmented reality from a niche
technology to the mainstream by allowing millions of people to roam the physical world to capture
virtual characters in a game (Hsu, 2016). Pokémon GO effectively uses augmented reality (AR)
technology in the hands of millions of smartphone users, where Pokémon are positioned on a map
of the real world, and when arriving at that location, users see them in the "real world" through
their phone screens and then catch them. The game opened the window for augmented reality to
be embraced. Pokémon GO encouraged players to walk around, socialize, and even make friends,
making it easy for people to embrace the new technology with a low-invasive approach to gaming,
thus driving the development and use of AR in various industries (Newman, 2016) (Figure 1.14).
19
Figure 1.14 Pokemon GO on mobile phone (Holly, 2017)
AR is defined as a system that combines real and virtual worlds, allows for real-time interaction,
and accurately registers virtual and real things in 3D. The sensory data that is superimposed can
be useful. This experience is so completely integrated with the real world that it appears to be a
realistic component of the setting. In contrast to virtual reality, which totally replaces the user's
real-world environment with a simulated one, augmented reality modifies one's continuous
perspective of a real-world environment (Wang, 2009). Augmented reality technology
superimposes computer-generated images onto a view of the real world seen through a headset or
mobile device. Microsoft's HoloLens technology takes augmented reality and uses it for everything
from gaming to collaborative documentation and analytics (Hosanagar, 2016) (Figure 1.15).
20
Figure 1.15 Microsoft Hololens use in repair (Microsoft, 2018)
A real environment is superimposed with computer-generated objects in augmented reality, which
places the objects with variable degrees of interactivity after identifying specific real-world
characteristics (Delgado, 2015). Accessibility is AR's primary benefit. When compared to VR, AR
requires the less amount of time and money, everyone's smartphone can be used as an AR viewing
device, lowering the cost and barriers to use. Additionally, it is the most natural and comfortable
experience to be able to see users’ surroundings while examining objects. One of AR's main
drawbacks is accessibility, which also makes it appealing. Viewing virtual objects on a smartphone
or tablet can be productive and restrict the experience if not done appropriately. The limitations of
the AR experience lay in the fact that fully immersive HMDs, which enable AR applications to
span the field of view, are currently quite expensive and rather impractical for the general public
21
(Dallasega et.al, 2020) (AR & VR home, 2022). AR can produce an augmented workspace where
people can interact with digital information in a virtual space (Henze, 2022) (Figure 1.16).
Figure 1.16 AR software working interface (Emerson, 2020)
Current methods of visualizing architecture, the design process, building construction procedures,
and project management systems are all expected to be improved by augmented reality technology
(Dallasega et al., 2020). A large amount of data and information that must be accessed by different
stakeholders, especially in the architecture and design industry, where a large amount of design,
engineering, and management data must be accessed, creates situations that make the use of AR
methods most promising by bringing the relevant staff into an augmented workspace (Paine, 2018).
It is an interactive experience of a real-world environment in which objects that exist in the real
22
world are augmented with computer-generated perceptual information, sometimes across multiple
sensory modalities, including vision, hearing, touch, somatosensory, and smell (Abdelalim, 2017;
Cipresso et al., 2018) (Table 1-2).
Table 1-2 AR applications
Field Example
Repair and maintenance Automotive motor and MRI
machines, repair and
maintenance personnel use
AR headsets and glasses
while performing their work,
providing them with useful
information in the field,
suggesting potential fixes,
and pointing out potential
trouble areas (Reflekt, 2022)
(Figure 1.20).
Figure 1.17.1 Interface
of mobile application
automotive motor repair
(Reflekt, 2022)
Design and modelling The use of AR allows
architects, engineers and
design professionals to see
how their designs might look
at an actual site and even
23
make virtual changes on site
(Souzza, 2019) (Figure
1.21).
Figure 1.17.2 AR
technology in
architecture (Gamma
AR)
Retail
Motorcycle brand Harley-
Davidson developed an AR
app that allows shoppers to
customize their motorcycles
in-store using the app to see
what colors and features
they might like (Harley-
Davidson, 2022) (Figure
1.22).
Figure 1.17.3 Harley
Davidson motor design
software (Harley-
Davidson, 2022)
Business Logistics Shipping company DHL has
been using smart AR glasses
in some of its warehouses,
with the lenses showing
workers the shortest route
within the warehouse to
locate and pick a certain item
that needs to be shipped
(Paine, 2018) (Figure 1.23).
Figure 1.17.4 Using AR
glasses in DHL
warehouse (Paine,
2018)
24
Hearing aids Computer aided technology
using AI and AR to create a
smart hearing aid can filter
out other voices (Bajarin,
2021).
Figure 1.17.5 Build AR
headset to supercharge
hearing (Bajarin, 2021)
Environment interact
through touch
Touch sensors can also be
used with AR (Vu, 2019)
Figure 1.17.6
Discriminate between
touching and hovering
over the surface
(Metaio, 2014)
The construction sector is progressively beginning to adopt mobile apps. Augmented reality
wearables in construction have been more prevalent. Throughout the construction process, the
construction industry has been strengthened and improved because to the employment of this open
development toolkit (Georgiadis, 2022). Any stage of the construction process can benefit from
the usage of augmented reality. Additionally, AR makes it much simpler to verify that everything
25
is proceeding as expected. This makes it simple to steer clear of errors and to verify that all
requirements have been met once the project has been completed (Doyle, 2018).
Augmented reality can minimize rework and damage caused by downtime during construction. In
the construction industry, completing projects on time is particularly important to ensure that
budgets are met. Augmented reality in the construction industry makes it easier to adjust
construction schedules and plan logistics for the coming weeks as needed. This eliminates the need
for rework, and AR in construction promotes employee safety (TeamViewer, 2022).
Augmented reality in the construction industry makes it possible to show results before the job is
done. Misunderstandings can be detected more easily. Paper-based timelines and delays in
communication are also eliminated as AR overlays help identify problems faster. This also makes
it easier to check for problems as they arise (SPOT, 2020).
The Microsoft HoloLens is a transparent lens augmented reality (AR) headset that offers an
augmented reality experience. According to Microsoft, HoloLens is a "wholly untethered, see-
through holographic computer." Users using HoloLens can interact with 3D models as if they were
real objects in their surroundings. Augmented reality allows users to interact with virtual objects
mixed in their surroundings, rather than in a purely virtual space. The use in real life is also very
widespread. It can help people achieve easier remote guidance, getting step-by-step instructions
from experts about things like home repairs. Visual diagrams are actually displayed in the space
around the user, indicating exactly what needs to be done next (Delgado, 2015) (Figure 1.16).
26
In addition, Microsoft HoloLens use AR technology to gamify tasks in people's sports and
games. For example, replacing the world around the user of the device with a fun, interactive,
scrolling landscape while jogging on a treadmill. By turning monotonous tasks into games,
people can make the process of exercise more fun. It provides much more possibility of game
industry. User is possible to get a fully immersive gaming experience (Delgado, 2015) (Figure
1.18).
Figure 1.18 Workers using Hololens on a construction site (Corke, 2019)
Another example is the use of augmented reality to study dynamic weather data. Using AR to
dynamically represent changes in weather data, giving a visual impression (Heinrich, 2008).
Product developers at The Weather Company realized that augmented reality could dramatically
change the way broadcast meteorologists produce weather forecasts. They realized the potential to
27
elevate weather forecasting from a traditional linear presentation to a virtual 3D immersive
experience to differentiate weather presentations and help broadcasters gain and retain viewers"
(Convey, 2005) (Figure 1.19).
Figure 1.19 An example of using AR in weather visualization (NewscastStudio, 2018)
Based on the visual representation of weather, AR technology also has the potential to play a role
in visualizing building science data. Some of the factors affected by the weather can also be
expressed in augmented reality, for example, the daylighting conditions, solar radiation, and indoor
air quality, these sort of weather file-based factors all can represent the changes in the building as
visually perceivable expressions. This will enable people to evaluate the sustainability of a
building or site by making it clearer and more effective in all aspects of its design and use in both
static and real-time data analysis.
28
1.2.3 Benefits of VR and AR in Buildings
In the construction sector, virtual reality has been found to be quite beneficial. First of all, VR can
be utilized for training reasons, such as safety training or equipment training, and when paired with
VR's use in the education sector, it can be utilized for the same purpose for the training of workers
in the construction business. VR can be utilized for pre-construction planning as well as for
inspecting site conditions and design layouts, among other things. VR enables higher accuracy and
teamwork by being used at every stage of a project, from design to final inspection. A virtual
model of a project can be made using VR technology and then utilized to plan and carry out work
(Thompson, 2022). Construction organizations may reduce communication breakdowns and
delays, as well as avoid costly errors, by utilizing virtual reality technology. The usage of virtual
reality technology in the construction industry is growing as a means of facilitating interaction
between field workers and stakeholders. It enables all project participants to fully immerse
themselves in a virtual setting and view the building project from a different angle (Zgoda, 2020).
All parties participating in the project will likely be able to communicate and understand each
other better as a result. Workers on the job site can receive training using VR technology, allowing
them to become familiar with the environment before work even starts. As a result, virtual reality
is an effective tool that can enhance teamwork and communication on construction projects. This
enhances construction delivery quality and significantly lowers the cost of rework in building
projects. By identifying errors early on, virtual reality can help avoid costly delays and changes.
And by providing a more comprehensive view of a construction project, BIM can help managers
better identify potential problems before they cause them. At the same time virtual reality provides
29
an immersive experience that can help clients understand complex construction projects and better
understand them (Zgoda, 2020) (Figure 1.19).
Virtual reality is a wholly digital experience that can only be enjoyed in a small area. Furthermore,
it may include audible, physical aspects of the outside environment, such as temperature and
motion (Virtual Reality Society, 2017). Virtual reality is the creation of a virtual environment that
is presented to our senses in a way that causes us to feel as though we are truly there. It is a
technically challenging effort that involves a variety of technologies and takes into account human
senses and cognitive processes. Both fun and useful things can be done with it.
Technology has become more affordable and accessible. One can expect to see more innovative
uses of technology in the future, perhaps because of the possibilities of virtual reality, and the
grown of the Metaverse (Figure 1.20). It has already been seen as a trend to imagine a virtual
world that people can do live their daily life. The Metaverse is a hypothetical iteration of the
Internet, a single, pervasive, immersive virtual world facilitated through the use of virtual reality
and augmented reality headsets, a network of 3D virtual worlds focused on social connections
(Matt, 2021) (Clark, 2021). The Gartner Technology Maturity Curve demonstrates the hype
cycle of disruptive innovation technologies to watch. According to the 2022 chart, the metaverse
concept will require more than the next decade of development and research to reach public
acceptance and widespread adoption (TheGhostHowls, 2022).
30
Figure 1.20 A concept photo of Metaverse (Eyraud, 2022)
With the help of virtual reality technology, users can experience everything they can think of,
regardless of reality. The greatest benefit of VR is the ability to design any environment while
having total control over every aspect of it. This allows the developer to design an experience that
is impractical or unachievable in the real world. The main benefit also serves as the biggest
drawback. To produce a relaxing and useful VR experience, every component, even the smallest,
must be developed digitally (TMDStudio, 2017). In order to prevent the user from becoming
disoriented or ill while in the environment, special attention must be paid to factors like navigation
and visual update speed of the head mounted displays, which are not always necessary in
augmented reality.
31
The VR Simulator is designed to not only provide operators with a real-world experience through
VR goggles and an actuator workbench, it also records operator movements, providing data that
highlights areas of success and improvement. This in turn allows trainers to quickly assess and
address any areas of weakness. The operators can learn and virtually practice before using the
equipment and maximize the proficiency of workers and avoid losses and accidents caused by
mistakes. (Figure 1.21) (Doyle, 2018).
Figure 1.21 A man using VR training simulator (Doyle, 2018)
Similar to virtual reality, augmented reality has the same advantages for applications in the
construction industry due to its visualization capabilities. Besides, in order to increase the efficacy
and efficiency of workers' duties, augmented reality (AR) has been proposed as a mechanism to
improve the process of information extraction from creating information models (Chu et al., 2018)
(Figure 1.22).
32
Figure 1.22 Augmented reality in workspace (3rockAR, 2022)
For architects and designers, AR provides much of its value in the early stages of design, as a tool
to experiment with form and attract the support of clients and stakeholders. It can help non-
designers understand how any architectural ensemble will work: the proportions of the space, its
orientation on the site, the views it provides, the mix of material finishes. Not only is it visually
visible, but it also allows anyone such as the client to "walk the site" long before it is built (Mortice,
2014) (Figure 1.23).
33
Figure 1.23 Building material providers can offer immersive customer experience
with AR (Solanki, 2020)
1.2.4 Software Used to Create VR and AR
In general, recent VR and AR software and projects have been developed using private engines or,
increasingly, existing connected game software development kits such as Unreal Engine and
Unity3D (Theoktisto, 2015). Users may combine all of their assets using a game engine, which
also offers developers access to a robust 3D element editor. The use and development of VR or
AR have some sub-sections that meets the needs of all main kinds of area and software is produced
based on the need. Software designed to achieve users can experience compiled data in a virtual
setting with visualization. These technologies give consumers the ability to view analytics in a
way that allows them to completely comprehend what the statistics are trying to say (Solanki,
2020).
34
1.2.4.1 Unity 3D
Unity 3D is a cross-platform game engine which made for designing games and for many other
purposes. It is a multi-platform game development tool with a fully integrated professional game
engine that includes a scene manager, lighting mapper, physics engine, scripting engine, and
rendering engine. Games and interactive 3D and 2D experiences like training simulations, medical
and structural visualization are typically made with Unity3D (Li et al., 2019) (Brodkin, 2022). A
Unity3D program in its entirety is made up of various scenes. A number of models act as Game
Objects in each scene, and scripts regulate their behavior. The camera renders and controls what
viewers see in the scene (Ni et al.,2010) (Li et al., 2019). In comparison to traditional development,
3D software may be created in a fraction of the time because to its user-friendly interface and
thoughtful architecture (Jerald, 2014). Unity is currently completely supported by a number of
virtual reality enterprises due to its accessibility and simplicity (Jerald, 2014).
1.2.4.2 Unreal Engine
Unreal Engine is another game engine that also support the development of VR and AR program.
Unreal Engine 4 helps users easily and quickly develop virtual reality and augmented reality-based
applications, build it and publish it to stores (G2Score, 2022). The Unreal Engine AR Framework
provides a framework for building augmented reality applications on iOS and Android handheld
platforms using Unreal Engine. It provides a single development path for both platforms, allowing
developers to build augmented reality apps for both platforms using a single code path. The
35
Handheld AR Blueprint template provides a complete sample project demonstrating the
augmented reality features available in Unreal Engine (Kushnir, 2018).
1.3 BIM Based Real-time Data Monitoring
Today, BIM and real-time data are becoming central topics in the construction and engineering
industry, and they represent powerful new tools for designing and managing facilities. Building
monitoring and real-time data can represent the solution to many important challenges (Digregorio,
2020).
1.3.1 BIM
BIM is a term that refers to both building information modeling and building information
management. Thanks to this highly collaborative process that incorporates architects, engineers,
real estate developers, builders, manufacturers, and other construction experts, a structure or
building can be planned, developed, and constructed within a single 3D model (Lorek, 2022). Real-
time building information is included in dynamic BIM to reflect current conditions, including
transferring real-time data and responding to unique scenarios through dynamic BIM with real-
time data. They can do this without putting users in an uncomfortable or dangerous situation and
restore facilities at a low cost. Combining operational data with design data already present in the
model is a useful reference for designers and researchers working on various projects (Lee, 2018).
Based on BIM's transmission and real-time response to building data, it can be used to import
building model data into other software for further analysis and processing. Then, it will provide
a benefit to data analysis and visualization.
36
1.3.2 Environmental Sensors
Environmental sensors are frequently used to track indoor comfort factors like air quality, humidity,
and particulate matter.
HOBO MX 1102 is an example that can collect indoor temperature and relative humidity (Figure
1.24). It is an integrated sensor for CO2, humidity and temperature measurements in indoor
environments and can wirelessly transmit recorded readings to Android devices, iPhones, and
iPads using Bluetooth Smart LE Communications. The measurement range is 0 to 5000ppm, -
20°C to 70°C and 1% to 70% RH with the accuracy of ±50ppm, ±0.21°C and 2% RH. and it can
store up to 84,650 measurement readings with date and time stamp (Onset, 2022).
Figure 1.24 HOBO MX 1102 (Onset, 2022)
As for the data of particles, the Pa- II- SD air quality sensor from PurpleAir can be used
(Figure 1.25). It measures PM2.5 concentrations in real time for residential, commercial or
37
industrial use. Built-in Wi-Fi enables the air quality measurement device to transfer data to the
PurpleAir map where it is stored and made available to any smart device (PurpleAir, 2022).
Figure 1.25 Pa- II- SD air quality sensor (PurpleAir, 2022)
Sensors can offer occupants and users trustworthy information to give immediate feedback on how
comfortable or habitable the indoor environment is; data can also be stored for longer term access
to help in determining trends.
After receiving the data from the sensors, understanding how to apply efficient data visualization
techniques is crucial since it can make it simpler and faster for users to comprehend and analyze
the data. In the general statistical process of data, traditional two-dimensional graphs and charts
38
are the most common methods used by researchers, such as bar charts, line graphs, etc. (Natephra,
2019). Static visualizations provide users with the tools to extract meaning from data more
efficiently, but they also have significant limitations.
The sample data that be selected by the sensor which exported to excel sheet is shown (Figure
1.26). These data can be visualized into line graphs (Figure 1.27).
Figure 1.26 Sample of data from sensor in Excel
39
Figure 1.27 Graph of sample data from sensor
For example, 2D charts cannot efficiently visualize how data changes under different conditions
due to the limited variables that can be considered, they can only present certain data types, and
they are inefficient at presenting real-time sensor data, such as air temperature, indoor air quality,
and wind speed, that are needed to make quick decisions and take action to solve problems. As the
size of the data set continues to grow and the complexity of the data available, static information
visualization methods become less useful, and this approach needs to be improved; interactive
tasks cannot be accomplished with static visualization methods (Abdelalim, 2017).
1.4 Summary
This chapter described some general information about data visualization, virtual reality and
augmented reality and their comparison and BIM based real-time data monitoring.
40
Starting with a 3d model or BIM and then adding sensors and placing the model and data into a
VR or AR environment allows for a different way of visualizing data. A large amount of data about
the different properties of the building can be obtained through sensors. Based on the
understanding of modern scientific use of data visualization and technologies, data visualization
played an important role in the scientific research process. It helps people have a better
understanding of numbers and provides a much easier method for analyzing them. Based on the
results of data visualization, where large and complex data is represented graphically, with AR,
one can overlay numbers on our surrounding world. One can explore and engage in a virtual reality
(VR) environment. These two technologies have a great deal of promise for improving procedures
and opening new possibilities in the building and construction industry.
41
Chapter 2. LITERATURE REVIEW
Chapter 2 describes some research done on data visualization in a building’s indoor environment
and virtual reality and augmented reality used in scientific visualization.
2.1 Data Visualization in a Building’s Indoor Environment
In the era of "big data," effective data visualization is a crucial component of the discovery process.
It serves as a link between the quantitative data's content and human intuition, making it a crucial
part of the scientific process that turns data into knowledge and understanding (Donalek et al.
2014). In the data mining process, visualization is crucial for guiding the selection of the
appropriate algorithms and for identifying and removing inaccurate data from the study.
Visualization is the primary bridge between the content of data and human intuition, and it can be
argued that one cannot really understand or intuitively understand anything. Visualization must
therefore be an integral part of any data mining process (Wang et al. 2019).
42
Figure 2.1 Global comfort data visualization tools (Ličina, 2018)
2.1.1 History of Data Visualization
Before the 17th century, the concept of data visualization mainly existed in the field of maps,
showing landmarks, cities, roads, and resources. Maps have existed for at least 10,000 years in
some form, with their original uses being for land ownership, navigation, and plain human
curiosity. In the past, information about the world would have been etched into stone or clay using
eyewitness reports and extensive guesswork. Through the ages, tools like the compass and sextant
allowed for more exact measurements and more accurate maps, and the printing press made it
possible to make them in large quantities. The world map created in 1569 by Flemish geographer
Gerardus Mercator represented a significant improvement in our ability to represent the surface of
the spherical Earth on a flat piece of paper (Farnworth, 2020). The first visual representation of
43
statistical data is given to Flemish astronomer Michael Florent van Langelen in 1644 (Figure 2.2).
Twelve known estimations of the longitude difference between Toledo and Rome are displayed
on a one-dimensional line graph along with the names of the astronomers who made each estimate.
Thematic charting started in the 18th century. Many of the charts used frequently today, including
line, bar, circle, and pie charts, are due to William Playfair (Farnworth, 2022).
Figure 2.2 First data visualization graph (Friendly, 2008)
Other types of statistical graphs, such as scatter plots, contour plots, time series plots, and bar
charts, were created during this time. With the development of computer processing in the second
part of the 20th century, statisticians were able to gather and store larger and larger volumes of
data, visualize it quickly and easily, and apply it to a vast array of specialized subjects (Friendly,
2008) (Few et al., 2007). It is now possible to examine data in unique and increasingly inventive
ways because of a variety of software tools (Friendly, 2008).
44
2.1.2 Methods of Data Visualization
Researchers have created a large number of methods to give a better way to visualize the indoor
environment of a building. Particle tracing is a common method to define airflow characteristics.
It plays a role to give much insight into the three-dimensional behavior of air within spaces.
The Department of Civil Engineering and Architecture of the University of Pavia published a 3D
platform for energy simulation of building assets (Cecchini, 2019). The workflow for
implementing a 3D GIS-based web platform that can store, process, and display data about
building assets and their energy use is presented in this study. It is compatible with a building
information model. The procedure begins with easily accessible data about the built environment
and employs a common data format and classification scheme to construct a repeatable model. The
3D model is created semi-automatically by fusing point clouds produced by national survey
activities using light detection and ranging with the 3D GIS data from the municipality. The
building's energy performance certificate contains a set of thermal performance and energy
statistics. An application was run in a building complex owned by the University of Pavia to test
and validate this approach. After completing the definition of the model and its representation in
a network environment, a comparative energy analysis of different buildings is shown. In this case
study, an integrated implementation on a web platform, based on a 3D GIS, to implement a city
model to analyze energy use and consumption, implement a city model to analyze energy use and
consumption in a building complex, and provide a 3D navigation model to make the understanding
of a large data set more spontaneous. In contrast to the traditional presentation of digital
45
information in tabular form, the 3D model supports the visual representation of urban energy
consumption data through images (Figure 2.1.2).
Figure 2.3 3D model and histogram representing the thermal and electrical energy
demand of the building stock.
2.1.3 Architecture and Facility Management
After several years of use, a building object may work and look significantly different from its
initial design, depending on how it is utilized, maintained, updated, renovated, and eventually
destroyed and recycled. Some of the problems that develop during several years of use include the
46
lack of acceptable maintenance principles and the use of inappropriate materials (Devetakovic,
2007). BIM-based facility management technology is a very important application area of BIM in
cities. It provides a process for minimizing corrective maintenance and establishing maintenance
information about the causes of equipment failures and the related operations required (Yam,
2001). Facilities management is a multidisciplinary profession that makes it easier to work and
conduct processes by integrating people, places, processes and technologies to ensure the proper
functioning of the physical environment (FMLink, 2007). TCHO is a company that specializes in
using state-of-the-art tools and materials to produce "high-tech chocolate." They use a mixed
reality testing system in their factory for everything from the sourcing of cocoa beans to the
production, delivery and distribution of chocolate. This combination of mobile, social, mixed and
virtual technologies can help create industrial systems that improve collaboration between
geographically distant people and places (Kimber, 2010).
2.1.4 Using Data Visualization in Architecture Area
Data visualization is used in the construction industry to reduce stress levels and boost health and
productivity. Researchers can start establishing relationships between environmental elements,
time, and occupant stress levels by collecting data on heart rate, body temperature, and time-related
questionnaires. It helps to analyze the data visually in order to find these correlations. The
researcher can start making adjustments or changes to the environment once the study is complete,
and then they can observe the resulting consequences. The correct data may be gathered by using
sensors in the surroundings and time-based tracking to help make designers-built environment
healthier, less stressful, and make occupants be the most productive (Lehman, 2022). In the
47
construction sector, data visualization is utilized to lower stress levels and improve production. By
gathering information on heart rate, body temperature, and time-related questionnaires, researchers
can begin identifying connections between ambient factors, time, and occupant stress levels.
Finding these associations requires a visual analysis of the data. Once the study is over, the
researcher can begin altering the environment, and they can then watch the effects take place. In
order to create a constructed environment that is healthier, less stressful, and has the highest level
of productivity for its occupants, the appropriate data may be acquired by using sensors in the
environment and time-based tracking (IBM, 2021). Methods of data visualization in the
architecture area, temperature and relative humidity data can animate and visualized from data
loggers, while incorporating the three-dimensional geometry of a building (Figure 2,4) (Baker,
2007).
Figure 2.4 The program for data visualizing in building (Baker, 2007)
Architectural visualization improves understanding between designers and clients, facilitates clear
communication of design ideas, and enables architects and designers to collaborate and
48
communicate effectively than. 2D designs can be difficult for people to read, which can cause a
lot of trouble, and with architectural visualization people can create a complete simulation of how
the product will look in the real world (Rose, 2019).
2.2 Virtual Reality and Augmented Reality Used in Scientific Visualization
Augmented reality and virtual reality allow architects to take their designs to a whole new plane,
opening far-reaching opportunities for the architectural industry. The construction industry has
made steady progress toward more technologically advanced processes, and AR and VR build on
existing technologies to provide construction teams with more information and help them plan
better.
2.2.1 History of VR and AR
The concept of VR began in the mid-1960s. The first VR-AR headset in history was developed by
Harvard professor Ivan Sutherland and his student, Bob Sproull. The user can perceive the virtual
world as if it looks, feels and sounds real, and in which the user can perform actual activities
(Sutherland, 1965). The first immersive 3D simulator in 1962, when Morton Heyliger created
Sensorama, was able to provide a realistic experience to stimulus sounds, smells and touch, as well
as including wind (Heyliger, 1962). Later, Ivan Sutherland developed the final display that
included interactive graphics not driven by Sensorama, in addition to sound, smell, and haptic
feedback. Many video game companies have improved the development and quality of VR devices,
such as the Oculus Rift or HTC Vive, which offer a wider field of view and lower latency. In the
early 1990s, Boin created the first prototype of an AR system to show employees how to set up a
49
wiring tool (Carmigniani et al., 2011). Feiner et al. developed the first mobile AR system capable
of researchers et al. created ARQuake, a mobile AR video game (Thomas et al,2000). in 2009,
other AR applications such as AR Toolkit and SiteLens, were developed to add virtual information
to the user's physical environment. in 2011, Total Immersion developed DFusion, and an AR
system for design AR system for projects (Maurugeon, 2011). Finally, in 2013 and 2015, Google
developed Google Glasses and Google HoloLens, which began usability testing in applications in
several domains. Development is continuing making both VR and AR systems better and cheaper.
2.2.2 VR and AR in Visualization
Virtual reality and augmented reality are widely utilized in several steps during construction and
design in the architecture area. Previous research has shown advantages of VR and AR to provide
a more interactive method for people.
VR as a new assessment tool in psychological research has higher validity than traditional methods.
Studies have shown that there is no significant difference between emotions triggered by virtual
and natural conditions (Chirico, 2019), but the use of audiovisual VR is more effective in
stimulating human perception and can amplify the human senses more than without artificial
sensory input or independent input (Naef, et. al, 2022).
The Department of Architecture at Mahasarakham University proposes a method for an automated
live sensor data visualization of building indoor environment conditions using a VR system based
on an integration of environmental sensors (Figure 2.5) (Natephra et al. 2019). BIM-based real-
time sensor data visualization is an integrated system that uses BIM models, sensor data, and VR
environments to monitor indoor conditions. Sensor values are sent from the controller to the BIM
50
Server and stored in the model. Visualization of geometry and sensor data, and readout of the data
with a VR headset following visualization of the sensor data (Cecchini et al, 2019).
Figure 2.5 Screenshot of the main interface of a prototype system (Cecchini, 2019)
The advantages of integrating BIM, sensor data (air quality), and VR for operations and
maintenance include the opportunity to support and track indoor conditions in real-time,
proactively manage the indoor environment, better monitor and store sensing data, and identify
measured parameters that may affect indoor comfort conditions. By utilizing VR, it is possible for
users to visualize and monitor the current condition of the room and detect possible problems in
real time. The limitation found is the result of data including temperatures and humidity are only
shown in general numbers, which means although it is visualized, it is still hard for people to have
a clear and easy understanding of the meaning of those numbers and comfort conditions. Although
the developed system provides the ability to connect sensors to BIM and VR, it enables users to
check air temperature, humidity, and light levels through HMD levels, only digital values can be
visualized. To help users easily understand comfort conditions, sensor readings should be
51
presented in VR as virtual charts and graphs. The thermal comfort analysis of the building is
limited to temperature and humidity through digital values retrieved from sensors. Air flow rate
and average radiant temperature can also be integrated with the system we developed (Hill, 2020).
This technology has been appropriately used in the study of weather. Weather is static data, but it
changes constantly over time. Using AR to dynamically represent changes in weather data, giving
a visual impression. Harness the power of advanced augmented reality forecasting technology to
engage viewers and keep the audience watching longer. AR technology can engage viewers with
dynamic TV weather graphics, and 3D images of storms and atmospheric events. It also has the
ability to integrate traffic data into live presentations to help viewers understand travel conditions
before they begin their commute (Heinrich, 2008).
AR is a technology that combines real and virtual content while allowing real-time interaction and
registration in three dimensions. Thus, AR systems are considered as the next-generation, reality-
based interfaces that complement the real world with virtual objects and seem to coexist in the
same space as the real world. In addition, the system generates interactive graphics, responds to
user input in real time, and tracks their position to maintain the relative position of the virtual
image and the real world. For example, AR can be used with sensors to help visualize proposed
architecture to monitor indoor conditions and assert indoor thermal comfort (Natephra et al. 2019).
One could also evaluate those indoor environmental variables that have a significant impact on
indoor comfort and assist the user to identify the problem in real time. The collected information
on indoor environmental conditions can be stored in the BIM database for further analysis in order
to discover knowledge from historical data on the indoor conditions of the facility. In addition,
52
visualization of computational fluid dynamics simulations on mobile devices based on augmented
reality improves the understanding and discussion of indoor thermal environments during the
design process.
However, the use of AR-based mobile devices to understand the indoor thermal environment still
encounters some problems due to limited computational power and lack of effective interaction
methods (Hosokawa, 2016). An integrated approach based on serving client is established, new
mobile-friendly data is proposed, interactive cross-sectional view selection and time-step
animation methods have been proposed an approach to intuitive interaction with AR environments
is proposed, and a prototype system is developed using the Unity3D engine and Tango tablet by
the Department of Civil Engineering of Tsinghua University (Figure 2.6) (Lin et al, 2019). The
research focused on augmented reality’s ability to visualize thermal environment on mobile
devices.
Figure 2.6 Tube view of CFD airflow on the real world (Lin et al, 2019)
53
Overlaying virtual data onto a real-world scene with the developed TangoCFD client involves four
steps. The first step is to place markers by touch, which represent the four points used for the
transformation calculation. This was mapped based on the created points, virtual data such as BIM,
point clouds, and flow lines are loaded and positioned in the appropriate locations. Then, a CFD
simulation was loaded. The location of the subject is sent to the server to download the
corresponding profile data and visualize it. The limitation also exists that it is much more suitable
to monitor the data like airflow, and particles in the air, but it is not possible to show the change
in temperature and humidity (Natephra et al. 2019).
Virtual reality technology is sometimes used in the development of construction projects. Building
information models serve as lifecycle tools for buildings, including as much information as
possible for further applications some research uses BIM tools to bring CFD visualization into VR
and reports on the evaluation and analysis of the results (Yan, 2020).
A visual lifecycle management tool is developed to show and control the indoor condition through
augmented reality and building information modeling systems. The system gives a possibility to
visualize the temperature and some other parameters of a building or a room and show it in
augmented reality and show it in mobile devices (Maile at al., 2007) (Figure 2.7).
54
Figure 2.7 Temperature shown in AR (Wagner, 2019)
By introducing mobile-friendly data formats and pre-processing large amounts of data, the
computational power of the mobile client requires careful control. At the same time, simple
interaction with virtual data overlaid in the real world is achieved through the framework of the
proposed data location method. Thus, by developing prototypes based on AR-ready devices and
data servers, better visualization can be provided for different types of data (Figure 2.8) (Lin et al.,
2019).
55
Figure 2.8 Virtualization of data in Augmented Reality (Hayden, 2018)
2.2.3 Potential in Future Building Industry
AR/VR innovations are becoming an integral part of the construction industry, especially in the
face of external shocks like COVID-19, the ability to operate remotely plays a very important role
in conditions where in-person work is not possible. While only major construction companies have
so far fully upgraded to AR/VR, the industry is poised to become more dynamic and competitive
in this direction as the technology becomes more affordable. and as 5G networks evolve
(Haritonova, 2022). AR applications can view and interpret the outside world in real time and
project images onto it, bringing significant benefits to the construction industry. 3D planning can
help project teams avoid mistakes and solve problems before construction begins. Teams can
quickly see where everything is built, from pipes to ducts, columns, windows, and access points.
This facilitates changes before, during, and after the project. Architecture faces increasing
demands for construction efficiency, which puts additional pressure on architects and designers to
56
identify potential problems early in the planning phase. of the construction process. At the same
time, virtual reality technology provides convenient planning and rendering tools. It is widely used
by developers to test realistic 3D renderings of architectural designs. In addition, VR allows
companies to make a more compelling case for bidding on new projects: instead of rendering
renderings or 2D models, it can bring stakeholders on a tour virtual vision into the future space
(Haritonova, 2022).
2.3 Summary
Chapter 2 described some research done on data visualization in a building’s indoor environment
and virtual reality and augmented reality used in scientific visualization. Based on the above
analysis of existing studies, it is shown that the study of data visualization in the use of indoor
environmental conditions in buildings is a topic that researchers have paid attention to.
Visualization models in various forms of expression, from 2D to 3D, have been studied for the
embodiment of many data of the building's interior environment. However, it was also found that
not all data are applicable to the same kind of image to represent. VR and AR, a popular topic in
recent years, are also being developed to the maximum extent for their functions and utilization in
the building and construction process. Based on what is known from past research, they can be
combined with BIM to enable real-time data monitoring through sensors, while representing the
data on visual images. There is still opportunity for more research on VR and AR for real-time
monitoring of environmental data inside buildings.
57
Chapter 3. METHODOLOGY
This chapter describes the workflow and methodology of the project including how to achieve
digital geography, collect data, user interface, and packaging the project.
In order to visualize the building interior environment and visualize the data in the form of VR and
AR and make it perceptible to the users, Unity was chosen as the primary software to achieve this
goal. A methodology was developed that included modeling two indoor spaces, collecting data on
temperature, humidity, and particulate matter at different locations in these indoor spaces, and
visualize these data in VR and AR. One can then visually check the indoor environmental
conditions with a list of times and dates, and the user can use a button or slider to adjust the data
and time to see the indoor climate conditions at different points in time.
There are six main steps in the workflow (Figure 3.1).
58
Figure 3.1 Methodology workflow
1. Digital Geography. Build a BIM model and define all the positions of sensors. The sample
room is a corner office at 3rd floor in Watt Hall (Los Angeles, CA). 3Ds Max Studio is
used as a tool to define all the materials of the model.
2. User Collection. Collect data of indoor environment of the office in three different aspects
and two kinds of sensors are used, all the data will be fixed and export in csv. file.
3. Visualization. Visualize the data into cubes and spheres, programming it into different
color expression.
4. Experience in VR. Set up VR headset and controller in Unity. Set up the players’ view and
interface for users.
5. Experience in AR. Use AR Foundation to set the primary location to help align the AR
model with the real world.
6. Package all the files and projects, and supple give a guidance to users.
59
These steps are demonstrated for an office space used by WSP, also in Los Angeles, CA, in Chapter
4.
3.1 Digital Geography
A sample room with sample data was chosen to test the methodology. It is an office on the 3rd
floor of Watt Hall at USC in Los Angeles, California (Figure 3.2). Before using the real data of
the WSP office space, this simple sample room was used to try to improve the whole project
research process and script programming. Such a process helped to identify problems ahead of
time, as well as find solutions in advance.
Figure 3.2 Floor plan of the office at 3rd floor of Watt Hall
60
3.1.1 Revit Model
Revit was used as the main software to create the digital model. First, based on the floor plan of
the Watt Hall office provided by the School of Architecture, a 3d model was created using Revit,
including all the interior furnishings and air conditioning and other equipment (Figure 3.3).
Figure 3.3 Revit model of the corner office in Watt Hall
At the same time, the locations of the sensors were marked in the model to reflect the effects of
different locations and different interior furnishings on the indoor environmental data (Figure 3.4).
61
Figure 3.4 The position of sensors in the room.
Then, based on the plan, relocate the model into center of the frame and set up the location of
sensors and mark it on the plan. The position of each sensor be set as 30 inches above the floor
which is an approximate height of desk in working area. SteeringWheels, which is specifically for
users to navigate and orient model in different views, was used to define the center of the view of
model and prepare for the import into the game engine software.
In order to import the model into Unity, the model files are exported from Revit as FBX file. FBX
(. fbx) file is a format for exchanging 3D geometry and animation data. It can be used with different
programs to open, edit and output high-fidelity 2D and 3D files and are used in movies, games,
augmented reality, and virtual reality development. Revit exports an FBX file that can be read
directly by Unity 3D and added at the same time. Unity 3D can import FBX files into Assets,
where one can develop model elements such as game objects, scripts, and other project files. FBX
is one of the formats supported by Unity 3D and can also be exported from Revit. As a result,
62
using FBX format files enables the conversion from modeling in Revit to Unity for importing the
model into Unity for subsequent operations.
3.1.2 Define Materials
When importing Revit's FBX model files into Unity, the model's materials are sometimes lost
(Figure 3.5). This process sometimes results in losing materials and textures afterwards, so it is
important to find methods to import the materials associated with the geometry into Unity.
Figure 3.5 A model in Unity without materials
One of the methods is to set the material for each part of the model by creating and using a
material in Unity. To create a new material in Unity, it is possible to use both materials and
shaders to define the appearance of the scene. Since the Revit exported FBX model will be
reflected in the hierarchy as a package after importing into Unity, when opening the whole
63
package it will be found that each part of the original model is a separate part (Figure 3.6). Unity
helps to load the corresponding parts of the model by setting up various types of materials in the
model, such as: glass, iron, wood, plastic, etc. (Figure 3.7).
Figure 3.6 Separate parts of imported model
64
Figure 3.7 Create materials in Unity
However, the disadvantage of this method of material setting is that first of all, although the
material of each part of the building can be set individually, only one material can be set for one
part, which cannot truly reflect the details of the building interior. In this case, especially when
using VR glasses, it will lack realism. Besides, although this method can make a blank model
colorful, it is very time-consuming and inefficient. It requires defining various materials and then
matching each part of the model to each other. This is feasible for small and simple models but
can be very time consuming for future large office models.
65
Another method uses 3ds Max. Through importing a Revit scene to 3ds Max, the materials and
textures of the model will be loaded. It is found that the new FBX file created by 3Ds Max can be
fully imported into Unity retaining both the geometry and the materials.
It is also important to reset the pivot point of the model in 3ds Max by moving the model to a
central point shown in grid. This is to insure that in the future when importing into Unity for
subsequent operations, the model can be located in the middle of the entire interface to reduce the
complexity. Make sure all materials in the scene are standard and export the scene as an FBX file
and activate the highlighting settings (Figure 3.8).
Figure 3.8 3Ds Max to reset the pivot of the model
66
3.2 Data Collection
Data pertaining to the sample model was collected at six different locations in the small office. For
the final case study, temperature, humidity, and particle density will be collected by different
sensors respectively and stored as a CVS file in Excel.
In the first experiment, Onset's HOBO wireless Bluetooth logger MX1102 was used to record
temperature, relative humidity, and carbon dioxide data, collecting data at different locations in
the building and outputting the data via Bluetooth to the computer side for integration (Figure 3.9).
Simultaneously, the PurpleAir PA-II air quality monitor was used to detect data on indoor air
quality, especially the concentration of particulate matter, also through the sensor's built-in Wi-Fi
enabling the air quality measurement device to transmit data to the PurpleAir real-time map, where
the data is stored and made available to any smart device (Figure 3.10).
Figure 3.9 HOBO MX 1102 (Onset, 2022)
67
Figure 3.10 Pa- II- SD air quality sensor (PurpleAir, 2022)
These three types of data were collected continuously for two hours at the same ten-minute
intervals to serve as environmental data for the sample model and for subsequent simulation
operations. Since the data obtained from the sensors was collected 10 times per second and the
data required for the experiment was collected every 10 minutes, the collected data was
reprocessed and filtered in an Excel sheet to obtain the final data table (Figure 3.11) (Figure 3.12).
68
Figure 3.11 Sample data from the sensor
Figure 3.12 The sample graph of data from sensor
69
After the data had been collected and all data processed into a form that is useable, it was saved
in Excel. Then in order to import the table data into Unity for reading, the table was saved in a
CSV format, and CSV table reading script was written to enable data reading in Unity. Three
series of data will be saved in separated CSV files for further study (Figure 3.13).
Figure 3.13 CSV files of collected data
3.3 Visualization
A C# script was written in Unity directly (using the Create menu at the top of the Project panel)
in order to visualize the data in Unity (Figure 3.14).
70
71
Figure 3.14 Code of CSVDataProject
72
Taking the temperature data for 2 hours as an example, the CSVDataProject script reads the data
in the file and reflect it in the height and color of the cube in the model in real time. When the
project in Unity plays, the CSVDataProject will be run with it. The original cube will be changed
based on the data from the CSV file.
With this script, it is possible to first read the table data, read on the basis of the CSV format file
and perform the next steps. The initial height of a cube in Unity is 1 meter, so assuming when the
indoor temperature is 45 degrees Celsius, the height of the cube is 3 meters, then the data read
from the CSV file is calculated and transformed into the height of the cube to visualize the data
in equal proportion (Figure 3.15).
Figure 3.15 The bar of temperature based on sample data
At the same time, in order to make people feel more intuitive about the data of indoor
environment and understand the range of comfortable temperature as well as the current
73
temperature at the corresponding time and location, the script can also determine the comfort
level of that temperature data to make the cube appear in different colors. 18 degrees Celsius or
below is set to blue, indicating that the temperature range is below human comfort, indicated as
blue. 18-26 degrees is a relatively more comfortable temperature, indicated as green. Above this
temperature is above the comfort issue and is red.
In summary, when running this program, the cube in the Unity scene will change height and
color based on the data in the table to reflect the indoor temperature conditions and comfort level
at that location (Figure 3.16). Also, this CSV file helps Unity read the next row of table data, and
by pressing the spacebar on the keyboard people can modify the row of data to read, providing a
way to subsequently add buttons to change the date and time in the interface.
Figure 3.16 Sample of temperature data visualize in VR headset
74
Using a similar approach, indoor air humidity will be expressed using the size and transparency
of the capsule (Figure 3.17) and indoor particulate matter will be represented as the size and
color of spheres (Figure 3.18). All the data will be reflected in the same scene to more visually
compare the difference between the indoor environments of different locations in same space.
Figure 3.17 Sample of humidity data visualization
75
Figure 3.18 Sample of number of particle data visualization
3.4 User Interface
The start scene is an essential component of any experience, as it provides an introduction and
entry point to the immersive world. By including an introduction and start button on this page,
users can easily understand how to begin their journey, and it sets the tone for the rest of the
experience.
The start scene is a separate scene in Unity that serves as an introduction to the Unity project,
before transitioning to the main scene for data visualization and modelling (Figure 3.19). This
scene can be built using various components, such as buttons, text objects, and images, which
can be easily customized using the Unity Editor. The image can be added as a background using
76
the UI toolkit, while the text can be used to provide users with an overview of the project and
prepare them for the virtual experience. The start button can also be scripted to allow for a
seamless transition from the start scene to the main scene. It plays an important role in setting the
tone and providing an entry point for the project.
Figure 3.19 Start scene
A project's interface is the set of elements that users interact with to navigate their environment
and control their experience. It gives the user a basic understanding of the project and lets them
know how to get started and make it run smoothly. Unity UI is a UI toolkit for developing user
interfaces for games and applications. It can use Game Object to create UI systems, using
components and game views to arrange, position and style user interfaces. When the whole
project could be used for people, there should be a simple introduction as overview to tell people
what is going on with it. Also, it could help users get to know where to start (Figure 3.20)
(Figure 3.21).
77
Figure 3.20 Interface of selecting date and time in VR
Figure 3.21 Interface of data series selecting in VR
78
3.4.1 Hands
The hands can be seen as a vital tool for users to feel the realistic in most VR and AR game.
They can help people interact with options and things in virtual reality and augmented reality
experience. It will be helpful to develop the experience for first person feeling.
The XR controller can help to standardize the control, movement, and commands of VR hand
controllers. Through it, the controllers can be set up into 3D models for hands and turned into
prefab model so that it can be easily import. The hand model from Oculus website will be used
as the 3D model to be imported into this project (Figure 3.22).
Figure 3.22 3D hand model imported as controller
In order to make any button work and corporate with the controller, these steps have to be done.
79
When the button is created, a line will be set up to track where the hand is pointing and help all
these buttons work. It can help to interact with the controller. The LineRenderer component is a
way for hand tracking. The following code is to help to set the initial points to the LineRenderer
starting position (Figure 3.23).
Figure 3.23 Code for line renderer
For helping users learn where they are pointing, it is necessary to get the LineRenderer to stay
aligned with Raycast, which could be done by coding (Figure 3.24).
80
Figure 3.24 Code for Linerendered aligned
A pair of hands with line rendered into the frame to show where the hand is pointing at will be
shown.
3.4.2 Canvas
A canvas is a type of 3D component that can be created in the hierarchy menu through Game
object. The canvas can be seen as a workspace of the whole interface system. It is the area that
all UI elements should be inside (Figure 3.25).
81
Figure 3.25 A canvas added into the project
A cube which scales as 1,1,1 will be added through 3D object in hierarchy as a cube primitive.
When dropping the canvas as a child of cube, the canvas can be controlled by the cube. It will be
a base of interaction when users click a button, and something is shown on the screen. The
location relationships between canvas and cube could be set through changing the position of X,
Y, Z in rec transform. In the UI that Unity has, the simple button can be added through select in
hierarchy menu. the name of the button will be set in the text script. All the names of a single
choice will be added. The width of the button will be set as 100, and the height is 33. The
location of the button can be set freely through Rect Transform tool.
3.4.3 Buttons
In a Unity project, the start button is usually used to launch the main features of the program or
game (Figure 3.26). Users can switch from the opening or start scene to the primary scene using
82
this graphical element. The start button is generally made with Unity's UI system and positioned
over a background image or video. The button game object has a script connected to it that waits
for a click event in order to configure the start button. The script will activate a function that
starts the transition from the start scene to the main scene as soon as the button is pressed (Figure
3.27). People can modify this function to carry out any number of tasks or routines, like loading
game assets. It allows users to easily navigate between different scenes and modes of operation.
Figure 3.26 The start button to control and switch to the main scene
83
Figure 3.27 Script to identify the start button
A user interface button is a feature that enables interaction with the application through clicking
or tapping. Events may be triggered by it. Buttons can be programmed to respond to various
input devices and show various graphics or text. Button inside scripts that make something
happen and can be clicked to be activated. Buttons in the interface including control of map, go
back to last page, time controller and data choose (Figure 3.28) (Figure 3.29).
84
Figure 3.28 Interface of data series selecting in VR
Figure 3.29 Interface of data series selecting in VR
The appearance of the button under the canvas would be set up through its material into different
colors. Several buttons will be set : Map button is for further action to open the plan of the room
85
and see where the user is. The ‘Go Back’ button is to go back to the last step to choose a
different time and data series. The ‘Temperature’, ‘Humidity’ and ‘Particle’ buttons are to
choose the data series to visualize.
By including a button UI in the scene structure, a button can be made (Figure 3.30). The inserted
button should be placed inside the canvas since it has already been made. When the scene is
being played, the button's fundamental features—such as detecting when the mouse is hovering
over it and changing color when pressed—are already present.
Figure 3.30 A basic button that input into Unity project
For a button to be genuinely useful in the UI, it must have some functionality, so a script must be
written to give it all the necessary properties. The OnClick property could be set up to define
what will happen when the button in clicked (Figure 3.31).
86
Figure 3.31 Script to make the button click available
The toggle is a UI component in Unity that allows users to toggle between two states, typically
an "on" and "off" state. In the context of controlling map visibility, a toggle can be used to turn
the visibility of the map on or off depending on the user's preference. The function is achieved
by creating a toggle game object in the Unity Editor and attaching a script to the object that
listens for a toggle event. Once the toggle is clicked, the script will trigger a function that toggles
the visibility of the map game object. The visibility of the map can be toggled using the
SetActive method of the game object, which sets the active state of the game object to either true
or false. It provides users with an accessible way to control the visibility of different elements in
the scene.
The image photo of the floorplan of an office is imported as a map into Unity's project window
and a flat object is created in the scene (Figure 3.32). A toggle button was added to the scene and
was set to open initially (Figure 3.33). A script was created to toggle the visibility of the JPG
image (Figure 3.34). A variable was created in the script for the map image and a function was
written using SetActive() to toggle its visibility. The map image variable was assigned in the
script. Finally, the toggle was tested to make sure it worked as expected.
87
Figure 3.32 The image of floorplan as map
Figure 3.33 Set up inspector of Map Display script
88
Figure 3.34 Script pf Map Display
A slider is a type of button. After doing LineRendered to create the button, then the slide has to
be set to a specific value between the maximum and minimum values (Figure 3.35). It could be a
helpful tool to help users change the time. A simple script is written to change some specific
setting of the slider (Figure 3.36).
89
Figure 3.35 Slider element in UI toolkit
Figure 3.36 Script for slider setting
In the slider properties, set the maximum value of the "Integer" box to 24 for all the hours of the
day. Setting the color of the text through its properties produces more pronounced color.
90
Following the same procedure, the slider game object will be dragged to the new slot and play,
the slider could be change from 1 to 24 through moving the button.
User Interface in Unity could be a tool to help making the interface and it can also help to link all
these buttons and sliders work together.
The VisualElement class is the base for all nodes in the visual tree (Figure 3.37).
Figure 3.37 The VisualElement class
The VisualElement base class contains properties such as styles, layout data, and event handlers.
Visual elements can have children and descendant visual elements. The element could include
different kinds of child visual elements like Label, Checkbox, and Slider. customize the
appearance of visual elements through inline styles and stylesheets. The appearance of visual
elements could be customed through inline styles and stylesheets. It can help to link all these
kinds of elements to make them work together.
91
A dropdown menu is a user interface element in Unity that allows the user to select an option
from a list of choices (Figure 3.38). It is a type of UI widget that can be added to a canvas and
customized to fit the needs of the application or game. When a user clicks on the dropdown
menu, a list of options will be displayed, and people can select one of the options by clicking on
it.
Figure 3.38 Dropdown menu of months
The options are customized in the inspector of the dropdown menu (Figure 3.39). One menu
should be set up as Months and copy another one as Dates. The list of options for the dropdown
menu can be set up in the Unity Editor by adding OptionData objects to the Dropdown
component. Each OptionData object represents one option in the dropdown menu and contains
the text that will be displayed for that option (Figure 3.40).
92
Figure 3.39 Options setting in dropdown inspector
Figure 3.40 Dropdown after the options are applied
93
The month dropdown menu to choose the month, the data dropdown is for the data and the slider
is specifically for the time of the day (Figure 3.41). Three input elements of buttons cooperate to
create a timestamp to choose the data of time in the spreadsheet. TimestampSelector script is
attached to the Canvas object by selecting the Canvas object in the Hierarchy panel. The
dropdown menus and sliders are customized by adding options and adjusting the values (Figure
3.42).
Figure 3.41 Buttons to control the time
Figure 3.42 Methods for UI elements to select time
94
Script to generate a time stamp controlled by three buttons and generate a timestamp for the
further use to read data in spreadsheet (Figure 3.43).
Figure 3.43 Script for timestamp selector
95
3.4.4 Interface Setting
With all these elements combined, when the user enters the VR system interface with glasses and
equipment on, they can see the location of the two handles and a brief text introduction to
explain the location and the purpose of the experiment. The main interface of the VR project
then comes up, where the user can first select the specific date and time of the data, they want to
visualize from the date function menu in front of them (Figure 3.44).
Figure 3.44 Interface of selecting date and time in VR
In the upper left corner of the line of sight is a button that can be clicked with the handle, after
tapping it, it will show the whole office floor plan and the current location, the user can freely
walk around to reach different locations to observe different data (Figure 3.45).
96
Figure 3.45 Interface of the map button in VR
In the upper right corner of the interface is the back button, which allows people to go back to
the previous screen at any time. After selecting the date and time through the menu, three
different parameters will be displayed on the screen, temperature and humidity and airborne
particulate content (Figure 3.46).
97
Figure 3.46 Interface of the data selection in VR
Any data that the user wants to see will be selected through the buttons on the handle. Each kind
of data should be listed in different spreadsheet including all position of sensors and time (Figure
3.47). The user can select single or multiple data at the same time, and repeated button clicks will
cancel the selection. Again, there is a back button in the upper right corner, and it can be returned
at any time to change the data type and date to look at different data. The slider is a tool used to
change the time of the data based on the date. The time of the day can be changed through the
slider to check the indoor environment conditions at the time.
Before the data in read and change the height and color of the 3D objects, all of these will show
in white and original size (1,1,1) at the location where sensor is settled (Figure 3.48). The
spreadsheet of tested data will be organized into different spreadsheet which stand for different
98
aspect of indoor environment factor. Each of them should be listed based on sensor and time
period. Each column in the spreadsheet corresponds to the temperature at each of the cube
locations. While the project played, 3D objects will change based on the data (Figure 3.49).
Figure 3.47 Scene of the office before played in VR
Figure 3.48 Example of the temperature data in spreadsheet for each cube
99
Figure 3.49 Corner office and its data of temperature shown in VR
As for the Map toggle, it is added with a function that show the floor plan of the whole office
and mark the real time position of users and give a direction. This function is realized by a script
to define the visibility of the map. It also can help the user to customize the display properties of
the map, such as its width, height, x and y offsets, and the file path to the image (Figure 3.50).
100
Figure 3.50 Floorplan to show on the screen
Based on this, the position of user can be marked by a symbol to show the location of the user,
besides, all the tested position where the sensors are set are also can be checked by on it (Figure
3.51). The function of real time location following is realized by x axis and z axis of the camera
in XR Rig.
101
Figure 3.51 Markers to show the current position of the user
3.5 Scripts
To make all of those buttons have specific work in the project, several scripts are written to
enable different kinds of actions in Unity.
3.5.1 Toggle
This script is for the Map toggle to show the plan of the office and can control its visibility
through clicking the toggle (Figure 3.52).
102
Figure 3.52 Script for map toggle
3.5.2 Button
This script for basic button that helps to make a button clickable (Figure 3.53). Specific scripts
for every single button to different parts will be show in Appendix.
103
Figure 3.53 Scripts for buttonStart button
Start button to switch from start scene to main scene (Figure 3.54).
Figure 3.54 Script for load scene
104
3.5.3 Slider
Scripts are also used to create sliders (Figure 3.55).
105
Figure 3.55 Scripts for sliders
3.5.4 Menu
Scripts can also make a three-part menu: menu, dropdown and buttons (Figure 3.56).
106
107
Figure 3.56 Scripts for menu in Unity
3.6 Package the Project
To package a project in Unity, it is necessary to make sure that the project is complete and free
of errors and should also make sure that all assets and resources are in their proper locations.
When the project is created, Unity stores a lot of metadata about the assets. To transfer the assets
into a different project and preserve all this information, all these assets should be exported as a
Custom Package (Figure 3.57). Then it is available to be open by any users to check and
experience it (Figure 3.58).
108
Figure 3.57 Export the package
Figure 3.58 Export all Assets
109
In the Building Settings in File menu (Figure 3.59), target platform is chosen as PC, Mac, Linux
for further use (Figure 3.60). Once the platform is selected, it can specify various settings for
your project, such as the application name, icon, and orientation. Make sure to fill in all the
required fields in the setting field. With all of these setting finished, a build button is to start
building your project and specify the output folder and name for built application.
Figure 3.59 The Build setting to build the project
110
Figure 3.60 Build and export the final project
3.7 Summary
This chapter described the workflow and methodology of the project including how to achieve
digital geography, collect data, user interface, and packaging the project. The workflow of how
the project is processed is described and shown through a sample room. The model created in
Revit would be imported into Unity through 3DS max to define the material. Additionally, the
process to visualize data through an Excel spreadsheet to a virtual reality and augmented reality
scene would be established.
Ten scripts were written:
111
LoadSceneAsyn.cs The purpose of the script is to load the main scene from the start scene. It
helps to switch between the two scenes.
TemperatureVisualization.cs. HumidityVisualization.cs, ParticleVisualization.cs
The purpose of the script for data visualization is to read the data in the CSV table file, and
then use the data as a basis for converting it into 3D objects of different sizes, so that the user
can see the environmental quality of the location in the scene.
MapDisplay.cs The script helps to control the visibility of floorplan and mark the real time
location of the user.
ToggleTextAndImageVisibility.cs The script is also work to control the visibility, it uses a toggle
to control the visibility of guide on the main scene.
Timestampselector.cs The month dropdown menu, date dropdown menu and time slider
coworking in the script to generate a timestamp to data visualization scripts.
ToggleCubeVisibility.cs, ToggleCapsuleVisibility.cs, ToggleSphereVisibility.cs
Toggle object visibility scripts are used to choose which one or more data series can be
shown in the scene.
See Appendix A for additional information about VR versus AR in Unity.
112
Chapter 4 will show the detailed workflow for using VR for data visualization for the WSP office
in the form of a “getting started guide.”
113
Chapter 4. USER EXPERIENCE VR
This chapter gives information of a guidance for users to be able to look at a room and associated
Excel data spreadsheet in VR. It describes how the model from Revit is imported into Unity, the
method of data visualizing, strategy of experiencing it in VR and AR. It can be used as a “getting
started guide” for VR visualization of data. Chapter 5 is similar, but it is about AR visualization
of data.
Figure 4.1 Workflow for users to work with the project
114
4.1 Prepare the Files
For this example, the user will need seven files to get started: Revit 3d model and 3ds Max export,
3 csv files, and two Unity files with data objects already created. These sample files have been
provided:
WSP model.rvt
WSP model with material.max
DataTemperature.csv
DataHumidity.csv
DataCO2.csv
StartScene.unity
DataVisualization.unity
The script files are loaded into the Unity project already, so one does not have to open or load
while using the project, but it is necessary to be in the project file package to make sure it works.
4.1.1 Create a Revit File
To get started with working on this project, a Revit 3D file should be created. The example shown
is the office from WSP company in Downtown LA (Figure 4.2). The floor plan is shown (Figure
4.3), and a model of the office was created in Revit (Figure 4.4).
115
Figure 4.2 A photograph inside the WSP office
Figure 4.3 Floor plan of WSP office
116
Figure 4.4 3D model of WSP office
4.1.2 Create a Spreadsheet with the Data
To make sure the data that collected from the sensor can be read correctly by the script from Unity
project, the data should be cleansed and collated in the spreadsheet. Each of the columns in the
spreadsheet should be shown one single sensor’s result. Due to the limitation of the script to read
the data, the result of temperature, humidity and particles need to be separated into three different
spreadsheet files to make it works in different scripts.
The example shown is from Shreya Santodia who is working on the project about indoor
environments quality relative to human experience in WSP office in Downtown LA. The
117
spreadsheet is all those data of temperature, humidity, and particles at the specific seat in the period
(Figure 4.5).
Figure 4.5 Example of spreadsheet of tested data
It has data collected at these locations (Figure. 4.6).
118
Figure 4.6 Floor plan with position of sensors in the office
Figure 4.7 Samples of three CSV files for temperature, humidity, and particles
The layout of the data of the three spreadsheets (temperature, humidity, and particles) should
follow the specific format as shown in the Figure 4.7 to make sure the project works. To ensure
the data can be read and generate a 3D object in correct position, the spreadsheet should follow
the format that: yyyy-MM-dd hh:mm, x position, z position, data, or the data will show incorrect
119
results due to reading the wrong number. Each position of sensor should be exported into separate
CVS files.
The spreadsheets should be saved as CSV to read in Unity, and they should be named
DataTemperature.csv, DataHumidity.csv, DataCO2.csv..
4.1.3 Unity Project File
The Unity file includes these objects and scripts already set up (Figure 4.7). Users only need to
open and combined with FBX file of model and CSV file for the data to make it work as the
following steps will explain.
120
121
Figure 4.8 Finished Unity project be provided
4.2 Revit to 3ds Max to Unity
The file of 3D model can be imported into Unity through the FBX format (Figure 4.9).
Figure 4.9 The 3D model of the office in Revit
122
There is a problem that the materials will be lost in this process as mentioned in 3.1.2. To deal
with this, the FBX file of the model should be exported through 3ds Max to make sure the textures
are fixed. In this process, the position of the model can be relocated and centered.
4.2.1 Export the Revit model as an FBX File
Open a 3D view of the Revit file. The parts that are set as visible in Revit will be exported. Invisible
parts cannot be saved into FBX file no matter if it exists, based on this, unnecessarily items in the
model can be set as invisible to make the scene much clearer. Then export the model into FBX file.
The FBX file format passes rendering information to 3ds Max, including lights, render appearances,
sky settings, and material assignments for the 3D view.
Import the newly created FBX file based on Revit model into 3ds Max (Figure 4.10). It will be
found that there are some unnecessary parts shown in the model such as level labels and cameras.
These things will cause some inconvenient while working in Unity so it should be deleted in 3ds
Max for further work (Figure 4.11).
123
Figure 4.10 Original FBX file shown in 3ds Max
Figure 4.11 Parts of the imported model from Revit
124
To convert the materials in Revit to this 3ds Max project, click the rendering in the menu, choose
the mental-ray daylighting of render preset and load it (Figure 4.12); some of materials in the
original Revit model should be shown. Make sure no textures are lost. If needed, copy the textures
to a single folder and use them. Relink the path asset tracking window to it in 3ds Max, it can be
done by right-mouse click on the texture and click Set Path or Browse in the context menu (Figure
4.13).
Figure 4.12 Setting the render of 3ds Max
125
Figure 4.13 Model that with material texture in 3ds Max
Make sure all of these materials in the original model are Standard materials, which can be checked
in Slate Material Editor (Figure 4.14). The Slate Material Editor is a visual node-based material
editor that allows users to create and edit complex materials and textures by connecting different
nodes that represent various material properties and maps.
126
Figure 4.14 Redefine the material in Scene Converter
Put the model in the center by setting up the pivot of the model. This also helps to make working
in Unity easier (Figure 4.15) (Figure 4.16).
Figure 4.15 Reset the pivot of the model
127
Figure 4.16 Center the model in 3ds Max
Then the scene can be re-exported as FBX file with the Triangulate in Geometry and Embed Media
options selected (Figure 4.17).
Figure 4.17 Set and Export FBX file
128
4.2.2 FBX File Import into Unity
Open the sample Unity project provided (DataVisualization.unity). Copy all the textures from the
3ds Max scene to the Assets folder of the Unity project. Copy the exported 3DS Max FBX file to
the same location. Setting the materials in Inspectors menu as Standard and Use External
Materials (Figure 4.18). Finally drag and drop the FBX file into Unity's Hierarchy window. After
that, the file of the model should be imported successfully into Unity project (Figure 4.19). Though,
Unity has limitations about express materials texture detailed which will not be exactly same as
the model be rendered.
Figure 4.18 Redefine the materials in Unity
129
Figure 4.19 Model in Unity after the materials are defined
4.3 Import CSV File to Unity
The imported tree CSV files from Excel can be directly load into Unity and saved in the Assets
folders (Figure 4.20).
The CSV file of temperature should have columns with the following contents "Date-Time", "x
position", "z position", " Temperature (°C )". Similar to the temperature data CSV file, the
humidity file should have columns with "Date-Time", "x position", "z position", " Humidity (%)"
and the CO2 spreadsheet should contain columns "Date-Time", "x position", "z position", " CO2
(ppm)" as well. The script reads the file and splits each line by commas, so the values for each
column should be separated by commas. The date-time column should be formatted as
"yyyy/MM/dd hh:mm" with the time in 24-hour format. If the file is not located in the same
directory as the Unity project, one may need to modify the file path in the script to match the
130
location of the CSV file. The original Excel file must be closed when running the project in
Unity, or the CSV file cannot be found based on the file path.
Figure 4.20 Imported CSV files in Unity
4.4 Experience in VR
In the previous steps, the import of models and visualization of data in Unity was completed by
importing the required plug-ins and scripts. The next set of steps is to view the project in VR,
starting with how to set up the headset and link the device into Unity (Figure 4.21).
Figure 4.21 Workflow for VR section in Unity
4.4.1 Enabling VR
Enabling VR in a Unity project is a straightforward process that can be accomplished by
following a few simple steps.
131
Before all the steps needed in Unity, install the appropriate driver and software for the VR
headset on computer. Different headsets will have different drivers and software requirements, so
be sure to follow the instructions provided by the headset manufacturer.
To enable VR in Unity, navigate to the Player Setting in Edit menu, click the Player option, select
Other Settings and click the Virtual Reality Supported option, then the Unity will be able to make
VR project (Figure 4.22).
3
Figure 4.22 Check the box of Virtual Reality to enable VR (Unity, 2022)
When VR is enabled in Unity, it is automatically rendered to the head-mounted display, and all
cameras in the scene are able to render directly to the head-mounted display. The view is
automatically adjusted.
132
4.4.2 Set Up Headset
Open the project named DataVisualization.unity under Scene folder in the project file. Once the
file is opened, select Build Settings in the File menu to set up the project. Set the platforms to PC,
Mac and Linux standalone.
Figure 4.23 Build the platform to Windows, Mac and Linux
Next enable VR support. Select Project Settings in Edit. Add XR Plugin Manager and then install
it (Figure 4.24). Open the Windows tab and select Open XR to install XR plugin and enable VR
support in Unity (Figure 4.25) (Figure 4.26).
133
Figure 4.24 Project Setting menu
134
Figure 4.25 Add XR Plugin Management in Project Settings
Figure 4.26 Select the supported VR Plugin
Fix the error after the Open XR to add an interaction profile in features by clicking the + button
in the Interaction Profile list (Figure 4.27). The sample project is tested by the Samsung Gear
VR headset. The Oculus Touch controllers are compatible with the Samsung Gear VR headset,
and the "Oculus Touch" profile includes mappings for the buttons, triggers, and thumb sticks on
the controllers.
135
Figure 4.27 Set up the interaction profile
The input setting can be set up in Input Manager in Project Settings to test and modify the VR
experience with the Oculus Touch controllers in the Unity Editor or on a Samsung Gear VR
device if needed (Figure 4.28).
136
Figure 4.28 Input manager to set up the controllers
137
Open the Window menu at the top and choose Package Manager (Figure 4.29).
Figure 4.29 Package Manager
138
Install OpenXR Plugin and XR Interaction Toolkit (Figure 4.30) (Figure 4.31).
Figure 4.30 Install OpenXR plugin
Figure 4.31 Install XR Interaction Toolkit
139
Click the samples of starter assets in XR Interaction Toolkit as a basic of controllers (Figure
4.32).
Figure 4.32 Import Starter Assets of XR Interaction
140
If the VR headset is attached to the computer properly, the scene should now be able to be
manipulated by the user (Figure 4.33).
Figure 4.33 Scene for users to see in a blank workspace in Unity
Open the loaded sample default interaction Starter Assets (Figure 4.34).
Figure 4.34 Imported starter assets
141
Open the XRI Default Left Controller.preset. In the inspector on the right side, click the Add to
AcionBaseController default. Click each of the add button in all of presets in the XR Interaction
Tookit. It will automatically populate the correct controls for doing locomotion snap turn
teleporting (Figure 4.35).
When working with a Unity project, imported scripts will automatically work without requiring
any additional setup. These scripts will be functional as soon as the Unity project is launched. Any
errors within the scripts will be highlighted and reported within the project, indicating the specific
problem that needs to be addressed.
142
Figure 4.35 Add default assets
143
Navigate Preset Manager in Project Setting, all the preset inputted are listed.
The move function of the controller of VR headset could be set up as needed in XR Default Input
Action (Figure 4.36).
Figure 4.36 Set up the buttons on the controller to move
It can also set up the button on the controller to select (Figure 4.37).
Figure 4.36 Set up the buttons on the controller to select
144
With all the default content are imported and active, open the Preset Manager in Project Settings
(Figure 4.38).
Figure 4.38 Preset manager of project setting
In preset manager, fill the filter in ActionBaseControl and match it with the preset and differentiate
the right and left controllers by writing right and left in filter column (Figure 4.39).
Figure 4.39 Match the ActionBasedController with the preset
145
Then set the left and right hand grips in the XR Default script in the pre input action script. Match
all the left-hand turn action with left hand and also the right-hand turn action with right hand
(Figure 4.40) (Figure 4.41). The scene will show a line of the center after the controllers are settled
(Figure 4.42).
Figure 4.40 Match the XR Default with the left and right hand action
146
Figure 4.41 Match the XR controller action
Figure 4.42 Scene for users after the controllers are set up
147
With the settled-up controllers and enabled VR in the project, the headset can be plugged in to
the computer. Then the scene on the screen can be moved with the headset and users can see
what screen of the computer shown with it (Figure 4.43).
Figure 4.43 VR headset is plugged in and set up
4.4.3 Set Up Player’s View
Create a new empty Object named VR Rig in Hierarchy (Figure 4.44), which will exist in the game
world as the Player's viewpoint and set all coordinates of this Object to (0, 0, 0) (Figure 4.45).
148
Add a pre-built component named XR Rig (Figure 4.46).
Figure 4.44 Create a XR Rig
Figure 4.45 Set up the start position of camera
149
Figure 4.46 Add a XR Rig component in VR Rig
Then create a new empty Object, the empty Object is also set to (0, 0, 0) coordinates, renamed
Camera Offset, then create a new Camera object under the Camera Offset which is the actual eyes
of the Player (Figure 4.47). The view that player who experiences the VR is the camera captured
screen (Figure 4.48).
Figure 4.47 Part of Hierarchy for XR Rig
150
Figure 4.48 Add a camera in VR Rig
Since it is used as the Player's eyes in VR, the Camera object needs to be linked with the settings,
so add the component Tracked Pose Driver to the VR Camera object to automatically associate
the Camera with the helmet's position and pose (Figure 4.49). The first-person view followed with
camera will be settled (Figure 4.50).
Figure 4.49 Set up Tracked Pose Driver component
Make sure that the Player GameObject is at a comfortable height for the user. This can vary
depending on the type of VR experience, but a good rule of thumb is to have the player's eyes at
around 1.6 meters (5.2 feet) off the ground.
151
Figure 4.50 Capsule as first-person aspect
The reason for using a camera and a capsule as the body and eyes in a first-person sight in Unity
is to simulate the experience of being inside the game world as if the user were the player character.
The camera represents the player's view, while the capsule represents the player's body (Figure
4.51). The player can control the movement and orientation of the capsule, and the camera will
move and rotate accordingly, giving the illusion of the player moving and looking around the game
world. This can be done by dragging the VR Camera GameObject from the Hierarchy window and
dropping it onto the Player GameObject in the Hierarchy window (Figure 4.52) (Figure 4.53).
152
Figure 4.51 The capsule imported as body
Figure 4.52 Camera for the play view
153
Figure 4.53 Scene of first-person sight
To make the players’ view following with people’s move perfectly, the turning should also be set
up. There are two turning methods available in the XR Toolkit: continuous and snap. Continuous
allows the user to change direction in one fluid motion, but it can cause dizziness and nausea in
some people after a while. The Snap Turn method will be the superior choice. The snap method
rotates the player at discrete angles, making it easier for the brain to process rotation in the virtual
world in the absence of a proprioceptive counterpart.
For use in snap turning. In our XR Rig's Inspector window, click "Add Component" and look for
"Snap Turn Provider (Action Based)."
The references "Left Hand Snap Turn Action" and "Right Hand Snap Turn Action" must be filled
out. Disable the Left Hand Snap Turn Action because we will only be using the left hand (Figure
4.54).
154
Figure 4.54 Setting up Snap Turn
Once the player's view is set up and are satisfied with the results in the Game View, then the VR
headset can be plugged in and test the VR experience in the actual headset. This will allow users
to ensure that the player's view is properly aligned with the VR headset and that the experience is
seamless and immersive. Make sure the headset is connected and properly set up, and that it has
been selected it as the active device in Unity before building and running the project. It's
important to note that the player's view may need to be adjusted slightly once the VR headset is
plugged in, as the physical dimensions and orientation of the headset may be slightly different
from the virtual representation in the Unity Editor.
4.4.4 Implementation of Button for VR in Unity
Add one component to the XR Rig to be able to operate it. Click Add Component in the
inspector of XR Rig, add Input Action Manager component (Figure 4.55).
155
Figure 4.55 Input action manager
Put the XR Default element into the Input Action Manager (Figure 4.56).
Figure 4.56 Load the default action into action assets
Since all click events in Unity are actually implemented in the interface: OnPointClick in
IPointerClickHandler, it can add click events with the Button component. Write a script whose
content is clicked to perform changes to the table read data (Figure 4.57), import it to the object in
the hierarchy, there are three parameters under OnClick in the Button component, first select the
object the script is hanging on, select the method to be executed, and finally click it to execute the
ListClick method (Figure 4.58).
156
Figure 4.57 Script of button
Figure 4.58 Execute ListClick
157
With all these setting finished, the Unity project that linked with VR headset and controllers
should be set up with all of these button works and can help users have an experience of being
inside the room and visualize the data in it.
Scripts and CSV files of the spreadsheet of data are already imported into the Unity project file,
it would be easy for users to simply use it without importing any extra files. A corresponding
script will be worked to specific data series in the project which also will be helpful to release the
complication of experience period.
4.5 Trying Out All the Features
With all the settings of VR are finished and the headset is plugged in, the Unity project is ready
to be experienced. When the user opens the project file, it will start with the Start Scene, there is
a simple introduction of the project and its aim (Figure 4.59). Using the controller to click the
button can be switched to the main scene.
158
Figure 4.59 Start scene of the project
In the main scene of the Unity project, users can start from the entrance of the office (Figure
4.60). On the interface, it can be seen that there are many of components including Map toggle,
Month Dropdown, date Dropdown, Time Slider, GoBack button, Temperature, Humidity, CO2
and a guidance toggle. The toggles and button can be worked by just click, the dropdowns need
to click and work with the slider inside the menu.
159
Figure 4.60 Main scene after clicked the start button
The default scene of the office that switched from start scene assumed the map in invisible. If
need, click the box of map toggle can make the floor plan shown in the scene. A marker of the
current location will also show on the map (Figure 4.61).
Figure 4.61 Interface of map in VR
160
If the user is not familiar with what those buttons are for, click the guide toggle at the right
corner, a simple direction of how to use those buttons can be show on the screen. Also, it can be
close anytime just by click the box of Guide toggle again (Figure 6.2).
Figure 4.62 Guide to give introduction to the function of buttons
Due to the limitation of data, if the chosen date and time have no data in the file, it will show an
alarm of “NO DATA” to tell the user to change to another time or data for experience (Figure
4.63)(Figure 4.64).
161
Figure 4.63 Scene if there are no data at the selected time
Figure 4.64 Scene if there are no data at the selected time and closed guide
162
If the user wants to choose a different date, it can be achieved by click the arrow of the
dropdown menu by the controller and click again to select a specific month and date (Figure
4.65) (Figure 4.66). The time sider is easy to be controller by press and move.
Figure 4.65 Choose the month in dropdown menu
Figure 4.66 Change the date and time by date dropdown and time slider
163
If the selected time can be found in the table file corresponding to the data, then the data
visualization script will run, and several 3D objects will be generated to represent the data
(Figure 4,67).
Figure 4.67 3D objects can be seen in the scene
The user can control the type of data to be read and displayed by clicking on different good
toggles, and one can choose one or more data types (Figure 4.68) (Figure 4.69) (Figure 4.70).
The specific values of the read data will be displayed around the generated 3D object.
164
Figure 4.68 Data of temperature and CO2 particle visible
Figure 4.69 Data of all three series of indoor environment visible
165
Figure 4.70 Data of humidity visible only
Likewise, when a table is selected that does not have data corresponding to it, an indication of no
data will appear on the screen, indicating that there are no 3D objects generated in this model
that represent the data (Figure 4.71).
Figure 4.71 The data show as no data at a time that have no result
166
As the person's position moves, open the map button and notice that the marked position will
move with it (Figure 4.72) (Figure 4.73). People can identify the direction of the data-generated
location based on this map for easy observation (Figure 4.74) (Figure 4.75).
Figure 4.72 The current location of user shown on the map
167
Figure 4.73 Data of temperature shown on the scene
Figure 4.74 All three kinds of data shown in another position of the office
168
Figure 4.75 Trying the project with VR headset
4.6 Summary
This chapter used the model of the WSP office to give a guidance for users to use the sample
model and data to visualize the indoor environment conditions in VR. Instructions were provided
to create all the files needed if a user wanted to create their own 3d model with associated data.
In the VR experience of this project, users are able to control the time of reading data through
buttons and choose the data from each of the three spreadsheets that they wanted to observe. At
the same time, the position of people is also indicated by an icon, which moves by the user's
position, to give more information to the user. In addition, a tool guide is given in Unity to help
people understand how to operate. When there is no corresponding data in the table read, “NO
DATA” is displayed. With the prepared files including the 3d Revit model and exported 3ds Max
169
file, three spreadsheets of data, and two Unity files, users can set it up and experience the data
through VR headset.
170
Chapter 5. USER EXPERIENCE AR
This chapter gives introduction about how users can experience the data visualization of indoor
environment through AR. It describes how the same Unity project used in VR can also be used
for AR applications through those steps that Vuforia setting, setup AR camera, setup AR
platform and computer graphic simulations. Many of the steps overlap with those in Chapter 4,
which focuses on the VR setup and experience.
Figure 5.1 Workflow diagram for experiencing the project
171
Figure 5.2 AR in the WSP office
5.1 Unity Project Experience in AR
Based on the original already done Unity project, is that it can be expressed in the form of AR.
Through the cooperation of Vuforia and Unity, the images of existing data visualization are
displayed in the form of AR on the screen of mobile devices or AR devices under the lens of real
life (Figure 5.3).
172
Figure 5.3 Workflow of Augment reality in Unity
The main difference between creating VR and AR experiences in Unity is the way digital content
is presented to the user. In VR, the user is fully immersed in a digital environment and
experiences the content through a headset that covers the eyes and ears. Digital content is usually
presented in 3D and users can interact with it using a controller or other input device. The main
goal is to create a whole new world or environment for the user to explore and interact with.
But in AR, digital content is superimposed on the real world through the mobile device's camera.
Users can see both the real world and the digital content at the same time. The user can interact
with the digital content through the mobile device and using touch or other input methods to
enhance the user's real-world experience.
As a result, the amount of data needed to make an accurate prediction of the future is more than
the amount of data needed to make an accurate prediction of the past.
5.2 Prepare the Files
For this example, you will need three files to get started: Revit, csv, and Unity file with data objects
already created. The software being used is Autodesk Revit and 3ds Max, and Excel, and Unity.
These sample files have been provided:
173
Data.csv
ARDataVisualization.unity
StartScene.unity
5.2.1 Difference Files with VR
Based on the original Unity project, it is possible to extend its functionality and make it even more
engaging by transforming it into an augmented reality experience. By leveraging the power of
Vuforia and Unity, the existing data visualization can be presented in a fascinating and immersive
way on the screen of mobile devices or AR devices, seamlessly blending with the real world.
Unlike the traditional visualization models that are limited to a fixed framework, AR breaks the
barriers and allows the data to be presented in different ways and in different locations. This
provides an unparalleled level of flexibility and enhances the user experience to a great extent. In
this project, a new Unity file was created, specifically dedicated to the AR aspect of the
visualization. The need for the building model was eliminated, and the primary focus was shifted
towards visualizing the data in the most optimal way possible.
By utilizing AR, people can stand in a realistic indoor environment and observe the values of
location-based data in a highly interactive and engaging manner. With AR, the user can enjoy a
truly immersive experience by seeing the data overlaid onto the real world, bringing it to life and
making it more accessible and understandable. AR is a powerful tool that adds a new dimension
to data visualization and opens up new possibilities for engaging with data.
174
The utilization of AR technology has expanded the data visualization beyond the constraints of a
traditional model framework, allowing for the representation of diverse data in a range of
locations. To achieve this, a new Unity file was created, solely devoted to the AR aspect of the
visualization. By eliminating the need for the building model, the emphasis was shifted towards
visualizing the data in the most effective way possible.
5.2.2 Create a Spreadsheet with the Data
To ensure the accurate reading of data collected from sensors in the Unity project script, it is
imperative that the data is properly cleansed and collated in a spreadsheet. The process of data
cleansing and collation in the AR experience differs from that of the VR experience. In the AR
project, the CSV files are merged into a single spreadsheet and an additional column is added to
identify the data position.
The example shown is from Shreya Santodia who is working on the project about indoor
environments quality relative to human experience in WSP office in Downtown LA. The
spreadsheet is all those data of temperature, humidity, and carbon dioxide at the specific seat in
the period (Figure 5.7).
175
Figure 5.4 Example of spreadsheet of tested data
By utilizing the Vuforia engine, assigning distinct target images to different locations is all that is
necessary to observe the indoor environmental quality data at different moments within the given
location, simply by selecting different times. However, in this project, importing a large number
of target images for each of the 27 locations would prove to be excessively complicated. As an
alternative approach, all the data has been aggregated together, and readings are selected using a
dropdown menu for ease of study. The CSV file, which has been combined into a single file,
176
consists of various columns such as date and time in yyyy-MM-dd hh:mm:ss format, as well as
temperature, humidity, and CO2.
In an AR experience, since all the data is merged, it becomes essential to add a column that
identifies the position of each data to prevent script misunderstandings and incorrect data readings.
By including the position, it becomes possible to locate data based on position and time stamp,
thereby facilitating accurate data interpretation.
It has data collected at these locations (Figure. 5.5).
Figure 5.5 Floor plan with position of sensors in the office
177
The layout of the data in the spreadsheet should follow the specific format as shown in the Figure
5.6 to make sure the project works. The data of temperature, humidity and carbon dioxide should
be input in order to the specific column, or the data will show different result due to read wrong
number. (Figure 5.6). The script of data visualization will start working to find the corresponding
date-time and position in the CSV file then identify the generated 3D objects’ scale and preference.
Figure 5.6 Sample of CSV file
The layout of the data in the spreadsheet should follow the specific format as shown in the Figure
4.7 to make sure the project works. The data of temperature, humidity and particles should be
separated into three spreadsheet files for further reading by scripts. To ensure the data can be read
and generate a 3D object in correct position, the spreadsheet should follow the format that: Position,
178
yyyy/MM/dd hh:mm:ss, temperature, humidity and CO2, or the data will show incorrect results
due to reading the wrong number.
The spreadsheet should be saved as a CSV file to help to read in Unity and the name should be
named as Data.csv to get ready to be used.
5.2.3 Script
The program reads information from a CSV file and builds a dictionary that converts a string of
the date and time into a list of three values (Figure 5.7). After that, a Unity dropdown UI element
is built up to display the data's available dates and times. The script takes the corresponding list
of values from the dictionary to set the colors and sizes of three cubes in the Unity scene when
the user chooses a date and time from the dropdown. The three data types represented by each
cube are temperature, humidity, and carbon dioxide concentrations. The data read from the
spreadsheet and the created cubes also influence the color and scale of the cubes.
The full script will be given in the Appendix.
179
Figure 5.7 The script for data visualizing
180
Another script is used to control the visibility of a 3D object in Unity which is represented by the
cube variable based on the state of a toggle UI element (Figure 5.8). When the script starts, it sets
the visibility of the cube based on the initial state of the toggle. If the toggle is on, the cube will
be visible, and if the toggle is off, the cube will be hidden. It helps the user to control and select
the series of data that they want to visualize.
Figure 5.8 Script for the data visibility control
5.2.4 Unity Project File
The Unity file includes these objects and scripts already set up (Figure. 5.9). Users only need to
open and combined with CSV file for the data as following steps will introduce to make it work.
181
Figure 5.9 Finished Unity project be provided
5.3 Import CSV File to Unity
The imported CSV file from Excel can be directly loaded into Unity and save it in Assets folders
(Figure 5.10).
Figure 5.10 Imported CSV file
182
5.4 Enable AR in Unity
To enable AR based on the Unity project, transforming into an AR application is a method.
Through engine Vuforia, the 3D objects or animations could be detected through image tracker
and be caught by camera, then make the aimed object appear upon the target's detection based on
a target image and can be built and executed on a supported device (Figure 5.11).
Figure 5.11 Workflow to enable AR in Unity through Vuforia
5.4.1 Register Vuforia
Vuforia is a platform specifically for AR development. It enables developers to create AR
experiences for mobile devices and smart glasses. It uses computer vision technology to recognize
and track real-world objects, such as images, and overlay digital content onto them in real-time. It
tracks 3D and flat picture objects in real time using computer vision technologies. This image
registration feature enables developers to position and align virtual things to actual objects using
3D models and other material while viewing them through the camera of a mobile device. It gives
the virtual object the appearance of being a component of an actual scene.
Vuforia works to enable developers to create AR applications with ease. Unity provides a wide
range of tools and features that allow developers to create 3D models, animations, and other
interactive content that can be displayed in the AR environment. Vuforia, on the other hand,
183
provides computer vision technology and APIs that enable Unity to recognize and track real-world
objects.
In order to use AR in Unity, first register on the Vuforia website, and then jump to the License
Manager page after successful registration (Figure 5.12). Then click the "Add License Key"
button to create a license. Click Next and agree to the terms and conditions, then click the
"Confirm" button to confirm the registration of the application. Vuforia is used for visualizing
3D models or animations in Unity project into AR format. It helps to visualize the 3D object
based on the data to be seen in the real world through camera. It tracks 3D and flat picture
objects in real time using computer vision technologies. This image registration feature enables
developers to position and align virtual things to actual objects using 3D models and other
material while viewing them through the camera of a mobile device. It gives the virtual object
the appearance of being a component of an actual scene. People can add advanced computer
vision capabilities to the project, enabling it to recognize images, objects and spaces, with
intuitive options to configure the application to interact with the real world.
184
Figure 5.12 License Manager of Vuforia
5.4.2 Download the SDK and Import it into Unity
Next, find Download for Unity on the download page of the Vuforia website, and download the
SDK for Unity. Click the "Import" button to import the downloaded SDK into the Unity project
(Figure 5.13).
185
Figure 5.13 Import SDK file into Unity
5.4.3 Add and Set Up ARCamera
Create a new scene and drag the ARCamera preset from the Prefabs folder to the scene (Figure
5.14), then copy the License Key information from the Vuforia webpage to the App License Key
field of the Vuforia Behaviour script (Figure 5.15).
186
Figure 5.14 Imported ARCamera prefab
Figure 5.15 Copied license key to the script of Vuforia
5.4.4 Set Up Buttons
When using Vuforia in Unity, buttons can still be used in a similar manner to trigger events or
actions within an AR application. To use buttons with Vuforia, it needs to create a script that
handles the button's actions and interactions with the AR environment. This script can be
187
attached to a GameObject, which is then associated with the target image. When the target image
is detected by Vuforia, the associated GameObject and its script will be activated, allowing the
button to function within the AR environment.
A script was written to make sure dropdown buttons and toggle works in AR as well (Figure
5.16).
Figure 5.16 Script for control the button in AR
The difference when using buttons in an AR environment is that the user's interaction with the
button may not be as straightforward as in a traditional game or application. The user may need
188
to move their device's camera around to align the button with the target image before UI
elements can be interacted with.
5.4.5 Add Identification Marks
Image targets are images that the Vuforia engine can detect and track. By comparing natural
features extracted from camera images with a database of known target resources, the engine
detects and tracks images (Figure 5.17). It can be seen as a less restrictive QR code to be scanned
and detected by camera for AR. The project can be shown on devices based on the target image.
The sample file is called TargetImage.jpg (Figure 5.18).
Figure 5.17 A example of how target image work
189
Figure 5.18 An example to show how the track image work in WSP office
The ability to recognize and track arbitrary images is a significant advantage for AR applications.
This feature enables developers to avoid the requirement of creating and distributing custom
markers paired with specific applications. Instead, an image is incorporated into the AR application,
and a feature map is analyzed and stored for matching real-world image captures. It is important
to note that the image of the target should have specific features to make the AR recognition work
effectively.
190
A target image when using Vuforia with a Unity project is to provide a visual reference for the AR
application to recognize and track in the real world. A target image is typically a static 2D image
that is used as a trigger for displaying additional digital content in the AR environment. Since the
AR recognition needs to be captured through the lens, the image of target should meet some
features to make it work. It should be high contract no matter mono or color. Besides, it should
have high resolution and sharp details at a minimum resolution of 250 x 250 pixels to help catch
up the target. Patterns that are symmetrical and repeated should be avoided which may not can be
tracked well. It would be better to find a flat surface pattern rather than a 3D figure which can be
difficult to be detected by the camera. Use the provided image TargetImage.jpg (Figure 5.19) for
this example.
Figure 5.19 Sample of target image in the project
After the image file is chosen, open the Target Manager page of Vuforia webpage and click the
"Add Database" button, fill in the name randomly and select the type as "Device". Click on the
191
name of the newly created Database, the Add Target interface will appear, and the settings will be
as follows (Figure 5.20).
Figure 5.20 Add target through Vuforia website
Set Width to 1 and name it, then click the "Add" button. When returned to the list page, there will
be a 5-star identification mark in the Database. Click the Download Dataset (All) button on the
page, select the development platform as Unity Editor and download the resource (Figure 5.21).
Import the downloaded resource into the Unity project, then select ARCamera in the hierarchy
view and check Load XX Database and Active under the Database Load Behaviour script (Figure
192
5.22). Drag and drop the ImageTarget preset from the Prefabs folder to the scene, then click the
Type drop-down list under the Image Target Behaviour script, set the type to Predefined, and select
the previously created Database and Identifier in the Database and ImageTarget drop-down lists
respectively.
Figure 5.21 Download the resource of dataset
193
Figure 5.22 Import Unity package
To represent the generated data in the form of 3D objects in an Augmented Reality setting, a
target image is placed in front of the camera for recognition purposes. Once recognized, three
default cubes will appear on the screen and move in conjunction with the target image.
Simultaneously, a dropdown menu will appear on the screen, allowing the user to select the
desired location and time. These dropdowns work together to control the data that is read from
the table and generate 3D objects based on the selected criteria. Add the model generated in
Unity and the 3D object representing the data as a target object as a child object of ImageTarget.
After this, it can be displayed on the device as an AR based on the imported recognition markers.
194
5.4.6 Build and Run the Project
After objects have been added to the image target, the project can be built and run to see the AR
experience in action. To accomplish this, the desired platform should be selected by going to File
and select Build Settings, and then the Build and Run button should be clicked (Figure 5.23).
Figure 5.23 Build the Unity project
When the project is run on a camera-enabled device, the target image should be pointed at to see
the AR experience. The objects that were added to the image target in the Unity scene will appear
in the real world and be tracked to the target image.
195
5.5 Using All the Features
The target image is imported into Unity project and Vuforia engine, it will be easy to use just
scan the target image by the camera, the 3D objects built, and buttons will show on the screen
with a background of real world scene captured by camera (Figure 5.24). To make the Unity
project in AR works, the user could do a very simple step is that run the Unity project, open the
target image on screen of any device or print it out. Make sure the image can be captured by the
camera and it will work for recognition purposes.
Figure 5.24 Using the project in WSP office
Once recognized, three default cubes will appear on the screen and move in conjunction with the
target image. Simultaneously, a dropdown menu will appear on the screen, allowing the user to
select the desired location and time. These dropdowns work together to control the data that is
read from the table and generate 3D objects based on the selected criteria (Figure 5.25) (Figure
5.26).
196
Figure 5.25 Recognized target image in the office
Figure 5.26 Default cubes generated before the date and time are selected
In addition, three toggles, labeled temperature, humidity, and CO2, are available to control the
visibility of the data, similar to the VR project. Prior to running the Unity project, the screen will
be black (Figure 5.27), but upon starting the project, the size and color of the cube will remain
constant until a new selection is made in the dropdown menu (Figure 5.28) (Figure 5.29).
197
Figure 5.27 Screen when the project is not run
Figure 5.28 Selecting the position and time
198
Figure 5.29 Result of the data in office
By providing a user-friendly interface with intuitive controls, the system ensures that users can
easily select and view the desired data in AR form. This allows for a more immersive experience
when working with the data, enabling users to gain deeper insights and a better understanding of
the information being presented.
When the Unity project is running, the user can select their current location from the dropdown
menu and choose the desired time for observation. Upon selection, 3D objects of varying heights
and colors will appear on the screen, with the objects' color and height scaling rules similar to
those of the VR project. Additionally, specific values of the data will be displayed on the screen
(Figure 5.30).
199
Figure 5.30 Selecting the data series want to show
The time can be selected by clicking the dropdown menu on the screen, due to the limitation of
the time period, each position only records around 10 sets of data (Figure 5.31).
Figure 5.31 Changing the time
If the user needs to focus on one or two specific series of indoor environment factors, it will be
easy to figure it out by click the toggle that not need. Then the corresponding cube and data will
be unavailable for now (Figure 5.32).
200
Figure 5.32 Show the result of humidity and CO2 only
When the user moves to another location, they simply click on the new location corresponding to
the spreadsheet and select the time. A new set of data is then displayed on the screen (Figure
5.33). The specific location numbers and locations are marked in the spreadsheet and work with
the map.
Figure 5.33 New set of data at another position
Similarly, it is also possible to choose data by clicking toggle to verify (Figure 5.34).
201
Figure 5.34 Show the result of temperature and CO2 only
This AR visualization of data, when viewed in conjunction with the actual surrounding
environment, can assist in analyzing the reasons for the data features, providing a means for
further analysis of the indoor environment. Modifying the dropdown option to observe the data
at other times is also very convenient, as the objects will change to reflect the new data. This
enables users to observe the changes in the data and gain a better understanding of the
information being presented. Overall, the system provides a powerful tool for analyzing and
understanding indoor environmental data, with a user-friendly interface and intuitive controls.
5.6 Summary
This chapter explains about how AR works in the same Unity project based on the VR process,
including the Unity project experience in AR, prepare the files, import CSV files to Unity and
enable AR in Unity and using all the features. With all steps and workflow provided, users can
also have an experience of data visualization of the indoor environment quality in AR format.
202
The AR project has features that enable users to review the data for each tested seat individually
while physically present in the office at each tested position. The project can be completed by
simply using the camera to scan the target image. Via dropdown menus, people can choose the
position they wish to examine as well as a certain date and time for it. On the screen, the
corresponding scale and color of the cubes will be displayed, and the user may select which data
series that wish to analyze particularly. Users can easily change their pick of position and time
by using the dropdown menu and moving to another position as well.
203
Chapter 6. CONCLUSION AND FUTURE WORK
This chapter summarizes and discusses the overall process of the project including discussion of
the workflow, Unity limitations, VR limitations, AR limitations, validation, and future work.
6.1 Discussion
Topics about the entire process based on the background, workflows of VR and AR
visualization, and all the customed scripts written are discussed.
6.1.1 Background
Several methods of data visualization can be accomplished by starting with a 3d model and then
taking sensor data in the form of csv files to combine them into a VR or AR environment.
Sensors can gather a lot of information about a building's various characteristics. Data
visualization is crucial to the process of doing scientific research. It provides an easier approach
to studying numbers and helps in improving people's knowledge of them. People can interact and
explore data in a virtual reality setting or using augmented reality to superimpose digital data
onto a real space. These technologies have a lot of potential for streamlining processes and
creating new opportunities in the building and construction sector.
Based on the analysis of previous studies, it is clear that academics have given consideration to
the study of data visualization in the usage of indoor environmental conditions in buildings.
Several data of the inner environment of the building have been analyzed using visualization
models in different forms of expression, from 2D to 3D. The fact that not all data apply to the
204
same kind of image to portray was also discovered. The full potential of VR and AR, a hot issue
in recent years, is also being developed for its uses and application in the building and
construction process. They can be integrated with BIM to provide real-time data monitoring
through sensors and depict the data on visual images, according to what is known from previous
research. Further study on VR and AR for in-building real-time environmental data monitoring is
still possible and digital twins could take advantage of these methods of visualization.
6.1.2 VR Workflow
A methodology for visualizing a Revit model and csv data was created to achieve the data
visualization in Unity (Figure 6.1). A sample room is used to illustrate and demonstrate the
project's workflow. To specify the material, the Revit model would be imported into Unity using
3ds Max. In addition, a method for visualizing data in an augmented reality and virtual reality
setting using an Excel spreadsheet would be developed. Scripts play a key role in the project's
success. The CSV table file's data would be read by the data visualization script in order to
utilize it as a basis for creating 3D objects of various sizes that will allow the user to observe the
environmental quality of the area in the scene. The button code acts as the control base button in
this project and it is used to fulfill various functions in the project including opening panels,
going forward and backward to other steps, and selecting data. The script for slider control is
used to change the selection of the time of day in the interface to represent different data. The
script of the menu is similar to the slider, which also controls the time at which the data is read to
represent the results of the data visualization at that time, and it works in conjunction with the
slider.
205
Figure 6.1 Workflow for data visualization
To make the VR work for users' own files, user guidance is provided after all the necessary
information is prepared in the Unity project (Figure 6.2). The example of the model in the WSP
office in Downtown LA is used to demonstrate how it works. The data files were prepared in
CSV format. The Revit model needs to be exported as an FBX file and modified using 3ds Max
to ensure it can be imported into Unity correctly. Once this is done, the VR-enabled setting in
Unity is used to set up the headset and controllers. A capsule is used to provide the first-person
perspective for users. The turning function is also set up to enable users to follow people's
movements perfectly.
206
Figure 6.2 Requirement to prepare for the project
The clicking controllers are awakened to allow users to click buttons and control the project.
With the prepared files, including the Revit model, a data spreadsheet, and the Unity project,
users can set up and experience their own models and data to visualize indoor environmental
conditions in a VR headset (Figure 6.3) (Figure 6.4). Once the project file is set up, users will
receive a simple introduction explaining what the project is and how to start. Users can open the
map to check their position and the location of the sensors. The data buttons can be clicked to
choose which data to view. Users can walk around with the headset and check different data in
different rooms. People can also jump to the point with sensors on the map (Figure 6.5) (Figure
6.6).
Figure 6.3 Workflow for VR experience
207
Figure 6.4 Experiencing the project with VR headsets
Figure 6.5 Result on the screen when there is no dataset
208
Figure 6.6 Result of data visualization in VR
6.1.3 AR Workflow
There are additional steps to make AR work in a Unity project based on the previous VR process
(Figure 6.7). Vuforia is a platform specifically for AR development. It tracks 3D and flat picture
objects in real time using computer vision technologies. This image registration feature enables
developers to position and align virtual things to actual objects using 3D models and other
material while viewing them through the camera of a mobile device. It gives the virtual object
the appearance of being a component of an actual scene. Therefore, with the Vuforia engine, it is
only necessary to assign different target images to different locations, and it is possible to
observe the indoor environmental quality data at different moments of the location by selecting
different times.
209
Figure 6.7 Requirement to prepare for the project
The process of making the Unity project to be experienced in AR includes the installation of
Vuforia Engine, setting up the application of AR in Unity, and choosing and setting the image
target (Figure 6.8). With the steps and workflow provided, users can also have experience in data
visualization of the indoor environment quality in AR format. After the AR setup is completed,
the Unity project can recognize the target image through the camera, so that the 3D object
representing the size of the data can be represented on the screen in the office where it is located.
At the same time, similar to VR, the selection of the visualized data can be done by clicking on
the buttons on the screen. Thus, the user can be in the building where the data is collected and at
the same time observe the indoor environmental quality at that location.
Figure 6.8 Workflow for AR experience
210
The AR visualization of the data is experienced in the office of WSP in Downtown LA (Figure
6.9). A target image is placed in front of the camera for recognition purposes (Figure 6.10).
Figure 6.9 Use the project and scan the target image in the office
211
Figure 6.10 The AR project work in office
Once recognized, three default cubes will appear on the screen and move in conjunction with the
target image. Simultaneously, a dropdown menu will appear on the screen, allowing the user to
select the desired location and time (Figure 6.11). These dropdowns work together to control the
data that is read from the table and generate 3D objects based on the selected criteria. Three
toggles, labeled temperature, humidity, and CO2, are available to control the visibility of the
data, similar to the VR project.
212
Figure 6.11 The result of data shown in AR at current position
When the Unity project running, the user can select their current location from the dropdown
menu and choose the desired time for observation (Figure 6.12). Upon selection, 3D objects of
varying heights and colors will appear on the screen, with the objects' color and height scaling
rules similar to those of the VR project. Additionally, specific values of the data will be
displayed on the screen.
Figure 6.12 Temperature and CO2 result shown only in the office
213
6.1.4 Custom Scripts
The project involves the use of several custom scripts, which play a critical role in ensuring that
the project functions as intended and provides the desired features and capabilities. The reading
and visualizing of data in spreadsheet, the visibility control and selection of data series, guide,
and map, which are all achieved by cooperation between scripts and UI elements. All the scripts
are in Appendix A.
6.2 Limitations and Improvements
In conclusion, this project successfully monitors indoor conditions to test for human comfort and
presents the data in a clear way using AR and VR techniques, allowing users to navigate through
the scene and select different locations and times to view the data.
The project has several limitations, including rendering limitations in the 3D model, UI and data
collection issues, and limitations in user experience for both VR and AR.
6.2.1 Unity
At the preparation stage of the project, the Unity part could have several limitations which can be
improved in different aspects.
- Visual Effect
In the 3D model of the WSP office, it is found that the rendering of the model is not realistic
enough (Figure 6.13). The whole model could be further detailed and rendered with more
214
materials added. Although the models currently used were more realistic when used in Revit
files, many materials and components could not be fully displayed through the imported models
when they were converted to Unity through 3ds Max. Therefore, it is not possible for the user to
relate the actual office to the model very clearly in the process of experience. This will affect the
experience and the overall visual effect. In order to improve the realism of the models imported
from Revit to Unity, the following measures were thought out to solve this problem. First, the
original Revit model was improved for Unity use by reducing the number of polygons, removing
unnecessary geometry, and ensuring that the model was scaled appropriately. It might be
developed to have better and more detailed modeling and rendering through Revit as well as 3D
max to identify the material much more specifically. This will help improve the performance of
Unity projects. Also, use high-quality textures to enhance the realism of your models. While
some models and materials cannot be imported directly, textures with normal maps, ambient
occlusion, and specular maps can be used to create realistic materials. Third, lighting is critical to
creating a realistic experience. Use Unity's lighting tools to add realistic lighting to models to
create realistic shadows and reflections. It will improve the model and enhance the user
experience (Figure 6.14).
215
Figure 6.13 The limitation of the material of the model
Figure 6.14 Detailed rendered model (Sebastien, 2020)
- Experience Effect
During the operation of data visualization through Unity, it was found that the UI is not smooth
enough to click and use, and there is no feedback on the toggle, dropdown menu, etc. in the
project, so there is no clear feedback on whether the process is going smoothly, which can cause
216
some troubles in the use. Animation is an effective way to provide click feedback in Unity when
there is a deficiency in user experience. By creating an animation that triggers when a button is
clicked, it is possible to produce a change in color or size that enhances the overall experience.
Another approach is to use particle effects, which can offer visual feedback and thus improve the
user's perception of the click action. Sound effects are also a useful tool to provide auditory
feedback when a button is clicked. By incorporating a sound effect, users can be informed that
the button has been clicked, resulting in an improved experience and overall finish for the Unity
project.
- Data Collection
The time difference of the collected base data itself is relatively small, and the sensor stays in the
same location for a short period of time, so it leads to the actual use of the process, there is no
very intuitive change on the image of the data visible to the naked eye (Figure 6.15). Since this
data must be collected by the sensor, organized in advance, then imported into the Unity file for
visualization, the data itself changes from moment to moment due to factors such as weather and
environment, and the results produced by it only have limited roles as a reference and record. If
the data can be collected in real-time using the sensors and uploaded to a tabular file on a laptop,
the user can experience the indoor environmental quality at that moment in time, which will give
more meaning and reference to people.
217
Figure 6.15 Lack of datasets for each position
- Data Visualization
The method of data visualization in the Unity project is using the height of columns and the
number of bubbles to show the data, which is not intuitive for users to have a better
understanding of what the data is actually like. Although the viewer can grasp the data based on
color, transparency, etc., the presentation of the data is still confined to a few changes on the
conventional chart, making it insufficiently straightforward and needing a concise explanation.
Given the visualization conditions of virtual reality and augmented reality, data on indoor air
quality can be presented in a more straightforward way. By connecting the locations where each
sensor is located, data can be estimated for each location throughout the office, then the indoor
air quality condition of every position in the building could be visualized but not just limited by
the sensors. It also would be an effective way to observe the impact of office interior design and
layout, such as the location of doors and windows, on indoor air quality.
6.2.2 VR Limitations
There are several limitations while trying the VR in headset including users experience, method
of the project present and users’ physical comfort.
218
- Users Experience
The user experience in a VR headset while running a project can be improved in various ways.
First, the current method of trying to locate available data to visualize through a button can be a
time-consuming and pointless task. It would be much more efficient to mark available time and
location through a marker for easy access. Secondly, navigating in a VR model can be time-
consuming and can cause dizziness if not done correctly. To address this issue, a function to
select a location and move directly to that point could be implemented. Furthermore, while there
are guides available for users to understand the purpose of the buttons on the controllers, they
may not fully comprehend how to use them. Therefore, incorporating a visual aid to identify
which button on the controller corresponds to its function can be beneficial to enhance the
overall user experience in the VR headset.
- Project Presentation
Similar to the limitations mentioned in the visual effects of the project, users cannot have a fully
immersive experience during VR use. The construction of the model also still has some
limitations, in VR devices, cannot be a fully immersive the construction of the model and will
occasionally produce a feeling of dizziness, which is also a problem that needs to be adjusted.
The working interface of VR can be spread across the entire 360-degree frame, so it can be set
up with a richer the resolution of VR headsets is relatively low, and clear UI elements may look
pixelated (Figure 6.16). This means that text can be hard to read and have a high degree of
blending in straight lines. Improvements should be made to try to avoid large blocks of text and
highly detailed UI elements. To make the VR project more attractive, interactive elements such
as animations, physical and sound effects can be added to the model to enrich the VR experience
219
as a whole, thus giving people an intuitive sense of the interior environment, they are in from
different forms to achieve the purpose of the project.
Figure 6.16 Sample of UI setting in VR (Ramotion, 2023)
- Physical Comfort
In the process of experiencing and using VR headsets to feel the data visualized in the model,
there is also a need to improve the physical comfort of the screen. This is because, during the
process of testing and adjusting, it was found that users may experience some kind of fatigue or
nausea because the environment they are watching is moving while their body is stationary,
which can give people a bad feeling and instead make the process of experience meaningless
(Figure 6.17). In order to improve the comfort level of the project, it can be supplemented and
improved by adding some audio and lighting to guide the user to use it properly to avoid
220
discomfort. One should test the VR project on different devices, critically evaluate the result, and
fix the problems to improve performance.
Figure 6.17 People feel dizzy while using VR headsets (Ku, 2019)
6.2.3 AR Limitations
Similar to VR, the AR part in the project also has potential limitations in the presentation of data,
and there are also some drawbacks specifically for the AR experience including that the outcome
is not stable and the movement direction is not clear.
- Outcome Not Stable
While AR offers visual effects in the project, it is not always stable in its outcome. Although the
target image may be easy to track at times, there are instances where tracking can be lost,
causing the generated 3D objects to disappear from the user's view (Figure 6.18). It can result in
221
unnecessary frustration and wasted time for the user who may need to track the target image
again to restore the AR experience. Loss of tracking can be caused by various factors, such as
changes in lighting, movement of the target image, or interference from other objects. It is
critical to optimize the AR environment and consider these factors to minimize the possibility of
losing tracking. The limitation can be improved by implementing techniques such as lighting
estimation, feature point detection, and motion tracking to improve the stability of the AR
experience. In addition, providing feedback to users when tracking is lost can improve their
experience by reducing frustration and indicating the appropriate action to resume tracking.
Figure 6.18 Unstable data while using AR project
- Direction Not Clear
In contrast to VR projects where target positions can be easily indicated, AR experiences often
require users to locate specific locations independently. This lack of guidance can be challenging
for users and may result in a less than optimal experience. To improve the usability of AR,
222
implementing a directional angle or AR map to guide users to the intended location would be an
effective solution (Figure 6.19). By adding a directional angle or map, users can easily navigate
to their intended location without getting lost or confused. It can enhance the overall experience
and provide a sense of direction to the user. Furthermore, providing clear directions can also
make the AR experience more accessible to users with different levels of expertise.
Figure 6.19 Directions for AR to clarify for users (Nilga, 2021)
6.3 Validation and Future Work
The main goal was to visualize sensor collect data in both a VR and AR environment so that it
could be presented in a more intuitive and clear way. It is partially done because the data is truly
visualized into objects and users can identify if it is higher or lower than a reasonable range, But
the clarity of data is still not enough, it is essential to consider the target audience and their level
of familiarity with the data.
The virtual reality and augmented reality approach could help one to visualize multidimensional
data, present high-density information, and provide context for a more complete understanding of
223
the problem. They make it easier to perceive the differences between data than ordinary charts,
making data more readable and intuitive to experience. For example, people who do not have a lot
of knowledge about the size of air humidity data can understand the current humidity situation by
using the size and transparency of the capsule in this project. For example, if 18 to 23 degrees
Celsius is the range of comfortable temperature for the human body, users can directly understand
whether the current temperature is high or low through the color, which allows people to quickly
understand the data, improve the spatial perception of users, and realize the perception of the scale
of the data. In this sample project, the visualization of data is not presented in a way that is easily
understandable to individuals who may not be familiar with the meaning of the figures displayed.
As a result, users may only be able to discern whether the data is higher or lower without fully
comprehending the significance of the visualization.
VR and AR allows for both temporal and spatial observation of data, which is difficult to do with
traditional charts and can be very complex. For example, users can change the time to observe
environmental conditions at different times in the same location, or just move locations to analyze
the impact of different locations in the office on environmental quality. As a result, data reading
can be made easier and as much information as feasible can be included using VR and AR
technologies. For data visualization, VR provide a detailed visualizations, as it can create a 3D
environment that users can explore and interact with. It can make it easier to understand complex
data sets. It also provides enhanced interactivity that allowing users to gain a deeper understanding
of the data by allowing them to find different scenarios and see how variables affect the data. The
understanding of the relationships between different data points and how relate to the real world
can be developed to users through walking around and explore data from different perspectives
ang angles. AR technology helps overlay the data with real world, making it easier to visualize
224
data in real world contexts. Besides, AR can be accessed through smartphones and tablets, it makes
the result can be easily accessed and portable for data visualization. It doesn’t need much cost of
equipment and make it more accessible. The method of data visualization can help people make
decisions effectively and quickly.
The VR and AR projects created have not been tested by people, so it is not yet known whether
there will be potential problems in their use. In future work, experimenters could be invited to
experiment with the two projects separately, and questionnaires could be distributed to allow
experimenters to assess the realism of the scenarios in this project and whether they have a digital-
based understanding of indoor air conditions through VR and AR as a channel for data
visualization. Based on the results of people's experience, further improvements can be made, such
as improving people's comfort and smoothness of use.
In addition, based on the existing visualization of indoor air quality and evaluating the data to give
users a more direct result would make the data visualization more straightforward and effective.
In reality, most people do not have a professional understanding of the air quality situation, and a
simple numerical value does not allow people to know whether the air quality is good or bad. The
visualization of the data helps people to understand to a certain extent, but it would be more
intuitive and effective if the data could be used to assess indoor air quality and give further
recommendations.
Existing data was used for visualization, but these data are also from sensors. In this regard, as for
future work, the sensors can be considered to be used to collect real-time data and transfer them to
a CSV file. The implementation data can be processed and imported into Unity for visualization.
Then, it will be shown visually for users in VR and AR. In this way, the real-time data will be
225
more reliable compared to the past data, as climate conditions and structural changes inside the
building may affect indoor air quality. Then users can not only see the data visually but also feel
it in real life.
3d modeling can take a lot of time before working in VR. 3d scanning might be a method that can
save time. 3D scanning is a process that involves capturing the shape and texture of a physical
object using specialized hardware and software. This process can be used to create highly accurate
digital models of real-world objects, which can then be imported into Unity for use in games,
simulations, and other interactive experiences. The file of scanned model can be saved into the
format which Unity support to use as the model needed to visualize its environment quality (Figure
6.20).
Figure 6.20 3D scanning model use in Unity (Makino, 2021)
226
That way, the program will be more widely available and will help make it more accessible to
more people who are concerned about indoor air quality. The AR can be further developed into a
mobile application (Figure 6.21). Users can select the data and date to be seen visually in the app
and then see the data visualized in a real-time scene through the camera. As the person moves
around with the phone in hand, the image changes with the location to show the data at that specific
position.
Figure 6.21 Application of AR data visualization
6.4 Summary
Building information models digitally represent the physical and functional characteristics of a
building or infrastructure asset. By using it in conjunction with VR and AR, it provides a more
immersive and interactive experience for data visualization. In this project, the BIM model was
used to create an immersive simulation of the building in the context of VR, allowing users to
explore and interact with the data within the building in real time. On this basis, any building-
227
related data can be presented in the form of models and virtual reality technology and give
people an immersive experience and a more intuitive understanding of the data.
Visualization of indoor air quality data in VR and AR has the potential to allow people to better
understand invisible characteristics of spaces in real time. By leveraging VR and AR
technologies, it is possible to create immersive and interactive environments that allow users to
explore the environmental quality of a given space from various perspectives and angles.
Visualization of indoor air quality data in VR and AR can be a powerful tool for improving the
overall understanding of the air quality of indoor spaces. By enabling individuals to explore and
interact with air quality data in real-time, VR and AR can help to promote healthy indoor
environments and provide users with the information they need to make informed decisions
about improving indoor air quality.
Digital twins are virtual replicas of physical objects that can be used for simulation, monitoring,
and analysis. AR and VR can be used to visualize and interact with digital twins, providing a
more immersive and intuitive way to understand complex systems and data. Building on this
project, the link to digital twin technology allows data such as the building's interior environment
to be combined with a virtual replica model of the building as part of the data in the digital twin
to explore and document the building through different forms.
AR can be used to superimpose information from the digital twin into the real world, allowing a
better understanding of how a physical object or system works in real time. At the same time
228
data collected by sensors, such as the building's internal environment, can be represented in the
same form, giving users a more intuitive and in-depth understanding of building information.
VR can be as a tool to create fully immersive digital twin simulations that allow users to explore
and interact with a virtual copy of a physical object or system. Users can use VR headset to
explore and discover the impact of multiple factors working together on a building in a digital
twin building.
Data visualization techniques in VR and AR can provide a more comprehensive understanding
of complex systems and data. The combination of a digital twin with AR and VR can provide
powerful tools for visualizing and analyzing complex systems and data, giving users a deeper
understanding of how these systems work and how they can be optimized for performance and
efficiency.
229
REFERENCES
Abdelalim, A., O’Brien, W., & Shi, Z. (2017). Data visualization and analysis of energy flow on
a multi-zone building scale. Automation in Construction, 84, 258–273.
https://doi.org/10.1016/J.AUTCON.2017.09.012.
AEC Magazine. (2022, October 24). KPF and SimScale develop wind analysis tool. AEC
Magazine. Retrieved February 1, 2023, from https://aecmag.com/simulation/kpf-and-simscale-
develop-wind-analysis-tool/
Anett, M. (2018). Virtual reality and its impact on the architectural industry. In 2018 4th
International Conference on Education and Training Technologies (ICETT) (pp. 77-80). IEEE.
doi: 10.1109/ICETT.2018.8464630.
AR & VR Home. (2022). What is AR? AR & VR Home. Retrieved from
https://arvrhome.com/what-is-ar/
Bajarin, T. (2021, August 12). How AI and AR Will Revolutionize Hearing Aids. Time.
Retrieved from https://time.com/6097261/hearing-aids-ai-ar/
Baker, T. A. (2007). An open-source program to animate and visualize the recorded temperature
and relative humidity data from dataloggers including the building's three-dimensional geometry
(Doctoral dissertation, University of Southern California). ProQuest Dissertations Publishing.
1447038.
BBC. (2019, September 6). Your Home Made Perfect: BBC Two’s first virtual reality property
show returns with new presenter Angela Scanlon. Retrieved from
https://www.bbc.co.uk/mediacentre/latestnews/2019/your-home-made-perfect
Borders, S. (2018). BadVR: Immersive Data. Data Visualization Catalogue. Retrieved from
https://datavizcatalogue.com/blog/suzanne-borders-badvr-immersive-data/
Brodkin, J. (2022). How Unity3D Became a Game-Development Beast. Retrieved 29 September
2022, from https://insights.dice.com/2013/06/03/how-unity3d-become-a-game-development-be
Carmigniani, J., Furht, B., Anisetti, M., Ceravolo, P., Damiani, E., & Ivkovic, M. (2011).
Augmented reality technologies, systems and applications. Multimedia Tools and Applications,
51(1), 341-377. doi: 10.1007/s11042-010-0660-6.
Cecchini, C., Magrini, A., & Gobbi, L. (2019). A 3d platform for energy data visualization of
building assets. IOP Conference Series: Earth And Environmental Science, 296(1), 012035. doi:
10.1088/1755-1315/296/1/012035
230
Chirico, A. (2019). Virtual reality in psychology: A review of present research and future
directions. Frontiers in Psychology, 10, 1-17. doi: 10.3389/fpsyg.2019.01473
Chu, C., Cheng, Y., Wei, C., & Lin, C. (2018). Integrating augmented reality with building
information modeling for building maintenance. Automation in Construction, 88, 165-175. doi:
10.1016/j.autcon.2017.12.008
Cipresso, P., Giglioli, I. A. C., Raya, M. A., Riva, G., & Gaggioli, A. (2018). The past, present,
and future of virtual and augmented reality research: A network and cluster analysis of the
literature. Frontiers in Psychology, 9, 2086. doi: 10.3389/fpsyg.2018.02086
Clark, M. (2021, December 10). The Metaverse: What It Is, Where to Find it, Who Will Build It,
and Fortnite's Role. PCMag. Retrieved from https://www.pcmag.com/news/the-metaverse-what-
it-is-where-to-find-it-who-will-build-it-and-fortnites
Convey, M. (2005). Immersive Weather. Broadcast Engineering, 47(7), 48-49.
Corke, T. (2019). HoloLens helps construction workers visualize the job site. Microsoft.
Retrieved from https://news.microsoft.com/features/hololens-helps-construction-workers-
visualize-the-job-site/
Cruz, R., Guimarães, T., Peixoto, H., & Santos, M. F. (2021). Architecture for Intensive Care
Data Processing and Visualization in Real-time. Procedia Computer Science, 184, 923–928.
https://doi.org/10.1016/J.PROCS.2021.03.115
Dallasega, P., Revolti, A., Sauer, P. C., Schulze, F., & Rauch, E. (2020). BIM, Augmented and
Virtual Reality empowering Lean Construction Management: a project simulation game.
Procedia Manufacturing, 45, 49–54. https://doi.org/10.1016/J.PROMFG.2020.04.059
Delgado, J. M. (2015). Augmented reality: Applications, challenges and future trends. In
Proceedings of the IEEE Global Engineering Education Conference (EDUCON) (pp. 474-478).
IEEE. doi: 10.1109/EDUCON.2015.7096067.
Dell'Oglio, S. (2022). Real-time rendering with Enscape. Retrieved from
https://www.archdaily.com/958417/real-time-rendering-with-enscape
Devetakovic, M. (2007). Designing for maintenance: Some implications. Journal of Quality in
Maintenance Engineering, 13(3), 252-261.
DiGregorio, R. (2020). BIM and real-time data in the construction industry. Retrieved from
https://www.archdaily.com/933376/bim-and-real-time-data-in-the-construction-industry
Donalek, C., Djorgovski, S. G., Cioc, A., Wang, A., Zhang, J., Lawler, E., ... & Norris, J. (2014).
Immersive and collaborative data visualization using virtual reality platforms. In 2014 IEEE
International Conference on Big Data (Big Data) (pp. 609-614). IEEE. doi:
10.1109/BigData.2014.7004282.
231
Doyle, M. (2018). Augmented reality and the construction industry. Constructible. Retrieved
from https://constructible.trimble.com/construction-industry/augmented-reality-and-the-
construction-industry
Emerson, A. (2020). AR software working interface. Medium. Retrieved from
https://medium.com/@austinjemerson/ar-software-working-interface-1d271e0f8d06
Eyraud, R. (2022, March 2). The metaverse: Full of hope or full of hype? Agnostic Networks.
Retrieved February 2, 2023, from https://www.agnonet.com/the-metaverse-full-of-hope-or-full-
of-hype/
Farnworth, R. (2020). A Short History of Data Visualisation. Retrieved 30 September 2022, from
https://towardsdatascience.com/a-short-history-of-data-visualisation-de2f81ed0b23
Few, S., Harrell, A., & Rogers, C. (2007). Show me the numbers: Designing tables and graphs to
enlighten. Analytics Press.
FMLink. (2007). What is facilities management? FMLink. Retrieved from
https://fmlink.com/articles/what-is-facilities-management/
Friendly, M. (2008). Milestones in the history of thematic cartography, statistical graphics, and
data visualization. Cartography and Geographic Information Science, 35(1), 5-16.
G2Score. (2022). Unreal Engine 4 Reviews. Retrieved from
https://www.g2.com/products/unreal-engine-4/reviews
Gamma AR. (n.d.). Gamma AR: Augmented Reality for Industry. Retrieved from https://gamma-
ar.com/
Georgiadis, P. (2022). How VR is changing the way media is created. Forbes. Retrieved from
https://www.forbes.com/sites/petergeorgiadis/2022/01/06/how-vr-is-changing-the-way-media-is-
created/?sh=7dd8f9f46e1e
Gillis, S. (2022). Augmented reality: A game-changer for ecommerce? Business.com. Retrieved
from https://www.business.com/articles/augmented-reality-ecommerce/
Guo, H., Yu, Y., & Skitmore, M. (2017). Visualization technology-based construction safety
management: A review. Automation in Construction, 73, 135–144.
https://doi.org/10.1016/J.AUTCON.2016.10.004
Haritonova, A. (2022). How augmented and virtual reality are reshaping the construction
industry. Inverse. Retrieved from https://www.inverse.com/innovation/how-augmented-and-
virtual-reality-are-reshaping-the-construction-industry
Harley-Davidson. (n.d.). Harley-Davidson® app. Retrieved from https://www.harley-
davidson.com/us/en/tools/mobile-apps/harley-davidson-app.html
232
Han, B., & Leite, F. (2022). Generic extended reality and integrated development for
visualization applications in architecture, engineering, and construction. Automation in
Construction, 140, 104329. https://doi.org/10.1016/J.AUTCON.2022.104329
Hayden, M., Novitskiy, N., & Zakharov, Y. (2018). Virtualization of Data for Augmented
Reality Applications. In 2018 IEEE 4th World Forum on Internet of Things (WF-IoT) (pp. 270-
275). IEEE. doi: 10.1109/WF-IoT.2018.8355186.
Heinrich, M. (2008). Augmented Reality in TV Broadcasting. In Mixed and Augmented Reality
(ISMAR), 2008 7th IEEE/ACM International Symposium on (pp. 157-158). IEEE. doi:
10.1109/ISMAR.2008.4637364.
Henze, R. (2022). The use of augmented and virtual reality in architecture. Retrieved from
https://www.architectural-review.com/essays/the-use-of-augmented-and-virtual-reality-in-
architecture/10050616.article
Heyliger, M. (1962). Simulator. United States Patent Office.
Hill, J. (2020). Displaying carbon footprint data in AR. Retrieved 16 September 2022, from
https://jordanelizahill.medium.com/displaying-carbon-footprint-data-in-ar-bf881e08e46
Hole, T. (2021). The Future of Architecture Is in Virtual (VR) and Augmented Reality (AR) —
Born to Engineer. Retrieved 16 September 2022, from https://www.borntoengineer.com/the-
future-of-architecture-is-in-virtual-vr-and-augmented-reality-ar
Hosanagar, K. (2016). Microsoft’s HoloLens: An AR headset with limitless potential. Forbes.
Retrieved from https://www.forbes.com/sites/kartikhosanagar/2016/04/07/microsofts-hololens-
an-ar-headset-with-limitless-potential/?sh=5f5d5d3368ba
Hosokawa, K., Hasegawa, H., & Kunii, T. L. (2016). Indoor thermal environment visualization
using AR-based mobile devices. Energy and Buildings, 130, 365-373. doi:
10.1016/j.enbuild.2016.08.057.
Hsu, T. (2016). How Pokémon Go brought augmented reality to the mainstream. The Verge.
Retrieved from https://www.theverge.com/2016/7/22/12254538/pokemon-go-augmented-reality-
mainstream-app-history
IBM. (2021). Improving occupant well-being and productivity with indoor environmental
quality. IBM. Retrieved from https://www.ibm.com/thought-leadership/institute-business-
value/report/well-being-and-productivity-in-the-workplace/
Jerald, J. (2014). The VR Book: Human-Centered Design for Virtual Reality. ACM Press.
Kimber, D. (2010). The social media revolution: Exploring the impact on industrial marketing.
Journal of Business & Industrial Marketing, 25(6), 475-480.
233
Ku, A. (2019, January 29). Motion sickness in VR. Medium. Retrieved April 6, 2023, from
https://medium.com/digitaldetox-co-uk/motion-sickness-in-vr-c7c38d30bbe9
Kushnir, L. (2018). Augmented Reality: Create AR apps with the Unreal SDK. Retrieved from
https://www.pluralsight.com/guides/augmented-reality-create-ar-apps-with-unreal-sdk
Lee, G. (2012). CityViewAR: Experiencing virtual city models in situ with augmented reality on
mobile devices. Journal of Computing in Civil Engineering, 27(5), 482-493. doi:
10.1061/(ASCE)CP.1943-5487.0000198
Lee, S. H. (2018). Integrating Building Information Modeling and Internet of Things: A Review
and Future Directions. Journal of Computing in Civil Engineering, 32(2), 04017049. doi:
10.1061/(ASCE)CP.1943-5487.0000704
Lehman, M. (2022). How data can make building design better for occupants. Building Design +
Construction. Retrieved from https://www.bdcnetwork.com/sponsored/how-data-can-make-
building-design-better-occupants
Li, X., Cheng, J., Goh, Y. M., & Wai, A. A. (2019). Virtual Reality-Based Interactive Game
Design Learning Platform. In Proceedings of the 2019 3rd International Conference on
Management Science and Innovative Education (MSIE 2019). Atlantis Press.
Li, X., Kuroda, A., Matsuzaki, H., & Nakajima, N. (2015). Advanced aggregate computation for
large data visualization. 2015 IEEE 5th Symposium on Large Data Analysis and Visualization
(LDAV). https://doi.org/10.1109/ldav.2015.7348086
Lin, M., Yan, C., Chen, K., & Zhang, X. (2019). Interactive cross-sectional view selection and
time-step animation in augmented reality for building information models. Journal of Computing
in Civil Engineering, 33(6), 04019029. doi: 10.1061/(ASCE)CP.1943-5487.0000896.
Lorek, L. A. (2022). The rise of building information modeling (BIM) in construction. Retrieved
from https://www.constructionexec.com/article/the-rise-of-building-information-modeling-bim-
in-construction
Lühr, S. (2020, July 14). Top 5 learnings for visualizing data in Augmented Reality (AR).
Medium. https://medium.com/@sebastianluehr/top-5-learnings-for-visualizing-data-in-
augmented-reality-7103f5caf12f
Maile, T., Rödelsperger, S., & Haberl, M. (2007). Indoor environmental visualization using
augmented reality and building information modeling. In 2007 Conference on Human System
Interactions (pp. 93-98). IEEE. doi: 10.1109/HSI.2007.4294545.
Makino, H. (2021, November 11). Tips for developing AR apps with Unity Mars by capturing
and importing real space. Medium. Retrieved April 6, 2023, from
234
https://makihiro.medium.com/tips-for-developing-ar-apps-with-unity-mars-by-capturing-and-
importing-real-space-f6923e0c9337
Matt, J. (2021, November 10). What is the Metaverse, and Will You Want to Live There?
Medium. Retrieved from https://jasonmatt.medium.com/what-is-the-metaverse-and-will-you-
want-to-live-there-8a06c2d97987
Matterport. (2022). Capture reality in 3D with Matterport. Retrieved from https://matterport.com/
Metaio. (2014). Augmented Reality: Touch the Virtual World. Retrieved from
https://www.metaio.com/products/augmented-reality/
Maurugeon, L. (2011). Total Immersion develops DFusion, an augmented reality system for
design. Retrieved from https://www.dexigner.com/news/24344
Metcalfe, T. (2018). What is VR? The devices and apps that turn the real world virtual. Retrieved
18 September 2022, from https://www.nbcnews.com/mach/science/what-vr-devices-apps-turn-
real-world-virtual-ncna857001
Microsoft. (2018). Microsoft HoloLens: Transforming the game with mixed reality. Retrieved
from https://www.microsoft.com/en-us/hololens/resources
Mortice, Z. (2014). Building Visualization. Architectural Record, 202(2), 78-83.
Naef, M., Rummel, J., Spinath, B., & Roebers, C. M. (2022). Virtual reality assessment: A new
tool to measure children’s cognitive development? Journal of Cognitive Education and
Psychology, 21(1), 30-48. doi: 10.1891/1945-8959.21.1.30
Natephra, W., & Motamedi, A. (2019). BIM-based live sensor data visualization using virtual
reality for monitoring indoor conditions.
NewscastStudio. (2018, June 7). AccuWeather uses AR to visualize weather in 3D. Retrieved
from https://www.newscaststudio.com/2018/06/07/accuweather-ar-weather-visualization/
Newman, D. (2016). How augmented reality will change mobile marketing forever. Forbes.
Retrieved from https://www.forbes.com/sites/danielnewman/2016/06/14/how-augmented-reality-
will-change-mobile-marketing-forever/?sh=60e2eb2c7916
Ni, T. Y., Shu, Y., & Hsiao, S. W. (2010). The interactive virtual experience of museum artifact.
In 2010 IEEE Virtual Reality Conference (VR) (pp. 299-300). IEEE.
Oetting, J. (2022, August 11). 14 best types of charts and graphs for data visualization [+ guide].
HubSpot Blog. Retrieved February 1, 2023, from https://blog.hubspot.com/marketing/types-of-
graphs-for-data-
visualization?hubs_content=blog.hubspot.com%2Fmarketing%2Fauthor%2Fjami-
oetting&hubs_content-cta=null
235
Onset. (2022). HOBO MX1102 CO2/Temperature/RH Data Logger. Retrieved from
https://www.onsetcomp.com/products/data-loggers/mx1102
Paine, J. (2018). The benefits of augmented reality in architecture. Retrieved from
https://medium.com/@j.paine/the-benefits-of-augmented-reality-in-architecture-4ad6fb92c703
PurpleAir. (2022). PurpleAir PA-II: The New Standard in Air Quality Monitoring. Retrieved
from https://www2.purpleair.com/products/purpleair-pa-ii
Racz, I. (2016). Virtual reality in interior design: The use of VR as a new tool for architects and
interior designers. Procedia Engineering, 161, 716-721. doi: 10.1016/j.proeng.2016.08.650.
Reflekt. (2022). Industrial Augmented Reality Applications: Use Cases and Examples. Retrieved
from https://www.re-flekt.com/industrial-augmented-reality-use-cases/
RobotLab. (2022). VR Expeditions 2.0. Retrieved from https://www.robotlab.com/vr-
expeditions-2-0/
Rose, M. (2019). 5 benefits of architectural visualization. G2. Retrieved from
https://learn.g2.com/architectural-visualization
Schiavi, A. (2022). The Benefits of Virtual Reality in Construction. Retrieved from
https://www.autodesk.com/redshift/virtual-reality-in-construction/
Sebastein, L. (2020, March 12). Creating immersive, photorealistic VR experiences with the high
definition render pipeline. Unity Blog. Retrieved April 7, 2023, from
https://blog.unity.com/technology/creating-immersive-photorealistic-vr-experiences-with-the-
high-definition-render
Sharma, S. (2022). Flipspaces: Changing the face of interior design with virtual reality.
Entrepreneur India. Retrieved from https://www.entrepreneur.com/article/374678
Sherman, W. R., & Craig, A. B. (2003). Introduction to What Is Virtual Reality? Understanding
Virtual Reality, 2–3. https://doi.org/10.1016/B978-1-55860-353-0.50018-5
Souzza, M. (2019). Augmented reality and the future of architecture. ArchDaily. Retrieved from
https://www.archdaily.com/907952/augmented-reality-and-the-future-of-architecture
Solanki, D. (2020). Augmented Reality (AR) & Virtual Reality (VR) in Business: Prospects,
Potential and Pitfalls. International Journal of Management, Technology, and Social Sciences
(IJMTS), 5(2), 152-163.
SPOT. (2020). Augmented Reality for Construction. Retrieved from https://spot.ar/augmented-
reality-construction
Sutherland, I. (1965). The Ultimate Display. Proceedings of the IFIP Congress, 506-508.
236
TeamViewer. (2022). Augmented Reality for Construction. Retrieved from
https://www.teamviewer.com/en/use-cases/industry/augmented-reality-for-construction/
TheGhostHowls. (2022, January 25). Gartner includes the metaverse in the hype cycle for the
first time. The Ghost Howls. Retrieved from https://skarredghost.com/2022/01/25/gartner-
includes-the-metaverse-in-the-hype-cycle-for-the-first-time/
Theoktisto, V. (2015). Virtual Reality: Opportunities and Challenges for Marketers. Athens
Journal of Business & Economics, 1(3), 271-292.
Thomas, B. H., Close, B., Donoghue, J., Squires, J., & Hertz, M. (2000). ARQuake: An
outdoor/indoor augmented reality first person application. Proceedings of the 2nd International
Symposium on Wearable Computers, 139-146. doi: 10.1109/ISWC.1998.729544
Thompson, M. (2022). How BMW and Jaguar Land Rover are using VR. VR World. Retrieved
from https://www.vrworld.com/technology/how-bmw-and-jaguar-land-rover-are-using-vr/
TMD Studio. (2017). How Virtual Reality and Augmented Reality Will Change the Future of
Interior Designing. Retrieved from https://tmdstudio.co/how-virtual-reality-and-augmented-
reality-will-change-the-future-of-interior-designing/
Ugolik, T. (2019). Osso VR Uses Virtual Reality to Train Surgeons. Medical Device and
Diagnostic Industry (MD+DI). Retrieved from https://www.mddionline.com/surgical/osso-vr-
uses-virtual-reality-train-surgeons
Vakshoor, P. (2023, January 20). Ladybug tools. Food4Rhino. Retrieved February 1, 2023, from
https://www.food4rhino.com/en/app/ladybug-tools
Virtual Reality Society. (2017). What is virtual reality? Retrieved from
https://www.vrs.org.uk/virtual-reality/what-is-virtual-reality.html
Visa Lighting. (2022). Virtual Reality. Retrieved from https://www.visalighting.com/virtual-
reality
Vrex. (2022). Radiance lighting models for efficient and realistic interior lighting design.
Retrieved from https://vrex.com/radiance-lighting-models-for-efficient-and-realistic-interior-
lighting-design/
Vu, K. (2019). How do touch controllers work in VR? Virtual Perceptions. Retrieved from
https://www.virtualperceptions.com/2019/01/28/how-do-touch-controllers-work-in-vr/
Wagner, M., Rich, S., & Dörner, R. (2019). Augmented Reality for Indoor Climate
Visualization. In Proceedings of the 2019 11th International Conference on Computer and
Automation Engineering (pp. 22-26). ACM. doi: 10.1145/3310836.3310846.
237
Wang, J., Hazarika, S., Li, C., & Shen, H.-W. (2019). Visualization and visual analysis of
ensemble data: A survey. IEEE Transactions on Visualization and Computer Graphics, 25(9),
2853-2872. doi: 10.1109/TVCG.2018.2853721.
Wang, X. (2009). Augmented reality in architecture and design: potentials and challenges for
application. International Journal of Architectural Computing, 7(2), 309-326.
Wang, X., Liu, Q., Liu, J., & Li, J. (2019). A review on big data visualization techniques. In
2019 IEEE International Conference on Big Data (Big Data) (pp. 1129-1136). IEEE.
William, R. (2003). From perception to consciousness: Searching with Anne Treisman. Annual
Review of Psychology, 54(1), 1-25. doi: 10.1146/annurev.psych.54.101601.145124
Yam, R. C., Guan, X., & Pun, K. F. (2001). Maintenance models for a medical equipment
management system. International Journal of Quality & Reliability Management.
Yan, C., Lin, M., & Chen, K. (2020). Bringing CFD visualization into VR with BIM tools.
Building Simulation, 13(1), 49-63. doi: 10.1007/s12273-019-0588-8.
Zgoda, J. (2020). Benefits of virtual reality technology in construction. Retrieved from
https://constructionexec.com/article/benefits-of-virtual-reality-technology-in-construction
3rockAR. (2022). Augmented Reality in Workspace. Retrieved from
https://www.3rockar.com/augmented-reality-in-workspace/
238
APPENDIX
8.1 Scripts of data visualization for VR
8.2 Script of data visualization for AR
8.3 Script of map button
8.4 Script of start button
8.5 Script of data visibility
8.6 Script of timestamp
Scripts of data visualization for VR
The script is to make the data in CSV to be read based on the selected date and time of dropdown
menu and slider, then generated a corresponding scale and color of 3D object at the position.
239
240
241
242
243
Script for data visualization for AR
The script helps to read the data based on the position and date-time dropdown menu to search
data in the spreadsheet and generate cubes based on the data.
244
245
Scripts of Map button
The script helps to use a toggle to open a map based on a picture and control the visibility of the
map.
246
Scripts for start button
The script helps to switch the scene from start scene to the main scene of the project.
247
Script of data visibility
The script is used to control the visibility of cubes that stand for different data series to control
the visibility of data in different factors.
248
Scripts of timestamp
The scripts provides help to the data visualization script to make the data from buttons and
dropdowns generated to one time stamp for further use.
249
Abstract (if available)
Abstract
The depiction of data through the use of typical graphics, such as infographics, charts, and even animations, is known as data visualization. These informational visual representations make complex data relationships and data-driven insights simple to comprehend. Virtual reality (VR) and augmented reality (AR) are two ways to make the direct perception of data through images of virtual data. These potentially powerful and innovative multidimensional data visualization tools can also provide an easy and natural way to visualize and explore collaborative data. This data can come from a variety of sources. Especially appropriate to portray in VR and AR is data that is associated with specific locations or spaces. For example, one can collect data on temperature, humidity, and particle concentration at different locations in a building and then visually represent the data in the form of numeric, graphs, charts, and other methods. One can visually visualize the data at different times by swiping to change the date and moment of the data. In addition, with appropriate programming different types of data can be toggled with a button. WSP's office in downtown Los Angeles was chosen as a case study. The indoor environmental condition was recorded every minute. Temperature and relative humidity data were collected by a HOBO MX 1102 sensor, and the particle density collected as the concentration of PM 2.5 in the office gathered by Pa- II- SD air quality sensor from PurpleAir. The spatial model and data were processed by Unity to visualize different positions at different times. The sensors were used as props to collect real-time data so that people can walk into the room and see the temperature and other conditions of the building from the physical sense and data. Through all these, it is possible to monitor the indoor environmental conditions of a room during a period to test the comfort level of this room for humans. The data was transformed to both a VR (virtual reality) and AR (augmented reality) environments. Both VR and AR could be beneficial as two techniques for people to have a better understanding of the indoor environment quality. It also provides a tool for people to choose the most suitable location for themselves within the office.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
CFD visualization: a case study for using a building information modeling with virtual reality
PDF
Building information modeling based design review and facility management: Virtual reality workflows and augmented reality experiment for healthcare project
PDF
BIM+AR in architecture: a building maintenance application for a smart phone
PDF
Quantify human experience: integrating virtual reality, biometric sensors, and machine learning
PDF
Visualizing architectural lighting: creating and reviewing workflows based on virtual reality platforms
PDF
Visualizing thermal data in a building information model
PDF
A BIM-based visualization tool for facilities management: fault detection through integrating real-time sensor data into BIM
PDF
Using building information modeling with augmented reality: visualizing and editing MEP systems with a mobile augmented reality application
PDF
Pre-cast concrete envelopes in hot-humid climates: examining envelopes to reduce cooling load and electrical consumption
PDF
Exploring participatory sensing and the Internet of things to evaluate temperature setpoint policy and potential of overheating/overcooling of spaces on the USC campus
PDF
Building information modeling: guidelines for project execution plan (PxP) for India
PDF
Augmented reality in room acoustics: a simulation tool for mobile devices with auditory feedback
PDF
Enhancing thermal comfort: air temperature control based on human facial skin temperature
PDF
Building energy performance estimation approach: facade visual information-driven benchmark performance model
PDF
Streamlining sustainable design in building information modeling: BIM-based PV design and analysis tools
PDF
Office floor plans generation based on Generative Adversarial Network
PDF
Building bridges: filling gaps between BIM and other tools in mechanical design
PDF
Natural ventilation in tall buildings: development of design guidelines based on climate and building height
PDF
Indoor air quality for human health in residential buildings
PDF
Human-environmental interaction: potential use of pupil size for office lighting controls
Asset Metadata
Creator
Danqi
(author),
Meng
()
Core Title
Data visualization in VR/AR: static data analysis in buildings
School
School of Architecture
Degree
Master of Building Science
Degree Program
Building Science
Degree Conferral Date
2023-05
Publication Date
04/19/2023
Defense Date
03/29/2023
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
augmented reality,BIM,data visualization,OAI-PMH Harvest,virtual reality
Format
theses
(aat)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Kensek, Karen (
committee chair
), Choi, Joon-Ho (
committee member
), Hanson, Eric (
committee member
), Noble, Douglas (
committee member
)
Creator Email
danqimen@usc.edu,danqimeng1101@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC113056713
Unique identifier
UC113056713
Identifier
etd-MengDanqi-11660.pdf (filename)
Legacy Identifier
etd-MengDanqi-11660
Document Type
Thesis
Format
theses (aat)
Rights
Meng, Danqi
Internet Media Type
application/pdf
Type
texts
Source
20230420-usctheses-batch-1026
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright. The original signature page accompanying the original submission of the work to the USC Libraries is retained by the USC Libraries and a copy of it may be obtained by authorized requesters contacting the repository e-mail address given.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
augmented reality
BIM
data visualization
virtual reality