Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Building information modeling based design review and facility management: Virtual reality workflows and augmented reality experiment for healthcare project
(USC Thesis Other)
Building information modeling based design review and facility management: Virtual reality workflows and augmented reality experiment for healthcare project
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
1 BUILDING INFORMATION MODELING BASED DESIGN REVIEW AND FACILITY MANAGEMENT: Virtual Reality Workflows and Augmented Reality Experiment for Healthcare Project by ZHAN YANG A Thesis SCHOOL OF ARCHITECTURE UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree MASTER OF BUILDING SCIENCE AUGUST 2018 2 COMMITTEE CHAIR: Karen M. Kensek, LEED AP BD+C Associate Professor of the Practice of Architecture USC School of Architecture kensek@usc.edu (213)740-2081 COMMITTEE MEMBER #2: Margaret Moser Assistant Professor of Practice Interactive Media and Games Division USC School of Cinematic Arts mmoser@cinema.usc.edu COMMITTEE MEMBER #3: Eric Hanson Associate Professor of the Practice of Cinematic Arts USC School of Cinematic Arts hanson@usc.edu (213)821-4387 3 ACKNOWLEDGEMENTS Yang wishes to thank his parents Jianfeng Jin and Yilong Yang in supporting his education and Christopher Nguyen from Kalloc Studios in help in developing these ideas. 4 ABSTRACT Building information modeling (BIM) is often used in the AEC (architecture/engineering/construction) industry throughout the building’s lifecycle including design, operation and maintenance (O&M). Virtual reality (VR) can be used for real-time simulation of a user's physical presence in a 3D interactive environment. Augmented reality (AR) can be used to overlay the virtual information onto the real environment. Both can augment BIM capabilities. A healthcare patient room was selected for demonstration due to its future potential as part of a complex building type for the use of VR based design and integrated facility management (IFM). Three VR workflows (cinematic VR, Revit + Unity, and Revit + Fuzor) were explored for the immersive project presentation through VR storytelling and interactive design review and management of O&M data. An AR test (Unity + Vuforia) was conducted to visualize the related information for onsite IFM support after recognizing the object. In the first workflow, a VR linear film was made via Revit, Maya, Mettle Skybox, Handbrake, Redshift and After Effects to present the project through Oculus Rift HMD for seated experience of VR. In the second workflow, parameter data including the equipment’s manufacturer, cost, and website address was added to a project file in Revit. The geometry was exported to Unity using Maya as a go-between. The data was extracted from the Revit schedule to a text file. In Unity, the data was parsed based on a unique identifier and its data structure and linked back to the equipment through C# scripting based on a panel based user interface of Unity. Through customized programming, the outcome was a room in VR where equipment information could be appeared on canvas panels per user’s control input in real time. The HTC Vive, a head mounted display (HMD), was used to provide users with a walk-through VR experience. Although Unity is widely used in game and film industries and a free version is available, extra time and effort is spent in model modification via external software like Maya and model import and export. Hence, the third workflow was created, including Revit, a database of the equipment in Excel, Fuzor, and a customized link written in Python using the Fuzor Application Program Interface (API). Fuzor was chosen as it is a game engine based solution specifically tailored for the building industry. Moreover, other features already existing in Fuzor are beneficial for IFM, which includes an imbedded system of annotation, category customization, and colorization for layers to quickly differentiate various types of building information. Fuzor also has bi-directional synchronization with the Revit model and supports a VR environment which requires extra effort to establish for Unity. This synchronization with the Revit model without the need to use additional software makes it suitable for IFM as live BIM updates are needed during the O&M. An XML tool was developed to link the Excel sheet to the Revit and Fuzor models instead of using the inherent Revit parameters as in the first workflow. Packaged into a Python script, this tool can automate both the data extraction and the execution of importing the data into the Fuzor’s parameter window. This tool has immediate application potential because the Excel file can be an IFM database in a real project like a maintenance or repair logging schedule. A simple test based on Vuforia and Unity as a very preliminary exploration of AR. After several trials, including the usage of the object target in Vuforia, the model target was applied to recognize the shape of a laptop based on the corresponding pre-loaded 3d model and to overlay the related information on top of that laptop through the webcam connected to the computer. This application can be applied on site for facility management to better support the visualization of useful information including training videos. All the three VR workflows and the one AR test were evaluated and analyzed, followed by the conclusion drawn as the entire summary. At the end, future work recommendations were made, which include the further AR workflow development based on the AR test, a toolkit for smoothing the transition from Revit to Unity, and exploration towards the other VR and AR based workflows. 5 HYPOTHESIS Several workflows of VR based on BIM and a simple test of AR can be developed and used to facilitate the design review and facility management in buildings; A patient recovery room can be applied for demonstration. RESEARCH OBJECTIVES 1. To apply BIM based VR and AR in both the design and facility management stage of the AEC project. 2. To establish a workflow of cinematic VR including Revit, Maya, and After Effect with Jaunt Player and Oculus Rift to achieve linear storytelling in VR for design presentation. 3. To establish a workflow of interactive VR including Revit, Maya and Unity with VRTK and HTC Vive to achieve real-time interaction in VR with the ability to visualize data from Revit. 4. To establish a workflow of interactive VR including Revit, Excel and Fuzor with Python and HTC Vive to achieve real-time interaction in VR with the ability to automate the integration of external database into Fuzor. 5. To explore interactive AR including Unity and Vuforia with a webcam to achieve real-time interaction in AR with the ability to recognize the facility of which the associated data is to be overlaid on top. 6 CONTENTS ABSTRACT .................................................................................................................................................................. 4 HYPOTHESIS ............................................................................................................................................................... 5 RESEARCH OBJECTIVES .......................................................................................................................................... 5 CHAPTER 1: INTRODUCTION .................................................................................................................................. 9 Chapter Introduction ................................................................................................................................................. 9 1.1 Building Information Modeling (BIM) ............................................................................................................... 9 1.2 Virtual Reality (VR) and Augmented Reality (AR) ............................................................................................ 9 1.2.1 Fundamental Terminology ......................................................................................................................... 10 1.2.2 Virtual Reality (VR) .................................................................................................................................. 11 1.2.3 Framerate and VR Hardware ..................................................................................................................... 12 1.2.4 Augmented Reality (AR) ........................................................................................................................... 13 1.2.5 AR Toolkit Selection ................................................................................................................................. 15 1.3 3D Game Engine ............................................................................................................................................... 15 1.3.1 Comparison of BIM based VR Solutions ................................................................................................... 15 1.3.2 Unity as an Example of a Game Engine .................................................................................................... 16 1.3.3 Fuzor, Fuzor API, and XML Database ...................................................................................................... 17 1.4 Integrated Facility Management (IFM) ............................................................................................................. 17 1.5 Design Review of Healthcare Facility ............................................................................................................... 18 Chapter Summary .................................................................................................................................................... 19 CHAPTER 2: BACKGROUND AND LITERATURE REVIEW .............................................................................. 20 Chapter Introduction ............................................................................................................................................... 20 2.1 Topic Related Research Papers on VR .............................................................................................................. 20 2.1.1 Virtual Reality for Integrated Facility Management .................................................................................. 20 2.1.2 Virtual Reality for Design Review of Healthcare Facility ......................................................................... 21 2.1.3 Game Engine for Building Information Modeling ..................................................................................... 22 2.1.4 Game Engine for Design Review of Healthcare Facility ........................................................................... 22 2.2 Example Model Transfer between BIM and Game Engine Platform ................................................................ 22 2.3 Augmented Reality for Integrated Facility Management .................................................................................. 24 Chapter Summary .................................................................................................................................................... 26 CHAPTER 3: METHODOLOGY ............................................................................................................................... 27 Chapter Introduction ............................................................................................................................................... 27 3.1 Overview ........................................................................................................................................................... 27 3.2 Workflow 1: Cinematic VR through Oculus Rift .............................................................................................. 27 7 3.2.1 SEPS Revit Model Exported to Maya ........................................................................................................ 28 3.2.2 Maya Rendering for Stereo Panorama ....................................................................................................... 29 3.2.3 Jaunt Player for Viewing Stereo Panorama via Oculus Rift ...................................................................... 31 3.3 Workflow 2: Unity based Interactive VR through HTC Vive ........................................................................... 32 3.3.1 Revit-Unity based Information Visualization for Integrated Facility Management ................................... 33 3.3.2 Basic Room-scale VR via Steam VR and HTC Vive ................................................................................ 41 3.3.3 VRTK Application for Interactive VR ....................................................................................................... 43 3.4 Workflow 3: Fuzor based Interactive VR through HTC Vive........................................................................... 48 3.4.1 VR Solution of Fuzor ................................................................................................................................. 49 3.4.2 Excel Database ........................................................................................................................................... 52 3.5 AR Test: Vuforia based AR through Object Target .......................................................................................... 53 Chapter Summary .................................................................................................................................................... 55 CHAPTER 4: RESULTS ............................................................................................................................................. 56 Chapter Introduction ............................................................................................................................................... 56 4.1 Workflow 1 Result: Cinematic VR through Oculus Rift ................................................................................... 56 4.1.1 Video Frame Preparation via Maya Scene Rendering ............................................................................... 56 4.1.2 Video Creation ........................................................................................................................................... 59 4.1.3 Format Conversion for Video File ............................................................................................................. 60 4.1.4 Final VR Video .......................................................................................................................................... 61 4.2 Workflow 2 Result: Unity based Interactive VR through HTC Vive ................................................................ 64 4.2.1 Final VR Result.......................................................................................................................................... 64 4.3 Workflow 3 Result: Fuzor based Interactive VR through HTC Vive ............................................................... 67 4.3.1 Fuzor for Design Review and Facility Management.................................................................................. 68 4.3.2 Automation of Excel Data Extraction and Export to Fuzor using REST Calls .......................................... 69 4.4 Test 2 of Workflow 4: Vuforia based AR through Model Target ..................................................................... 73 Chapter Summary .................................................................................................................................................... 76 CHAPTER 5: DISCUSSION ...................................................................................................................................... 77 Chapter Introduction ............................................................................................................................................... 77 5.1 Discussion of the Research Process .................................................................................................................. 77 5.1.1 Research Process Discussion of the Cinematic VR Workflow .................................................................. 77 5.1.2 Research Process Discussion of the Unity VR Workflow ......................................................................... 77 5.1.3 Research Process Discussion of the Fuzor VR Workflow ......................................................................... 77 5.1.4 Research Process Discussion of the Vuforia AR Test ............................................................................... 78 5.2 Application Discussion ..................................................................................................................................... 78 5.2.1 Application Discussion of the Cinematic VR Workflow ........................................................................... 78 8 5.2.2 Application Discussion of the Unity VR Workflow .................................................................................. 78 5.2.3 Application Discussion of the Fuzor VR Workflow .................................................................................. 78 5.2.4 Application Discussion of the Vuforia AR Workflow ............................................................................... 79 5.3 Evaluation ......................................................................................................................................................... 79 5.3.1 Evaluation of the Cinematic VR Workflow ............................................................................................... 79 5.3.2 Evaluation of the Unity VR Workflow ...................................................................................................... 79 5.3.3 Evaluation of the Fuzor VR Workflow ...................................................................................................... 79 5.3.4 Evaluation of the Vuforia AR Test ............................................................................................................ 80 5.3.5 Overall Evaluation ..................................................................................................................................... 80 Chapter Summary .................................................................................................................................................... 80 CHAPTER 6: CONCLUSION AND FUTURE WORK ............................................................................................. 81 Chapter Introduction ............................................................................................................................................... 81 6.1 Conclusion......................................................................................................................................................... 81 6.2 Future Work ...................................................................................................................................................... 83 6.2.1 Future Work of Existing Research ............................................................................................................. 83 6.2.2 Goals for VR/AR in Design Review and Facility Management ................................................................ 84 Chapter Summary .................................................................................................................................................... 84 REFERENCES ............................................................................................................................................................ 85 9 CHAPTER 1: INTRODUCTION Chapter Introduction Five key concepts and system components described in Chapter 1 are building information modeling (BIM), virtual reality (VR) and augmented reality (AR), 3D game engine, design review of a healthcare space via experience based design (EBD), and integrated facility management (IFM). 1.1 Building Information Modeling (BIM) Building information modeling (BIM) is a methodology to manage the necessary building data in a digital format throughout the building's life cycle (Penttilä, 2006). It creates an integrated solution for a building project by combining architecture, structure, mechanical, electrical, plumbing, and other trades into the same platform (Dossick & Neft, 2009). Additionally, BIM can also be used as a base for scheduling, cost estimating, quality control, accessibility, logistics, sustainability, maintainability, acoustics, and energy simulation (Aouad, Lee, & Wu, 2006). The value of BIM is that it can help create a more optimal workflow, assist collaboration for higher work efficiency, and help with life-cycle management of a building for a better built environment (Professional Constructor Central, 2012). For example, BIM information can be used to assure that the tasks during the construction are accomplished on time and that the expected safety standard and project quality are achieved (McGraw-Hill Construction, 2008). Autodesk’s Revit is a widely used BIM software program that can be used with several VR programs. Briefly, BIM can be regarded as a data and knowledge warehouse through the building lifecycle. The main benefits of using BIM are reduced capital costs, improved accuracy in contract documentation, improved estimation during bidding and procurement, and enhanced understanding of the building among others (Wang et al., 2013). 1.2 Virtual Reality (VR) and Augmented Reality (AR) Virtual reality (VR) is used to immerse the users into a completely digital world while augmented reality (AR) is applied to superimpose the virtual content onto the real world. VR use head mounted display (HMD) to totally immerse the viewer with the virtual environment while blocking the observation towards the real world. In contrast, HMD used for AR can be regarded as a pair of smart glasses like Google Glass to overlay the virtual content to part of the view while the real world can still be seen from the HMD. The above difference can be easily understood literally as that the VR creates the experience from a virtual world while AR augments the observation towards a real world. Hence, in the AEC industry, VR is more used in the design phase while the AR is more used in construction and operations and maintenance (O&M) phases (Campbell, 2017). However, it depends on the specific situations (Figure 1.1). Different types of the VR and AR can be summarized based on the level of immersion (Figure 1.2). Figure 1.1: Comparison Diagram of VR and AR Applied in Different Stages of an AEC Project Drawing based on Fauvel (2017) 10 Figure 1.2: Summary of Different Types of VR and AR based on Level of Immersion Drawing based on Fauvel (2017) 1.2.1 Fundamental Terminology Both VR and AR are aimed to alter the observation and perception of the viewer/user. One important term is panorama, which is also called 360-degree view. A normal camera is used to capture an image within a certain range of the view and for a given angle of shooting. After taking multiple images in different angles and stitching them together, one combined image called panoramic image can be obtained to see the whole surrounding based on an observation point. The viewing can be done through either wearing a HMD to be immersed inside the entire view or using mouse to view on the computer screen. Another important term is stereo, which is the 3D perception formed in human brain after combining the two offset pictures presented separately to the left and right eyes of the viewer (Eric, 2017). Panorama and stereo are key to understand different kinds of VR and AR. There are many forms of the media can be used to present and experience their combinations, which includes IMAX, full-dome, head-mounted display (HMD), etc. There are basic terms which are essential to understand the base of the VR and AR, especially for VR, whose combinations can generate different forms of the presentation. The basic sequence of the VR formation can be described as follows. The first is the minimal cell of this systematic formation, which can present a still scene in 360 degrees so that the viewer can see in any direction towards the surrounding scene (Figure 1.3). In normal practice, stitching process is based on 6 pictures taken including 4 of them from the 4 directions in horizon and 2 of them from the shooting facing upward and downward respectively (Hanson, 2017). This scene is also called panorama, which can be transformed to stereo panorama after introducing two cameras side by side. A real indoor application stitched from 9 raw images are shown as well (NVIDIA, 2018). 11 Figure 1.3: Stereo Panorama Decomposition as the First Segment in VR Production Pipelines (NVIDIA, 2018) Stereo panoramas can be placed inside of a fixed path animation to make a linear film with the added ability of the user to stop at any location and view the surroundings. The addition of the locations can generate a 3D animation whose view angle of any observatory position along a pre-fixed view path can be controlled discretionarily. For example, along a predetermined path, the viewer can fly above or walk through the building in VR mode to experience a tour. With the addition of the time, 360 videos with moving or still camera recording a dynamic scene instead of a still scene before can be made (Figure 1.4). The example shown is a 360 video where the camera’s position’s movement and the conversation between the two peoples are dynamic. The user can rotate the view angle in 360 degrees anytime. The outcome of the first workflow is to make a similar 360 video. The final delivery of the VR work should be in the minimal resolution of 3840 × 1920 for a clear VR work with high quality (Hanson 2017). Some advanced cameras like Jaunt Camera have the capability in making a 360 video (Jaunt, 2018). The stitching process of Jaunt is expensive, as well as the equipment itself. At this stage, the VR is in the form of linear film that cannot achieve real-time interaction. Figure 1.4: Decomposition of 360 Video as the Second Segment in VR Production Pipelines (Sarconi, 2017) 1.2.2 Virtual Reality (VR) Virtual reality (VR) is a virtual environment generated as a 3d digital model and accessed through multiple hardware platforms including stereoscopic glasses. A viewer’s actions in the real world are tracked and reflected into that simulated 3D environment (Brouchoud, 2016). VR offers a visualization method to help people intuitively understand the environment via a fully immersive way (Donalek et al, 2014), with little effort and no extra cost for the client (Arch Virtual, 2014). Although VR’s characteristics are usually based on a single user’s experiences, lately there has 12 been environments created for collaboration and cooperative tasks (Peters et al., 2016). VR can be used to facilitate users to be more aware of a situation (Endsley, 1995), add interactivity for simulating real world, and enrich media content (Klein & Militello, 2001). Immersive collaboration in VR applications can help achieve a high degree of communication and collaboration among multiple players via text chat, voice communication, and interaction with shared components. For instance, it takes less than five minutes to bring this functionality to a VR application for Unity developers (Brouchoud, 2016). There is a match between these VR characteristics and feature and the nature of architecture, engineering, construction (AEC) industry. Nevertheless, cumbersome headset and isolation from the real world are the two big hurdles in VR to be tackled in the future (Virtual Xperience Inc., 2017). 1.2.3 Framerate and VR Hardware Framerate in VR expressed by frames per second (fps) is defined as the number of times per second a rendered image is shown on the screen. If this value is not high enough to reach 90 fps normally, the simulated environment cannot be sufficiently matched to the user’s real activity (head rotation and movement) in real world so that the user will experience a lag directly leading to motion sickness (Peter et al., 2016; Patrao, Pedro, & Menezes, n.d.). When complicated 3D models like the architectural models exported from BIM related software like Revit, ArchiCAD or SketchUp or large graphics are being displayed, the hardware system is unable to display at an adequate framerate (Peter et al., 2016). Some development of teleporting functionality in VR toolkits have reduced the motion sickness to some extent by deploying various techniques although some people will still feel uncomfortable. Consequently, for VR Head mounted display (HMD) which is a kind of device worn on the head with a display optic in front of one eye for 2D presentation or each eye for stereo display, the selection is extremely significant and the above factor are taken into consideration. Four typical HMDs are compared together (Table 1.1), from which it is worth to notice that unlike Samsung GearVR and Google Cardboard used just for visualization, HTC Vive and Oculus Rift are the devices where VR can be applied beyond passive architectural visualization, meaning that the environment can be manipulated like moving furniture around and sketching 3D ideas in VR (Brouchoud, 2016). Based on this table, both Oculus Rift and HTC Vive are selected for better immersive and interactive experience. However, HTC Vive is big, heavy, and expensive (Virtual Xperience Inc, 2017). Since interactive VR is the key outcome to achieve later so that the web-based VR (WebVR) will not be considered because the control method is largely limited to only the mobile phone or the gaze in most occasions (Zhong & Kevin, 2017). For example, Forge Viewer WebVR, as a typical framework for WebVR, cannot achieve even adding additional 3D object into the screen (Zhong, 2017). 13 Table 1.1: VR Head Mounted Display (HMD) Comparison (Brouchoud, 2016) HTC Vive Oculus Rift Google Cardboard Samsung Gear VR Pros Cons HTC Vive Movement ability Deep immersion Little motion sickness Full interaction based on input device Long time spent in setup and teardown Many pieces of equipment and large space needed Oculus Rift Easy to setup More portable, relative to HTC Vive Comfortable experience Built-in audio headsets, microphone Full interaction based on input device Not sufficient fps may cause motion sickness sometimes Google Cardboard Convenient and portable Cheap Easy to adopt Limited to static pre-rendered views More lagging, with higher chance of motion sickness Lack of position tracking leading to motion sickness Samsung Gear VR Portable, no need to use high-end PC More process power than Cardboard Lack of position tracking leading to motion sickness Low visual fidelity Not sufficient fps may cause motion sickness sometimes Required phones with battery drainage and overheating 1.2.4 Augmented Reality (AR) Augmented reality (AR) is a user’s view of real world scene superimposed with virtual contexts created from computer (Milgram & Kishino, 1994). It’s hardware components include processor, display, sensors and input devices (Aripaka, 2016). Devices include Sony SmartEyeglass, Recon Jet, Epson Moverio BT-300, Vuzix M300, and Google Glass (Friedman, 2016). AR can be categorized into marker based AR and marker-less AR. The difference between them is that unlike the marker based AR with the ability of accurate recognition via the preloaded reference to be matched, marker-less AR relies on some algorithms to understand the environments based on the limited features extracted with more computational effort comparatively. Marker based AR uses normal markers for detecting the precise posture where the virtual objects can be put. The marker target can be model, image, cylinder, and object in Vuforia as a commonly used AR platform (Vuforia, 2018). For instance, the most common example of this type of AR used in AEC industry is presented (Figure 1.5). The AR content is the BIM model whose superposition place is determined by the AR marker which is a printed CAD sheet of the corresponding floor plan. 14 Figure 1.5: 3D Experience of Building Plans via AR (Busta, 2015) The other type is marker-less AR, which either can be location based directly (Paula, 2017) or can be based on features extracted for locating the position (Chi, Kang and Wang, 2013). Examples are Pokemon Go game based on the GPS system and Apple ARKit’s App’s common ability to place virtual object on top of horizontal planes. The AR’s AEC applications are shown (Table 1.2). Table 1.2: AR Main Application in AEC Industry User AR Application in AEC Specific Examples Architects Overlay BIM onto the real-world context for design visualization and collaboration. Overlay a building model with 1:1 scale on the site to do site analysis for the building. Contractors Overlay BIM over an “as-built” project site to support layout, inspections, and quality control. Overlay a MEP layout to guide the assembling of building service systems on site. Owners Overlay the “as-built” BIM with real facility for revealing hidden conditions in O&M. Overlay the temperature distribution diagram with the energy system to diagnose the building Technicians Overlay the training material onto the equipment to guide for repair and maintenance. Overlay the video tutorial onto the condenser boiler to guide for cleaning the solid deposit. Manufacturer Overlay products onto the real environment for visualization, selection and procurement. Overlay furniture models for user’s decision making on purchase, like IKEA’s AR App. AR offers a complementary solution for VR; both the real and virtual world are blended together in AR, although the immersion level is largely reduced compared to VR (Aripaka, 2016). With the distributed information, AR can also be used to increase the efficiency of the work onsite by reducing the dependency on the centralized location (Zhong, 2017). Markerless SLAM (simultaneous localization and mapping) based AR is the most advanced and accurate AR type making the user hard to distinguish between the virtual and real objects, given that the rendering part is done perfectly enough. There are many powerful tools available for AR development, among which Apple’s ARKit is a typical example. Advanced properties are included in that tool. For instance, it can understand the scene with the abilities including detecting the planes and estimating the lighting to apply the correct light to the virtual objects. Moreover, Visual inertial odometry (VIO) allows accurate track the world around without additional calibration (Apple, 2017). Moreover, the power of the ARKit can also be integrated with the other advanced frameworks like Metal, Vision, SpriteKit, and CoreML to achieve more in application (Tillage, 2017). 15 1.2.5 AR Toolkit Selection Due to the high cost and the bulky property of the Microsoft HoloLens (Porter & Heppelmann, 2017), the HMD based AR solution is not considered. Instead, the mobile phone based AR solution is selected so that no additional hardware is needed. There are various tools for building AR mobile apps. Vuforia, Apple ARKit, Google ARCore are the three tools used by CO Architects (N. Miller, personal communication, December 14, 2017). These three tools are also the only three listed in the book called Augmented Reality for Developers (Billinghurst, 2017). These tools are based on SLAM (marker-less) technique mentioned before. The final selection also depends on the application requirements. For instance, Google ARcore offers a web-based AR solution which is platform neutral although it is in the experiment stage now (Prismetric, 2017). Object recognition would be important for IFM. ARKit lacks that feature as it can only detect flat surfaces. This can be verified by a specific application case where object recognition can be achieved in ARKit integrated with CoreML as the Apple’s machine learning (ML) system (Odom, 2017). Most Vuforia Apps perform content placement on specific objects with its inherent AI (artificial intelligence) power including object recognition while most of ARKit Apps place the content relative to environment like a plane mentioned before (Unite Austin, 2017). Consequently, Vuforia is finally selected, also under the consideration that Vuforia Fusion can utilize the ability of the ARCore and ARKit in addition to its capability to provide the best AR experience automatically (Vuforia, 2017). There are the two functionalities that can be used in Vuforia. The first is the object target. After scanning the AR target meeting the specific requirement, the scanned file is used to be used as the recognition base. The specific requirement is that the object should be relatively small to be able to fit on top of a normal table, be a rigid body without moving parts, and it cannot be shiny, pliable and transparent. The scanning process should be executed using a specialized scanner supported only by Android phone currently, under even and moderate lighting condition without the shadow casted from the other surrounding objects. The second is model target, which is the latest functionality under current development. Since CAD file in various common formats like obj, fbx, etc. contains the higher accuracy in geometry information, it is used in this method which recognizes the target based on the shape. Since the model and the real object are matched with each other, there is no need to scan in this method (Vuforia, 2018). For instance, in Vuforia sample project, a stl file is provided from NASA website for 3D printing to get the corresponding model of that CAD file (NASA, 2017). 1.3 3D Game Engine A game engine is a computer game development application that includes elements such as powerful 3D rendering, physics, collision, graphical user interface, artificial intelligence, sound, and event management (Eberly, 2007; Fritsch & Kada, 2004). A game engine enables the user to manipulate the virtual environment to achieve real-time individual control with more advanced software allowing multi-user collaboration. For example, in the AEC industry, after fully experiencing the design, the behavior information of the components within the facility can be simulated and understood, extending the power of BIM containing only the static information like the geometric and attribute data originally. The behavior information can be illustrated from the example of the door animation which is that door elements in the facility can be animated to slide or swing open based on the door type and hinge direction. 1.3.1 Comparison of BIM based VR Solutions Some common software of Revit based VR solutions are summarized in this section (Table 1.3). Table 1.3: Software Comparison of BIM based VR Solution (The Revit Kid, 2017) Software Name Basic Characteristics Autodesk 360 Rendering Easy to make 360 panoramas No walk-through ability V-Ray Easy to make 360 panoramas No walk-through ability Lumion Easy to make 360 panoramas and navigate No Vive support IrisVR Easy to make VR and navigate, able to mark-up model Low rendering quality Enscape3D Easy to make VR and navigate, able to mark-up model High rendering quality Unity High rendering quality, able to customize the interaction Difficult to use Fuzor High rendering quality, able to customize the interaction Easy to use 16 1.3.2 Unity as an Example of a Game Engine Unity is a commonly used 3D game engine. 90% of Oculus games and VR experiences are generated through Unity (Gaudiosi, 2016). Its asset store, containing a library of high quality assets (pre-made objects, materials, animation, special effects), reduces development time and cost (Gaudiosi, 2016). It also provides visual consistency for projects, which is difficult to accomplish if assets are obtained from various resources (Brouchoud, 2016). Moreover, assets can be reused among multiple projects. Unity models can be embedded quickly and simply onto an independent application or a web browser, facilitating the design collaboration because the work sharing can be achieved without the need to handle account registration and download problems (Brouchoud, 2010). Unity is also a platform-neutral that runs on Android, iOS and Windows Phone mobile devices, leading to wide adoption of AR and VR in an environment like the game (Donalek et al, 2014). The comparison between the most common two game engines is presented (Table 1.4), which supports the Unity to be selected because it is easier to use compared to Unreal. Unity3D currently supports scripting languages of C# and UnityScript (Handel, Gümüş, Papoutsis, & Amann, 2016). Table 1.4: Game Engine Comparison (Peter et al., 2016) Comparison Criteria Game Engines Unreal Unity Scripting Language C++, Blueprint C#, Javascipt Source Available Open source Negotiable Content Store High quality, limited offer Various qualities Documentation Good Excellent User Base Community Medium, increasing rapidly Huge Ease of Use Moderate Easy Unity user interface is displayed (Figure 1.6). In addition to the Asset Store window built inside the Unity, there are total six windows which are project, console, inspector, game, scene and hierarchy windows, constituting a main part of the user interface. Figure 1.6: Unity User Interface 17 1.3.3 Fuzor, Fuzor API, and XML Database For the AEC industry specifically, Fuzor allows VR functionality and bi-directional synchronization with Revit. It provides a Revit add-on to convert the Revit model into a Fuzor model directly. Unlike the Enscape3D add-in in Revit, this VR scene is not just for visualization, but basic VR real-time interactions including free movement, addition or deletion of Revit families, visualization of Revit family parameter information can already been achieved in Fuzor. Moreover, it can be used to perform the basic simulations for an AEC project, which includes 4D construction visualization, financial analysis, rain ingression evaluation, CCTV simulation, etc. Changes made in the Revit model can be automatically updated to Fuzor model and vice versa. Apart from the model, there is no need to export and import the BIM data of parameter information between the Revit and Fuzor (Kalloc Studios, 2018). Several basic and important building simulations can be done in Fuzor although these can also be done in Unity (S. Case, personal communication, December 4, 2017). It is important to notice that many important AEC software including Revit, SketchUp, Rhino, and others work well smoothly work smoothly with Fuzor and Fuzor is suitable for IFM although Fuzor is not free while Unity is free for a limited version of the software (Kalloc Studios, 2018). Both Fuzor and Unity are worth to be studied for two different options. XML is a markup language that can be used to describe the information of Fuzor model. The Fuzor API is the internal protocol that can be used for websites and applications to launch, query, and send commands to the Fuzor; this can also be regarded as an expanded user interface for Fuzor (Kalloc Studios, 2018). The REST (Representational State Transfer) API is the API using HTTP requests to get, put, post, and delete data (TechTarget, 2018). Fuzor REST API is one part of the Fuzor API. 1.4 Integrated Facility Management (IFM) According to the International Facility Management Association (IFMA), facility management is a profession that encompasses multiple disciplines to ensure functionality of the built environment by integrating people, place, process and technology (Professional Constructor Central, 2012). Integrated facility management (IFM) is essential in the lifespan of a building project because operations and maintenance (O&M) in IFM occupies the longest period. More than 85% of the total cost can be attributed to maintaining the normal working condition of a building (Teicholz, 2004). When as-built drawings are not available, data re-entry may be necessary for facilities managers; fieldwork inefficiency can be improved up to 20% by offering suitable information support (Lee & Akin, 2009). Around 50% of the total task time is accounted by activities related to documentation including finding the appropriate paper work and then reading it (Teicholz, 2004). Both the processes are time consuming because the building operating data is often not digitally associated with the building model; static information including annotated drawings, manuals, and photos have normally been stored in the form of paper documents (Hou & Wang, 2011). They can be hard to comprehend in time, particularly in emergency situations (Akcamete, Liu, Akinci, & Garrett, 2011). A BIM based VR and AR solution can be used to effectively identify facility automatically and visualize information specific to the target equipment including the three groups: geometric, operation, and maintenance information (Figure 1.7). Lack of the communication is also an important contributor towards the inefficiency of FM so that the multiuser shared VR and AR. A National Institute of Standards and Technology (NIST) reported that the cost from inadequate transfer of the information throughout the entire building lifecycle is $15.8 billion per year with two-thirds occurring in FM (GCR, 2004). Figure 1.7: High-level Categories of Equipment-specific O&M Information (Lee & Akin, 2011) 18 Maintenance can be categorized into reactive maintenance conducted to fix rather than prevent the issue; preventative maintenance IS regularly performed to reduce the chance of failure; and predictive maintenance is an intelligent version of preventative maintenance whose action time is based on the accumulated data (Selezan & Mao, 2016). 1.5 Design Review of Healthcare Facility Healthcare facilities are specialized spaces with critical consideration for patient safety and comfort (Figure 1.8). The design needs feedback from clients, architects, engineers, facility managers, healthcare staff, and even future patients. This is useful at the beginning stages of design and later as the design progresses and achieves more detail. This can be difficult to accomplish as some of the stakeholders mentioned previously have different skills in interpreting architecture drawings. Yet, still, design review is important (Fu & East, 1999; Shiratuddin, 2009). Normally, it is performed in the following four steps. First, the plans and specifications are reviewed. Second, the potential design errors found from experience and available sources are recorded. Third, the corresponding review comments are generated and shared to both the designers and other reviewers. Fourth, the necessary redesign incorporating the changes are conducted. Figure 1.8: Example of Modern Operating Room Design (Dunston, Arns, & McGlothlin, 2007) Healthcare quality including the staff fatigue and effectiveness in delivering care, patient safety, and stress and recovery is strongly related to the physical environment (Ulrich, Quan, Zimring, Joseph, & Choudhary, 2004). Creating a better physical environment can be enhanced through experience based design (EBD) for the initial and detailed design reviews in providing better solutions for visualization, information management, and collaboration (Ulrich, Quan, Zimring, Joseph, & Choudhary, 2004). 19 EBD improves the quality of the feedback quality in design reviews. It can help reveal architectural errors, find incompatibilities between an equipment location and other physical constraints, and establish better design solutions in the early project stage. Done digitally (for example in a VR environment), it might be able to reduce costs versus building a full-size mockup of a room, which is a standard practice for expensive and complex rooms. In the EBD, for example, there are other tasks that can be benefited from the use of the 3d model: assessing the mobility of equipment and activity limitations of occupants, configuring patient beds to different forms, and inspecting indoor dimensions. Through simulation, one can also evaluate architectural characteristics for infection control and compare different light sources (Kumar, Hedrick, Wiacek, & Messner, 2011; Dunston et al., 2007). Chapter Summary Key concepts including BIM, VR, AR, 3D game engine, IFM, and design overview of healthcare facility were introduced in this chapter, which is the foundation to determine the research objective and construct the research content. The important software and hardware are selected, which is the base of the workflow and experiment establishment and comparison in the following chapters. These parts are the components of the integrated system to be proposed later for an innovative solution especially applied in healthcare projects. 20 CHAPTER 2: BACKGROUND AND LITERATURE REVIEW Chapter Introduction The literature review for deeper illustration towards the key elements in the first chapter reveals their potential connection and integration. The four topic related research papers in the first section are used to guide for constructing a VR based system in the research later, and the system’s elements are already included in the previous chapter. The second section shows potential problems in converting the Revit model to Unity model and the usage of Autodesk 3DS Max as the go-between software to solve the issues, which offers the direct guidance for establishing workflows later via developing a bridge from the BIM model to game engine model as a core part of the entire system. The final section is about the AR application in IFM based on the standard operating procedure (SOP) in the IFM department of Jones Lang Lasalle (JLL) (a real estate firm). 2.1 Topic Related Research Papers on VR Topic related research papers on VR introduced in the following parts reveal the integration of the above elements of the system of this research and the advantages and disadvantages of this system, and provide relevant guidance in research methodology and work limitation to be solved later. BIM enabled FM was utilized to enhance the communication with a more intuitive presentation from combining the model and data together after supplementing the FM related information in addition to the original BIM database. The addition of the communication mechanism via the game engine solved the original shortcoming of the BIM based VR solution (Shi et al., 2016). VR was demonstrated to be used as an interactive tool for spatial design evaluation towards a patient room. That paper also indicated the necessary effort required to shrink the VR development time, including developing a library of common objects (Dunston et al., 2007). BIM is part of this solution, although material conversion is the accompanied problem whose advanced solutions like developing a customized tool is not covered in this paper. Simulation and training purposes could both be satisfied by a special framework integrating the BIM and game engine (Wu & Kaushik, 2015), indicating partially reasons why IFM implementation can be improved through this system in this paper. Training and design review for healthcare facility could be benefited from game engine based VR environment (Kumar et al., 2011). 2.1.1 Virtual Reality for Integrated Facility Management Researchers explored the shortcomings of BIM based VR solution and how to improve it by introducing an advanced communication mechanism including gesture, text and voice so that real time interactions among remote stakeholders were enabled in the same VR environment (Shi, Du, Lavy, & Zhao, 2016). Game engine is the key for this improvement. Unity 3D was used to generate interactive environment. Photon Unity Networking (PUN) as a free tool with flexible API was applied to implement the cross-platform multiuser function. A commercial script called Autodesk Material Convert (AMC) was used to complete the material conversion in the workflow. BIM enabled FM was utilized to enhance the communication with a more intuitive presentation from combining the model and data together after supplementing the FM related information in addition to the original BIM database (Shi et al., 2016). The system was decomposed mainly into two fundamental elements that are BIM-based game engine and multiuser network that can be regarded as a plugin (Figure 2.1). 21 Figure 2.1: Framework of The Multiuser Shared Immersive VR for FM Communication (Shi et al., 2016) 2.1.2 Virtual Reality for Design Review of Healthcare Facility A VR patient room model was developed with deep level of interactivity to support evaluation towards the designed space (Dunston et al., 2007). Users could wander throughout the room to check spatial, acoustic and visual conditions, functionality, and aesthetical effect. Mobile equipment and furniture in the room could be grabbed and rearranged within the space. For instance, the user opened one of the dresser drawers near the bed in that VR environment (Figure 2.2). Additionally, the overall environment can also be changed, which includes the lighting levels, the outdoor environment viewed from the window, and the room size. Three research goals were proposed to be solved. First was to enable users to move the walls until users were satisfied with the plan layout. The second was to develop a library of common objects for rapidly popularizing VR mock-ups of hospital units. The third was to build the competence to quickly generate compelling VR mock-ups for hospital projects (Dunston et al., 2007). Figure 2.2: User Opening Drawer for Clearance Check (Dunston et al., 2007) 22 2.1.3 Game Engine for Building Information Modeling A special framework integrating the BIM and game engine was utilized to enhance design communication and review for meaningful feedback on spatial problem, design logic and overall satisfaction. Similar applications in AEC industry based on the same framework were widely acknowledged, which includes safety education and training in construction and evacuation simulation in case of emergency (Wu & Kaushik, 2015). 2.1.4 Game Engine for Design Review of Healthcare Facility 3D game engines were combined with the EBD method for healthcare facilities to establish a systematic solution for design review based on specific scenarios in an interactive VR environment (Kumar et al., 2011). Training in healthcare industry could benefit from this system. For example, a nurse could be trained through using this model to build the familiarity in operating complicated medical equipment. The graphical user interface (GUI) of the designed prototype shows how users interact with the system (Figure 2.3). Figure 2.3: Graphical User Interface of the EVPS Prototype Application (Kumar et al., 2011) 2.2 Example Model Transfer between BIM and Game Engine Platform Game engines including Unity have a lack of editing capacities inside the game editor and no actual modeling features other than a few simple shapes (M. Moser, personal communication, March 14th, 2018). Third-party 3D applications can be used to help remedy this problem. A 3D model used in Unity3D is often imported from Revit, SketchUp, or other BIM applications, and used as a basis for creating VR application, followed by introducing materials and lighting. Finally, the VR application is published as a file that can be accessed and experienced by anyone with the appropriate VR hardware (Brouchoud, 2016). There are many workflows between CAD and BIM software and Unity (Figure 2.4). 23 Figure 2.4: Geometry Exchange Between CAD and BIM Software and Unity (Boeykens, 2014) The direct Revit to Unity workflow needs to be revised because the material color and texture in FBX model from Revit is lost over the direct workflow, which is due to the reason that materials in Revit can only be read by Autodesk software (Kumar et al., 2011). Unless using scripts to create the necessary UV coordinates, which are the basic elements of a two-dimensional texture coordinate system (Autodesk Knowledge Network, 2014), it is not easy to fix that inside Unity. As a result, manually assigning materials with textures is not applicable because the geometry imported into Unity lacks the necessary UV coordinates (Turbakiewicz, 2016). The material information is absent, without individual assignment of materials to meshes (Figure 2.5). Hence, an additional step is introduced into the workflow, which is to input a model to 3DS Max for rendering first, followed by re-exporting it as an FBX file, resulting in a much better model containing element hierarchy information from which material assignments and UV coordinates can be acquired (Philipp, 2012). In this updated workflow, if the model is changed by adding extra details or circumstance in 3DS Max, Unity will automatically find updates after simply saving most raw file-types in 3DS Max’s format to the assets folder of the project (Brouchoud, 2010). This workflow can be applied later for BIM and game engine based VR integration (Figure 2.6). It is important to have a hierarchy of objects in the model so that scripts can be attached to several objects with similar attributes at the same time, for better model management and accelerated development process (Kumar et al., 2011). After structuring this data, advanced coding techniques and programming strategies need to be implemented that distribute certain template scripts to a corresponding family of objects. For instance, objects belonging to a door family can have a swing animation script and a trigger script attached to enable them to open in close proximity to a reviewer’s First-Person Controller (FPC). 24 Figure 2.5: Example of the FBX Structure in Unity (Boeykens, 2015) Figure 2.6: Process Flow Revit-Unity Integration (Handel et al., 2016) 2.3 Augmented Reality for Integrated Facility Management IFM in practice includes multiple layers of management of repair report, work order, check, asset, inventory, knowledge database, human resource, etc. Take the work order as an example, it involves the procedures of submit, approval, execute, track, feedback to achieve closed-loop management. The data management is the key, which comprises basic information, specification, space information, maintenance record, warranty, manufacturer (JLL, personal communication, July 8, 2016). AR technology can be used to display the related information on top of the selected components of a systematic project, which is an important application outcome in achieving the combination of real objects and related digital data. Another application case is using AR to offer instructions necessary for accomplishing the job on site (ASIS NEWS, 2017). These two applications are about information visualization. 25 Based on the internal practice of JLL in daily work of the O&M, more AR application scenarios on IFM can be considered, as follows. For instance, indoor navigation can be achieved using AR to quickly guide the related technician to reach to the destination spot on site to handle the work order efficiently. Both the ease of the navigation and the information visualization mentioned before are the two typical applications via AR for IFM (Royal BAM Group, 2018). Since O&M tasks are normally performed in an environment where technologies like GPS or Wi-Fi are not always available, other types of tracking mechanism of the AR systems should be selected. The Apple ARKit offers this tracking ability and its app called “World Brush” uses latest technology (Figure 2.7). This free App can be used for part of IFM because of its markup functionality which can also be seen from the Fuzor application later. The markup is spatially fixed regardless of the viewer’s perspective so that the paintings are equivalent to being planted into and merged with the real world around. The painting can be shared to other team members of IFM on site (Apple, 2017). Consequently, if users find issues including broken equipment, this tool can be used to take note with markup and sent to the IFM team to report the issue for solution. The painting markup is vital to accurately locate the place and component for technicians so that they don’t need to spend extra time and effort in searching for the target in O&M. Similar applications can be used in building industry directly (Figure 2.8). Figure 2.7: World Brush App (Pathak, 2017) 26 Figure 2.8: D-Measures App (Milideas De Decoracion, 2015) Information visualization based on the object recognition is the outcome to be achieved for the AR part of the research so that object target and model target of Vuforia have been selected in the previous chapter. Both the methods will be presented later. This AR research is a simple test due to the time limitation, compared to the three complete VR workflows later. In the AR configuration of Vuforia, ARCamera and ObjectTarget are the essential components (Vuforia, 2018). Chapter Summary Through describing previous research work, the connection of the above key terms is clearly illustrated, including the relationship between VR and IFM, VR and healthcare experienced-based design (EBD), BIM and game engine, healthcare EBD and game engine, and AR and IFM. The real applications of these combinations are described. Potential software problems and their solutions are further specified for research guidance later, based on the basic software selection before. For example, the problem in model conversion between BIM and game engine platforms and its solution is illustrated with the specific example of Revit - 3DS Max – Unity workflow. Additionally, information visualization is identified as a connection between the VR and AR workflows and therefore the corresponding functionalities of AR software (Vuforia) is studied. 27 CHAPTER 3: METHODOLOGY Chapter Introduction This chapter is about the methodology to be used for research development, which is presented through several workflow diagrams except for the AR test. Sub-sections of this chapter correspond to the key parts of the workflow. These are also the initial results which are significant to develop the research methodology. Full results are discussed in Chapter 4. The three workflows in VR are cinematic VR through Oculus Rift, Unity based interactive VR though HTC Vive, and Fuzor based interactive VR through HTC Vive. The one test of AR is Vuforia based interactive AR. 3.1 Overview The first three workflows are used for the same project objective, which is using BIM based VR to facilitate the IFM and potential design configuration modifications based on EBD during the design stage of the building. For the VR workflows, the final result is to be viewed with a head mounted display (HMD). Functionality is more critical than aesthetics including lighting and materials. The final test is BIM based AR for information visualization based on object detection, which is aimed to be applied for the IFM stage particularly. The first three workflows are VR work while the final test is AR work. Workflow 1 is to create a linear film to present the project. Game engines were applied in the workflow 2 and 3 to achieve the real-time control through room-scale VR experience. The game engines are Unity and Fuzor respectively in the two workflows for comparison. It should be noticed that the hardware usage is not exclusive. For instance, the Oculus Rift can also be used for Unity and Fuzor workflows. The initial AR result is a test instead of a workflow so that no workflow diagram is corresponded. 3.2 Workflow 1: Cinematic VR through Oculus Rift The first workflow was about making a VR film without game engine. Oculus Rift was used. Autodesk Maya was used as the main platform for integrating the video ingredients, starting with basic models exported from Revit and ending with a cinematic VR file (Figure 3.1). Figure 3.1: Workflow 1 of Cinematic VR through Oculus Rift with Partial Steps for Initial Results ( ○ c Autodesk, Redshift, Oculus, Jaunt, Adobe, METTLE, HandBrake) The information panels were made from Adobe Photoshop (PS). Redshift within Maya was responsible for rendering frames, followed by making a linear video from Adobe Effects (AE). AE was used to combine frames into a video with another round of rendering. AE was also for creating effects for better transitions between frames. The frame output from Maya could be rendered in either mono or stereo mode, the stereo mode being especially time-consuming for rendering. A single frame was rendered first. The test procedure was conducted via Jaunt Player (shown at the top right corner of the workflow diagram). Mettle, an AE plugin, was used to play the final VR video after connecting to the Oculus Rift. It was also used to preview the video from AE before the final AE rendering. Compared to Mettle, Jaunt Player is a standalone VR player. However, Jaunt Player can also be used to play and present the final VR video, either through Oculus Rift terminal or computer screen. It should also be noticed that the final video produced from AE can also be published to the other platforms like mobile phones and websites. In the initial result, a typical Revit model was found and processed to be exported into Maya. The single stereo panorama without the material and 28 lighting was tested to generate and viewed through Oculus Rift via Jaunt Player. As another part of the initial result, necessary ingredients for the later video making was made, like the pre-made information panel from PS. 3.2.1 SEPS Revit Model Exported to Maya This part is corresponded to this section of the workflow diagram (Figure 3.2). Figure 3.2: SEPS Revit Model Exported to Maya in Workflow 1 A medical patient room from SEPS website was selected as the final demonstration project. Although its Revit model can be downloaded (Figure 3.3), its associated data like the manufacturer information is not available because the Veterans Administration and the Department of Defense who use SEPS for their projects are not allowed to have any manufacturer information in their building models until procurement. Hence, the generic models from MILSTD1691 was used instead (Military Standard 1691, 2017), and the data in the Excel database was gathered from the internet for simulating the real application scenario of IFM where that Excel file can be replaced by real IFM related data which is not available in normal Revit model but in external database normally in Excel format (Figure 3.4). The sample model information of Name, Manufacture, URL, and Cost for three pieces of equipment was used for the test scenario. In a real application, the IFM related data normally refers to maintenance schedule, replacement list, list of equipment to be regularly inspected, warranties, O&M manuals, training videos, change orders, specifications including model and serial numbers, service contact information, photos, etc. Data retrieval, change management, and tracking costs and work activity are critical for IFM (Selezan & Mao, 2016). In this step, some information panels were made in Photoshop, which would be presented in the results later. The Revit model was exported into Maya. 29 Figure 3.3: The Revit Model of a Patient Room based on the EPS OOHR Template (Military Standard 1691, 2017) Figure 3.4: Excel Database of the Three Pieces of Equipment within the Final Research Model 3.2.2 Maya Rendering for Stereo Panorama This part is about the below section of the workflow diagram (Figure 3.5). Figure 3.5: Maya Rendering for Stereo Panorama in Workflow 1 30 Autodesk Maya 2017 was the go-between software selected (Figure 3.6). Several pan cameras were added into the model for some key frames, and the green camera positioned in the center of the model was used to make the first stereo panorama for testing. The first stereo panorama from the center camera view was simply rendered and exported to a JPG file with a length-width ratio of 2:1 where the same two images were for two eyes separately to create a stereoscopic environment (Figure 3.7). It is apparent that some parts of the distortion are revealing the characteristics of 360 pictures, and the only part without distort is the middle point. The JPG file can be regarded as the 2D format of a 360 image after unfolding it from a sphere to 2D plane. Special players like Jaunt Player will play this 2D format back in the form of spherical image with a whole surrounding view. This can also be regarded as the standard currency for cinematic VR, which is called equirectangular or latlong format in profession. However, the other formats can also be used as the 2D formats of panorama depending on the different unfolding ways. For example, fish eye format is another typical one (Hanson, 2017). Figure 3.6: The User Interface of Autodesk Maya 2017, including the FBX Model in Wireframe with Cameras 31 Figure 3.7: Maya Rendered Stereo Panorama from the Central View inside the Project Room Model 3.2.3 Jaunt Player for Viewing Stereo Panorama via Oculus Rift This part is about this section of the workflow diagram (Figure 3.8). Figure 3.8: Jaunt Player for Viewing Stereo Panorama via Oculus Rift in Workflow 1 Jaunt Player is the software used for 2D and 3D display of 360º videos and side-by-side stereo display, which was used with Oculus Rift for quick VR experience (Jaunt, 2012). Based on the previous comparison between Oculus and Vive, it can be found that Oculus is more portable and faster to set. At this stage, only linear video was to be created instead of real interactive experience so that Oculus Rift HMD for seated experience of VR was selected while later HTC Vive HMD for walk-through VR experience would be selected for real-time VR part. It should be noticed that although later Oculus Rift also starts to support the room-scale VR as well, its maximum dimension allowed is smaller than that of HTC Vive so that HTC Vive was selected for a better experience within a room-scale (Lang, 2016). Hence, the two VR experiences within the two workflows can be comprehensively evaluated at the end. It should be noticed that linear video here means that the video’s production process is in a predetermined sequence so that viewers cannot change the frame sequence while playing the video. Its main window opened from its user interface displayed the project model without material on the computer screen, and the same content could be viewed from Oculus Rift for a more immersive experience and a more stereoscopic sensation if the stereo mode was selected rather than mono mode (Figure 3.9). The presentation is a 360 video for a still scene in a fixed position whose observed perspective can be rotated per input from either the mouse of the computer or the headset of Oculus Rift, which means that for better observing the surrounding space from any predetermined point of view the user can either pause the video to rotate the view or rotate the view while the video is playing. 32 Figure 3.9: The User Interface and Main Window of Jaunt Player for 360º Videos Display of the Project Model 3.3 Workflow 2: Unity based Interactive VR through HTC Vive The second workflow is about the Unity based workflow for creating experience based VR through HTC-Vive with the interactive information visualization from VR input (Figure 3.10). There are two key objectives to be achieved. The first is to generate a VR scene to achieve the real-time control for a patient room, which involves not only the simple collision, door opening, object movement but also the complicated interaction like parametric control in drawer opening. The core aim of this first objective is to simulate a real room so that the user can do real manipulation. This is critical for the design review of the space in VR. After achieving the first objective, the interactive data visualization in VR environment for IFM information support was explored further, based on the VR environment created before. This is the second objective, which will be discussed in Chapter 4. Although material is lost from Revit directly to Unity, the material is well preserved after exporting the Maya model into the Unity, given that the material is manually added within the Maya due to the inevitable material loss from Revit to Maya. In the initial result, the interactive information visualization is achieved by a mouse click. Then the control method is converted into VR input. Moreover, the visualization is modified to accommodate the HMD. 33 Figure 3.10: Workflow 2 of Unity based VR through HTC-Vive with Partial Steps for Initial Results ( ○ c Autodesk, Unity, HTC, Vive, VRTK) 3.3.1 Revit-Unity based Information Visualization for Integrated Facility Management This part is to achieve the interactive information visualization as the foundation of the information management as the core of IFM. The initial result will show the Revit to Unity part, without fully development of the VR environment (Figure 3.11). Although it is the initial result, it can demonstrate the Unity potential in terms of IFM, to some extent. Figure 3.11: Revit-Unity based Information Visualization for Integrated Facility Management in Workflow 2 3.3.1.1 Basic Information Transfer from Revit to Unity Revit is a typical BIM software whose difference from the other 3D modeling software like SketchUp is that Revit is the database for a project so that it contains the data including not only the geometry but also the information about objects. Although it has already been demonstrated that the geometry data of Revit can be exported to Unity as a FBX file format, the other information data was not transferred. For the healthcare facility management, family parameters within the Revit file were used for basic demonstration. The transferred information would be visualized in the VR environment through Unity later. The first step of the process was to use Revit to create its own schedule based on the parameters, which was a straightforward procedure because of the Revit’s fundamental capacity. For simplicity, a furniture with a basic box shape would be created with the three family parameters including Manufacturer, Cost, and URL (Figure 3.12). The schedule was customized for the family parameters which in this case were type parameters of the furniture family. For enabling the information transfer, this schedule was assembled into a database to be accessed from outside software like Unity first. Various file formats could be used. In this case, a txt text file format was selected with certain data structure pattern like certain data organization order, semicolon mark as field delimiter, and quotation mark as 34 text qualifier. The file was generated by exporting the schedule from that Revit file, followed by storing the file into the Unity Asset folder through simple copy and paste. Figure 3.12: Autodesk Revit Interface showing the Sample Box Model with its Type Properties and Schedules Next, Unity was used to visualize this information. Since the main task of this initial experiment step was to access data, the default Main Camera in the Unity was selected to be the temporary place to hold and display the information. Hence, a script was written to be attached to its object which was Main Camera in this case (Figure 3.13). The script can be decomposed into four parts. The first part was from line 1 to 3 where particular scripting classes that were already inside the Unity and ready to be used were deployed for special functionality utilization. From line 5 to 7, the second part was about establishing the necessary variables. Then, the four lines of code from line 10 to 13 were written for initialization, which was to define the relationship between these variables, and to assign the contents to certain variables. The above third part was the key to access the data in Unity. Once done, the final part of the code was to display the data onto the Graphic User Interface (GUI), which was updated frame by frame. Also, the camera within the FBX model imported from Revit was be deleted because it conflicted with the existing main camera, and only one camera was needed for presentation. 35 Figure 3.13: loadTxt.cs Script in C# The result was the data from Revit shown in Unity (Figure 3.14). It could be shown that the facility information had been displayed on to the top left of the game view screen. This visualization was initialized once, meaning that the information was still while the game was running. There was no trigger event to control when it happened, no correspondence existing between certain objects and information, and nothing could be done to manipulate this box. Hence the next step was solving that issue. Figure 3.14: The Game View of Initial Testing in Loading External Text File into Unity 3.3.1.2 Interactive Information Visualization in Unity from Multiple Revit Files Given that the main information of BIM including model geometry and information could be accessed and transferred from Revit to Unity, further effort was done to achieve the interactive visualization in VR to enable the Unity user to visualize the FM information right above any piece of the facility in space once the user used the mouse to click it. Compared to the previous basic case, the information was associated to its own model and visualized as requested by the user rather than the information was deadly displayed on top of the whole screen. A collider was added to the object to be clicked before triggering an event, which was the foundation of the object interaction in Unity. 36 To simulate the real situation where one of the Revit model was missed, one of the model would be built directly in Unity so that the exporting process would also be simplified. This step was based on the previous step so that most of the code was be reused (Figure 3.15). Additional parts of the script were shown in marks of different colors. Just like the Start, Update, and OnGUI, OnMouseDown shown with the purple mark was another MonoBehavior as an input for the Event System. Hence, the additional part of the code was to transfer accessed data in the previous step to the specific UI system which was a canvas component as the display screen for this case and to display that UI system in a certain way only when users clicked the mouse down in Unity. That certain way was to assign the position of the object clicked to be the position of that canvas, which was achieved by the final line of the script with yellow marks attached. In that line of the code, the two gameObjects referred to the different variables. It should also be noticed that the public variables of Text and Canvas could only be used if the UnityEngine.UI was enabled, which was shown using red marks. The additional parts shown in green marks was to delete the initial display of the panel to ensure that information panel was only shown per mouse click. Figure 3.15: ShowTextOnClick.cs and loadTxt.cs Scripts in C# After developing the code, there were two issues to be solved. Hence, two parallel experiments were conducted before combining them into the final solution, which corresponded to the two problems including that the canvas (the object where the data from Revit would be displayed) could not be precisely attached to the imported FBX box model while it could accurately match the cylinder model built in Unity directly, and that the text inside the panel could not be displayed well at first within the canvas. Both problems were solved shown in the red and blue blocks respectively (Figure 3.16). 37 Figure 3.16: The Two Parallel Experiments of Canvas System in Unity 38 The solution towards the first issue could be seen from the improved results for the canvas display (Figure 3.17). After multiple attempts, it was found that the first problem could be solved after extracting the geometry child component from the FBX parent object to a new Game Object. However, the solution to the canvas system display issue was much more complicated so that there was also not enough place to be used to reveal the solution. As shown in that figure, the canvas system was a hierarchy of three layers with their own components which were canvas, panel, and text in this case. Consequently, they were tightly related to each other. The solution was based on Unity User Manual (Unity, 2017). There were two parts in that article used for solving this issue, which were the method of fitting to size of text, and the method of fitting to size of UI element with child text. After fixing these two issues together, it could be seen that the desired canvas display was achieved in Unity (Figure 3.17). Figure 3.17: The Reuslt of Canvas System in Unity It also showed the inspector window of the panel in which two built-in scripts including Horizontal Layout Group and Content Size Fitter were applied for canvas display adjustment. In contrast, only the Content Size Fitter was used for Text component of the canvas system, which was not shown in Figure. It could also be found that the relative size of panel in terms of the context zone could be adjusted through changing public variables like left and right inside Horizontal Layout Group script, shown in pink marks. The other part of the pink marks on the left screen showed the canvas system and the relative positions among its three layers which was controlled by Rect Transformation shown in previous figure. It should be noticed that sizes of the two canvases were the same although they appeared to be different due to the difference between their relative positions to the main camera in the scene. 39 3.3.1.3 Interactive Information Visualization in Unity from a Single Revit File In the real situation, facilities in a single room or multiple rooms might be contained in multiple Revit files or within a single Revit file. More complicated models, which were the two pieces of medical equipment in a single Revit file downloaded from SEPS2BIM website (SEPS2BIM, 2017) were tried. Basic materials were imported from Unity Asset Store to be attached onto these models. It was found that this time the locations of these canvas were automatically matched to their corresponding models, which was because that the these two models were from the same FBX system. Data manipulation was the only additional part needed to be solved based on the previous case already developed. Since the whole schedule contained the facility information of more than one piece and the display for the selected piece of the equipment should only extract the related information, the solution was to manipulate the data after parsing the data in that file. The file was shown in Notepad (Figure 3.18). Three parameters except Name were the same as the previous ones while the Name parameters which were default in that downloaded Revit model were added this time. The reason to do that was that Name parameter was the ideal identifier to be used for searching for only the object corresponding information to be displayed. However, it was found that the names of the imported pieces of the models were not equivalent to the name parameter that was also the name in Revit model, which could be seen that the Revit’s name of each facility was repeated by the name of the Unity model within the FBX group (Figure 3.19). Although it made the potential solution become less straightforward, the method was still apparent because the first five numbers could be utilized for unique marker instead of the whole name. Figure 3.18: The Exported Specialty Equipment Schedule Figure 3.19: Names of FBX Imported Model Pieces in Unity Hierarchy Window Compared to the previous scripts, more C# functions were applied, shown in top area of the whole script (Figure 3.20). For example, StreamReader classes later in the data processing part of the code were used from the System.IO namespace. The only changed part was about data manipulation shown in the brace mark. More variables were needed to support more complex logic compared to the previous one. The identifier mentioned before was modified from the original object name through only keeping the first five characters, which was stored into the variable objectName. After using the StreamReader, a While loop was used to ensure the thorough file reading process line by line, under which the two parallel If loops were used to combine the searched lines of the first fixed line and any line containing the same string objectName. The combine method was to concatenate the lines with the “\n” separation to keep line separation rather than connect to locate at the same line. Script lines starting with the “print” did not have any effect, but printed the output after any line to the console window to verify the valid execution of any line, which was a test process vital to the successful implementation of the whole script. As usual, the script was also attached as a child object to the game object to be executed. 40 Figure 3.20: The Script of Chapter 3.3.1.3 In the result, the white background could be seen due to the addition of a base panel to simulate the ground (Figure 3.21). The positions of the camera and light were modified for better presentation, shown in the left scene window. The basic material and the imported schedule were added into the asset folder of the project, shown in the top two blue circle marks on the right. The Aspect Ratio Fitter Script was introduced to further modify the canvas panel for better information presentation. 41 Figure 3.21: The Result of Interactive Information Visualization in Unity based on a Single Revit File 3.3.2 Basic Room-scale VR via Steam VR and HTC Vive Basic Steam VR and HTC Vive was used to develop an initial VR scene allowing users to walk through and look around the room model without materials. In the first test, no VR Toolkit had been used so that there was no interaction could be performed (Figure 3.22). However, this was the significant step to test the HTC Vive as the HMD of VR and know the key functionality of Steam VR that would be later accessed by SDK (Software Development Kit). 42 Figure 3.22: Basic Room-scale VR via Steam VR and HTC Vive in Workflow 2 3.3.2.1 HTC Vive and Steam VR HTC Vive setup was the final step before the start of the VR experience although a VR simulator later could be applied without setting up the HTC Vive. It was apparent that the final work would be VR presented through HTC Vive. The setup guidance was provided by USC IT Service (Kaurloto, 2017), which was based on the HTC Vive product instructions. The first step is to test the working conditions of the equipment (Figure 3.23). Figure 3.23: Steam VR - HTC Vive Room Setup Drawing based on Lily (2016) Steam VR is the necessary software for supporting HTC Vive, which connects to a PC. The HTC Vive is a component of Steam VR platform. The headset’s movement is tracked by a pair of lighthouses, which are laser-emitting base stations placed at the corners of a room with the maximum distance of 5m or 16ft. The tracking mechanism is basically that sensors arranged around the surface of both controllers and headset receive lasers emitted from the two lighthouses. For the controllers, the main two components are haptic thumb pads for controlling feedback and a trigger on the back as the primary input button (PCGamer. 2015). 3.3.2.2 Roam in Room-Scale VR Steam-VR has the room-scale VR ability that allows the user to look and walk around the 3D model. The experiment workplace is shown with the necessary equipment (Figure 3.24). 43 Figure 3.24: HTC-Vive Work Station The Unity setup was shown with the Camera (Eye)’s coordination controlled via the position of the headset in this screenshot (Figure 3.25). The display window of the Unity could also be viewed from the HMD. This initial result allowed the user to walk around the room while looking around freely. However, there was no real obstacles existing yet and no interaction had been scripted yet. Although it was not clear from this figure to see the inside of that patient room due to lack of the material, it was clear enough to view in VR in terms of getting a full sense of the space dimension and internal layout. It was worth to notice that the movement at this stage was due to the tracked HMD only so that the input from the two controllers was invalid to control the movement at that time. Figure 3.25: The Unity Interface shown with the Model when the HMD is connected and Steam VR is running 3.3.3 VRTK Application for Interactive VR To further bring the interaction and VR control, the VRTK (Virtual Reality Toolkit) was tried in the second part of the initial test (Figure 3.26). For design review of an architecture space with furnishings, it is important that objects behave in some ways like they would in real life, for example, a drawer would open the correct amount to see if it conflicts with other furniture. To achieve realistic drag interaction within certain movement limitations to simulate different real situations, a cabinet with a drawer was made. 44 Figure 3.26: VRTK Application for Interactive VR in Workflow 2 3.3.3.1 Parametric Control for Drawer Simulation The model was attached with several parameters, but it still could not satisfy the requirement needed in experience- based design (EBD) because one of the most important conditions for achieving EBD was to make the parameters meet the design logic so that the real situation could be simulated. It was obvious that the drawer should be prevented from being pulled out to be dropped to the ground so that the distance over which the drawer could be moved should be limited. The limitation could be a percentage of the drawer depth, whose actual value depended on various factors. And within a certain amount of the distance, the drawer could be moved freely in a fixed direction. 3.3.3.2 Virtual Reality Toolkit (VRTK) VRTK (Virtual Reality Toolkit) is the free tool used to facilitate the VR game development through Unity. VR simulator of VRTK is important for VR development, which provides the player a method to play the developed game through mouse and keyboard, without establishing the HMD and VR space, which saves the effort and time spent in VR equipment establishment. The most significant elements in VRTK would be described in the following parts for better understanding the gaming mechanism behind, from which the further development was possible (VRTK, 2017). Controller is the input from the player. Rigidbody should be applied to the controller for enabling physical collision. Players should be able to receive timely feedback when the touch is effective before further action is performed like pulling the trigger, and Rumble on Touch is used to control the shake strength and duration for the real controller. Scripts of VRTK_InteractableObject for interactable objects needs Rigidbody and Collider. In the grab interaction, the option includes object which can be grabbable, droppable, precision snapped, etc. In grab mechanics, there are three types: Fixed Joint with simple snap, rotation snap, and precision snap normally used for grabbing object; Spring Joint used for manipulating object with certain level of flexibility; and Tracked Object used for tracking the objects’ movement to the controllers’ movement. For instance, Tracked Object works well for door manipulation due to 1:1 control. In use interactions, Pointer Activates Use controls whether the pointer emitted from controllers can be used to initiate the actions towards the interactable objects, followed which Hold Button to Use determines the next actions through pressing the button of controller. A part of the scripts is written in C#, revealing how the highlighting by toggling is achieved (Figure 3.27). In addition to VRTK_InteractableObject, VRTK_InteractTouch script is also important in this tool. In this script, there are also multiple components. For example, Global Touch Highlight Color is responsible to highlight the interactable objects touched and its color can be adjusted. VRTK_InteractGrab is another script for simulating throwing the grabbable object away after releasing the controller button (VRTK, 2017). 45 Figure 3.27: Toggle Highlight Scripts of VRTK in C# Language It can be found that there is a boundary in VR that constraints the movement of players. The VR boundary indicates the real boundary through the highlighting with virtual walls in the VR scene. These virtual walls also block the users from being moving further. However, the activity range in the VR can be larger than the real boundary because user can always walk around using the touchpad or teleport within the VR scene. 3.3.3.3 Initial Test of Unity-VRTK VRTK has some built-in functions to facilitate the VR development. Thus, it is essential to utilize and modify the suitable scripts to customize development. For the simple case of a cabinet with a drawer mentioned before, there are multiple ways to control the action of pulling the drawer out of the cabinet. To achieve the most realistic simulation, Configurable Joint was selected. The most important two parts responsible to accomplish that were limited and locked motions and linear limits to control the maximum amount or percentage of the distance the drawer could be pulled out (Figure 3.28). Apart from Configurable Joint and VRTK_Interactable Object shown in Figure, Transform, Box Collider, Rigidbody, VRTK_Spring Joint Grab Attach, VRTK_Swap Controller Grab Action were necessary in achieving the desired objective. However, it should be emphasized that the result was not controlled by Revit parameters, but done in the VRTK. It could also have been done directly in Unity, without the usage of the VRTK. Figure 3.28: Specific Child Scripts used inside the VRTK_Interactable Object Parent Script There were in total four results displayed in both the Scene and Game windows simultaneously, corresponding to two design conditions whose linear movement limitations were 50% and 100% of the maximum drawer depth respectively and the two extreme situations which were 100% open and close within their own design condition respectively. 0.2 and 0.1 units of linear limit corresponded to 100% and 50% movement limitations respectively (Figure 3.29). 46 Figure 3.29: Preliminary Result of a Designed Cabinet with Single Drawer through Unity for VR Usage Among the aligned four results, there were three parts of the user interface: scene, game, and inspector windows, (left to right). The scene window is the place where the game is designed while the game window is the place where the game is played, tested, and shown. The scene window in this case is also used to conduct interaction towards the game object, although it is feasible to just manipulate the drawer in the game window entirely. For this case, it was found 47 that controlling in the scene window instead of the game window was more convenient because in the game window it was hard to locate the handle in 3D dimension although it was much easier through HTC Vive controller. However, the above operation is not normal but just for the presentation because changes made in the game window will not affect the real game shown in scene window which can be regarded as an interface of game editor. In the middle Game window, navigation and interaction models can be switched with each other. There are green and red dots in the Game window, reflecting that the screenshot is in the interaction mode because these two dots represents and simulates the left and right controllers of HTC Vive. The inspector window is used to add the functionality towards the specific game objects. Tools and scripts including the VRTK can be added and modified in this part. For example, if there is need to write a customized script to tailor the functionality of a game object, Add Component at the bottom can be clicked to add a new script although the new script cannot be edited directly in Unity. It is MonoDevelop which will be initiated for further editing. MonoDevelop is the open source integrated development environment (IDE) supplied with Unity (Unity, 2017). MonoDevelop is a cross platform IDE for multiple programming languages including C# which is used (MonoDevelop, 2017). The user interface of MonoDevelop based on Windows operation system is shown (Figure 3.30). The code presented was VRTK_SpringJointGrabAttach.cs mentioned before. It can be found that this IDE has variable features including advanced text editing and integrated debugger, beneficial to the code writing. This code also reveals another significant ability of Inspector window in Unity. Three float variables including BreakForce, strength, and damper were public, which are accessed through the inspector window (Figure 3.30). The strength determined how quickly the drawer could be pulled, in a positive relationship. The damper removed the oscillation effect, which meant that the low value could be used to achieve loose pulling and vice versa. Public values can be directly overridden to test the game change while the game was playing so that there was no need to interrupt the game playing process and switch to the MonoDevelop for modification. Public variables could also be accessed through scripts which also could be adjusted in Unity Inspection window. Because of these variable settings, the real situations could be simulated, which could be validated in the VR simulator. The drawer sliding could be felt as considerably nature rather than rigid. When a player tried to pull further beyond the movement limitation, the drawer started to get stuck while vibrating. If the player kept trying to do that, a limitation stage would be reached, followed by breaking the junction between the cabinet and drawer. Figure 3.30: MonoDevelop User Interface with VRTKSpringJointGrabAttach and Corresponding Unity Inspector 48 3.4 Workflow 3: Fuzor based Interactive VR through HTC Vive The third workflow developed was a Fuzor based interactive VR with bidirectional synchronization with Revit (Figure). The effort in constructing a VR platform in Unity based on the BIM database can be saved by Fuzor, a game engine customized for AEC usage. From the Revit Add-in version of Fuzor, the Revit model can be opened via the standalone version of Fuzor and kept synchronized with Fuzor model. Hence, time and effort can be saved in importing, exporting, and modifying the model. This synchronization with the Revit model without the need to use additional software makes it suitable for IFM as live BIM updates are needed during the O&M. It takes more time and effort to establish a fully interactive VR environment through Unity workflow. In comparison, it can be automatically achieved through a single click in Fuzor. Although Fuzor can also offer the AR solution with Hololens, this was not covered as the headset is expensive. There are the two parts of the Fuzor workflow (Figure 3.31). First was to demonstrate the inherent power of Fuzor in VR application for IFM by the same project demonstration, which was based on the Revit model. One of the most important aspect of capacities to be proven is the Revit parametric information visualization in Fuzor VR because this is same as the part of the result to be accomplished in the Unity workflow. Second part of the workflow included Revit, a database of the equipment in Excel, Fuzor, and a customized link written in the Fuzor API with a customized Python script. An XML tool was developed to link the Excel sheet to the Revit and Fuzor models instead of using the inherent Revit parameters as in the first workflow. This tool can have real application potential because the Excel file can be a IFM database in a real project like a maintenance schedule or repair logging. Figure 3.31: Workflow 3 of Fuzor based VR through HTC-Vive with Partial Steps for Initial Results ( ○ c Autodesk, Fuzor, HTC, Vive, Python, Anaconda, PyCharm, Microsoft) 49 3.4.1 VR Solution of Fuzor Figure 3.32: VR Solution of Fuzor in Workflow 3 This part of the workflow relied on Fuzor’s ability to work directly with Revit (Figure 3.32). After installing Fuzor, the additional Fuzor logo is added onto the default Revit user interface (Figure 3.33). With the model opening in Revit, it can be opened in the Fuzor from the Launch Fuzor button in Revit. Figure 3.33: Fuzor Plugin in Revit User Interface The Revit model was opened in Fuzor. The default functionalities of Fuzor could be seen (Figure 3.34). Like the previous initial results, there were no materials on the objects yet although Fuzor will bring most of the materials from Revit into Fuzor. 50 Figure 3.34: Fuzor User Interface with Separate Functionality Windows Added The same operations could be done in both the normal Fuzor user interface and Fuzor VR environment. When the VR mode was initiated, the Window user interface of Fuzor showed the same scene as the VR view. The following screenshots were captured in VR directly rather than the desktop interface. There are many render modes to be viewed in VR. One of them was Light Visualization Map render mode, which was beneficial to the basic lighting analysis. From the daytime to nighttime, the different lighting conditions of the indoor environment could be clearly compared through this rendering mode (Figure 3.35). The left-hand side showed the early morning situation where the sunlight entered the indoor space while the right-hand side represented a night situation where the potential glare may occur due to the single ceiling light source in contrast with the overall dark environment. Based on the above analysis, the related design modification could be conducted to improve the lighting environment in both the design review and O&M phases of the project. 51 Figure 3.35: Light Visualization Map Render Mode of Fuzor showing Different Indoor Lighting Environments To create dynamic model and achieve more realistic simulation, Fuzor also had more advanced simulation capability although they could also be developed in Unity. For instance, water flowing out of the faucet could be simulated dynamically via Water Spawn functionality (Figure 3.36). The simulation was created in VR environment, without the need to switch back to the desktop. The left-hand side of the figure showed the source of water spawn represented by the little yellow frame cube put around the faucet head so that water could be spawned from that place to reflect the real situation. The right-hand side of the figure shown the result of that. Figure 3.36: Fuzor Simulation of Water Spawned from the Faucet in VR Environment More functionalities could be achieved in Fuzor VR (Figure 3.37). The top left one showed that Revit families including furniture and equipment could be added, deleted, moved and rotated from this equipment warehouse. The bottom left one showed that their property information from the Revit family parameter could be visualized, with their location presented by cartesian coordination in X, Y, and Z. The top right one showed that dimension from any two points could be measured directly in VR environment so that the quality insurance procedure could be conducted. The bottom right one showed that problems could be marked up in VR environment and saved so that later it could be seen by other reviewers and probably modified accordingly. These functionalities were essential at O&M stage of the project. Among these four figures, only the top left one showed the daytime view of the room while the rest of them showed the nighttime view of the room. 52 Figure 3.37: Fuzor VR Functionalities including Family Editor, Property Visualization, Measurement, and Markup 3.4.2 Excel Database The section 3.4.2 is corresponded to this section of the workflow diagram (Figure 3.38). 53 Figure 3.38: Excel Database in Workflow 3 Both the geometric data (such as the room components) and the Revit parameters for the objects (such as the equipment name) were easily transferred from Revit to Fuzor. The final part of the third workflow was to connect external databases to Fuzor so that other customized project information like the IFM data which is not normally contained in Revit can be integrated into the Fuzor with the original BIM data from the Revit. The database was the same Excel datasheet used before, except that a new column of the object ID number for Fuzor API identification was added (Figure 3.39). The structure of the data was key to understand the script later. The parameter label was presented by the head of each column like Name, Manufacturer, URL, and Cost while the parameter value was the cell value of the Excel spreadsheet whose value was determined by both the value of the first column of ID and the parameter label. The inherent column before the first column from 1 to 4 was called index, which would not be used later. There was only one pair of the data could be added to Fuzor at once, to be shown in the result later. One pair of the data consisted of a parameter label and its corresponding parameter value for each object identified by its unique identifier which was ID for this case. Figure 3.39: Modified Excel Database of the Three Pieces of Equipment within the Final Research Model 3.5 AR Test: Vuforia based AR through Object Target A very preliminary exploration of AR was tried to understand its potential applicability to FM and understand more clearly its differences to VR. It was mentioned previously that there are two potential methods in the Vuforia-Unity workflow, object target and model target. They are both used to recognize the object (this could be an object like a laptop or even an object like an entire room) so that information can be overlaid on top of them. It is useful for IFM. This section shows the initial steps of the object target method while the chapter 4.4 will describe the completion of the model target method. When no digital model of the object is available, the database can be scanned into the Vuforia project folder and to be used later at the object target in Unity-Vuforia’s object recognition. The complete steps are creating an object data file from the Vuforia Object Scanner, followed by uploading the object data file to the Vuforia Target Manager. After the device database is packaged, it can be downloaded and imported to the Vuforia project developed in Unity. However, other tools can also be considered, like Eclipse and Xcode (Vuforia, 2018). The trial processes presented are primarily about using the Vuforia Object Scanner. The scanner works in certain types of the Android phone. A Xiaomi phone with the Android operating system was used. A pool toy was used for the experiment because of the multiple requirements for the target objects. No actual medical equipment could be used, and very few daily objects could satisfy the requirement for the experiment so that model target was applied later for wider range of the objects to be used. For instance, a Dell laptop was unable to be scanned into the database. The pool toy experiment was successfully at the end. 54 First the app was installed on the Android phone. The scan target was a paper printed in advance as the scanner base where the potential target was put on top of that to be scanned. During the entire process, the relative position between the object target and the scan target should remain the same. The Cartesian coordination system is shown in the first screenshot (Figure 3.40). After placing the object, the scanning process was initiated, a dome was shown as the indication of the scanning area. The scanning process was done after the dome is covered with green color sufficiently, after extracting enough visual features presented by feature points, which are thegreen dots whose number was shown on the screen. Hence, the stage when the scanning process was completed was detected at that time (Figure 3.41). After the scanning, the test procedure can be conducted to see whether this target can be tracked (Figure 3.42). In the test model, a virtual box was tightly followed the free movement of the toy, which means a successful scanning process done because the object target can be accurately detected and tracked. When the test was finished, an interface was presented with the detail parameters of this scanned database including the visual feature points extracted (Figure 3.43). From this window, the next action can be either back into the test or continue scanning if the green coverage mentioned before is not enough meaning that amount of feature points acquired is not enough. The testing process is always necessary because a certain level of the percentage can guarantee the success of tracking and detection, and 100% is not always necessary. Consequently, the scanning duration can be reduced since the scanning process can always be broken and tested to see whether the scanning is done enough. Figure 3.40: The Position of the Target Object on Top of the Scan Target Figure 3.41: Scanning Process of Vuforia Object Scanner 55 Figure 3.42: Testing Process of Vuforia Object Scanner Figure 3.43: Guidance Window of Vuforia Object Scanner After completing the scanning, it was determined that the object target technique did not seem to be appropriate. Chapter 4, section X-X describes the results from the model target exploration where a laptop was used. Chapter Summary This chapter is about the methodology of each workflow. The corresponding initial results are presented to initiate the development process and to partially validate the possibility of each workflow. The initial result of the first workflow is primarily about creating a stereo panorama after exporting the Revit model to Maya. The second workflow includes developing a C# script in Unity, using the VRTK simulator to conduct the basic parametric control in Unity and applying Steam VR to test the HTC Vive based on the Unity. For the third workflow, applying the Fuzor to test its basic functionalities in VR was covered in this chapter. In terms of the AR test, the object target method in Vuforia was attempted to recognize the object and display the data associated. It was decided to proceed with the model target exploration instead. 56 CHAPTER 4: RESULTS Chapter Introduction Based on the workflow diagram and initial steps with initial results presented in the chapter 3, this chapter shows the results of the workflows. The workflow diagrams are modified based on previous ones to show additional steps to achieve the final results. For the AR exploration, a new test using the model target method in Vuforia instead of object target of Vuforia is performed to achieve the object recognition as a conceptually demonstration for the base of AR applied in IFM. 4.1 Workflow 1 Result: Cinematic VR through Oculus Rift After testing the process towards a single stereo image rendered from Maya without any materials assigned in chapter 3.2, the Maya model was modified. After rendering the necessary frames through Maya with Redshift, AE was used to combine them and conduct final rendering after adjustment was done in AE to make the video smoother in transitions among some key frames. Before rendering for the final video file, Mettle was applied to check (Figure 4.1). Figure 4.1: Workflow 1 of Cinematic VR through Oculus Rift with Entire Steps for Final Results ( ○ c Autodesk, Redshift, Oculus, Jaunt, Adobe, METTLE, Handbrake) 4.1.1 Video Frame Preparation via Maya Scene Rendering Section 4.1.1 is about this first part of the workflow diagram (Figure 4.2). 57 Figure 4.2: Video Frame Preparation via Maya Scene Rendering in Workflow 3 After adding materials, establishing lighting, adding pan and render cameras for key frames, adding additional frames for scene change, and setting up surrounding environment for the model, the final footages were prepared and ready to be rendered, given that middle frames would be linearly interpolated automatically via Maya animation ability. External view was established by a spherical 360 image (Figure 4.3). Figure 4.3: External View Setup via Spherical Mapping from a Latlong Panorama One of the Maya model view for an example frame to be rendered under a designed lighting condition was shown, where the left and right showed the effect comparison between the situations with and without the material (Figure 4.4). The final rendering was executed based on the specific render setting (Figure 4.5). There were totally 1400 frames to be rendered via Redshift, under the normal VR resolution of 3840 × 1920. 58 Figure 4.4: Final Maya Scene View under Designed Lighting Condition, with and without Materials Figure 4.5: Maya Render Settings 59 4.1.2 Video Creation This section is represented by this part of the workflow diagram (Figure 4.6). Figure 4.6: Video Creation in Workflow 3 Adobe After Effect (AE) with the plugin of Mettle SkyBox VR was used to composite the entire image sequence and conduct the final rendering. The rendering situation was shown with the AE user interface (Figure 4.7), from which it could be seen from the bottom of the figure that the mov file format was selected as a standard practice of handling the VR video (Hanson, 2017). From the Project window located at the top-left corner of the user interface, it could be seen that an additional image was included into the render queue with the previous 1400 frames. That picture was rendered from Maya separately to show the final scene as the end of the video when the light was dimmed at night. Mettle SkyBox VR was used to preview the VR video before the final rendering process was initiated (Figure 4.8). Figure 4.7: User Interface of Adobe After Effect with the Final Rendering Status shown 60 Figure 4.8: Mettle SkyBox VR Player opened in Adobe After Effect for Previewing the Final VR Film 4.1.3 Format Conversion for Video File Section 4.1.3 is represented by this part of the workflow diagram (Figure 4.9). Figure 4.9: Format Conversion for Video File in Workflow 3 It was found that the original video file in mov format was so large that file compression was necessary. HandBrake, a video transcoder tool, was applied to change the format of the file and to reduce the file size (Figure 4.10). Based on the standard setting shown, the outcome was a mp4 file ready to be played in Jaunt Player (Hanson, 2017). The duration of the final video was 1 minute. 61 Figure 4.10: HandBrake Setting for Final Video Transcoding 4.1.4 Final VR Video This section is related to this part of the workflow diagram (Figure 4.11). Figure 4.11: Final VR Video in Workflow 3 Six key frames in Jaunt Player were used to show the example VR contents in the final video. First was the whole project description panel at the beginning of the video (Figure 4.12). Second was the middle drawer failed to be fully pulled out from the cabinet due to the obstacle of a chair in that room, with the information panel shown when the user used the controller to touch the mobile chart’s drawer (Figure 4.13). Third was about the movement of that chair contributing to the fully open of that drawer (Figure 4.14). Fourth was the information panel for the stretcher displayed when the stretcher was touched (Figure 4.15). Fifth was about the perspective view of a patient laying on that stretcher (Figure 4.16). Sixth was about that patient’s view at night (Figure 4.17). Unlike the real-time VR to be presented later where the positional tracking was allowed, the camera’s positions in this cinematic VR for frames were fixed. 62 Figure 4.12: The First Example Frame of the Final VR Video Played via Jaunt Player Figure 4.13: The Second Example Frame of the Final VR Video Played via Jaunt Player Figure 4.14: The Third Example Frame of the Final VR Video Played via Jaunt Player 63 Figure 4.15: The Fourth Example Frame of the Final VR Video Played via Jaunt Player Figure 4.16: The Fifth Example Frame of the Final VR Video Played via Jaunt Player Figure 4.17: The Sixth Example Frame of the Final VR Video Played via Jaunt Player 64 4.2 Workflow 2 Result: Unity based Interactive VR through HTC Vive After testing HTC Vive with Steam VR and the VRTK with VR Simulator, the interactive VR environment was established for the final room model (Figure 4.18). The interactive information visualization was achieved via further modifying the script to be integrated into the VRTK class of code based on the initial result in the previous part. The modification logic was to change from the mouse input control to the VR input control which could be laser pointing to use object or directly touching the object. Both would be applied later. However, the VR input should not be limited to these forms and more interaction methods could be explored. Figure 4.18: Workflow 2 of Unity based Interactive VR through HTC Vive with Entire Steps for Final Results ( ○ c Autodesk, Unity, HTC, Vive, VRTK) 4.2.1 Final VR Result Section 4.2.1 is related to this part of the workflow diagram (Figure 4.19). Figure 4.19: Final VR Result in Workflow 2 65 4.2.1.1 Interactive VR Room Environment A VR simulator was used to show the final VR environment first, which could achieve almost the same effect as that of the full VR except for the perception and the presence of the controller model in the scene. In the VR simulator result (Figure 4.20), except for the interactive door opening scenario that was not corresponded in the first workflow, the rest presented the same scenario as that of the first workflow, showing the potential problem of the space layout (Figure 4.20). Although more interactive components like the gravity and collider as the basic examples had been included into the final build of the VR game from Unity, the other interactions would not be shown for workflow comparison. After testing in the simulator, the same content was presented in the real VR environment, as the final part in this section of results (Figure 4.21). In both environment, the chair was highlighted when it was touched by controllers. Figure 4.20: The Example Interactions of the Simulated VR Game Played via VR Simulator of VRTK Figure 4.21: The Example Interactions of the VR Game Played via HTC Vive 4.2.1.2 Final VR Result with Interactive Information Visualization This part presents the outcome in VR environment directly. The specific VRTK components applied would not be shown again here because these parts had already been coved in the previous section. Thus, the main modification to be presented was for that core script (Figure 4.22). Three parts of the script were copied from the previous script, based on which the two modifications performed were inheriting the system characteristic from VRTK_InteractableObject and removing the final line of code. The reason of removing that code was that in VR environment the canvas should not be set in the position of that object but be placed in the fixed location within the eye view of the user to see the its words clearly. After multiple trials, it was found that the manual setup for the canvas was the most efficient way to achieve that. The inheritance was the key to transfer the mouse click control to VR control using VRTK. In this case, the class of Information_Visualization was inherited from the class of VRTK_InteractableObject so that the behavior of VRTK_InteractableObject class could be reused, extended and modified by the new class of Information_Visualization (Microsoft, 2015). In the final result (Figure 4.23), the information was overlaid on top of the VR screen per laser point or controller touch. Similarly, the information could be IFM related for quick information extraction and visualization. 66 Figure 4.22: The Script of Chapter 4.2.2 67 Figure 4.23: Final Workflow 2 Result of the Unity based Interactive VR through HTC Vive 4.3 Workflow 3 Result: Fuzor based Interactive VR through HTC Vive Although the Fuzor’s VR capacity had been demonstrated in the Fuzor section of Chapter 3, a new Fuzor API was developed (C. Nguyen, personal communication, January 2, 2018) so that extra information outside the Revit parameter stored in Excel file could be automatically added into the Fuzor. Fuzor’s ability could be largely extended to enable the IFM usage, with a customized Python script to parse the external data from Excel and execute the Fuzor API to add new parameter (Figure 4.24). 68 Figure 4.24: Workflow 3 of Fuzor based VR through HTC-Vive with Entire Steps for Final Results ( ○ c Autodesk, Fuzor, HTC, Vive, Python, Anaconda, PyCharm, Microsoft) 4.3.1 Fuzor for Design Review and Facility Management Section 4.3.1 can be graphically presented by this section of the workflow diagram (Figure 4.25). Figure 4.25: Fuzor for Design Review and Facility Management in Workflow 3 69 There were the two basic functionalities (movement for design review and markup for IFM) which could already be achieved in the Fuzor desktop version while the VR version would not be repeated since its VR ability had already been proven (Figure 4.26). The external walls were hidden for easier manipulation towards the model. Figure 4.26: Movement for Design Review and Markup for IFM in Desktop Fuzor 4.3.2 Automation of Excel Data Extraction and Export to Fuzor using REST Calls This section can be graphically presented by this part of the workflow diagram (Figure 4.27). Figure 4.27: Automation of Excel Data Extraction and Export to Fuzor using REST Calls in Workflow 3 The logic of the whole process including the two parts (extracting Excel data and exporting the data to Fuzor using REST Calls) was as follows. First, given of the running of the Fuzor file, the object whose parameter to be updated from the external Excel file should be selected, followed by running the first Fuzor REST API http://localhost:45190/selected to query data from Fuzor which was the object information for this case (Kalloc Studios, 2018). Fuzor responded to actions by XML data which was described before. Apart from the queries, the other typical kind of the action performed was execution whose API would be used later to add parameters. The REST API adopted the uniform format of http://localhost:45190/query?parameter_list. For instance, the ‘query’ in that form needed to be replaced with ‘execute’ to execute a command, which could also be seen from the next API. The ‘query’ was a general term which should be changed for various situations, and that was changed to ‘selected’ for the last API case. After running, the information was shown via XML in a new webpage (Figure 4.28). It could be found that the object ID was under the line of ‘<name>__fuzor_objectid</name>’ which was also the first value in that structure. 70 Figure 4.28: Online XML File of Object Information including ID The next step was to run the second API in the format of http://localhost:45190/execute?createcustomparameter=<ObjectID>:<Parameter_Label>:<Parameter_Value>. For instance, if the cost information of that stretcher was the data to be added, the command was http://localhost:45190/execute?createcustomparameter=13931801347516435145:Cost:3982. After running this API, the new BIM parameter was added automatically. However, only a blank webpage was shown after running this API, which meant that the command was successfully executed (Figure 4.29). Figure 4.29: Blank Webpage Returned from the 2nd Fuzor REST API The steps were included in the Python script to automate the whole process of parsing key information out of the excel document, handling the REST calls and sending the information back to Fuzor in real time (Figure 4.30). The script with the explanation was shown in the PyCharm user interface where the upper part was the editor window and the downside was the output window. PyCharm is an Integrated Development Environment (IDE) for Python which is a commonly used computer programming language (JetBrains, 2018). Python can be equipped with the Anaconda platform, which is a python Pistribution providing not only the numerous packages and the virtual environment manager (Anaconda, 2018). 71 Figure 4.30: Python Script for Automating Extracting the Excel Data and Exporting to Fuzor using REST Calls The result of adding new parameters was shown in the red block (Figure 4.31). Customized BIM parameters could also be added and deleted through the UI manually. A new window within the yellow boundary was shown after clicking the annotation button inside the window within the blue block. The IFM related information could then be typed in, which could be seen in the figure including typical IFM data like the work order category which was O&M in this case, worker assigned with this job which was to fix one feet of the stretcher, work description shown in the ‘Issue’ window, the required deadline to resolve the issue, and the markup image added as a separate attachment for better description. With these information and the essential BIM data added before, the basic IFM goal could be achieved. To sum up, Fuzor API and Python script were used to extend the database in addition to the BIM model, based on the external excel datasheet. Hence, if the level of development (LOD) of the BIM is high enough, Fuzor can be used as an advanced IFM tool with VR capability, and its model can be served as a living file which owns a precise snapshot of the full space. 72 Figure 4.31: Result of the Modification towards the Workflow 3 for Integrated Facility Management 73 4.4 Test 2 of Workflow 4: Vuforia based AR through Model Target Unlike the Object Target in Vuforia, which recognizes objects via visual features, Model Target recognizes objects by shape, based on the corresponding CAD models pre-imported into the database. Therefore, no scanning process is required to generate the target database. Although the Model Target feature in the software is still being developed, it can cover a wider range of objects that are even bigger like vehicles. Some of medical facilities and equipment is large so that it is difficult to scan, which is also another reason why this method is more appropriate to be used in healthcare project (Vuforia, 2018). The entire processes has several steps. First, the 3D model is imported into the Vuforia Model Target Generator, which is like the scanner mentioned in the Object Target test before. Second, the target is generated from there after setting the desired detection position and adjusting the model scale. The desired position is shown with the model preview automatically created for the alignment guide later in determining whether the virtual content will be activated to present when the camera catches the view aligned with the target model with that desired position. A scale adjustment is necessary because a match in dimensions between the real object and model target is a precondition for this to work (Vuforia, 2018). Finally, the target is imported to Unity for further development (Vuforia, 2018). The overall process is almost identical to that of the Object Target because they can both be summarized into the two steps which are target generation and Unity development with the target. Vuforia Model Target Generator was downloaded and installed onto a Windows computer. A Dell Inspiron 13 7000 Series is the real laptop; a similar 3D model were downloaded for testing (Turbosquid, 2018). The digital model could have been created in many other software programs, but an existing 3d model was used instead. Two similar laptops models were tested. Two windows of the generator user interface are shown with the two models, which were CAD Model View (Figure 4.32) and Detection Position windows respectively (Figure 4.33). Only part of the model in the CAD Model View was needed. An adjustment was conducted to only include part of that, with the dimensional units changed to have an approximate match in size. Figure 4.32: CAD Model View Window of the Vuforia Model Target Generator 74 Figure 4.33: Detection Position Window of the Vuforia Model Target Generator The generated target was imported to Unity for the further development. An important part to customize the AR experience was to place the custom content as a child of the ModelTarget game object. In this case, a simple canvas with the text of the name of the model was attempted to be shown in the AR so that was placed (Figure 4.34). The selected scene was built and packaged into the apk. file to be run in the Xiaomi phone (Figure 4.35). Figure 4.34: The Unity Interface with ARCamera and ModelTarget inside the Scene for AR development 75 Figure 4.35: Unity Build Setting Window for Deployment towards Android Platform After sending the apk file to the Xiaomi phone and installing the corresponding Unity app from this apk file, the results from running this app for both the two models are displayed (Figure 4.36). The successful result should be the corresponding virtual object appearing on top of the screen in the laptop position, in this example, the work “laptop” overlaying the screen of the laptop. The virtual object for this case should be a text of laptop as a simple demonstration towards the IFM data display in real application. However, as shown in the result, the expected text (Laptop) was not shown but only the guide view aligned with the real laptop, meaning that the experiment failed. Figure 4.36: AR App Running on the Xiaomi Phone The Xiaomi phone might be a factor towards this failed result as it is an old phone. A webcam was applied for the demonstration and the solution. After this change, this test succeeded (Figure 4.37). The name (Laptop) was displayed automatically when the shape of the laptop was detected by the camera, after the guide view and the view captured by 76 the camera were aligned with each other. This virtual content was disappeared after the camera was moved away from the original alignment. Consequently, the concept demonstration towards the AR application from the shape recognition for IFM onsite information support has been conducted. However, this is suitable for a building project with no or few repeated facilities like a complex patient room; data about an object could be show or links to a video on how to fix the equipment; or even graphics that show how to use the medical instruments. Otherwise, QR or bar code might still be necessary for the AR information visualization. For instance, this method is not suitable for an office building with numerous furniture and equipment with the same shape. Figure 4.37: The Test Result of the Desktop AR Chapter Summary This chapter described the final three workflows in VR, the exploration using AR, and the results from the studies. For the first workflow, a VR film was produced to show the project content via Oculus Rift. For the second workflow, based on the Unity game engine, a VR game was developed for a patient room with an interactive environment where the information panels are visualized via the VR control input in HTC Vive. For the third workflow, the Fuzor game engine was used to develop the similar interactive VR environment with the addition of an automatic link developed via a Python script to expand the database from a separate Excel file. The Fuzor API was used for the development of the third VR workflow. Finally, an AR test was conducted via Model Target of the Vuforia and Unity and demonstrated by a webcam linked to the laptop. 77 CHAPTER 5: DISCUSSION Chapter Introduction This chapter includes the following three parts: discussion of the research process, application discussion, and evaluation. Each part will be divided into four sections for the four projects including the three VR workflows and the AR test. 5.1 Discussion of the Research Process The research processes are discussed to better understand the VR workflows and the AR test. The differences among them are compared and evaluated. The relationships among them are also specified. 5.1.1 Research Process Discussion of the Cinematic VR Workflow The initial steps of the cinematic VR workflow include model modification from Revit to Maya and the test method via Jaunt Player to ascertain that the proper panorama can be obtained before continuing the rest of the rendering process afterwards, which can be regarded as the first stage of the research. The model conversion process including the material addition is used in the Unity VR workflow later. Although the process of this workflow is relatively straightforward, the trial process is important due to the long-time period to be consumed later in rendering all the frame sequence. The frame connection, final rendering, and the file format conversion are the final steps of this workflow, which is based on the first steps. Consequently, only a VR film is the outcome of this workflow from the sequential development processes. The animation path is predetermined for a clear VR storytelling. The aesthetic part is evaluated before establishing the light level, material assignment, exterior environment, etc. 5.1.2 Research Process Discussion of the Unity VR Workflow The initial steps of this workflow can also be regarded as the first stage of the workflow. The Unity VR workflow is decomposed into relatively small problems. The initial steps are used to tackle these pieces of problems and the final steps integrate these solutions. Hence, it is based on the methodology rather than the work in the cinematic VR workflow. The sequence of the process is as follows: Unity C# scripting, VRTK code application, VR simulator based on the combination between the Unity and VRTK, and final integration with the Steam VR and HTC Vive. The test approach is from simple box to the model furniture, and from separate Revit files to a combined Revit file. Because of this progressive research method, the result from the initial steps can be applied in real application. For example, without the HTC Vive and Steam VR, VR simulator from VRTK applied in Unity game engine can already be used for design review and facility management. Due to the complexity of this workflow, the trial processes are necessary during the development process while these can be avoided after developing the workflow. 5.1.3 Research Process Discussion of the Fuzor VR Workflow The initial steps of the Fuzor VR workflow should be considered as the first part of the research. The initial steps of the Fuzor VR workflow are mainly about using the Fuzor’s VR functionality, during which process the synchronization between the Revit and Fuzor is also demonstrated. The final part of the research is conducted on the laptop mode of Fuzor instead of the VR mode. Except using the same Fuzor model, these two parts of the research steps are separate from each other. Consequently, the two interfaces of Fuzor including the VR and laptop ones are both demonstrated. The second part of the research is mainly about writing a Python script which is used to assemble all the commands into a single place and automate the process without opening the other tools including a web browser and an excel table manually. The Python script includes a pseudo code to guide the user to first select the object in Fuzor to get the required information which is the unique identifier of the selected object afterwards. That script also includes a user input request to ask the user to input the parameter name whose data is to be added into the Fuzor. Hence, the Python script shows the final part of the research. 78 5.1.4 Research Process Discussion of the Vuforia AR Test The initial steps of AR test are to use Object Target of the Vuforia. The method is changed to Model Target of the Vuforia later. Hence, the initial steps can be regarded as a case study and the real research starts in the final steps. The methodology change is not from the failed result but due to the discovery of a more suitable method. In the Model Target test, the mobile based AR failed to provide a desired result while the webcam based AR succeeded. Hence, it can be found that Xiaomi phone used, as an old model, may lack the ability to support this latest functionality. For example, Xiaomi phone’s ARCamera may not be able to automatically focus during the operation. Compared to the VR workflows, this AR test is comparatively simple. 5.2 Application Discussion The application potential is discussed to reveal the research impact. Moreover, the real application scenarios are described to show the real situations where these workflows and test can be applied. 5.2.1 Application Discussion of the Cinematic VR Workflow The VR storytelling behind the VR film created in the cinematic VR workflow is important for the project presentation. This cinematic VR is similar to the walkthrough or flythrough animation generated from the Revit or Revit Live. As a result, this may not be effective comparatively. However, it can bring more effect and customized content to the film, which is its advantage behind more time and effort consumed. For example, the project description is presented via a panel made at the beginning of the film and it is disappeared after a certain time period which is determined from the estimation towards the approximate time the viewer needs to spend in reading through the content. Unlike the full interaction controlled by the users in the other VR workflows, the producer controls the storytelling to emphasize the certain content to the viewers and the viewers cannot control but can only passively view the film. Thus, this kind of VR is linear in time. Although the VR game in the other VR workflows can guide the users to do specific tasks as another way of project presentation with more interactions, the cinematic VR presentation offers a passive watch option to the clients. The passive watch is more efficient because the time spent in exploring the scene by the users can be saved. 5.2.2 Application Discussion of the Unity VR Workflow It is apparent that the interactive Unity based VR tool can be valuable for the design phase of the project. In a complex project like a medical space, problems can be easily detected among various design solutions through VR simulation in terms of the space layout. Some crowded situations can be encountered if the space utilization efficiency is aimed to be maximized initially. Flexibility of many equipment within a medical space is another significant aspect revealing the importance of this application. The above scenario can be shown via the result of this workflow. For the basic information visualization, in addition to the benefits towards the IFM, it is also essential for the design phase because different medical products to be quickly distinguished from their information panels directly determine the relevant considerations in their space arrangements. Since the BIM system is separated into the model and the data to be exported into the Unity platform individually, the external Excel data containing the O&M information can also be bought into the asset folder and visualized in the VR scene, just like the BIM data, exported to a text file, included into the Unity database. Since the data visualization is the most important aspect in IFM, this workflow can also be beneficial to facility management. 5.2.3 Application Discussion of the Fuzor VR Workflow Because of the highly dynamic property of the healthcare project in the O&M phase of the project, the interactive Fuzor based VR tool can be used by external consultants or project team, which cannot easily access to the site to detect the O&M issue in VR environment so that the site team can be informed to verify and make the corresponding modification if necessary. Moreover, the site team might need to make some decision based on the instruments from management team, which might not be on site at that time. Thus, virtual project document can be used by management team to make quick decision without spending time and effort to come to the site every time when their judgments are expected from onsite technicians. In a warehouse storing replacement components, technician can access to the VR model with the information including the objects need to be carried to the site to reduce the times of the round trips between the site and the warehouse. 79 A typical O&M scenario can be demonstrated by the result of the Fuzor based VR workflow. From markup images collected inside the database folder of Fuzor, the related issues are identified for a specific piece of equipment such as a foot of the stretcher. A customized parameter (usually stored in Excel files) is used to supplement the necessary information during the whole life cycle of the project, from which important information for repair or maintenance (serial numbers, warranty information, the O&M history, etc) can be obtained for facilities inside the building. Then the work order can be established in the annotation window and the status of that work order can be constantly tracked within the Fuzor system until the issue is fixed. Design review can also be done in the Fuzor VR workflow because the interactive VR environment can be inherently accessed through the Furozr VR. Changing the amount of equipment and moving the location of equipment can be easily performed in Fuzor VR after detecting the design problem in that immersive environment. The change can be automatically synchronized into the Revit file and vice versa. However, it should be noticed that the Excel IFM data added to the Fuzor cannot be synchronized back to Revit yet. Revit parameter visualization in VR is another inherent characteristic of the Fuzor VR, which is important towards both the design and facility management. 5.2.4 Application Discussion of the Vuforia AR Workflow The AR test overlays the information directly onto the real object so that it is suitable for the onsite usage. This is the concept demonstration, from which it can be seen that the IFM related information including the training video can be associated to the real object so that the conventional user manual and technical specification can be replaced. There are many types of the AR which can be applied in real situations, which can be bar code based, QR code based, etc. Neither model or object targets don’t rely on additional images. 5.3 Evaluation The workflows and test will be evaluated from the different criteria including the efficiency, interactivity, ease of use, compatibility, and customization in this part. The comparison can be used to guide for the selection of a specific workflow for early design and IFM. 5.3.1 Evaluation of the Cinematic VR Workflow This is not an efficient workflow for AEC industry because that the development process is time consuming and that the special effect is not normally required. This workflow lacks the interactivity ability. However, this workflow is relatively simple to use due to plenty of tutorials online as the guidance towards how to use these software and hardware. The software and hardware used are compatible with each other. It is highly customizable because of the software functionality. However, it still requires certain level of proficiency in operating all this software. 5.3.2 Evaluation of the Unity VR Workflow The efficiency of this workflow is not high due to both the time and effort consumed in development and the export and import processes between the BIM (Revit) and game engine (Unity) platforms. This workflow is highly interactive and customizable with the customized programming. Although there is large community support, it is still a difficult workflow requiring the programming knowledge in C# and familiarity in both the Unity and VR tools. The other difficulty is about VR programming. Although VRTK provides a good tool to avoid that, it is still being developed so that customization outside VRTK requires advanced programming ability. The Revit software is not compatible with Unity software due to the material loss, which is also the reason why Maya is used as a go-between software. 5.3.3 Evaluation of the Fuzor VR Workflow This is an efficient workflow because of its real-time synchronization between the Fuzor and Revit so that the import and export can be eliminated. The VR capabilities including BIM parameter visualization are intrinsic in Fuzor so that no additional development time and effort are needed. This workflow is also interactive. This workflow is easy to use in Fuzor. However, writing an automatic execution tool using Fuzor API in Python is not easy, which requires the knowledge of XML, Fuzor API, and Python programming. Fuzor API is now limited so that the level of customization in this workflow is also relatively low. For example, it can be seen from the fixed VR inputs from certain buttons of the HTC Vive controller. In contrast, that can be easily modified in Unity with VRTK. The synchronization between the Revit and Fuzor is bi-directional and in real-time so that they are compatible with each other. For example, the material will remain the same. With the latest Fuzor API which can be used to add outside data into the Fuzor, the other database in the format like Excel are also compatible with the Fuzor. 80 5.3.4 Evaluation of the Vuforia AR Test The AR test is efficient in terms of the fact that its 3D model can be directly used as the reference to be matched later for the object recognition. It is also an interactive test although the interactive method can only be controlled via the relative position between camera and the object. Similar to the first workflow, no programming is needed so that the test is also relatively easy. It should be noticed that this evaluation cannot be compared with the VR workflows because every workflow is more complicated than the simple AR test as a basic concept demonstration. There existing some problems with compatibility because only certain types of the mobile phone can be used as its terminal device, which has been demonstrated by the initial failed result. The customization of this test is largely limited by the development of Model Target of Vuforia which is the latest model in the Vuforia system. 5.3.5 Overall Evaluation Three workflows are described for using VR for design review and facilities management: cinematic VR, Unity based interactive VR, and Fuzor based interactive VR. The first workflow creates an immersive experience for the user to understand the environment. This workflow requires knowledge and access to many software programs including Autodesk Maya and Jaunt Player. Although no programming is needed, its development is time consuming and work intensive. While the final project provides only an animation path pre-determined for the user, it does allow for some interaction in the form of panoramic views. The lack of full interactivity in the VR environment is a weakness of this workflow. The second and third workflows can both be used for both design review and facility management. Both workflows utilize a game engine based platform to achieve real-time control and accomplish interactive data visualization. They can be used to transfer the needed information from the Revit model, in this example a patient room in a healthcare facility, to the VR model. Their main differences are the base software used. The Unity-based workflow allows for a high degree of customization and interactivity within the VR environment. However, this involves C# programming and the ability to use VRTK SDK. The third workflow with Fuzor, due to its bi-directional synchronization with Revit and easy access to a VR environment, has many features built-in that makes it easier for implementing a FM solution. However, the Fuzor API currently is relatively limited for developing customized applications. The AR test is Vuforia based AR for FM information visualization onsite. Although Object Target of Vuforia can also be used, Model Target of Vuforia is more suitable to be used for real healthcare project due to the size of subject and the time consumption in object scanning process. This test is mainly about the Vuforia utilization for the concept demonstration so that no programming is needed. Chapter Summary This chapter is the discussion of the three VR workflows and an AR test, which consists of the three sections including research process discussion, application discussion, and evaluation. The first part discusses the research methodology with the summary of the research approach applied for each of the projects. The research logic is also important to guide for the future research although the future work will be separately covered in chapter 6. The second one describes the real scenario where each of the projects can be applied so that the research impact can also be understood. The application limitation is also analyzed in this part. Hence, it can be used to guide for the real applications. The final section is about the evaluation towards these projects, from which the advantages and disadvantages are illustrated so that the workflows and test can be compared with each other based on certain criteria. 81 CHAPTER 6: CONCLUSION AND FUTURE WORK Chapter Introduction This chapter will cover the two parts: conclusion and future work. The conclusion summarizes the research with a summary about the research impact. The future work is divided into the two parts, future work to improve the methodology and recommendations of larger goals for VR and AR in design review and facility management. 6.1 Conclusion Fundamental knowledge gained through a literature search, trying different software programs, and learning about building information modeling (BIM), virtual reality (VR) and augmented reality (AR), 3D game engines, design review via experience based design (EBD), and integrated facility management (IFM) provided the foundation for the three workflows and one test developed. BIM is the common ingredient of the system in all the VR workflows. VR, AR, and game engines are the techniques used in the system to achieve the desired outcomes. EBD and IFM are the two project stages for the application of the workflows and the test developed with healthcare serving as a typical type of project suitable to be the content of the workflows and the test. Three workflows were developed with the workflow diagrams (Figures 6.1, 6.2, 6.3). Figure 6.1: Workflow 1 of Cinematic VR through Oculus Rift ( ○ c Autodesk, Redshift, Oculus, Jaunt, Adobe, METTLE, Handbrake) 82 Figure 6.2: Workflow 2 of Unity based Interactive VR through HTC Vive ( ○ c Autodesk, Unity, HTC, Vive, VRTK) Figure 6.3: Workflow 3 of Fuzor based VR through HTC-Vive ( ○ c Autodesk, Fuzor, HTC, Vive, Python, Anaconda, PyCharm, Microsoft) 83 The three VR workflows were described towards using VR for design review and facilities management. The first was cinematic VR, which was used to be viewed only, and the second and third workflows were interactive VR using Unity and Fuzor respectively. BIM contains sufficient project information in terms of both the model and data so that a patient room model of the Autodesk Revit was used as the base of the workflow and to emphasize the potential uses of VR for healthcare projects. The workflows showed simplified examples, but the real life potential solutions are interactive manipulation for spatial design overview and interactive information visualization for integrated facility management. Since the customization of the Unity based interactive VR workflow for transferring the model and data from Revit into Unity and writing C# script to achieve the real-time interaction in VR have already been achieved inherently in Fuzor, the Fuzor based VR workflow demonstrated instead the automatic extraction of Excel data and importation of the data to Fuzor using its API via a Python script. It is then possible to integrate IFM related data into the VR platform in addition to the original BIM data from the Revit. Since the BIM database in Unity can also be modified to include the IFM related data, both workflows are able to utilize game engine based VR to achieve the real-time control and accomplish interactive data visualization of the IFM data based on the existing BIM data. The final AR test was presented separately to display the related information on top of the real object in the space using the shape recognition ability provided from Vuforia, based on the corresponding obj file preloaded. Hence, the above information management in VR was extended to real situation in AR for facility management on site. Unity has been used in the two projects, from which its capacity in VR and AR development for AEC industry has been demonstrated. A summary table shows the four types of VR described (Table 6.1). Table 6.1: Workflow and Test Summary Projects Building Information Modeling (BIM) Virtual Reality (VR) Augmented Reality (AR) Game Engine Experience Based Design (EBD) Integrated Facility Management (IFM) Film VR ✓ ✓ ✓ Unity VR ✓ ✓ ✓ ✓ Fuzor VR ✓ ✓ ✓ ✓ Unity AR ✓ ✓ ✓ 6.2 Future Work There are two aspects mentioned in the future work part. First is future work towards making the existing research better. The second is about the overall goals of the VR and AR’s application in the practice of design review and facility management. 6.2.1 Future Work of Existing Research Each of the existing workflows and AR exploration can be improved. 6.2.1.1 Future Work of the Cinematic VR Workflow Different VR terminals can be tested to compare the performance. As for the content, it can also be changed. For instance, sound design can be performed to include the sound for a more immersive experience. Training content can also be included in the VR film. 6.2.1.2 Future Work of the Unity VR Workflow BIM and game engine model transfer can be further investigated with the other software like Autodesk 3ds replacing the Autodesk Maya as the go-between tool. Model preparation may be optimized in both the quality through the polygon optimization and efficiency via writing customized scripts to solve the material conversion problem which can only be solved by the commercial tools currently. A more completed game can be made to be used as an IFM tool used by the facility management team. The game can be linked to the mobile app or email system to receive the input from the real IFM action. That completed game can include multiple scenes for different rooms and places. Additionally, an avatar system with special communication mechanisms can be introduced for a multi-player game. Training scenarios can also be included into the game. 84 6.2.1.3 Future Work of the Fuzor VR Workflow Since the automatic addition of the Excel data into the Fuzor is conducted in the laptop version of Fuzor, future work can be the verification of that in the VR mode. After the successful addition of the external database into the Fuzor, it is worth studying how to achieve this towards the Revit, given that the addition of the excel IFM database to the Fuzor cannot be synchronized back to the Revit. Moreover, future effort can be made to open a webpage or a pdf directly in the Fuzor VR from the URL link inside. The ability to access multiple kinds of the IFM database directly in the VR interface of the Fuzor is essential for the Fuzor to be qualified as a practical IFM tool. 6.2.1.4 Future Work of the AR Test The direct continuum of this AR test can be about achieving the same result from one object to multiple objects. The AR part can be studied further with the latest development of the tools because the tools in the market are still undergoing development. It is possible to develop an AR tool to be used in IFM because the match between the onsite demand in O&M and the AR capacity used in the real environment. For instance, it has been mentioned that Vuforia Fusion is useful to release the power of the other platforms like the ARKit and ARCore so that functionality is worth to be explored. Vuforia’s other functionalities for the QR and bar code based targets can be investigated. More promising AR tools like Microsoft HoloLens can be further applied in the application. Moreover, with Unity, a complete AR application with necessary functionalities like way finding, problem locating, report system, etc. can be built for IFM and tested in the real situation. Since the AR test is not BIM based, a tool can be developed to link the BIM and IFM databases with the Vuforia database in the future. 6.2.2 Goals for VR/AR in Design Review and Facility Management For both AR and VR, the user experience is always the key criterion to be considered in the evaluation. As mentioned before, cumbersome HMD is always a restriction in the development of these technologies, especially for VR. Thus, VR and AR with mobile phones including web based ones should be considered for the future work. Moreover, mixed reality (MR), as the combination of VR and AR, can be researched. For instance, AR can be used as a gate into VR experiences with a more natural transition. Additionally, more advanced techniques including the Internet of Things (IoT), cloud computing, robotics, blockchain, machine learning (ML), and artificial intelligence (AI) can be incorporated as the future solutions for the design review and facility management. For instance, preventative maintenance can be achieved by machine learning and big data. IoT can be used to connect the real and virtual objects together. It should be noticed that the supply chain management is also an essential supporting component for the IFM system. For instance, artificial intelligence can be used to predict the potential problems and arrange the resource in advance to achieve the lean management with a higher efficiency. Chapter Summary Virtual reality provides an immersive environment, which is important for design review to simulate the real environment. With additional data, it can also be used for better informed design and facilities management. Building information modeling is widely used in the building industry, and its files can provide both geometry and data. With some scripting, a game engine can be used to bring and link these together. Additional external data can augment original BIM data. Eventually, a complete database can be managed in the VR scene, which can benefit integrated facilities management. The proposed workflows can connect the 3d model and data more closely so that the virtual space can be transformed from a replica of a real project to a living project document suitable for both design and facility management. Augmented reality increases the information onto the real world. Unlike how VR separates the real and virtual parts of the project, AR brings them together so that it is suitable on-site. VR is more appropriate to use when the site cannot be easily accessed. Information visualization is essential to the success of a building project especially for complex projects like healthcare ones requiring intensive information support for decision making. Both VR and AR provide efficient and immersive ways of visualization and experience. 85 REFERENCES Akcamete, A., Liu, X., Akinci, B., & Garrett, J. H. (2011). Integrating and Visualizing Maintenance and Repair Work Orders In BIM: Lessons Learned from A Prototype. Proc. of the 11th International Conference on Construction Applications of Virtual Reality, Weimar, Germany. Anaconda. (2018). What is Anaconda. Retrieved from https://www.anaconda.com/what-is-anaconda/ Aouad, G., Lee, A., & Wu (Eds.), S. (2016). Constructing the Future: nD modeling. Taylor & Francis. Apple. (2017). Introducing ARKit. Retrieved from https://developer.apple.com/arkit/ Apple. (2017). World Brush. Retrieved from https://itunes.apple.com/us/app/world-brush/id1277410449?mt=8. Arch Virtual. (2014). Learn How Arch Virtual Uses Unity to Visualize the Design Projects of Their Clients. Retrieved from https://unity3d.com/showcase/case-stories/arch-virtual. Aripaka, R. B. (2016). Augmented Reality Indepth Insights. Retrieved from https://www.linkedin.com/pulse/augmented-reality-indepth-insights-raveendra-babu-aripaka?trk=v- feed&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_content%3Bj1gCEB1ZoofgEaMQRU1bGg%3 D%3D. Lang, B. (2016). Oculus Rift and HTC Vive Roomscale Dimensions Compared. Retrieved from https://www.roadtovr.com/oculus-touch-and-htc-vive-roomscale-dimensions-compared-versus-vs- visualized/. ASIS NEWS. (2017). ASSA ABLOY Showcases Emerging Technologies with Mobile, Augmented Reality, Building Information Modeling. SDM. Retrieved from https://www.sdmmag.com/articles/94368-assa-abloy- showcases-emerging-technologies-with-mobile-augmented-reality-building-information-modeling Autodesk Knowledge Network. (2014). Introduction to UV Mapping. Retrieved from https://knowledge.autodesk.com/support/maya/learn- explore/caas/CloudHelp/cloudhelp/2015/ENU/Maya/files/UV-mapping-overview-Introduction-to-UV- mapping-htm.html Patrao, B., Pedro, S., & Menezes, P. (n.d.). How to Deal with Motion Sickness in Virtual Reality. Retrieved from http://scitecin.isr.uc.pt/Proceedings/Papers/EPCGI/17.pdf Billinghurst, M. (2017). Developing AR and VR Experiences with Unity. University of South Australia. Retrieved from https://www.slideshare.net/marknb00/developing-ar-and-vr-experiences-with-unity Boeykens, S. (2014). Getting BIM Data into Unity. CAD, BIM & 3D. Retrieved from http://cad- 3d.blogspot.gr/2014/06/getting-bim-data-into-unity-part-0.html. Boeykens, S. (2015). Getting BIM Data into Unity. CAD, BIM & 3D. Retrieved from http://cad- 3d.blogspot.com/2015/02/getting-bim-data-into-unity-part-2-revit.html. Brouchoud, J. (2010). Welcome to My Home: First Attempts at Using Unity3D For Architectural Visualization. Arch Virtual. Retrieved from http://archvirtual.com/2010/01/08/welcome-to-my-home-first-attempts-at-using- unity3d-for-architectural-visualization/. Brouchoud, J. (2016). VR Device Options for Architects, Trombly International. Retrieved from http://www.hypergridbusiness.com/2016/06/vr-device-options-for-architects/. Brouchoud, J. (2016). What’s Your VR + Architecture Strategy. Retrieved from https://www.linkedin.com/pulse/whats-your-vr-architecture-strategy-jon-brouchoud?trk=v- feed&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_content%3Bub3snCMhL3lMcZYaUd7Vuw% 3D%3D&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_content%3BycVnX5cdQHuIv7Z0yYS5W g%3D%3D. Busta, H. (2015). Three Augmented and Virtual Reality Apps for Design and Construction. Retrieved from http://www.architectmagazine.com/technology/products/three-augmented-and-virtual-reality-apps-for- design-and-construction_o Chi, H., Kang, S., & Wang, X. Y. (2013). Research Trends and Opportunities of Augmented Reality Applications in Architecture, Engineering, and Construction. Automation in Construction, 33, 116–122. Donalek, C., Djorgovski, S. G., Cioc, A., Wang, A., Zhang, J., Lawler, E., Yeh, S., Mahabal, A., Graham, M., Drake, A., Davidoff, S., Norris, J. S., & Longo, G. (2014). Immersive and Collaborative Data Visualization Using Virtual Reality Platforms, 2014 IEEE International Conference on Big Data. Dossick, C. S., & Neft, G. (2009). Organizational Divisions in BIM Enabled Commercial Construction. ASCE Journal of Construction, Engineering and Management, 136, 459–467. Dunston, P. S., Arns, L. L., & McGlothlin, J. D. (2007). An Immersive Virtual Reality Mock-Up for Design Review of Hospital Patient Rooms. Purdue University. Eberly, D. H. (2007). 3D Game Engine Design: A Practical Approach to Real-time Computer Graphics, 2nd ed. San Francisco CA; Oxford, Morgan Kaufmann; Elsevier Science. 86 Endsley, M. R. (1995). Toward A Theory of Situation Awareness in Dynamic Systems, Human Factors: The Journal of the Human Factors and Ergonomics Society, 37(1):32–64, 1995. Fauvel, C. (2017). Darmstadt Germany: Forge AR-VR-MR experiments. Forge - DevCon 2017. Retrieved from https://www.slideshare.net/Autodesk/forge-devcon-2017-darmstadt-germany-forge-arvrmr-experiments Friedman, E. (2016). Augmented and Virtual Reality for Architecture, Engineering and Design, BrainXchange LLC, Retrieved from https://brainxchange.events/augmented-virtual-reality-architecture-engineering-design/#. Fritsch, D. & Kada, M. (2004) Visualization Using Game Engines. In: ISPRS Commission 5. Istanbul, Turkey. Fu, M. C., & East, E. W. (1999). The Virtual Design Review, Computer-Aided Civil and Infrastructure Engineering, 14, 25-35. Gaudiosi, J. (2016). Why Valve’s Partnership with Unity Is Important to Virtual Reality. Retrieved from http://fortune.com/2016/02/11/valves-partners-with-unity/. GCR, N. (2004). Cost Analysis of Inadequate Interoperability in The US Capital Facilities Industry. Institute of Standards and Technology (NIST). Handel, O., Gümüş, E., Papoutsis, E., & Amann, J. (2016). Dynamic Visualization of Pedestrian Simulation Data, Technical University of Munich. Hanson, E. (2017). CTAN 502A: VR Production Pipelines. University of Southern California. Hou, L., & Wang, X. (2011). Experimental Framework for Evaluating Cognitive Workload of Using AR System in General Task, Proceedings of 28th International Symposium on Automation and Robotics in Construction, Seoul, Korea. JAUNT. (2012). Jaunt Player. Retrieved from https://support.jauntvr.com/hc/en-us/categories/201201366-Jaunt- Player. Jaunt. (2018). A Complete Cinematic VR Solution. Retrieved from https://www.jauntvr.com/technology/ JetBrains. (2018). PyCharm. Retrieved from https://www.jetbrains.com/pycharm/ Kalloc Studios. (2018). Fuzor 2018 API. Kalloc Studios. (2018). Fuzor, Got AEC Apps: Design, Build, Succeed. Retrieved from https://www.kalloctech.com/index.jsp. Kaurloto, D. (2017). HTC Vive Setup Guide. University of Southern California IT Service. Klein, G., & Militello, L. (2001). Some Guidelines for Conducting a Cognitive Task Analysis, Advances in Human Performance and Cognitive Engineering Research, 1,163-199, 2001. Kumar, S., Hedrick, M., Wiacek, C., & Messner, J. I. (2011). Developing an Experienced-Based Design Review Application for Healthcare Facilities Using A 3d Game Engine, Journal of Information Technology in Construction, 16, 85-104. Lang, B. (2016). Oculus Rift and HTC Vive Roomscale Dimensions Compared. Retrieved from https://www.roadtovr.com/oculus-touch-and-htc-vive-roomscale-dimensions-compared-versus-vs- visualized/. Lee, S., & Akin, Ö. (2009). Shadowing Tradespeople: Inefficiency in Maintenance Fieldwork. Automation in Construction, 18 (5) 536–546. Lee, S., & Akin, Ö. (2011). Augmented Reality-Based Computational Fieldwork Support for Equipment Operations and Maintenance. Automation in Construction, 20, 338–352. Lily, P. (2016). HTC Vive Review. Retrieved from https://www.wareable.com/vr/htc-vive-review McGraw-Hill Construction. (2008). Building Information Modeling Trends Smart Market Report. New York. Microsoft. (2015). C# Programming Guide. Retrieved from https://docs.microsoft.com/en- us/dotnet/csharp/programming-guide/index Milgram, P., & Kishino, F. (1994). A Taxonomy of Mixed Reality Visual Displays, IEICE Transactions on Information and Systems, E77-D (12) (1994) 1321–1329. Milideas De Decoracion. (2015). D-Photo Measures an App to Save Really Useful Measurements. Retrieved from https://www.milideas.net/d-photo-measures-una-app-para-guardar-medidas-realmente-util Military Standard 1691. (2017). Product Search Page. Retrieved from https://ms1691.facilities.health.mil/milstd1691/#/. MonoDevelop. (2017). Cross Platform IDE for C#, F# and More. Retrieved from https://docs.unity3d.com/Manual/MonoDevelop.html NASA. (2017). Viking Lander – STL. NASA. Retrieved from https://nasa3d.arc.nasa.gov/detail/viking-lander NVIDIA. (2018). VRWorks - 360 Video. Retrieved from https://developer.nvidia.com/vrworks/vrworks-360video Odom, J. (2017). ARKit & Machine Learning Come Together Making Smarter Augmented Reality. Next Reality. Retrieved from https://next.reality.news/news/apple-ar-arkit-machine-learning-come-together-making- smarter-augmented-reality-0179507/ 87 Pathak, K. (2017). The Best iOS 11 ARKit Apps for iPhone and iPad You Should Try. iPhone Hacks. Retrieved from http://www.iphonehacks.com/2017/09/best-ios-11-arkit-apps-iphone-ipad.html Patrao, B., Pedro, S., & Menezes, P. (n.d.). How to Deal with Motion Sickness in Virtual Reality. Retrieved from http://scitecin.isr.uc.pt/Proceedings/Papers/EPCGI/17.pdf Paula. (2017). Augmented Reality 101: All about the Technology behind ARCore and ARKit. Retrieved from https://www.wikitude.com/blog-augmented-reality-google-arcore-arkit-apple/ PCGamer. (2015). SteamVR — Everything You Need to Know. Retrieved from http://www.pcgamer.com/steamvr- everything-you-need-to-know/. Penttilä, H. (2006). Describing the Changes in Architectural Information Technology to Design Complexity and Free Form Expression. Journal of Information Technology in Construction, 11, 395–408. Peter, E., Heijligers, B., Kievith, J. D., Razafindrakoto, X., Oosterhout, R. V., Santos, C., Mayer, I., & Louwerse, M. (2016). Design for Collaboration in Mixed Reality. Technical Challenges and Solutions 2016, 978-1-5090- 2722-4. Philipp, M. (2012). The Power of Interoperability: Bringing Revit Models into Unity. Retrieved from http://www.studica.com/blog/the-power-of-interoperability-bringing-revit-models-into-unity. Porter, M. E., & Heppelmann, J. E. (2017). The Battle of the Smart Glasses. Retrieved from https://hbr.org/2017/11/a- managers-guide-to-augmented- reality?lipi=urn%3Ali%3Apage%3Ad_flagship3_company%3Bah5VBhwzTvaGTwvcpfT4Qw%3D%3D# one-companys-experience-with-ar. Prabhu, S. (2017). Augmented Reality SDK Comparison: Parameters to select the right AR SDK. ARreverie Technology. Retrieved from http://www.arreverie.com/blogs/augmented-reality-sdk-comparison- parameters/ Prismetric. (2017). Google ARCore Vs Apple ARKit – The Release to Make Augmented Reality Mainstream. Retrieved from https://www.prismetric.com/google-arcore-vs-apple-arkit-release-make-augmented-reality- mainstream/ Professional Constructor Central. (2012). World Workplace and BIM. IFMA. Retrieved from https://buildinginformationmanagement.wordpress.com/2012/11/02/ifma-2012-world-workplace-and- bim/#. Royal BAM Group. (2018). Virtual Reality (VR) & Augmented Reality (AR). Retrieved from http://www.bamireland.ie/divisions/bam-fm/innovation/virtual-reality-and-augmented-reality.htm RTK, (2017). VRTK Documentation. Retrieved from https://vrtoolkit.readme.io/docs Busta, H. (2015). Three Augmented and Virtual Reality Apps for Design and Construction. Retrieved from http://www.architectmagazine.com/technology/products/three-augmented-and-virtual-reality-apps-for- design-and-construction_o Sarconi, P. (2017). How to Shoot A 360 Video. Retrieved from https://www.wired.com/2017/02/shoot-360-video/ Selezan, D., & Mao, C. (2016). Integration of BIM and Facility Maintenance: What Does the FM Crew Really Need? Autodesk University 2016. SEPS2BIM. (2017). BIM: Equipment Objects, Space Templates & Departments Healthcare Space and Equipment Planning. Retrieved from http://seps2bim.org/. Shi, Y. M., Du, J., Lavy, S., & Zhao, D. (2016). A Multiuser Shared Virtual Environment for Facility Management. Procedia Engineering 145, 120 – 127. Shiratuddin, M. F. (2009). A Framework for Design Review in a Virtual Environment: Using Context Aware Information Processing. ISBN-10: 363917285X, ISBN-13: 978-3639172850, Publisher: VDM Verlag. TechTarget. (2018). RESTful API. Retrieved from http://searchmicroservices.techtarget.com/definition/RESTful-API Teicholz, E. (2004). Bridging the AEC Technology Gap. IFMA Facility Management Journal. The Revit Kid, Virtual Reality and Revit in the Real-world, [online] Available at http://therevitkid.blogspot.com/2017/03/virtual-reality-and-revit-in-real-world.html Tillage, T. (2017). ARKit Fundamentals in iOS 11. Retrieved from https://www.captechconsulting.com/blogs/arkit- fundamentals-in-ios-11 Turbakiewicz, J. (2016). Advanced Methods of Data Processing and Visualization for Building Information Modeling. Poznan University of Technology. Retrieved from http://buildingsmart.pl/mgr/2016Turbakiewicz.pdf. Turbosquid. (2018). 3D Models for Professionals. Retrieved from https://www.turbosquid.com/ Ulrich, R., Quan, X., Zimring, C., Joseph, A., & Choudhary, R. The Role of the Physical Environment in the Hospital of the 21st Century: A Once-in-a-Lifetime Opportunity. Funded by the Robert Wood Johnson Foundation. 88 Unite Austin. (2017). Intro to Vuforia AR Integration in Unity 2017.2. Unity. Retrieved from https://www.youtube.com/watch?v=yIvQSrPEtIY Unity. (2017). Unity Community. 2017 Unity Technologies. Retrieved from https://unity3d.com/community Unity. (2017). Unity User Manual (2017.1). Retrieved from https://docs.unity3d.com/Manual/MonoDevelop.html Unity. (2017). Unity User Manual (2017.1). Retrieved from https://docs.unity3d.com/Manual/HOWTO- UIFitContentSize.html. Virtual Xperience Inc. (2017). CBS – Virtual Reality Check. Retrieved from https://www.virtual-xperience.com/cbs- virtual-reality-check/. VRTK, (2017). VRTK Documentation. Retrieved from https://vrtoolkit.readme.io/docs Vuforia. (2017). Do Model Target Needs Exact Same CAD Model or Geometry Can Vary. Retrieved from https://developer.vuforia.com/forum/model-targets/do-model-target-needs-exact-same-cad-model-or- geometry-can-vary. Vuforia. (2017). Vuforia: Features. PTC Inc. Retrieved from https://www.vuforia.com/features.html Vuforia. (2018). Introduction to Model Targets in Unity. Retrieved from https://library.vuforia.com/content/vuforia- library/en/articles/Solution/introduction-model-targets-unity.html Milideas De Decoracion. (2015). D-Photo Measures an App to Save Really Useful Measurements. Retrieved from https://www.milideas.net/d-photo-measures-una-app-para-guardar-medidas-realmente-util Vuforia. (2018). Model Targets User Guide. PTC Inc. Retrieved from https://library.vuforia.com/articles/Solution/model-targets-user-guide.html Vuforia. (2018). Object Recognition. PTC Inc. Retrieved from https://library.vuforia.com/content/vuforia- library/en/articles/Training/Object-Recognition.html Wang, X. Y., Love, P. E.D., Kim, M. J., Park, C., Sing, C., & Hou, L. (2013). A Conceptual Framework for Integrating Building Information Modeling with Augmented Reality. Automation in Construction 34, 37–44. Wu, W., & Kaushik, I. (2015). Design for Sustainable Aging: Improving Design Communication through Building Information Modeling and Game Engine Integration. Procedia Engineering, 118, 926 – 933. Zhong, W. (2017). More with Forge Viewer WebVR: Part 1 support extra three.js mesh in WebVR. Retrieved from https://forge.autodesk.com/blog/more-forge-viewer-webvr-part-1-support-extra-threejs-mesh-webvr. Zhong, W. (2017). New Experience of AR/VR under the Forge Platform, Forge DevCon 2017, Autodesk University. Zhong, W., & Kevin, V. (2017). Different Tools to Implement Your WebVR Application. Forge DevCon 2017, FDC126343.
Abstract (if available)
Abstract
Building information modeling (BIM) is often used in the AEC (architecture/engineering/construction) industry throughout the building’s lifecycle including design, operation and maintenance (O&M). Virtual reality (VR) can be used for real-time simulation of a user's physical presence in a 3D interactive environment. Augmented reality (AR) can be used to overlay the virtual information onto the real environment. Both can augment BIM capabilities. A healthcare patient room was selected for demonstration due to its future potential as part of a complex building type for the use of VR based design and integrated facility management (IFM). Three VR workflows (cinematic VR, Revit + Unity, and Revit + Fuzor) were explored for the immersive project presentation through VR storytelling and interactive design review and management of O&M data. An AR test (Unity + Vuforia) was conducted to visualize the related information for onsite IFM support after recognizing the object. ❧ In the first workflow, a VR linear film was made via Revit, Maya, Mettle Skybox, Handbrake, Redshift and After Effects to present the project through Oculus Rift HMD for seated experience of VR. ❧ In the second workflow, parameter data including the equipment’s manufacturer, cost, and website address was added to a project file in Revit. The geometry was exported to Unity using Maya as a go-between. The data was extracted from the Revit schedule to a text file. In Unity, the data was parsed based on a unique identifier and its data structure and linked back to the equipment through C# scripting based on a panel based user interface of Unity. Through customized programming, the outcome was a room in VR where equipment information could be appeared on canvas panels per user’s control input in real time. The HTC Vive, a head mounted display (HMD), was used to provide users with a walk-through VR experience. ❧ Although Unity is widely used in game and film industries and a free version is available, extra time and effort is spent in model modification via external software like Maya and model import and export. Hence, the third workflow was created, including Revit, a database of the equipment in Excel, Fuzor, and a customized link written in Python using the Fuzor Application Program Interface (API). Fuzor was chosen as it is a game engine based solution specifically tailored for the building industry. Moreover, other features already existing in Fuzor are beneficial for IFM, which includes an imbedded system of annotation, category customization, and colorization for layers to quickly differentiate various types of building information. Fuzor also has bi-directional synchronization with the Revit model and supports a VR environment which requires extra effort to establish for Unity. This synchronization with the Revit model without the need to use additional software makes it suitable for IFM as live BIM updates are needed during the O&M. An XML tool was developed to link the Excel sheet to the Revit and Fuzor models instead of using the inherent Revit parameters as in the first workflow. Packaged into a Python script, this tool can automate both the data extraction and the execution of importing the data into the Fuzor’s parameter window. This tool has immediate application potential because the Excel file can be an IFM database in a real project like a maintenance or repair logging schedule. ❧ A simple test based on Vuforia and Unity as a very preliminary exploration of AR. After several trials, including the usage of the object target in Vuforia, the model target was applied to recognize the shape of a laptop based on the corresponding pre-loaded 3d model and to overlay the related information on top of that laptop through the webcam connected to the computer. This application can be applied on site for facility management to better support the visualization of useful information including training videos. ❧ All the three VR workflows and the one AR test were evaluated and analyzed, followed by the conclusion drawn as the entire summary. At the end, future work recommendations were made, which include the further AR workflow development based on the AR test, a toolkit for smoothing the transition from Revit to Unity, and exploration towards the other VR and AR based workflows.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Using building information modeling with augmented reality: visualizing and editing MEP systems with a mobile augmented reality application
PDF
CFD visualization: a case study for using a building information modeling with virtual reality
PDF
Visualizing architectural lighting: creating and reviewing workflows based on virtual reality platforms
PDF
BIM+AR in architecture: a building maintenance application for a smart phone
PDF
Data visualization in VR/AR: static data analysis in buildings
PDF
Streamlining sustainable design in building information modeling: BIM-based PV design and analysis tools
PDF
A BIM-based visualization tool for facilities management: fault detection through integrating real-time sensor data into BIM
PDF
Augmented reality in room acoustics: a simulation tool for mobile devices with auditory feedback
PDF
Multi-domain assessment of a kinetic facade: determining the control strategy of a kinetic façade using BIM based on energy performance, daylighting, and occupants’ preferences; Multi-domain asse...
PDF
Building information modeling: guidelines for project execution plan (PxP) for India
PDF
Quantify human experience: integrating virtual reality, biometric sensors, and machine learning
PDF
Visualizing thermal data in a building information model
PDF
Building bridges: filling gaps between BIM and other tools in mechanical design
PDF
MM Electrical Tool: a tool for generating electrical single line diagrams in BIM
PDF
Revit plugins for electrical engineering improvements in buildings: Lighting power density and electrical equipment placement
PDF
Planning in advance for rehabilitation and restoration using BIM and seismic simulation to record and analyze the Japanese House in Los Angeles
PDF
Lateral design with mass timber: examination of structural wood in high-rise timber construction
PDF
Energy simulation in existing buildings: calibrating the model for retrofit studies
PDF
Net zero energy building: the integration of design strategies and PVs for zero-energy consumption
PDF
Environmentally responsive buildings: multi-objective optimization workflow for daylight and thermal quality
Asset Metadata
Creator
Yang, Zhan
(author)
Core Title
Building information modeling based design review and facility management: Virtual reality workflows and augmented reality experiment for healthcare project
School
School of Architecture
Degree
Master of Building Science
Degree Program
Building Science
Publication Date
07/29/2018
Defense Date
04/26/2018
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
3D game engine,augmented reality (AR),building information modeling (BIM),design review of healthcare facility,integrated facility management (IFM),OAI-PMH Harvest,virtual reality (VR)
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Kensek, Karen (
committee chair
), Hanson, Eric (
committee member
), Moser, Margaret (
committee member
)
Creator Email
yz579tovey@gmail.com,zhanyang@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-40429
Unique identifier
UC11670813
Identifier
etd-YangZhan-6563.pdf (filename),usctheses-c89-40429 (legacy record id)
Legacy Identifier
etd-YangZhan-6563.pdf
Dmrecord
40429
Document Type
Thesis
Format
application/pdf (imt)
Rights
Yang, Zhan
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
3D game engine
augmented reality (AR)
building information modeling (BIM)
design review of healthcare facility
integrated facility management (IFM)
virtual reality (VR)