Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Using building information modeling with augmented reality: visualizing and editing MEP systems with a mobile augmented reality application
(USC Thesis Other)
Using building information modeling with augmented reality: visualizing and editing MEP systems with a mobile augmented reality application
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Using Building Information Modeling with Augmented Reality: Visualizing and Editing MEP Systems with a Mobile Augmented Reality Application by Zeyu Lu A Thesis Presented to the FACULTY OF THE USC SCHOOL OF ARCHITECTURE UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree MASTER OF BUILDING SCIENCE MAY 2020 Copyright 2020 Zeyu Lu ii ACKNOWLEDGEMENTS I would like to express my deep gratitude to my parents, who have been supported my three-year study at USC. I really appreciate the valuable help from Professor Karen Kensek (kensek@usc.edu, USC School of Architecture) with defining and developing the topic, encouraging me to be more creative, and teaching me how to accomplish the research by disposing it into simple parts. I would also like to express my appreciation to Professor Margaret Moser (mmoser@cinema.usc.edu, USC School of Cinematic Arts) for continued guidance and supporting me throughout the research and solving technical problems in Unity. I also truly want to thank both Professor Vangelis Lympouridis (vangelis@enosisvr.com, USC Viterbi School of Engineering) and Charles S. Dwyer, my USC mentor, (Charles.S.Dwyer@usace.army.mil, U. S. Army Corps of Engineers.), for supporting me and advice with their knowledge from different areas of expertise. Lastly, I want to thank you to all the students and faculties in our MBS family for supporting and helping me in these two years and throughout the research and development of this thesis. iii TABLE OF CONTENTS Acknowledgements ....................................................................................................................................... ii Table of Contents ........................................................................................................................................ iii List of Tables ............................................................................................................................................. vii List of Figures ........................................................................................................................................... viii Abstract ...................................................................................................................................................... xv Chapter 1 ...................................................................................................................................................... 1 1. Introduction .............................................................................................................................................. 1 1.1 Building Information Modeling ......................................................................................................... 1 1.1.1 Building Information Modeling ................................................................................................ 1 1.1.2 BIM AND CAD ........................................................................................................................ 1 1.1.3 Nd environment BIM ................................................................................................................ 2 1.1.4 BIM maturity levels ................................................................................................................... 3 1.1.5 Industry standardization and interoperability ............................................................................ 5 1.2 Facility Management ......................................................................................................................... 7 1.2.1 Facility Management ................................................................................................................. 7 1.2.2 FM Systems ............................................................................................................................... 7 1.2.3 BIM and FM .............................................................................................................................. 8 1.3 VR, AR, MR ................................................................................................................................... 10 1.3.1 Virtual Reality ......................................................................................................................... 10 1.3.2 Augmented Reality .................................................................................................................. 12 1.3.3 Mixed Reality .......................................................................................................................... 14 1.3.4 Hardware and Software Development Kit (SDK) for AR/MR ............................................... 14 1.3.5 Game Engine ........................................................................................................................... 16 1.4 AR/MR implementations ................................................................................................................ 18 1.5 Development Tool ........................................................................................................................... 20 1.6 Summary ......................................................................................................................................... 22 Chapter 2 .................................................................................................................................................... 23 2. Background and Literature Review ....................................................................................................... 23 2.1 BIM and FM.................................................................................................................................... 23 2.1.1 The potential of BIM and FM ................................................................................................. 23 2.1.1.1 Space analysis and management ..................................................................................... 23 2.1.1.2 Renovation/retrofit planning and feasibility .................................................................... 23 2.1.1.3 Integrate BIM and FM database ...................................................................................... 24 2.1.1.4 Real-time localization of building components ............................................................... 26 2.1.1.5 Personnel Training and Development ............................................................................. 27 2.1.2 Challenges and limitations ...................................................................................................... 27 2.1.3 BIM and FM Summary ........................................................................................................... 29 2.2 BIM and AR/MR ............................................................................................................................ 29 2.2.1 Augmented Reality and BIM .................................................................................................. 29 2.2.2 Using BIM and AR for construction ....................................................................................... 30 2.2.3 Using BIM and AR for architectural design ............................................................................ 31 2.3 BIM, AR, AND FM ........................................................................................................................ 33 2.4 Limitations of transferring BIM data to AR/MR ............................................................................ 34 iv 2.4.1 Complexity process of BIM data to the AR environment ....................................................... 34 2.4.2 Poor real-time data exchange between a BIM software and the AR environment .................. 35 2.4.3 Imprecise positions tracking to place virtual objects in the real environment accurately ....... 36 2.4.4 Limited computing speed and system memory for mobile device .......................................... 36 2.5 Current marketing BIM-based AR software ................................................................................... 36 2.5.1 Unity AR tool with BIM data .................................................................................................. 36 2.5.2 Unreal 4 VR tool with BIM data ............................................................................................. 37 2.6 Summary ......................................................................................................................................... 38 Chapter 3 ..................................................................................................................................................... 39 3. Methodology .......................................................................................................................................... 39 3.1 Introduction of the workflow .......................................................................................................... 39 3.1.1 Description of tasks ................................................................................................................. 40 3.1.2 Software selection ................................................................................................................... 40 3.2 Building Information Model and BIM data exporting (BIM to AR) .............................................. 41 3.2.1 Define BIM objects ................................................................................................................. 41 3.2.2 Add new parameter type to store object location .................................................................... 42 3.2.3 Transfer Revit data to Unity .................................................................................................... 44 3.2.3.1 Export Revit objects’ geometries to FBX Unity ............................................................. 45 3.2.3.2 Modify Revit FBX in Autodesk 3ds Max (Convert Autodesk Materials to standard materials & Reset objects pivots) .................................................................................... 47 3.2.3.3 Export Revit objects’ metadata using in Unity................................................................ 51 3.3 Integrate BIM with Unity for AR development (Create AR in Unity) ........................................... 52 3.3.1 Geometry management in Unity ............................................................................................. 53 3.3.2 Parsing Revit metadata ............................................................................................................ 58 3.3.3 Designing Unity User Interface (UI) + user input ................................................................... 59 3.3.4 Creating interactions ................................................................................................................ 62 3.3.4.1 Changing objects’ position .............................................................................................. 63 3.3.4.2 Changing the values of some parameters ........................................................................ 65 3.3.5 Generating Files and Developing the web backend request .................................................... 67 3.3.6 Develop iOS AR applications and set tracking images’ locations .......................................... 67 3.4 Use AR in real world ...................................................................................................................... 71 3.5 Update BIM ..................................................................................................................................... 74 3.6 Summary ......................................................................................................................................... 74 Chapter 4 ..................................................................................................................................................... 77 4. Program Development ........................................................................................................................... 77 4.1 Prepare Building Information Model Metadata .............................................................................. 78 4.1.1 Add coordinate information with Dynamo .............................................................................. 78 4.1.2 Export Revit metadata to CSV with Dynamo ......................................................................... 80 4.2 Develop Mobile AR Application—TransBIM ................................................................................ 82 4.2.1 Unity Scripting Overview ....................................................................................................... 83 4.2.2 Define ScriptableObject .......................................................................................................... 86 4.2.3 ReadCSV.cs (Parse Revit metadata) ....................................................................................... 88 4.2.3.1 Set variables .................................................................................................................... 89 4.2.3.2 Read the CSV file into an array of strings....................................................................... 89 4.2.3.3 Find the GameObject’s relevant Revit metadata by matching ElementID ..................... 91 4.2.3.4 Store the GameObject relevant Revit information into a dictionary ............................... 93 4.2.3.5 Create a ScriptableObject based on the GameObject category and assign values .......... 93 4.2.3.6 Add tags in the Unity project .......................................................................................... 96 4.2.3.7 Load the asset file in the mobile application and assign GameObject a tag ................... 97 v 4.2.4 Designing the User Interface ................................................................................................... 99 4.2.4.1 Unity “ScrollView” Component ................................................................................... 101 4.2.4.2 Buttons and “ButtonFunctions” .................................................................................... 105 4.2.5 ClickObject.cs ....................................................................................................................... 106 4.2.5.1 Set Variables ................................................................................................................. 107 4.2.5.2 Assign references (values) to the predefined variables ................................................. 109 4.2.5.3 Select (Touch) a GameObject ....................................................................................... 112 4.2.5.4 AddDatatoParaDict() function ...................................................................................... 118 4.2.5.5 UpdateChangeParaUI() ................................................................................................. 120 4.2.5.6 UpdatePropertyWindow() ............................................................................................. 122 4.2.5.7 CreateGizmos() ............................................................................................................. 124 4.2.6 ReferenceObject.cs (Change object’s parameter and record the changes) ............................ 129 4.2.7 Create the Text File ............................................................................................................... 135 4.2.8 Upload Text File to Dropbox ................................................................................................ 137 4.2.9 ARFoudation and ARKit (locate the model on site) ............................................................. 139 4.2.10 Summary ............................................................................................................................. 146 4.3 Update BIM with Modified Metadata ........................................................................................... 146 4.4 Summary ....................................................................................................................................... 150 Chapter 5 ................................................................................................................................................... 151 5. Case Study at Watt Hall ....................................................................................................................... 151 5.1 Watt Hall Revit Model and Asset Preparation .............................................................................. 151 5.1.1 Case study scale (Watt Hall Third floor) ............................................................................... 151 5.1.2 Revit model and MEP as-built based on point cloud ............................................................ 152 5.1.3 Prepare the FBX file and CSV file (BIM geometry and metadata) ....................................... 156 5.2 Build the TransBIM on iOS .......................................................................................................... 158 5.2.1 Import FBX and CSV file into the TransBIM project ........................................................... 159 5.2.2 Set model pivot, on-site image, and custom scan images ..................................................... 160 5.2.3 Run the application in Unity Editor to generate asset files ................................................... 165 5.2.4 Create a Dropbox connection ................................................................................................ 166 5.2.5 Build the application to iOS with Xcode ............................................................................... 167 5.3 Case Study: Use TransBIM at Watt Hall Third Floor .................................................................. 171 5.3.1 Welcome panel ...................................................................................................................... 172 5.3.2 Scan the images ..................................................................................................................... 173 5.3.3 Load Model ........................................................................................................................... 175 5.3.4 Filter object ........................................................................................................................... 176 5.3.5 Select object .......................................................................................................................... 177 5.3.6 Change the object’s color, hide and unhide objects .............................................................. 177 5.3.7 View object’s parameter ........................................................................................................ 178 5.3.8 Change object’s parameter (movement & sizes) ................................................................... 179 5.3.9 Upload the changed data to Dropbox .................................................................................... 181 5.4 Update Revit model ...................................................................................................................... 181 5.5 Summary ....................................................................................................................................... 183 Chapter 6 .................................................................................................................................................. 185 6. Discussion and Future Work ................................................................................................................ 185 6.1 Background and research methodology ........................................................................................ 185 6.2 Evaluation of current workflow .................................................................................................... 186 6.3 Discussion of results: examination of tool features ...................................................................... 187 6.4 Limitations .................................................................................................................................... 189 6.4.1 Problems with the workflow ................................................................................................. 189 vi 6.4.2 Usability of the current prototype.......................................................................................... 192 6.4.3 Building components isolated in TransBIM .......................................................................... 192 6.5 Future Work .................................................................................................................................. 194 6.5.1 Code improvement ................................................................................................................ 194 6.5.2 TransBIM user experiences improvement ............................................................................. 195 6.5.3 Real-time BIM synchronization ............................................................................................ 196 6.5.4 TransBIM visionary development for FM ............................................................................ 196 6.6 Conclusion .................................................................................................................................... 197 References ................................................................................................................................................. 198 Appendix A: ScriptableObject .................................................................................................................. 203 A.1 ductParameter.cs ............................................................................................................................ 203 A.2 conduitParameter ........................................................................................................................... 204 A.3 wallParameter.cs ............................................................................................................................ 204 A.4 otherParameter.cs ........................................................................................................................... 205 Appendix B: Main Unity Scripts (Unity Monobehavior Class) ................................................................ 206 B.1 ReadCSV.cs ................................................................................................................................... 206 B.2 ClickObject.cs ................................................................................................................................ 208 B.3 ReferenceObject.cs ........................................................................................................................ 215 B.4 TrackedImageInfoMultipleManager.cs ......................................................................................... 221 Appendix C .............................................................................................................................................. 224 C.1 SwichOver.cs ................................................................................................................................. 224 C.2 AddTag.cs ...................................................................................................................................... 227 C.3 AssignParameter.cs ........................................................................................................................ 228 C.4 InventoryControl.cs ....................................................................................................................... 232 C.5 StoreHeight.cs ................................................................................................................................ 233 C.6 UploadFile.cs ................................................................................................................................. 234 vii LIST OF TABLES Table 1. 1 nD BIM (A portion of content in Table 1.1 is adapted from (Pärn et al., 2017) .......................................... 3 Table 1. 2 BIM maturity levels (BIM Wiki, 2019) ........................................................................................................ 4 Table 1. 3 Fundamental LOD Definitions ..................................................................................................................... 4 Table 1. 4 File format supported by Autodesk Revit Architecture ................................................................................ 5 Table 1. 5 A brief history of VR (Dom, 2018; Gutiérrez A., Vexo, & Thalmann, 2008; Mihelj et al., 2014) ........... 10 Table 1. 6 A list of examples of AR/MR Headsets and Smartglasses ......................................................................... 15 Table 1. 7 AR SDK Comparison (https://thinkmobiles.com/blog/best-ar-sdk-review/) .............................................. 15 Table 1. 8 AR in different industries ........................................................................................................................... 18 Table 2. 1 Examples of some mobile tools using BIM for FM ................................................................................... 28 Table 3. 1 A list of C# script using in the tool development ....................................................................................... 75 Table 4. 1 A list of C# script using in the tool development ....................................................................................... 82 Table 4. 2 The descriptions of the GameObject in the Canvas prefab ....................................................................... 100 Table 4. 3 All Button functions ................................................................................................................................. 106 Table 4. 4 Features of the “ClickObject.cs” .............................................................................................................. 107 Table 4. 5 Variables (Parameters) in the “ClickObject.cs” ........................................................................................ 108 Table 4. 6 Actions when selecting an object.............................................................................................................. 115 Table 4. 7 Two actions to manipulate the previously selected .................................................................................. 117 Table 4. 8 Add relevant information in orginTsDict ................................................................................................. 119 Table 4. 9 Add relevant in information to ChangeParaDict based on object’s category and family ......................... 120 Table 4. 10 UI components in the “Change Parameter window” .............................................................................. 121 Table 4. 11 Duplicate UI elements and assign information based on the field types in an asset file ......................... 123 Table 4. 12 ReferenceObject.cs Variables ................................................................................................................. 129 Table 5. 1 The summary of TransBIM features ......................................................................................................... 184 Table 6. 1 The summary of TransBIM features ......................................................................................................... 188 Table 6. 2 The summary of Dynamo scripts and features ......................................................................................... 189 viii LIST OF FIGURES Figure 1. 1 Revit disciplines, categories, and family types (Autodesk Revit mechanical advanced sample) ................ 2 Figure 1. 2 Revit duct objects parameters ...................................................................................................................... 2 Figure 1. 3 BIM maturity levels U.K. (Tekla, 2019) ..................................................................................................... 4 Figure 1. 4 IFC opened in an IFC Viewer program (BIMer) ......................................................................................... 6 Figure 1. 5 IFC file in Internet Explorer ........................................................................................................................ 6 Figure 1. 6 BIM-BAS integration in EcoDomus FM..................................................................................................... 9 Figure 1. 7 BIM 360 Glue iPad app allows users to review, annotate and send notifications to team members ........... 9 Figure 1. 8 Technological components for an AR system (Wang et al., 2009) ........................................................... 12 Figure 1. 9 Markerless Augmented Reality applications ............................................................................................. 13 Figure 1. 10 Marker-based AR .................................................................................................................................... 13 Figure 1. 11 Reality-Virtuality Continuum (Milgram & Kishino, 1994) ..................................................................... 14 Figure 1. 12 Face Rendering example (a) the 3D vertexes, (b) the triangulated mesh, and (c) final smoothed mesh . 17 Figure 1. 13 Game Engine Comparaison (Peter et al., 2016) (Yang et al., 2017) ........................................................ 17 Figure 1. 14 Unity User Interface ................................................................................................................................ 18 Figure 1. 15 Pokémon Go ............................................................................................................................................ 18 Figure 1. 16 The hierarchy of namespace, class, and method (left) and the syntax of ZoomToFit method (right) ..... 20 Figure 1. 17 Replace materials by batch processing in a Revit plugin (Polynine Studio, 2018) ................................. 21 Figure 1. 18 An example to create sheets and set sheet names and numbers .............................................................. 21 Figure 1. 19 Parametric bridge stadium design with Dynamo ..................................................................................... 21 Figure 2. 1 Integrating BIM for FM energy analysis and building retrofit (K. Kensek, 2015) .................................... 24 Figure 2. 2 Veterans Administration Space and Equipment Planning (SEPS). ........................................................... 24 Figure 2. 3 Comparison between original FM workflow (upper) and BIM-based workflow (lower) ......................... 25 Figure 2. 4 Integrated BIM provides FM personnel with various information (Kassem et al., 2015) ......................... 25 Figure 2. 5 Interface in EcoDomus with all integrated data (K. Kensek, 2015) .......................................................... 26 Figure 2. 6 Example of BIM-FM system for real-time location reference (Lin et al., 2014) ...................................... 27 Figure 2. 7 Data and Process Requirements to Support BIM-Enabled Facilities Management ................................... 28 Figure 2. 8 A BIM-based AR application “(ACCEPT)” used in Construction (Ratajczak et al., 2019) ...................... 30 Figure 2. 9 Examples of tasks how BIM-based AR application can be used for Construction ................................... 31 Figure 2. 10 Trimble’s Hard Hat: Visualizing the mixed virtual building systems on site .......................................... 31 Figure 2. 11 Building planning visualization in the physical environment ................................................................. 32 Figure 2. 12 Put the 3D models by disciplines on 2D drawings .................................................................................. 32 Figure 2. 13 Kubity Go put linked Revit model on a detected physical surface .......................................................... 32 Figure 2. 14 Identify physical object and show facility information in AR (J. Wang et al., 2014) ............................. 33 Figure 2. 15 A screenshot of the ARWindow scenario ................................................................................................ 33 Figure 2. 16 A Mobile Augmented Reality method for accessing building information: InfoSPOT .......................... 34 Figure 2. 17 Workflow of Revit-Unity (Handel et al., 2016) ...................................................................................... 35 Figure 2. 18 A BIM-based and Location-based Augmented Reality system (Ratajczak et al., 2019) ......................... 35 Figure 2. 19 Recommendation of file format for different software to Unity (Boeykens, 2014) ................................ 36 Figure 2. 20 BIM data synchronization (Unity Reflect, 2020) .................................................................................... 37 Figure 2. 21 BIM data synchronization (Unity Reflect, 2020) .................................................................................... 37 Figure 2. 22 A rendered view in Twinmotion after assigning new materials, and changing weather data .................. 37 Figure 3. 1 Interaction between the user, Revit, and Unity 3d (BIM data circle) ........................................................ 39 Figure 3. 2 Methodology ............................................................................................................................................. 40 Figure 3. 3 Workflow of exporting BIM data .............................................................................................................. 41 Figure 3. 4 Disciplines in Revit (Revit mechanical basic sample project) .................................................................. 42 Figure 3. 5 Necessary Revit object to be exported ...................................................................................................... 42 Figure 3. 6 Find the location of selected elements (either Line or Point) .................................................................... 43 Figure 3. 7 Adding new project parameter type to each object based on its location type (line or point) ................... 43 Figure 3. 8 Assign object spatial information to the newly created parameters .......................................................... 44 Figure 3. 9 Results of adding object’s location information ........................................................................................ 44 Figure 3. 10 Check Revit Element ID by Dynamo ...................................................................................................... 45 Figure 3. 11 FBX hierarchy in Unity ........................................................................................................................... 45 Figure 3. 12 Importing the FBX exported from Revit to Unity ................................................................................... 46 ix Figure 3. 13 Basepoint of the project coordinate system in Revit, Dynamo, and Unity .............................................. 46 Figure 3. 14 A pipe’s pivot shown at the project base point location instead of on the pipe itself .............................. 47 Figure 3. 15 Floor rotation based on its “Pivot” .......................................................................................................... 47 Figure 3. 16 Organize FBX file by entities .................................................................................................................. 48 Figure 3. 17 Check or reload objects materials ............................................................................................................ 48 Figure 3. 18 Convert Autodesk materials to standard materials (1) ............................................................................ 49 Figure 3. 19 Convert Autodesk materials to standard materials (2) ............................................................................ 49 Figure 3. 20 Comparison of Revit and 3ds max FBX file in Unity ............................................................................. 49 Figure 3. 21 A pipe’s pivot shown at the project base point location instead of on the pipe itself .............................. 50 Figure 3. 22 Before resetting the object’s pivot, line-based object pivot is outside of the geometry .......................... 50 Figure 3. 23 Reset the object pivot to the center of the object in 3ds Max .................................................................. 50 Figure 3. 24 IFC data opened in Notepad .................................................................................................................... 51 Figure 3. 25 A test of exporting Duct information by Revit schedule ......................................................................... 51 Figure 3. 26 Export Revit metadata (element ID and parameters) to Excel in Dynamo .............................................. 52 Figure 3. 27 Results of the exported metadata in Excel ............................................................................................... 52 Figure 3. 28 Integrate BIM data in Unity (Mobile AR application development) ....................................................... 53 Figure 3. 29 Import FBX in Revit and set the importing setting ................................................................................. 54 Figure 3. 30 Unpack the imported prefab .................................................................................................................... 54 Figure 3. 31 Add script components to all FBX entities .............................................................................................. 55 Figure 3. 32 Drag the FBX file prefab into the scene .................................................................................................. 55 Figure 3. 33 Set the overall pivot point of the Watt Hall prefab at the center of a door to scan the first image .......... 56 Figure 3. 34 Set the tracking images’ positions in both a virtual model and a real building ....................................... 56 Figure 3. 35 Create an empty GameObject .................................................................................................................. 56 Figure 3. 36 Move the empty GameObject to the center of a door .............................................................................. 57 Figure 3. 37 Make the empty GameObject as the parent of the unpacked FBX model ............................................... 57 Figure 3. 38 Save the “TransBIM Model” as the final prefab and put it in the ARSessionOrigin .............................. 57 Figure 3. 39 Match the Element ID of the metadata and imported geometry .............................................................. 58 Figure 3. 40 Parsed data of one object showing when clicking an object (duct) ......................................................... 59 Figure 3. 41 Overview of designing application UI system and Input functions ......................................................... 59 Figure 3. 42 Revit property window (Left), developed AR tool property window (Right) ......................................... 60 Figure 3. 43 TextRow prefab ....................................................................................................................................... 61 Figure 3. 44 InputRow prefab ...................................................................................................................................... 61 Figure 3. 45 CheckRow prefab .................................................................................................................................... 61 Figure 3. 46 OptionRow prefab ................................................................................................................................... 62 Figure 3. 47 The prefab of UI components were duplicated showing the selected object’s parameter information ... 62 Figure 3. 48 Methodology of creating Interaction functions ....................................................................................... 63 Figure 3. 49 Select a conduit and a window to show a controlled line and a host point (objects’ geometries hidden)64 Figure 3. 50 Adding object’s location information to objects parameters in Revit ..................................................... 64 Figure 3. 51 A GameObject Pivot position (Right), a GameObject Revit Coordinate information (Left) .................. 65 Figure 3. 52 The UI component prefab shows the object’s parameter information ..................................................... 65 Figure 3. 53 Change duct size in Revit ........................................................................................................................ 66 Figure 3. 54 Scale in non-longitudinal directions ........................................................................................................ 66 Figure 3. 55 Overview of generating text file & web request ...................................................................................... 67 Figure 3. 56 Overview of developing the AR experience ............................................................................................ 67 Figure 3. 57 Download ARFoundation, ARKit XR Plugin and other XR packages in Unity Package Management . 68 Figure 3. 58 Basic AR scene hierarchy ........................................................................................................................ 68 Figure 3. 59 Add AR Session and AR Session Origin................................................................................................. 68 Figure 3. 60 ARSession and scripts (Left) ARSessionOrigin and scripts (Right) ....................................................... 69 Figure 3. 61 ARSessionOrigin with all the trackable managers (Left); ARSessionOrigin in the scene (Right) .......... 69 Figure 3. 62 Tracked images’ locations in Unity World Coordinate ........................................................................... 70 Figure 3. 63 Define Custom scan Images in “ReferenceImageliabry” ........................................................................ 70 Figure 3. 64 First Image and Second Image locations ................................................................................................. 71 Figure 3. 65 Set tracking images’ positions in both the virtual model and the real building. ...................................... 71 Figure 3. 66 Workflow of using the developed application ......................................................................................... 72 Figure 3. 67 Scan Images for BIM model locating ...................................................................................................... 72 Figure 3. 68 Visualize HAVC Information.................................................................................................................. 72 x Figure 3. 69 Change duct size (Left), change duct location (Right) ............................................................................ 73 Figure 3. 70 Generating a modified BIM data and upload to Dropbox ....................................................................... 73 Figure 3. 71 TransBIM Features .................................................................................................................................. 73 Figure 3. 72 Workflow of updating BIM with modified BIM data ............................................................................. 74 Figure 3. 73 Methodology ........................................................................................................................................... 75 Figure 4. 1 Process with source codes are translucent ................................................................................................. 77 Figure 4. 2 Scripts used for each function ................................................................................................................... 78 Figure 4. 3 Revit element location ( a pipe is line-based and a door is host-based) .................................................... 79 Figure 4. 4 Adding new project parameter type to each object based on its location type (line or point) ................... 79 Figure 4. 5 Assign object spatial information to the newly created parameters .......................................................... 80 Figure 4. 6 Adding object’s location results ................................................................................................................ 80 Figure 4. 7 Export Revit metadata (element ID and parameters) to Excel in Dynamo ................................................ 81 Figure 4. 8 Results of the exported metadata in Excel ................................................................................................. 81 Figure 4. 9 Scripts used for each function ................................................................................................................... 82 Figure 4. 10 Add <Components> to the GameObject under the FBX hierarchy ......................................................... 83 Figure 4. 11 Create Scripts in the “Assets\Addin_scripts” folder ................................................................................ 84 Figure 4. 12 Attach “Mono Script” components on each Revit Object under the FBX hierarchy............................... 84 Figure 4. 13 Script derived from MonoBehaviour Class ............................................................................................. 84 Figure 4. 14 Order of Execution for Event Functions (Partial Diagram) .................................................................... 85 Figure 4. 15 The result of creating an asset file that store the object Revit metadata for each GameObject ............... 86 Figure 4. 16 The ScriptableObjects for different categories (e.g. ductParameter.cs) .................................................. 86 Figure 4. 17 Define the ScriptableObject for duct category ........................................................................................ 87 Figure 4. 18 Define the ScriptableObject for conduit category ................................................................................... 87 Figure 4. 19 Define the ScriptableObject for objects (not “Walls”, “Ducts”, “Conduits”, and “Pipes”) .................... 88 Figure 4. 20 Examples of assets created by the different ScriptableObject types ........................................................ 88 Figure 4. 21 Set fields to store important information ................................................................................................. 89 Figure 4. 22 CSV “Revit_metadata_1201.csv” in Unity Asset/Resources Folder ....................................................... 90 Figure 4. 23 Read CSV File only one time and store the information into a static string[] ......................................... 90 Figure 4. 24 Debug results show the example of data[0] information (first row of CSV) .......................................... 90 Figure 4. 25 Match the Element ID of the metadata and imported geometry .............................................................. 91 Figure 4. 26 Static string[] data Partial Debug Results ................................................................................................ 91 Figure 4. 27 Extract ElementID from GameObject’s name and assign it to the “ObjectID” variable ......................... 91 Figure 4. 28 Find GameObject’s metadata based on matching ElementID ................................................................. 92 Figure 4. 29 An example of a duct object was assigned with metadata into the parameterArray variable .................. 92 Figure 4. 30 Slice the information from the array “parameterArray” into the dictionary “paraDict” .......................... 93 Figure 4. 31 Create an asset file (ScriptableObject) based on category type and assign values to the asset file ........ 93 Figure 4. 32 Based on object’s category to create and save the asset file with the object’s parameter information .... 94 Figure 4. 33 AssignParameter.cs define functions to assign dictionary data to a ScriptableObject instance .............. 95 Figure 4. 34 DictToDuctAsset() function (Partial) ...................................................................................................... 95 Figure 4. 35 Results of each asset file with assigned information ............................................................................... 95 Figure 4. 36 Filter objects by categories ...................................................................................................................... 96 Figure 4. 37 Add tags in the Unity project .................................................................................................................. 96 Figure 4. 38 “AddTag.cs” and CreateTag() function ................................................................................................... 96 Figure 4. 39 The results of tags shown in the Unity Tag Manager .............................................................................. 97 Figure 4. 40 Tag assigning to each GameObject and all created tags .......................................................................... 97 Figure 4. 41 Using #define directive to call Unity Editor scripts from your game code ............................................. 98 Figure 4. 42 Load the asset file and find category information from its loaded asset file ........................................... 98 Figure 4. 43 FindCategory() function to get the GameObject’s category from the asset file of the GameObject ....... 98 Figure 4. 44 Result of assigning a tag to each GameObject based on its category name ............................................ 99 Figure 4. 45 Overview of TransBIM UI system and Input functions .......................................................................... 99 Figure 4. 46 The application UI components in the Canvas prefab (1) ..................................................................... 100 Figure 4. 47 The application UI components in the Canvas prefab (2) ..................................................................... 100 Figure 4. 48 Create Unity built-in UI component ScrollView ................................................................................... 101 Figure 4. 49 <GridLayoutGoup> component in the “ParaContent” of the “InventoryMenu” ................................... 102 Figure 4. 50 InventoryControl.cs ............................................................................................................................... 102 Figure 4. 51 TextRow prefab (Left) and CheckRow prefab (Right) .......................................................................... 102 xi Figure 4. 52 The UI Contents in the “ParaContent” in each “ScrollView” window ................................................. 103 Figure 4. 53 Automate generating a corresponding number of UI elements in the “Property window” ................... 103 Figure 4. 54 FilterBox() function in the InventoryControl.cs .................................................................................... 104 Figure 4. 55 Automate Generate UI element based on Categories ............................................................................ 104 Figure 4. 56 Turn off the visibility of “Floor” GameObjects .................................................................................... 105 Figure 4. 57 Button Component and OnClick() function for “Button Property”(SwitchOver.switchProperty) ........ 105 Figure 4. 58 Button Component and OnClick() function for “Button Category” (SwitchOver.switchCategory) ..... 105 Figure 4. 59 Click “Property Button” to control the visibility of the “Property window” ......................................... 106 Figure 4. 60 Use the Canvas Prefab to build TransBIM UI ....................................................................................... 106 Figure 4. 61 One of the interactions: “select GameObject” ....................................................................................... 107 Figure 4. 62 Set class variables for “ClickObject.cs” ................................................................................................ 108 Figure 4. 63 Awake() function to assign values to the predefined variables ............................................................. 109 Figure 4. 64 Get “InventoryControl” component and the UI components (e.g. “textRow’) ..................................... 110 Figure 4. 65 “Inventory Control” components in the Property Window and Category Window .............................. 110 Figure 4. 66 Variables in <ClickObject> with no values (Left); The result of passing the variables value from the InventoryControl to <ClickObject> (Right) .............................................................................................................. 110 Figure 4. 67 The results when duplicating (clone) the UI components prefabs ......................................................... 111 Figure 4. 68 Find and assign the GameObjects ( “ChangeParameter” window and “ParaContent” ) to variables .... 111 Figure 4. 69 The GameObjects (contents) under the “ParaContent” of the “ChangeParameter” window ................ 111 Figure 4. 70 Find the parent of each GameObject and attach a script (“ReferenceObject.cs”) to it ......................... 112 Figure 4. 71 Revit objects have same parent GameObject ........................................................................................ 112 Figure 4. 72 The parent GameObject was added with only one “ReferenceObject.cs” component .......................... 112 Figure 4. 73 Generate Colliders when importing the FBX file in Unity .................................................................... 113 Figure 4. 74 IsPointerOverUIObject() check whether users click the UI components .............................................. 113 Figure 4. 75 Touching the UI won’t select the virtual object blocked by the UI ....................................................... 113 Figure 4. 76 OnMouseDrag () function ..................................................................................................................... 114 Figure 4. 77 The components under the “ParaContent” hierarchy of the InventoryMenu show the object’s parameter information ................................................................................................................................................................ 114 Figure 4. 78 When clicking a GameObject, the object’s color change to yellow ...................................................... 115 Figure 4. 79 OnMouseDrag () function Part 3 enables five actions ........................................................................... 116 Figure 4. 80 Results of some of the actions enabled by the Part 3 scripts in OnMouseDrag () function ................... 116 Figure 4. 81 Result of selecting a GameObject not belonging to “Walls”, “Ducts”, “Pipes”, and “Conduits” ......... 116 Figure 4. 82 Scripts to manipulate the previously selected ........................................................................................ 117 Figure 4. 83 A duct was selected with highlighted color (Left); The duct, as a previously selected object, was changed its color back (Right) ................................................................................................................................... 117 Figure 4. 84 “PreviousObject” and “CurrentObject” in the ReferenceObject.cs ....................................................... 118 Figure 4. 85 AddDatatoParaDict() function............................................................................................................... 118 Figure 4. 86 The need information in the asset file ................................................................................................... 119 Figure 4. 87 “ChangeParaDict” information shown in “Change Parameter window” .............................................. 120 Figure 4. 88 The GameObjects (contents) under the “ParaContent” of the “ChangeParameter” window ................ 121 Figure 4. 89 UpdateChangeParaUI() function ........................................................................................................... 121 Figure 4. 90 Update the “ChangeParameter window” ............................................................................................... 122 Figure 4. 91 UpdatePropertyWindow() ..................................................................................................................... 122 Figure 4. 92 Variable fields types defined in the ScriptableObject (ductParameter.cs)............................................. 123 Figure 4. 93 Generate UI elements with assigned parameter information based on the object’s asset file ............... 124 Figure 4. 94 Create a “Gizmos” at a selected object’s geometry center and to show the GameObject’s vertical, longitudinal, and cross-sectional directions ............................................................................................................... 125 Figure 4. 95 The default settings of the Gizmos GameObject ................................................................................... 125 Figure 4. 96 Unity Y-axis points up .......................................................................................................................... 125 Figure 4. 97 The FBX model was rotated to make the Revit z-axis pointing up ....................................................... 126 Figure 4. 98 Each object has its local z-axis pointing up, x-axis and y-axis in a horizontal plane ............................ 126 Figure 4. 99 Putting the Gizmos under a duct’s and a pipe’s hierarchy, the Gizmos’s vertical arrows point up ....... 126 Figure 4. 100 The CreateGizmo() function................................................................................................................ 127 Figure 4. 101 Creates a new Gizmos instance and puts it as a child of a GameObject (1) ........................................ 127 Figure 4. 102 Creates a new Gizmos instance and puts it as a child of a GameObject(2) ......................................... 128 Figure 4. 103 The angle between object’s longitude and Revit x-axis (object local x-axis in Unity) ....................... 128 xii Figure 4. 104 Calculate the angle by using trigonometry operations ........................................................................ 128 Figure 4. 105 Reset the Gizmos’s location and rotation ............................................................................................ 128 Figure 4. 106 Results of the Gizmos showing the object’s longitudinal, cross-sectional, and vertical directions ..... 129 Figure 4. 107 Attach the ReferenceObject.cs to the FBX Model parent ................................................................... 129 Figure 4. 108 Variables and Awake() in ReferenceObject.cs .................................................................................... 130 Figure 4. 109 The UI components in the “Change Parameter Window” ................................................................... 131 Figure 4. 110 MovementChangeEditted() function ................................................................................................... 132 Figure 4. 111 addOriginalDataToDict() function ...................................................................................................... 132 Figure 4. 112 Each object has the same local x-axis, y-axis, and z-axis .................................................................... 133 Figure 4. 113 Using the circular function to get the resolutions of a movement ....................................................... 133 Figure 4. 114 Calculate the radians and movement in x-axis and y-axis ................................................................... 134 Figure 4. 115 Move the object; save the movement information and update object’s Revit coordinate information 134 Figure 4. 116 The updateRevitCoord() ...................................................................................................................... 134 Figure 4. 117 Change movement results ................................................................................................................... 135 Figure 4. 118 The updateDict() function ................................................................................................................... 135 Figure 4. 119 Use “Upload data” button to create Text ............................................................................................. 136 Figure 4. 120 CreateText() function .......................................................................................................................... 136 Figure 4. 121 Generate the text file in the defined path ............................................................................................. 136 Figure 4. 122 The text file result ............................................................................................................................... 136 Figure 4. 123 Upload the file to Dropbox .................................................................................................................. 137 Figure 4. 124 The “UploadUIWindow” is popped up when clicking the“Upload data” button ................................ 137 Figure 4. 125 DropboxSync - upload and download files from Dropbox .................................................................. 137 Figure 4. 126 Go to the Dropbox creating APP page ................................................................................................ 138 Figure 4. 127 Create an app folder on Dropbox ........................................................................................................ 138 Figure 4. 128 Generate accessToken for the app ....................................................................................................... 138 Figure 4. 129 Copy the access token and paste into “DropboxSync Script” ............................................................. 139 Figure 4. 130 Change the upload local file path in the UploadFile.cs ....................................................................... 139 Figure 4. 131 Results after the text file was uploaded to the Dropbox ...................................................................... 139 Figure 4. 132 AR Image Tracking and Load model .................................................................................................. 140 Figure 4. 133 Download ARFoundation, ARKit XR Plugin and other Unity XR packages ..................................... 140 Figure 4. 134 ARSession (Left) ARSessionOrigin (Right) ....................................................................................... 141 Figure 4. 135 Set trackable images in the “ReferenceImageLibrary”........................................................................ 141 Figure 4. 136 Assign prefabs to the “TrackedImageAndLoadModel.cs” .................................................................. 142 Figure 4. 137 “FirstPoint(Blue)” and “SecondPoint (Green)” prefabs ...................................................................... 142 Figure 4. 138 The model prefab “Watt Hall” ............................................................................................................ 142 Figure 4. 139 Set tracking images’ positions in both the virtual model and the real building ................................... 143 Figure 4. 140 Set the Pivot point at one door’s center ............................................................................................... 143 Figure 4. 141 UpdateARImage() to update the detected images’ location ............................................................... 143 Figure 4. 142 AssignGameObject() to get the tracked images’ locations and vector ................................................ 144 Figure 4. 143 AssignGameObject() to get the tracked images’ locations and vector ................................................ 144 Figure 4. 144 Tracked images’ X-Z Plane-Projection locations in Unity World Coordinate .................................... 145 Figure 4. 145 LoadModel () to locate the model based on the first and second images’ locations ........................... 145 Figure 4. 146 Result of loading the model based on the tracked images’ locations .................................................. 145 Figure 4. 147 Update BIM Model using Dynamo ..................................................................................................... 146 Figure 4. 148 Download text file from Dropbox ....................................................................................................... 147 Figure 4. 149 Use “Change location multiple.dyn” to update the Revit model ......................................................... 147 Figure 4. 150 Read and parse the text file ................................................................................................................. 148 Figure 4. 151 Get the coordinate numeric information .............................................................................................. 148 Figure 4. 152 Generate controlled lines based on the new coordinate to update the object’s location ...................... 149 Figure 4. 153 Result of updating the changed objects’ locations .............................................................................. 149 Figure 4. 154 Core script using in Dynamo and application development ................................................................ 150 Figure 5. 1 Watt Hall ................................................................................................................................................. 151 Figure 5. 2 A panorama image of the Watt Hall third floor ....................................................................................... 152 Figure 5. 3 Using TransBIM at Watt Hall third floor MBS Conner .......................................................................... 152 Figure 5. 4 Watt Hall Revit Model (Left), Watt Hall Third Floor (Right) without MEP objects .............................. 152 Figure 5. 5 Use BLK 360 scan one spot in the building ............................................................................................ 153 xiii Figure 5. 6 Recap Pro and Leica BLK360 ................................................................................................................. 153 Figure 5. 7 Point cloud in Recap Pro (Desktop) ........................................................................................................ 154 Figure 5. 8 Link RCP file in Revit and merge the model .......................................................................................... 154 Figure 5. 9 Floor plan with wide view range (from high to low) ............................................................................... 154 Figure 5. 10 Floor plan with a short view range around at a high level ..................................................................... 155 Figure 5. 11 Section view provided conduits’ elevations and sizes (Left) Plan view provided conduit’s layout and sizes (Right) ............................................................................................................................................................... 155 Figure 5. 12 Section View Shows the Duct’s Elevations and Sizes .......................................................................... 155 Figure 5. 13 The 3D View of Final Model (MEP disciplines)................................................................................... 156 Figure 5. 14 The 3D View of Final Model (Coordination disciplines) ...................................................................... 156 Figure 5. 15 Export FBX file from Revit ................................................................................................................... 157 Figure 5. 16 Convert object’s materials to standard and reset the pivot points ......................................................... 157 Figure 5. 17 The FBX files exported from Revit and 3ds Max ................................................................................. 157 Figure 5. 18 Result of assigning object’s spatial information to the newly created parameters ................................ 158 Figure 5. 19 Results of the exported metadata in Excel ............................................................................................. 158 Figure 5. 20 TransBIM scene in Unity ...................................................................................................................... 159 Figure 5. 21 Download Unity XR packages .............................................................................................................. 159 Figure 5. 22 Importing the “Object Parameters.csv” into Assets/Resources folder ................................................... 159 Figure 5. 23 FBX file import settings ........................................................................................................................ 160 Figure 5. 24 Set the overall pivot point of the Watt Hall prefab at the center of a door to scan the first image ........ 160 Figure 5. 25 Set the tracking images’ positions in both a virtual model and a real building ..................................... 161 Figure 5. 26 First Image and Second diagram ........................................................................................................... 161 Figure 5. 27 Define Custom scan Images in “ReferenceImageliabry” ...................................................................... 162 Figure 5. 28 Drag the FBX file prefab into the scene ................................................................................................ 162 Figure 5. 29 Unpack the Prefab ................................................................................................................................. 163 Figure 5. 30 Add script components to all FBX entities ............................................................................................ 163 Figure 5. 31 Create an empty GameObject ................................................................................................................ 163 Figure 5. 32 Move the empty GameObject to the center of a door ............................................................................ 164 Figure 5. 33 Find the door’s pivot location in the Unity world system ..................................................................... 164 Figure 5. 34 Make the “TransBIM Model” as the parent of Watt Hall FBX instance ............................................... 164 Figure 5. 35 Use the “TransBIM Model” in the AR Session Origin ......................................................................... 165 Figure 5. 36 The “TransBIM” model prefab was duplicated in the scene ................................................................. 165 Figure 5. 37 Generated the asset files that store each object’s metadata ................................................................... 166 Figure 5. 38 Create a Dropbox App folder ................................................................................................................ 166 Figure 5. 39 Create an App in Dropbox and generate an access token ...................................................................... 166 Figure 5. 40 Copy the access token to the “DropboxSync” in TransBIM scene ....................................................... 167 Figure 5. 41 TransBIM iOS setting ........................................................................................................................... 167 Figure 5. 42 Project Settings> Player Settings >iOS Settings> Other Setting (1) ..................................................... 168 Figure 5. 43 Project Settings> Player Settings >iOS Settings> Other Setting (2) ..................................................... 168 Figure 5. 44 Build the TransBIM Unity project to an Xcode project ........................................................................ 168 Figure 5. 45 The Xcode project built from Unity ...................................................................................................... 169 Figure 5. 46 Open the Xcode ..................................................................................................................................... 169 Figure 5. 47 Xcode version (Left) and iOS version (Right) ...................................................................................... 170 Figure 5. 48 Log in Apple Developer Account .......................................................................................................... 170 Figure 5. 49 TransBIM installed in the iPad .............................................................................................................. 170 Figure 5. 50 Using TransBIM to visualize duct on the Watt Hall third floor ............................................................ 171 Figure 5. 51 TransBIM features ................................................................................................................................. 172 Figure 5. 52 A Welcome Panel when opening the application .................................................................................. 173 Figure 5. 53 First and Second Image locations at Watt Hall building ....................................................................... 173 Figure 5. 54 Detecting the first and second image and put balls at the detected images’ locations ........................... 174 Figure 5. 55 Place the images on different doors that share the same centerline (1) ................................................. 174 Figure 5. 56 Place the images on different doors that share the same centerline (2) ................................................. 175 Figure 5. 57 Load the virtual model at the right location by clicking the “Load Model” button ............................... 175 Figure 5. 58 Ducts and conduits were overlaid on the real building .......................................................................... 176 Figure 5. 59 Use the “Filter Objects” to select what objects visible in the AR environment .................................... 176 Figure 5. 60 MEP components were clear to see after using the filter function ........................................................ 176 xiv Figure 5. 61 Select a Duct in the AR environment .................................................................................................... 177 Figure 5. 62 When selecting a new object, the old selected object’s color was changed to the original ................... 177 Figure 5. 63 Use the “Highlight Color” to switch the object’s color ......................................................................... 178 Figure 5. 64 Use the “Eye” and “Closed Eye” icons to unhide and hide objects ....................................................... 178 Figure 5. 65 View selected object’s parameter information ...................................................................................... 178 Figure 5. 66 Type a comment and mark in the Property Window(1) ........................................................................ 179 Figure 5. 67 Type a comment and mark in the Property Window(2) ........................................................................ 179 Figure 5. 68 Move a duct (Left); move a conduit (Right) .......................................................................................... 180 Figure 5. 69 Change the object’s movement by inputting decimal characters .......................................................... 180 Figure 5. 70 Change the duct diameter size ............................................................................................................... 180 Figure 5. 71 Click the “Upload data” button to upload the BIM metadata text file to Dropbox ............................... 181 Figure 5. 72 The uploaded text file in Dropbox ......................................................................................................... 181 Figure 5. 73 Use “Change location multiple.dyn” (Partial) to update the Revit model ............................................. 181 Figure 5. 74 Change location multiple.dyn ................................................................................................................ 182 Figure 5. 75 The Revit Model before updating.......................................................................................................... 182 Figure 5. 76 The Revit Model after updating ............................................................................................................ 183 Figure 5. 77 Using TransBIM in Watt Hall ............................................................................................................... 183 Figure 5. 78 TransBIM Features ................................................................................................................................ 184 Figure 6. 1 Methodology ........................................................................................................................................... 185 Figure 6. 2 Interaction between the user, Revit, and Unity 3d (BIM data circle) ...................................................... 186 Figure 6. 3 Overall Methodology .............................................................................................................................. 187 Figure 6. 4 Some features of TransBIM .................................................................................................................... 188 Figure 6. 5 Scripts Overview ..................................................................................................................................... 189 Figure 6. 6 Add script components to all FBX entities .............................................................................................. 190 Figure 6. 7 Generated asset files that store the object’s metadata ............................................................................. 190 Figure 6. 8 Unity Reflect ........................................................................................................................................... 191 Figure 6. 9 Upload Revit project to Reflect ............................................................................................................... 191 Figure 6. 10 Develop custom real-time BIM application by using the preview packages in Unity Pro .................... 192 Figure 6. 11 The updated duct in Revit was above the roof ...................................................................................... 192 Figure 6. 12 A type of duct fitting ............................................................................................................................. 193 Figure 6. 13 When moving a duct, the connected duct will automate moving or stretching ..................................... 193 Figure 6. 14 Connection warning if moving improperly ........................................................................................... 193 Figure 6. 15 Moving a duct does not move the connections properly ....................................................................... 194 Figure 6. 16 The connected ducts and fittings can be moved with the changed duct in Revit .................................. 194 Figure 6. 17 Extract Height information from string and convert the extracted string to float .................................. 195 Figure 6. 18 Define the ScriptableObject for duct category (Partial scripts) ............................................................ 195 Figure 6. 19 A proposal methodology ...................................................................................................................... 196 Figure 6. 20 Using TransBIM on site (Watt Hall) ..................................................................................................... 197 xv ABSTRACT Building information modeling (BIM) is a common type of software in the Architecture, Engineering, Construction, and Operation (AECO) industry. Although the benefits and the features of BIM have been extended, BIM is still primarily presented as digital data stored in the desktop systems in offices or outputted as 2D drawings. A gap remains between the digital BIM and the real-world objects described by it. In facility management (FM), the traditional way of facility maintenance is a time-consuming process that managers search an index sheet with detailed instructions to check the building assets’ information. Currently, there has been cases using BIM to integrate with the FM systems to help FM personnel manage the assets and the building operations. Some cloud- based BIM-FM tools even enable FM personnel to work on site by scanning asset references such as barcodes to retrieve needed information. However, these BIM-FM-integration tools still need manual work to do the process repeatedly. One trend to optimize the efficiency of facility management is by using Augmented Reality (AR) technology to overlay the whole building information model on-site, which can combine virtual information with the real environment. This study intended to build a tool that can be used when there are demands to modify a building information model on site. Different from most BIM-AR research and tools focusing on transferring BIM data unidirectionally from BIM software to the VR/AR platform, a workflow of exchanging BIM data bilaterally was established. A BIM- based mobile AR application, TransBIM, was developed to visualize, edit, and update the building information model in the AR environment, specifically for MEP systems. Also, TransBIM enables users to send the changed BIM data back to Revit to update the Revit model. USC Watt Hall and its Revit model were used as a case study to test TransBIM and the proposed workflow. Autodesk Revit, Dynamo, Autodesk 3ds Max, Unity, and ARSDKs (ARFoundation & ARKit), Dropbox were used to develop the mobile AR application (TransBIM) and to realize the entire workflow. The case study showed that TransBIM can overlay the 1:1 scale BIM model with the real building by scanning the on-site images. Users can filter objects by categories and click the virtual objects to retrieve the relevant BIM information. Most importantly, users can change an object's location and edit some parameter information such as MEP systems’ (ducts/conduits/ pipes) sizes, comments and mark in TransBIM. The changes made in TransBIM were saved into a text file (.TXT) with the modified BIM information and the text file can be uploaded to Dropbox from the mobile device. Finally, the BIM- metadata text file was downloaded in PC. Another developed Dynamo script was used to read the text file to update the Revit model. The Revit model (MEP systems’ locations and parameters) was correctly and instantly updated. Using TransBIM realized transferring the BIM metadata between BIM (Revit) and AR (TransBIM). TransBIM offers an opportunity for FM managers to visualize the existing MEP system’s BIM asset information and to modify the MEP model and information onsite. The Revit model can be automatically updated by reading the changed data. Moreover, TransBIM provides a new concept of managing and updating the Revit model during a building’s operating phase, which is using AR to modify a building information model on site. TransBIM can be a prototype for developing more sophisticated products that help asset management and model renovation design in FM. Keywords: Building information modeling (BIM), Mobile Augmented Reality (Mobile AR), Unity, Facility Management (FM) Research Objectives: • To establish a way to communicate BIM information, including geometric and functional data, between the BIM and AR environments. • To establish a workflow of integrating BIM with AR using Revit, 3ds Max, Unity, Dropbox, and Dynamo. • To create bi-directional operability of BIM data between BIM (Revit) and AR (Unity). • To create the BIM-based AR platform, TransBIM, for facility management and renovation design. 1 Chapter 1 1. INTRODUCTION This chapter introduces building information modeling (BIM), facility management (FM), virtual reality (VR), augmented reality (AR) and mixed reality (MR), and the application of AR in the architecture, engineering, and construction (AEC) and non-AEC industry. It overviews each technology and concept to find opportunities to integrate BIM and AR for facility management. 1.1 Building Information Modeling This section introduces the definitions and features of BIM and discusses the differences between BIM and CAD. Multi-dimensional information, BIM maturity level and BIM standard file formats for realizing BIM interoperability of BIM are also explained. 1.1.1 Building Information Modeling Building Information Modeling (BIM) is a term with multiple definitions. Initially, it is generally deemed as an evolutionary technology or a set of software compared to CAD (Computer-Aided Design), which highlights and represents the combination of the 3D geometries and the attributes of the building elements (Ghaffarianhoseini et al., 2017). Besides being defined as a set of 3D modeling programs, BIM is also considered as a concept, a process, an interacting policy or a variety of actives of how the BIM is applied to collaborate the architecture, engineering, construction, facility management industry (AEC/FM) (Eastman, 2011; Ford et al., 1995; Penttilä, 2006). BIM is a collaborative working style enhancing in design, build and management more efficiently based on digital technologies (HM Government, 2012). BIM is a methodology that integrates the necessary building information in the different BIM platforms to share the BIM database across the associated disciplines throughout the building's life cycle (Penttilä, 2006)). The integrated disciplines involve architecture, structure, mechanical, electrical, plumbing, and other specialties (Dossick & Neff, 2010). With the ability to share the BIM data via the exchangeable file format throughout different project phases, BIM enhances the opportunity for architects, engineers, contractors, on-site workers, and other participants to communicate and collaborate efficiently due to the convenience of sharing building information. In summary, because BIM stresses the collaboration of all stakeholders sharing information throughout the phases on a project to make detailed design, accurate plan, and efficient management, the benefits of using BIM include the following: • Reducing errors in design and construction documents • Improving the accuracy of estimation in cost and time and thus saving capital and time • Improving the coordination of different participants in a project • Identifying potential construction conflicts and enhancing the construction efficiency • Representing 3D geometry to enable clients and users to better understand the project (X. Wang et al., 2013) 1.1.2 BIM AND CAD Building Information Modeling (BIM) and traditional 2D or 3D Computer-Aided Design (CAD) are two different technologies. In traditional CAD, only the basic 2D geometric components like lines, arcs, and circles are provided. Users need to draw relevant shapes in different views to describe a building component because the geometric elements in CAD are independent in each view (plan, section, and elevation) (Ding, Zhou, & Akinci, 2014). CAD cannot automatically update that modified object 2D shape in other views (Ding et al., 2014). In contrast, BIM is an object-based tool with parametric components (Kensek, 2014). The model is considered a database; changes in one part of the drawing (for example, the width of a door in 3D) are updated anywhere that is the same object (for example, the width of the door in the schedule, elevation, plan, etc.). When creating or modifying an object in a BIM software, parameters will simultaneously be generated or added as attributes belonging to that object. All views, schedules, and dimensions are associated with building information objects. Therefore, when changing an object in one view, it changes the object itself and thus all other views related to this object will be updated automatically (Kensek, 2014). Autodesk Revit, a commonly used BIM software program, has different families that describe architectural, structural, and system objects. An object can change family types to adapt all features of that selected family type, and parameters can also be edited (Figure 1.1, Figure 1.2). 2 Figure 1. 1 Revit disciplines, categories, and family types (Autodesk Revit mechanical advanced sample) Figure 1. 2 Revit duct objects parameters This object-based tool feature enables BIM to be the data-rich model that embodies not only all geometries but also functional properties of the building elements (Ding et al., 2014). In fact, object-based tools and data-rich model are two outstanding characterizes of BIM. 1.1.3 Nd environment BIM Building Information Model (BIM) is not limited to presenting the physical characters (three-dimensional geometries) and functional characters of building elements (attributes are also known as parameters in BIM software, such as material properties, geometry volume, geometry area, etc.) (Ding et al., 2014). In fact, there is a term called multi-dimensional (nD) modeling or nD BIM (Ding et al., 2014). Some researchers regard nD BIM as the extension of BIM functionality (Lee et al., 2005) (Table 1). For example, 3D BIM can be expanded to 4D and 5D BIM when incorporating the dimensional aspects of time and cost (Taylor 3 & Bernstein, 2009). Integrating BIM with FM can be classified as 6D modeling. Likewise, nD BIM is where 3D BIM information is adding supplementary information to the nD environment for analysis and simulation purposes (Pärn, Edwards, & Sing, 2017). For example, some BIM software has been added various functions or add-ins to conduct engineering analyses and functions such as “scheduling, costing, quality, accessibility, safety, logistics, crime, sustainability, maintainability, acoustics, and energy simulation” (Ding et al., 2014; X. Wang et al., 2013). Table 1. 1 nD BIM (A portion of content in Table 1.1 is adapted from (Pärn et al., 2017) BIM Dimension Aspects Descriptions Relevant Stakeholders 3D (Geometry) 2D and 3D model, 3D includes “geometric presentation, parametric descriptions, and legal regulations associated with the construction of a building” Design Team 4D (3D+Time) Integrates scheduling and phases information to 3D BIM objects to plan and manage the construction process throughout the projects General and sub-contractor 5D(3D+Cost) Links cost associated information to 3D BIM objects Quantity surveyor 6D (3D+Facility Management) Combines facility management records to specific 3D BIM objects for evaluating facilities during the life cycle Owner, facility manager Nd (3D+ … other functions) Brings other potential aspects or functions as new dimensional information into 3DB IM objects Engineer, technical consultant, and other stakeholders On the other hand, some researchers argue that nD can be used in defining the different maturity levels of BIM (Porwal & Hewage, 2013). The extension to the different dimensions of BIM only works if there is a high level of interoperability possible between the software programs. 1.1.4 BIM maturity levels As the technology and standard of BIM mature and become widely used in the AEC/FM domain, additional related businesses have introduced BIM and worked together with BIM. However, the rates of how disciplines use BIM remain different (Porwal & Hewage, 2013). Therefore, the BIM maturity level was developed to instruct BIM adopters on how to manage the process during the change from their “internal organizational interfaces with external supply-base and clients” (Porwal & Hewage, 2013). The United Kingdom and the United States use different prototypes of the BIM maturity level. In the United Kingdom, Mark Bew and Mervyn Richards first developed the BIM maturity model (Porwal & Hewage, 2013). It is also known as iBIM named by its highest level or called the BIM Wedge derived from its shape (Figure 1.3) (Porwal & Hewage, 2013). Although there are different variants based on the prototype, the basic level is same. In the BIM maturity model, the maturity developments are classified into 4 levels (level 0 – level 3 and beyond level 3) (Table 1.2). The maturity model basically specifies what requirements to be done to comply with BIM criteria by different levels (Porwal & Hewage, 2013). 4 Figure 1. 3 BIM maturity levels U.K. (Tekla, 2019) Table 1. 2 BIM maturity levels (BIM Wiki, 2019) Level 0 2D CAD-based on two-dimensional objects without data management Level 1 May include 2D and 3D conceptual objects in a managed CAD environment Level 2 Models with attached data are managed in a 3D environment and separated as different entities by disciplines. The involved data may include 4D (construction sequencing) and 5D (cost estimating) information Level 3 An integrated BIM model with all disciplines, stakeholders, and dimensions (time, cost, lifecycle information) Above level 3 Use the data for future possible use In the United States, Model Progression Specification (MPS) developed by the American Institute of Architects (AIA) helps AEC practitioners to specify and communicate BIM information clearly and reliably during different phases, milestones, and deliverables (Table 1.3) (BIM Forum, 2018). The core of the MPS is the level of detail or level of development (LOD) (Porwal & Hewage, 2013). In fact, the level of detail and level of development are two interpretations (BIM Forum, 2018). The level of detail is what details involved in the element model, while the level of development is how much the degree to the element’s geometry and attached information has provided and may be used by the project team (BIM Forum, 2018). Table 1. 3 Fundamental LOD Definitions (Table 1.3 was adapted from the American Institute of Architects, AIA-G203-2013 and AIA-202 element model table, and BIM Forum 2018 Level of Development Specification Guide (LOD Spec 2018 Guide) Level of detail Phases Model content requirements Examples (Supply Air duct) Client requirements Words descriptions, lists of contents Non-geometric representations 100 Conceptual geometry “Diagrammatic or schematic model elements; conceptual and/or schematic layout/flow diagram” 2D representations 5 200 Basic Design Generic elements (recognizable placeholder) shown in 3D with approximate information: Maximum Size, Rough Quantities, Purpose, location, to orientation and approximate access/code 300 Detailed Designed Geometry Specific elements with confirmed information: shape quantity, size, location and orientation of specific elements 350 Actual Geometry and codes Model elements “enhanced beyond LOD 300 by the addition of information regarding interfaces with other building systems”. 400 Fabrication A model with sufficient information that is accurate for fabrication /shop drawing; actual information such as quantity, size, shape, location, and orientation. 500 As-built “A field verified representation in terms of size, shape, location, quantity, and orientation” (AIA-g203) As-built model with operation information 1.1.5 Industry standardization and interoperability As BIM is used for different stakeholders to do specific activities at various stages, such as design, construction, procurement, maintenance, and operation, the interoperability of BIM software programs is an issue. Interoperability means the BIM data can be exchanged through multiple BIM programs so that users can communicate and work with the shared information. However, a native file format of a BIM software usually cannot be opened in another BIM software. Even though each BIM software can export and import another format, data loss sometimes happens when translating a native file format to another one. For example, Revit uses RVT as a native file format and supports a variety of industry standards and file formats (Table 1.4). When using a Revit model to do energy simulation in another BIM energy simulation program, such as IES-VE, the RVT file cannot be opened and the exported gbXML file format loses some information compared to the native Revit file. Table 1. 4 File format supported by Autodesk Revit Architecture (Retrieved from Autodesk Support:https://knowledge.autodesk.com/support/revit-products/learn- explore/caas/sfdcarticles/sfdcarticles/Standards-and-file-formats-supported-by-Revit.html) Revit native formats RVT, RFA, RTE, RFT CAD formats DGN, DWF, DWG, DXF, IFC, SAT, and SKP Image formats BMP, PNG, JPG, JPEG, and TIFF Other formats ODBC, HTML, TXT, and gbXML 6 To make interoperability easier, adapting a standard or protocols in a common language is used to exchange data across different software (Revit IFC Manual, 2018). Currently, the Industry Foundation Class (IFC), established by the buildingSMART International, is the most supported data exchange standard in terms of the BIM interoperability issue (Porwal & Hewage, 2013). Until now, the buildingSMART International has been developing IFC standard into many versions and formats. For example, Autodesk Revit 2017 or higher version supports 3 versions of IFC: IFC 4, IFC 2X3, and IFC 2X2. It is recommended that using a higher version provides a better representation of complex geometries (Revit IFC Manual, 2018). However, the limitation to use the IFC file is that only specific programs with IFC interpretation functions such as BIM software or different IFC viewer can read IFC into 3D models with elements’ metadata (Figure 1.4). While it is possible to open an IFC file via Notepad or Internet Explorer, it is hard to understand what the building model looks like and what values of object parameters are from the plain text (Figure 1.5). Therefore, although an IFC file stores the all necessary geometric and functional data of a building information model, it is not easy to parse the data and retrieve the needed information. Figure 1. 4 IFC opened in an IFC Viewer program (BIMer) Figure 1. 5 IFC file in Internet Explorer 7 1.2 Facility Management This section explains the main tasks of facility management, a comparison of traditional FM and current FM. The role of FM is increasing as FM accounts for most of the time (all post-construction phase) and the cost of a project in the life cycle. The potential of using BIM in FM and some FM systems are introduced also. 1.2.1 Facility Management Facility management is a process during the operating phase of a project in which owners or facility managers maintain the building equipment and systems to ensure the various functionalities of the building as well as controlling the cost during operation (ISO 41011: Facility Management, 2017). The facilities refer to those already built or installed components with certain functions serving the buildings (International Facility Management Association, 2019). Therefore, the major job of a facility manager is to check the safe operation and to maintain the properties of an organization (Hosseini el at., 2018). The properties include building fabric, decoration, lighting fixtures, ventilation, and air conditioning (HVAC) system, building control systems (fire safety system), mechanical, electrical and plumbing (MEP) system, equipment, and appliances, etc. (International Facility Management Association, 2019). Most of these cases are physical maintenance; there are other categories of such as cleaning, recycling, security, and pest control. Traditional facility management is not a core job compared to the AEC domains. It only supports ancillary works, such as ensuring a building’s normal working and future use. However, current FM domain becomes more important because the business value in the building lifecycle is an important task (Nicał & Wodyński, 2016; Wang et al., 2013). Compared to the AEC tasks, FM tasks account for over 80% total cost of a project over the whole lifecycle of a building as well as the longest period in a project’s life cycle (Costin el at., 2012). For example, a project may take 2-5 years to design and construct, but the operating and maintenance can last 20 years or more from when the building begins to be occupied (Kensek, 2015). As the owners or facility managers need to sustainably and regularly check the equipment and update information over the years, improving the efficiency of facility management is a critical issue. One way to improve efficiency is to build a facility management system. The conventional “handover process” to build a facility management system is very time-consuming. It can cost several weeks to collect and record useful information from the design and construction drawings to management systems (Nicał & Wodyński, 2016). 1.2.2 FM Systems This section introduces some FM systems computerized maintenance and management system (CMMS), building automation system (BAS), engineering document management system (EDMS), and integration software. Each system is independent and has its distinctive purposes and specialties to support FM personnel. Computerized maintenance management information system (CMMIS) is a software program design for maintenance management and administrative functions. The CMMS systems maintain a database storing necessary information for facility personnel to index, record, update the information of building assets. Because it is a computer-based system, CMMS increases the facility personnel productivity by reducing the amount of manual paperwork and instead of core activities of the systems interface. (Ramachandra et al, 2012). Some of the general advantages of using CMMS include controlling work orders, tracking resources of the executed tasks, and providing information to control the ongoing actives (Parsanezhad & Dimyadi, 2013). A building automation system (BAS) or building management system (BMS), is a computer-based control system that controls the mechanical and electrical systems of a building system with a series of network-linked software and hardware (Engineering Document Management with Meridian, 2018). It controls 5 systems of building systems: heating, ventilation and air conditioning (HVAC), lighting, security and fire systems. The BAS database recorded substantiate data such as temperature, humidity, flow rate, pressure, power, control signals, and the status of equipment (Xiao & Fan, 2014). As BAS systems become more advanced to collect detailed levels of facility data, the connection BAS and BIM can achieve more resourceful information to help management and decision-making. EDMS is a software program to manage engineering documents and drawings. The key functions of EDMS mainly include document management, work orders, write comments, and generate photos. EDMS can keep the tracks of changes and revise documents to guarantee the on-site project teams can access the updated accurate information. One of the benefits of EDMS is it manages the version of documents, which even though the documents are derived 8 from different sources, for example, different version CAD files, the metadata of the documents are maintained in the systems during the document maintenance (Engineering Document Management with Meridian, 2018). 1.2.3 BIM and FM BIM can be used to make facilities management more efficient. A building information model is comprised of types of objects that are represented by 3D geometries with relevant attributes and functional information and can be shared into different programs in certain exchanging rules. These indicate two of the key features of BIM-- data-rich model and interoperability. The two features promote a variety of functionalities, not only limited in such as “quantity take-offs, cost estimating, space and asset management, and performing energy analyses” (Teicholz, 2013) but also making BIM an ideal platform to enhance FM activities (Liu & Issa, 2013). BIM brings the potentials for owners and facility managers to more efficiently perform FM tasks. For example, with the feature of the data-rich model of BIM, even though some facilities managers and owners may not be trained in understanding drawings or as-built documents to retrieve information, they can easily retrieve the needed information from the virtual model of a physical facility asset in BIM (Teicholz, 2013). Moreover, because BIM has become a more common useful tool in AEC, reusing the virtual model and the objects’ affiliated data stored in BIM for facility management can save time and cost. The interoperability of BIM lets the BIM software of different disciplines (Architecture, Structure, Systems, etc.) work together in a collaborative system that stores the shared data. Using BIM in FM can let the FM personnel track different disciplinary information because a building information model contains architectural, structural, MEP components. Also, BIM provides the opportunity for facility managers to easily understand components’ spatial location and relationships by visualizing the building geometries and attached attributes of the maintaining systems and equipment. Using BIM throughout the building lifecycle, some of the design and construction information can be quite useful for FM personnel to better understand a building, analyze operating systems and conduct decision-making. Using BIM for FM is not that much simpler than using BIM in AEC. Generally, there is no established best way to use BIM in FM (Teicholz, 2013). Driven by different situations where projects have individual “organizational mission, facilities requirements” and level of detailed information that the facility management organization needed, FM systems software and BIM programs can vary. Integration requires an interface using integration software to connect different systems and ensure that they do not conflict with each other. When integrating BIM and individual FM systems, the integration software should be capable to combine the information from the isolated FM system software and BIM software into a centralized location (Hunt & Betancur, 2015). Currently, many integration software products are cloud-based and can be accessed from mobile devices such as tablets or smartphones. These cloud-based products optimize the on-site activities (e.g. construction or maintenance), where users can access systems’ information by logging in their account via portable devices. Two examples of commercial cloud-based integration software are EcoDomus and BIM 360. EcoDomus software can link BIM with real-time facility operations data (BAS) and facility management software. The most popular products are EcoDomus PM and EcoDomus FM. EcoDomus PM focuses on the BIM and facility data management in new construction or building renovation (EcoDomus, 2019). “EcoDomus FM focuses on real- time integration of BIM model with BAS (such as "Siemens, Honeywell, Johnson Controls, etc.), with Facility, Maintenance, and Workplace management software (such as Maximo, Unifier, TRIRIGA, ARCHIBUS, Planon, AssetWorks, FAMIS, webTMA, Corrigo, or others), and Geographical Information Systems (like ESRI ArcGIS and Google Earth)” (Figure 1.6) (EcoDomus, 2019). 9 Figure 1. 6 BIM-BAS integration in EcoDomus FM (Retrieved from https://ecodomus.com/portfolio-archive/bim-and-bas/) Autodesk BIM 360 is a series of cloud-based programs that can easily display BIM data via mobile devices on-site. It has compatibility with other Autodesk products. BIM 360 contains a series of products for specified purposes to support diverse areas of specialties and project phases. For example, BIM 360 Glue facilitates BIM management coordination and BIM collaboration, while BIM 360 Field focuses on field data management (issue, tasks, daily project updates, and purchase). When uploading an RVT file on BIM 360, users can see models, views (plans, elevations, and sections), and objects information in a similar interface as they would be shown in Revit. Such platforms improve project delivery and assist users easily retrieving up-to-date information via mobile devices throughout the projects’ lifecycle. Figure 1. 7 BIM 360 Glue iPad app allows users to review, annotate and send notifications to team members (Retrieved from https://www.forconstructionpros.com/construction-technology/project-management/blog/11309218/autodesk- bim-360-glue-goes-mobile-with-ipad-app) 10 1.3 VR, AR, MR Three technologies, VR, AR, and MR are demonstrated. Some hardware and software development tools, and game engines to make VR/AR/MR are compared. 1.3.1 Virtual Reality Virtual reality is defined as a computer-generated simulation of three-dimensional images of an environment that someone can interact with via electronic devices (Kim, 2013). Also, it is defined as a special human experience that is enabled by being in and perceiving a given environment (Steuer, 1992). However, whether VR is considered a computer technology or human experience, the idea of VR is the same, to make an imaginary scene by computer simulation so that users can have real sensations in the virtual world. (Gutierrez et al. 2008). To make the user sense the virtual reality, four basic elements of VR the virtual environment, virtual presence (immersion), sensory feedback (as a response to the user’s actions) and interactivity (Sherman, 2013). 1. Virtual environment includes all content that is generated by the computer. The objects are presented via multiple approaches including “visual, aural and haptic” way to realize users’ vision, hearing, and touch in VR (Mihelj et al., 2014). 2. Virtual presence (immersion) can be divided into physical virtual presence and mental presence. Physical presence means the users can feel their body physically being in the virtual environment, while mental presence indicates users psychologically feel the virtual environment. It usually requires additional media to achieve some physical presence such as audio, and haptic since a virtual environment does not provide any actual sensory experience, other than vision. With additional media, users would receive multiple sensory stimuli while they are in a virtual environment. For example, when they move close to a virtual object, the object becomes larger, the sound of the object becomes louder and closer, and even users may feel they are touching the virtual object if they wear haptic devices. As for mental presence, a more realistic virtual environment can give users higher mental satisfaction of being present (Mihelj et al., 2014). 3. Sensory feedback is an indispensable element of virtual reality. The virtual reality system, for example, VR headsets, tracks the user’s position within a given space to simulate the user’s point of view and provide other sensory changes based on their physical location (Mihelj et al., 2014). 4. In virtual reality, computer simulation must respond to users’ actions. To make the computer interactive, either the user can affect the simulated environments, or users can move around and change their viewpoints in that environment (Mihelj et al., 2014). The first practice of VR can be traced to the 1950s. Morton Heilig, who some consider the father of virtual reality, developed and patented a machine called the “Sensorma” in 1957 (Table 1.5). Another important device of VR is the first head-mounted display called the Philco HMD. It can be regarded as the first prototype of telepresence, which did not provide a virtual environment but showed a video from a distant physical world. In 1968, Ivan Sutherland developed the first head-mounted display (HMD) presenting the virtual environment, the Sword of Damocles (Mihelj et al., 2014) (Table 1.5). Several implements of VR were developed during the 1970s and 1980s, and the formal term of VR was established in 1989 (J. Lanier). In the 1990s, one of the well-known milestones of the development of virtual reality was CAVE (Cave Automatic Virtual Environment) (Table 1.5). Since 2000, various applications of VR have been implemented in different industries, and the demand for VR has been promoting the development of VR related devices (Wang et al. 2008). Table 1. 5 A brief history of VR (Dom, 2018; Gutiérrez A., Vexo, & Thalmann, 2008; Mihelj et al., 2014) 1956 The Sensorma developed by Morton Heilig (Mihelj et al., 2014). Sensorma was a virtual bicycle that gave users multiple sensory experience including three- dimensional city view, the city sounds, and feeling of wind and vibrations (Mihelj et al., 2014). [1] 11 1968 Ivan Sutherland developed the Sword of Damocles HMD. It was an eyeglass with two screens for each eye working together to give user an illusionary three- dimensional scene (Mihelj et al., 2014). [2] 1979 McDonnell-Douglas Corporation developed the VITAL helmet, an HMD integrated VR, for military use [2] [2] 1982 VPL Research released its VR equipment, such as VR goggles and gloves, to the public market[2] [2] 1989 NASA developed the Virtual Environment Workstation Project (VIEW) for training astronauts (Gutierrez et al. 2008). [3] 1991 Antonio Medina designed a VR system to drive the Mars robot rovers from Earth [2]. [2] 1992 The first CAVE, projecting virtual environment on walls, was invented by the University of Illinois, Chicago in 1992 (Gutierrez et al. 2008). The CAVE was a room with interior walls as projection screens displaying the projected virtual scenes. [4]. After 2000 Various VR head-mounted displays released in the consumer market and applied in different industries such as Google Cardboard, HTC Vive, Oculus Rift, and Samsung Gear VR [5][6][7][8] [5][6][7][8] [1] (Mihelj et al., 2014) [2] https://virtualspeech.com/blog/history-of-vr [3]https://www.nasa.gov/ames/spinoff/new_continent_of_ideas/ ; [4] https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment#/media/File:CAVE_Crayoland.jpg; [5] https://store.google.com/product/google_cardboard ; [6] https://www.vive.com/us/ ; [7] https://www.amazon.com/Oculus-Rift-Virtual-Reality-Headset-Pc/dp/B00VF0IXEY [8] https://www.samsung.com/global/galaxy/gear-vr/ 12 1.3.2 Augmented Reality Augmented Reality (AR) is an emerging technology based on the maturity of VR development (Li et al., 2018). The idea of AR is not users sensing the virtual objects in a virtual environment, but users sensing (usually visualizing) both the user’s real-world surroundings and the virtual objects simultaneously. AR enables additional computer- simulated information overlaying on to a users’ view of the real world via a certain media, such as a smartphone’s screen. This technology can satisfy users’ perception of the spatial connection between the virtual objects and the real-world entities. As a result, the real world is augmented in some way (Chi et al., 2013). Besides augmenting the reality with virtual models or images, other virtual objects used in VR such as voice recordings, animation, and video can also be applied as the simulated information to increase the feedback of AR (Hou et al., 2013). For example, a video source can be put in the real environment as if the video is existing in the air. The biggest difference between VR and AR is where the computer-simulated virtual objects are located. In VR, all virtual objects are isolated from the real world into a virtual environment. This creates a gap between the virtual objects and the real-world objects. AR combines the two worlds so that the user not only senses being in the real world but also accesses the flexible virtual objects (Hou et al., 2013). Users can interact with both virtual and real environments. For example, users can control the virtual components freely or move their perspectives. It requires the AR system to constantly identify the users’ position in the real-world environment and put the to-be simulated virtual components on a specific location. Because the advantage of AR is to see both the real physical world and additional computer-simulated object and information, AR technologies have been successfully implemented into various applications such as design, education, medical treatment, design, manufacturing, military, advertising, and social communication, construction, entertainment. etc. (Kim et al., 2013; Mihelj et al., 2014). Some examples are discussed in Chapter 1.3.4. AR is generally composed of five fundamental technological components: visualization, input device, output device, trackers or tracking devices, and the computing unit (Figure 1.8) (Wang et al., 2013). Figure 1. 8 Technological components for an AR system (Wang et al., 2009) The media representation is what type of virtual objects displaying on the AR screen. It includes expression methods such as text descriptions, animation, and 3D models to convey useful information (Kim et al., 2013). An AR output, also known as a display device, is how the virtual objects are presented from the devices to users. A display screen is the most common output. For example, using the HoloLens or AR smartglass, the screen can show the virtual scene overlaying the real context. Besides enhancing the visual displaying effect, the human sense of touching, hearing, smelling, etc. are also AR output options. The input devices can be any devices that can constantly communicate to the AR computing system and send commands, for example, keyboard, microphone, or a head-mounted device that can recognize gestures or voice. A computing device can be a phone, computer, or an AR/MR device like HoloLens to execute incoming commands and generate virtual graphics. As for trackers, also known as a tracking system, it refers to the device cameras or sensors to track the real-world environment to correctly put virtual objects in physical world location. Therefore, it is better to use a higher accuracy of a tracking system to detect the users’ location and real-world environment. In fact, 13 the limitation of sensors and trackers are the most significant factor that hinders the effective development and use of AR systems (Kim et al., 2013). Overall, AR is realized by integrating the 5 elemental technologies (visualization, input device, output device, trackers or tracking devices, and the computing unit) : a computing device continuously receives and processes the real-time commands from both the input device and the tracking system to generate simulated virtual representation (feedback) through the AR display device. There are two types of augmented reality in practice: marker-based AR and markerless (Figure 1.9, Figure 1.10). Marker-based AR needs one or more particular markers to decide where the virtual objects are attached to the physical environment. Markers can be anything that a marker-based tracking toolkit can detect, such as pictures, models, objects, QR codes, etc. (Figure 1.9 Left). Some of the marker-based AR toolkits are ARToolkit, ARCore, ARTag, etc. The difference between the two is that the marker-based AR is detecting the marker’s location to put the virtual components in the AR environment, while the markerless system can detect the physical environment and to put the virutual components on the detected surfaces. Figure 1. 9 Markerless Augmented Reality applications Ikea place augmented reality application ( Retrieved from http://highlights.ikea.com/2017/ikea-place/) Figure 1. 10 Marker-based AR Use the picture (marker) on the door to locate the 1:1 scale BIM model on the real building As the technology of AR has advanced in the past decade, markerless AR applications have been created (Kim et al., 2013). Markerless AR is also known as location-based AR. It does not require the pre-defined reference (markers) for virtual object registering and positioning but relies on more complex algorithms and computational processes to understand the real environment. Therefore, the markerless AR can directly put the virtual objects on the real environment based on the virtual objects’ designed location or based on the analyzed result of the tracked scene (Chi 14 et al., 2013). In recent years, as many mobile context awareness methods were developed, markerless AR has been frequently used for mobile augmented reality applications (Kim et al., 2013) (Figure 1.9 Right). The SLAM (simultaneous localization and mapping) technics are the most advanced AR technics that can track users’ surroundings in order to localize the device. When the virtual objects are rendered realistically, the combined scene with the real-world scene and virtual-world scene can blur users’ views and make it difficult for users to judge what are virtual or real objects. 1.3.3 Mixed Reality The term Mixed Reality (MR) is a somewhat vague definition and overlaps with augmented reality. The definition of mixed reality is not as precise as virtual reality and augmented reality. Generally, the distinction between VR and AR depends on whether the virtual objects are displaying in the virtual environment or overlapping on the real environment. Some researchers have defined MR as the continuum between a “real environment” and a “virtual environment” (Milgram & Kishino, 1994) (Figure 1.11). In the “Milgram and Kishino Reality-Virtuality Continuum”, augmented reality refers to the mainly “real environment with some virtual aspects”, while Augmented Virtuality (AV) is mainly the “virtual environment with some real aspects” (Milgram & Kishino, 1994). Reality and virtuality respectively mean the purely real environment and the purely virtual environment. Between the real environment and the virtual environment is the boundary of mixed reality. Figure 1. 11 Reality-Virtuality Continuum (Milgram & Kishino, 1994) In this case, augmented reality can be regarded as the subcategory of mixed reality. For example, some of the augmented reality superimposes the virtual environment exactly on its described real-world objects. Therefore, when there is an augmented reality, mixed reality may also be considered. The only difference between mixed reality and augmented reality is that mixed reality emphasizes building the relationship of the existing object and virtual objects to enhance the “user's interaction capabilities,” while some augmented reality applications only improve a user's experience of the real world with the supplementary virtual information (Holz et al., 2011). Some users just call it AR-MR to avoid having to choose one term over the other. 1.3.4 Hardware and Software Development Kit (SDK) for AR/MR Smartphones and tablets were the first two types of mobile terminal devices to use AR. As more advanced technologies and sensors, smart glasses and AR headsets come to be, especially since 2017. One distinctive advantage of smartphones and tablets is they are widely spread among users. Some examples include Apple iPhone and iPad (iOS), Samsung Galaxy S Series and Galaxy Tab Series (Android), and the Huawei P-Series. AR optimized smart glasses and headsets are self-contained computers with complex hardware and software modules to recognize and analyze the users’ environment (Table 1.6). These devices can give a translucent surrounding view with a virtual scene. It can generate visual and auditory digital data, and users can not only see the real environment through the glass but also the digital generated visual and auditory effect. Some of the headsets come with Wi-Fi connectivity without any cables, which enable users to walk freely while wearing it. For example, Microsoft 15 HoloLens 2 can execute hand tracking and real-time eye-tracking, as well as voice control so that users can use gestures or other commands to manipulate the virtual holograms (HoloLens, 2019). With advanced sensors such as accelerometer, gyroscope, cameras, and etc., HoloLens can track users’ positions, and map the environment mesh in real-time. Table 1. 6 A list of examples of AR/MR Headsets and Smartglasses Microsoft HoloLens 2: Magic Leap One Meta 2 Augmented Reality ODG (Osterhoutgroup) AR-Smartglasses R7, R8 and R9 Series To create AR/MR applications in the AR devices, an AR software development kit is needed. An AR/MR software development kit (SDK) or development toolkit is a collection of tools for developing AR/MR applications. One SDK normally works for the specific hardware and operating systems. AR/MR SDKs help developers easily develop their AR/MR applications by using the preset algorithms in the packages. For example, ARKit is one of the open-source AR SDK for iOS systems (iPhones and iPads). It provides developer tools to easily realize necessary AR functions such as “device motion tracking, camera scene capture, advanced scene recognizing and processing, etc.” (ARKit, 2019). There are many augmented reality software development kits (SDK) in the market for mobile AR development, for example, Apple ARKit, Google ARCore, Vuforia, etc. Most of these development kits are markerless AR. A comparison of the AR development toolkit list shows some features that are similar and others dissimilar (Table 1.7). Table 1. 7 AR SDK Comparison (https://thinkmobiles.com/blog/best-ar-sdk-review/) AR SDK Supported platforms: Specialties ARKit ARKit2 ARKit3 iOS system 11 iOS system 12 (iPhones and iPads). • multiplayer sharing the same AR experience (ARKit2); • 2D image detection and tracking (image, QR code, signs); • recognition of spaces and 3D objects ARCore Android 7.0 and higher, iOS 11 or higher. • motion tracking: real-time position tracking (camera position relative to surroundings); • environmental detecting: detecting size and location of horizontal, vertical, and angled surfaces 16 • lighting measuring: estimating the real-world lighting conditions Vuforia Android, iOS, UWP and Unity Editor. • recognition of the different types of visual objects such as a box, cylinder, text and environments recognition; • cloud recognition: scan a target (3D object or image) and recognize it by searching in the local or cloud database, and download online asset Wikitude Android, iOS, Windows for tablets, smart glasses (Epson Moverio, Vuzix M100, ODG R- 7). • environment recognition • new extended recording and tracking of objects (scan and see augmented objects beyond markers), • Unity live preview (AR-view feature into Unity editor to test the SDK features) • Windows support. Unity ARFoundation (3.0) Android 7.0 and higher, iOS 11 or higher. • world tracking: track the device's position and orientation in physical space. • detect horizontal and vertical surfaces and physical feature points. • light estimation: estimates for average color temperature and brightness in physical space. • Face tracking • 2D image tracking (ARFoundation, 2020) AR Foundation is a Unity library providing a multi-platform solution for XR developers (AR Foundation, 2020). It allows users to use the same code on multiple platforms. Users need to install the corresponding XR Plugin in Unity for specific platform build (e.g., ARKit XR Plugin on iOS or ARCore XR Plugin on Android), and AR Foundation can automatically translate its languages to other platform languages to build applications. 1.3.5 Game Engine A game engine makes game development easier and more efficient. It typically provides functions including rendering 2D and 3D graphics, physics simulation, collision response, audio, graphical user interfaces (GUI), scripting, artificial intelligence, and etc. (Eberly, 2007; Fritsch & Kada, 2004). Without using a game engine, game developers would have to hard code all necessary systems such as physical simulation to make game objects move realistically or to make collisions between game elements (Unity Engine, 2019). Another distinct benefit example the game engines provide is to simplify the process of rendering graphics. Without a game engine, creating a smooth surface should go through the process, first creating a bunch of vertexes, then converting the vertexes into the triangulated mesh, and finally smooth the 3D mesh (Figure 1.12). 17 Figure 1. 12 Face Rendering example (a) the 3D vertexes, (b) the triangulated mesh, and (c) final smoothed mesh (Retrieved from https://www.researchgate.net/figure/Rendering-examples-of-the-facea-the-3D-vertexes-b-the-triangulated-mesh- and-c_fig4_257879204) Unity and Unreal are two popular game engines in the Game/XR domain. Unity is a licensed game engine that has become popular in the last seven years (Peters et al., 2016). It gained much favor from small companies as it is easier to start and Unity offered free license tier when Unreal cost a lot. The licensing and cost of both have changed over time. In contrast, the Unreal Engine had been widely used for over two decades only by large game companies due to the high license fee until 2014 to 2015 when Epic open-sourced Unreal. The latest version of Unreal Engine, Unreal 4, is available to the general public (Peters et al., 2016). Both engines are powerful, for example, the latest version of Unity 2019 can build games for more than 25 different platforms such as PS4, Steam, Android, IOS, and etc. (Unity, 2019). There are several differences between Unity and Unreal (Figure 1.14). Figure 1. 13 Game Engine Comparaison (Peter et al., 2016) (Yang et al., 2017) The user interface of Unity is comprised of 6 windows (Figure 1.14): The Hierarchy window contains all GameObjects in the current Scene Window. Users can select and edit these GameObjects such as models, lights, and colliders through the Scene Window. The Game View is the preview of how the game will look like. The Project Window contains all the assets that belong to the project. A Unity asset is an item that can be used in the developing Game or Projects. Assets include the files created outside of Unity, such as 3D models, audio files, images, and the files created within Unity, such as an Animator Controller, an Audio Mixer or a ScriptableObject. The Inspector window shows and manages the properties of any selected GameObject, Asset, or Setting. The console window presents deployment results (errors, warnings, and other messages). 18 Figure 1. 14 Unity User Interface 1.4 AR/MR implementations AR/VR applications are being implemented in different industries. A good example is the popular mobile game, Pokémon Go, where players can capture the Pokémon, virtual creatures, displaying on the player's real-world location (Figure 1.15). Figure 1. 15 Pokémon Go (Retrieved from https://www.imore.com/best-places-use-ar-mode-pokemon-go) Below list some cases using AR in different industries (Table 1.8). Table 1. 8 AR in different industries Education “Human Anatomy Atlas” is an AR application that allows medical students to learn anatomy in depth; providing more training opportunities for medical students; practicing surgeries on virtual patients. Manufactory Boeing uses the Skylight app Upskill for manufacturing. It allows technicians to wear AR glasses to identify the correct wire number using Skylight. Results showed Boeing cut down 25% production time and reduced the error rates to nearly zero. 19 (https://rubygarage.org/blog/augmented-reality-in- education-and-training) (https://upskill.io/landing/upskill-and-boeing/) Space industry NASA is using HoloLens AR headsets to build its new spacecraft faster. It can save time for engineers and works to assemble parts based on seeing the virtual models with information and instructions instead of reading paper instructions. (https://www.technologyreview.com/s/612247/nasa -is-using-hololens-ar-headsets-to-build-its-new- spacecraft-faster/) Navigation Yelp offers AR in their app to show the virtual directions and guide of places such as cafes and restaurants overlaying the real surroundings. (https://www.houstonchronicle.com/techburger/article/The -strange-tale-of-Monocle-the-AR-pioneer-12371889.php) Architecture industry AR can help architects to visualize and collaborate 1:1 scale BIM model on-site for building evaluations. (https://www.youtube.com/watch?v=r0ZmaljQlW8) Construction industry Contractors can use AR to see the overlaid as-built model to do layout visualization, inspections, and quality control, for example, overlaying the MEP systems’ layout as guidance for the construction assembling. (https://mixedreality.trimble.com) 20 1.5 Development Tool There are many BIM, AR, and other software programs. This section focuses on Unity, ARFoundation, ARKit, Revit, the Revit API, and Dynamo. Unity, ARFoundation and ARKit can be used to create an iOS mobile AR application. According to Chapter 1.3, Unity is easy to start and provides free personal licensed to create games. It can support build the games or applications for smartphones (Android or iOS platform). Unity also provides XR (VR/AR/MR) packages for users to easily build their applications with AR features. By using the ARFoundation and ARkit Unity packages downloaded from the Unity package manager, an iOS-platform AR application can be made. Autodesk Revit is a leading BIM software program that can provide to provide BIM data (geometry and building components’ functional information) to an AR environment. Using Revit is reasonable because it is one of the most common software programs used in AEC/FM during a project’s life cycle. It also offers an API and Dynamo for users to develop custom functional tools. Moreover, Revit can export multiple file formats read by other software, especially in other Autodesk software. Revit Application Programming Interface (API) is a tool to develop additional functions or features as plugins in Revit. It is mainly used in cases such as automating repetitive tasks, extending the Revit functionality in simulation, computational design, and BIM management. The Revit .NET API allows users to write customized algorithms in any .NET compliant language including VB.NET, C#, and C++/CLI. Autodesk also provides Revit Software Development Kit (SDK) with code samples, documents to guide users to use Revit API, for example, the documentation of the Revit API functions and syntax of namespaces, classes and methods (Figure 1.16). A namespace is a collection of specified classes, which includes multiple methods, while a method is the lowest level of the hierarchy, containing a series of executable statements. Figure 1. 16 The hierarchy of namespace, class, and method (left) and the syntax of ZoomToFit method (right) A plug-in is a software component added in a software program to provide software specific features or functionality. By using Revit API, a Revit plug-in can be added in Revit to realize additional custom features and functionality. For example, a simple plug-in “The Find and Replace Materials” allows users to find multiple project materials and replace them with the same material all at once (Figure 1.17) (Polynine Studio, 2018). This saves time and manual processing. 21 Figure 1. 17 Replace materials by batch processing in a Revit plugin (Polynine Studio, 2018) Dynamo is an open-source visual programming tool installed in Revit. Similar to the Revit API, it provides many visual graphic coding nodes (commands), but a user does not type in scripts but uses pre-made nodes instead. Each node has one or multiple inputs and outputs and represents a specific function by accessing the Revit API. The user connects the inputs and outputs of different nodes to use the API functions and extend Revit’s functionality (Figure 1.18). To create custom nodes, users can download released packages from “Dynamo Package” in Dynamo or writing Python or C# scripts. Some packages include archi-lab, Clockwork, Lunchbox, and Rhythm. Compared to the Revit API using C# in Visual Studio, Dynamo is much easier for non-programmers to learn. Like the Revit API, using Dynamo can extend the features and of Revit and enhance the efficiency of using Revit. Some advantages include automating repetitive tasks, accessing building data, designing parametric models, and testing performance or simulations (Figure 1.18, Figure 1.19). Figure 1. 18 An example to create sheets and set sheet names and numbers Figure 1. 19 Parametric bridge stadium design with Dynamo 22 1.6 Summary The chapter introduced some basic concepts of BIM, FM, VR/AR/MR, tools and applications, and the development tools using in this study. BIM is popular and implemented across the AEC/FM industry. The interoperability and data-rich model of BIM enhances the efficiency in FM during the building lifecycle. FM is an important job to ensure and maintain a building normally operating. It is also a time-consuming job as FM is responsible for all physical components. AR’s ability to visualize virtual objects superimposed in the real world and the increasingly accessible AR technology has triggered many industries (non-AEC/FM and AEC/FM) to apply AR for specific tasks. It is worth considering integrating BIM with the AR visualization technique to improve the working efficiency in FM. 23 Chapter 2 2. BACKGROUND AND LITERATURE REVIEW Chapter 2 describes some specific research about BIM and FM, previous workflows of realizing BIM data visualization in AR/MR systems, and some implementations of integrating BIM, AR/MR, and FM. Purposes, functionalities and potential problems of BIM-VR/AR systems using for FM are also discussed. 2.1 BIM and FM This Section introduces the benefit of using BIM in FM and examples of BIM-based FM system. BIM is a good database resource providing three-dimensional geometry and building asset attributes to integrated FM data. Using BIM can enhance FM in space management, preventative maintenance and retrofit, and efficient input of the FM database into the building information database. The challenges and limitations of integration between BIM and FM are also discussed. Finally, using augmented reality for BIM-based FM is proposed. 2.1.1 The potential of BIM and FM Compared with the design phase and construction phase, the building’s operation accounts for the most time in a building’s lifecycle. FM is mainly responsible for ensuring the optimum performance of the building and maintaining the asset sustainably. Owners and facility managers need to acquire related building information as a reference to make decisions, evaluate building performance and execute building maintenance (Nicał & Wodyński, 2016). Acknowledging BIM is a platform that shares the database of building information and understandable model, it has the potential to provide the building information database for FM. Therefore, 6D BIM, which refers to the post-construction phases (operating and maintenance) of a building or sustainability, has received more and more attention in the AEC/FM industry because of the ability of 6D BIM to improve FM efficiency in practice (Nicał & Wodyński, 2016). The benefits of integrating BIM and FM can include three aspects: accurate space management, using BIM data for preventative maintenance and retrofit, and efficient input of the FM database into building information database (Arayici et al., 2012). 2.1.1.1 Space analysis and management Building space is an inherent asset of the building itself. It is important for owners and facility managers to understand space before properly utilizing the space during the occupancy stage. Space analysis and management include space classification, space re-assignment, underutilized space identification, forecasting space requirement, room allocations, etc. These tasks need information about spatial information: space numbers, areas (gross, assignable, and non-assignable), volume, room functions, location, and contents (equipment, furniture, lighting fixture, etc.) (Becerik-Gerber et al., 2012; Nicał & Wodyński, 2016). This spatial information can relatively easily be inserted into the BIM database and then can be accessed through the software schedules (K. Kensek, 2015). The traditional method to do space analysis is using identifiers in CAD to fetch and show the space information; however, it is not only time consuming to update but also inconsistent in naming conventions. On the contrary, BIM provides three-dimensional space visualization and instance space attribute. It is much easier in BIM to manage building space. Moreover, BIM enables FM personnel to trackback previous assets’ information. For example, to solve a dispute or examine insurance, FM personnel may need to see and reproduce the original buildings before they are remodeled and expanded. Therefore, BIM enables owners and facility managers to undertake space planning based on the BIM model visualization and the attached space attribute (Nicał & Wodyński, 2016). 2.1.1.2 Renovation/retrofit planning and feasibility Facility managers need to assure all building elements under normal operating conditions. It is essential to record and update the existing building information. By checking the database, facility managers can decide whether building equipment needs to be fixed, upgraded, or replaced (Nicał & Wodyński, 2016). BIM provides historic building information for checking preventive maintenance and repair. An accurate building information model of an existing building can be utilized for future refurbishment or demolishing. The architecture firm Harley Ellis Devereaux used BIM models to simulate existing building energy performance (energy use intensity EUI) to give building upgrade retrofit suggestions (Figure 2.1). 24 Figure 2. 1 Integrating BIM for FM energy analysis and building retrofit (K. Kensek, 2015) 2.1.1.3 Integrate BIM and FM database The data-rich BIM model can be used as a big database to store the FM database to manage assets. The BIM 3D model represents building components’ spatial relationships. The detailed specifications of building equipment can be “exported into spreadsheet files or linked to a 3D model within an FM system” (Kensek, 2015). Because the computerized maintenance management software integrates BIM data and FM data, each piece of equipment can be accessed and modified. For example, the Veterans Administration Space and Equipment Planning System (VA- SEPS) integrated BIM, GIS, facilities assets to enhance healthcare facilities design and management (Figure 2.2) (Onuma, 2015). Figure 2. 2 Veterans Administration Space and Equipment Planning (SEPS). Every piece of equipment is attached to BIM (Kensek, 2015) Moreover, using BIM for FM can combine specific FM information and BIM geometric information into an integrated system/ environment. A case study from Northumbria University indicated the integration of BIM and FM system can save time for FM personnel checking and updating building equipment information (Kassem et al., 2015). The original workflow is the FM personnel must take extra workload to update the drawings and building information in separated systems (Figure 2.3 upper). However, with the BIM-based FM system, FM information is integrated into Revit (Figure 2.3 lower). The use of Revit can instantly automatically update schedules, generate elevations views, and directly visualize a three-dimensional data-rich model (Figure 2.4) (Kassem et al., 2015). Moreover, this BIM-based FM tool has the potential for room locating or specification checking when sharing the BIM in other BIM programs. 25 Figure 2. 3 Comparison between original FM workflow (upper) and BIM-based workflow (lower) (Kassem et al., 2015) Figure 2. 4 Integrated BIM provides FM personnel with various information (Kassem et al., 2015) One of the difficult issues of combining BIM with FM systems is sharing information between software. Generally, the integration needs integration software to transfer the information from FM systems into building information model. EcoDomus is one of the integration software that provides real-time integration of the Asset Information Model (derived from BIM) with Building Management Systems (some FM systems) (EcoDomus, 2019). For example, USC facility management extracted BIM data into different FM systems by using EcoDomus FM. EcoDomus was selected as a central database repository to store building information and FM information from multiple systems including BIM, CMMS, BAS, EDMS, GIS, etc. (Kensek, 2015). Transferring BIM data to EcoDomus was done by Revit custom plug-ins. The final integrated result of a piece of equipment in EcoDomus contains all data from all systems, which include the three-dimensional geometry, green-highlighted position of equipment, general information of equipment, the equipment’s operation and maintenance manual, and conditional information in one interface (Figure 2.5) (K. Kensek, 2015). With EcoDomus that combined various systems data into one integrated interface, FM personnel can save much time checking all information instead of switching 26 software repeatedly (Kensek, 2015). It also helps FM personnel to obtain as much information as possible all at once. The relevant information, retrieved from different systems, of one component can interpret each other to make all information in the interface readily comprehensible. Figure 2. 5 Interface in EcoDomus with all integrated data (K. Kensek, 2015) 2.1.1.4 Real-time localization of building components Facility managers must locate and check the building components such as equipment, systems, and materials routinely (Nicał & Wodyński, 2016). Responding to results promptly is critical to efficiently detect problems and searching solutions. Traditionally, FM relies on paper-based material, such as facility maps and asset lists, and working experience to understand the materials (Nicał & Wodyński, 2016). However, adopting this method is not feasible enough where objects are dangerous to access. Also, it is hard to find the corresponding building elements from the paper-based material. Therefore, many practices of integrating BIM with FM are created to help facility managers to quickly find the real-time location of building in the FM system. With the integration of BIM and other technology such as Radio Frequency Identification (RFID), bar-code technology, etc., real-time building components’ location is quickly shown in the BIM-FM system (Nicał & Wodyński, 2016). The facility manager can scan the 2D barcode on-site to get the necessary system information with 3D model representation from the integrated BIM-FM system (Figure 2.6). It can improve the FM efficiency by accurate and swift BIM model searching (Lin, Su, & Chen, 2014). 27 Figure 2. 6 Example of BIM-FM system for real-time location reference (Lin et al., 2014) 2.1.1.5 Personnel Training and Development Professional facility personnel should be familiar with building assets “when they are newly assigned, when new facilities are built or when existing facilities are renovated” (Becerik-Gerber et al., 2012). Training is required to ensure they can do their jobs well. However, the process of training for facility personnel familiar with buildings components (“boundaries, divisions, functions, maintenance schedules, and warranty of equipment, etc.”) is mainly based on “presentations, site visits, hand-by-hand teaching, and self-learning”, which requires substantial time (Becerik-Gerber et al., 2012). With the use of BIM integrated FM information, FM personnel can walk through building assets by visualizing virtual BIM models, spaces, components, and relevant functional data. Moreover, BIM makes the post-training assessment easily executed (Becerik-Gerber et al., 2012). Instead of being on-site, test questions can be resolved by using the BIM-FM system, for example, searching certain pieces of equipment location and checking whether it should be repaired. 2.1.2 Challenges and limitations There are several benefits of integrating BIM with FM. However, there remain some challenges in implementing BIM for FM. The main barriers are cultural changes and the interoperability issue of BIM and FM software (Nicał & Wodyński, 2016). The cultural barriers refer to the reluctance and difficulty of FM personnel to learn and to change working ways in adopting new technology. The second barrier is caused by diverse BIM and FM software. Some tools provide solutions to integrate BIM and FM information such as EcoDomus, BIM 360 Series, and YouBIM. Moreover, lacking a standardized guide for BIM and FM also brings difficulty of integration. For example, necessary data can be difficult to move from BIM to the FM system. A hierarchy of data and process requirements of supporting BIM to FM shows what needs to be known in FM (Figure 2.7). 28 Figure 2. 7 Data and Process Requirements to Support BIM-Enabled Facilities Management (Becerik-Gerber et al., 2012) Besides the technical challenges in interoperability, some BIM-based FM systems are installed in local computer machines. Although these types of systems display the building model with data and users can navigate to the model to view the pieces of equipment in the virtual model, there is still a gap of the virtual model with the on-site building components. These BIM-based FM systems cannot directly point out or tell users where the building components are located in the physical world, but only help FM personnel understand the spatial relationship of building components via BIM model and quickly providing required information. A more efficient way to check and update facility operation information may be using a mobile application that integrates BIM and FM data. Examples are using barcodes to scan components on-site to visualization building information model and other operating information; FM personnel can stand by the real objects on-site and see the BIM model with FM data. However, manually scanning the barcode is still a little troublesome when scanning too many facilities. To both automatically access the BIM model with FM data and visualize real objects and virtual objects simultaneously on-site, augmented reality has begun to be implemented in the FM industry. Table 2. 1 Examples of some mobile tools using BIM for FM EcoDomus Mobile software Supports operating systems: iOS, Android, and Windows) Application features: • View BIM data • Access asset information, documentation by scanning the barcode on site • Add work orders and review records http://ecodomus.com/products/ecodomus-mobile/ 29 BIM 360 Field/ BIM 360 Glue Supports operating system: iOS Application features: • Navigate BIM models with object property data; • Select, hide and reveal objects by disciplines; • Make measurements and annotation, notifications, and markup response to team members • Clash detection (BIM 360 Glue) • Scan the barcodes to access BIM information for the equipment on iPad. The barcodes are generated in Revit before the model being uploaded to BIM 360 Field/Glue. Use BIM 360 Glue on-site coordination (https://bim360resources.autodesk.com/customer-case-studies/better- coordination-fewer-clashes-2) Access asset using on-side barcode (https://www.autodesk.com/bim-360/field-construction-management-software/) 2.1.3 BIM and FM Summary Using BIM for FM integrates the digital building information and facility management system databases. The combination of BIM and FM empowers the overall data and software features that can help several tasks in FM. The main challenges of the integration of BIM and FM are hindered by interoperability issues between BIM and FM. Most of the BIM-based FM systems only present 3D geometry with location description and facility information in the virtual BIM and FM integrated system, which is not related to physical building equipment. Even though some BIM-based FM systems provide mobile apps, it still takes time to manually scan the barcode. Using augmented reality integrated BIM and FM information was proposed as a way to visualize both the virtual model and on-site building components. 2.2 BIM and AR/MR This part demonstrates the overview of AR and BIM, benefits of using BIM and AR in construction, architectural design, and facility management, and current marketing BIM-based AR software. 2.2.1 Augmented Reality and BIM The current visualization method using AEC/FM is mainly applying various BIM programs to visualize building model and attributes. Most of the BIM-related research has focused on how BIM programs can better connect stakeholders fully utilizing BIM information, including 3D BIM (geometries and attributes), 4D BIM (3D+time) and 5D BIM (3D+ cost), and simulation results throughout the design, construction, and maintenance. However, no matter how BIM is developed to help AEC/FM actives, the BIM information is mostly shown in the BIM software (PC or mobile platform) or transferred into 2D drawings. Engineers need to understand the model and drawings based on their professional experience and self-spatial navigation to do their tasks, for example, find the location of building components or managing the construction to be correct as designed. To compensate for the inherent limitation of the digital of BIM programs, researchers found the opportunity to apply AR into the AEC/FM industry, especially in the construction field. Augmented reality is an interactive experience of combining computer-generated perceptual information with the real-world environment. The principal feature of AR is that it can build a medium 30 for users to visually receive additional computer-generated information while they are working in their real world. As a result, users see the reality being overlaid with additional virtual information. The most prevalent applications of AR are “in scientific visualization and gaming entertainment” (Wang et al., 2013). But as AR technology becomes mature and more advanced, various VR/AR HMD and multiple open-source AR software development tools have been developed; in the past two decades, the Architecture, Engineering, Construction, and Facility Management (AEC/FM) industry has put many efforts on bringing the benefits of VR/VR to enhance their work efficiency (Du et al., 2016). BIM-AR visualization tools can build the connection between the digital BIM information and the corresponding objects and the environment in reality (Figure 2.8). The connection helps stakeholders to easily and quickly understand design, construction, and operation by seeing the additional information overlaying on the real context; as a result, working efficiency and accuracy rise. Figure 2. 8 A BIM-based AR application “(ACCEPT)” used in Construction (Ratajczak et al., 2019) 2.2.2 Using BIM and AR for construction AR has the potential ability to benefit eight tasks out of 17 classified tasks in the AEC industry: “layout, excavation, positioning, inspection, coordination, supervision, commenting and strategizing” (Shin & Dunston, 2008). Other study shows BIM and AR integration benefits six areas in the construction site: “interdependency, spatial site layout collision analysis and management, link digital to physical, project control, procurement: material flow tracking and management, and visualization of design during production” (Wang et al., 2013). The benefit of interdependency means BIM-based AR helps the different specialties work more collaboratively (Wang et al., 2013). The use of AR displaying the integrated BIM information can help subcontractors quickly and correctly distinguish the interdependency from each other and understand the relationship of building components and systems (Wang et al., 2013). Linking digital to physical refers to overlaying (mixing) the 1:1 scale virtual model on the building components. Besides the BIM data, another project information such as drawings, instructions, and pictures can also be added in AR to gives on-site workers more information to make right decisions and efficient tasks. “Accept” is a mobile AR using the smartphone and tablet. It focuses on using AR to help to enhance construction actives. The main features of “Accept” include 3D model navigation, displaying geometric and functional data of selected objects while walking through the buildings, checking construction details and instructions and reporting construction progress (Figure 2.9). 31 Figure 2. 9 Examples of tasks how BIM-based AR application can be used for Construction (Ratajczak et al., 2019) Some firms build custom AR/MR applications for an AR/MR headset. Instead of holding a smartphone or tablet, wearing an AR/MR headset to implement AR in construction can offer constructors more convenience, as the device on the head frees their hands. One example is the MR tool called “Trimble Connect” using HoloLens, and it is available in the Microsoft Windows App Store (Figure 2.10). It can aid the project stakeholders such as structural, mechanical and electrical to coordinate, collaborate and manage the process by visualizing the 1:1 scale precise- located holographic data on-site in the context of the construction environment. Figure 2. 10 Trimble’s Hard Hat: Visualizing the mixed virtual building systems on site Accessed from https://mixedreality.trimble.com 2.2.3 Using BIM and AR for architectural design In addition to the advantages in the construction field, there are benefits of integrating BIM and AR in the architecture design. Traditional architectural design mainly uses CAD or BIM software to create building models. Although most programs have realistic rendering effects showing what buildings look like after construction, there remains a gap between the rendered model and the final built effect. For example, it is difficult to assess the overall conceptual design model based on the design place. Some may build an architectural scale model, which is a physical representation example of the building structure to show design ideas to clients and the general public. Building a scale model is time-consuming. An AR-BIM visualization system can help architecture evaluate the conceptual design based on the environment (J. Wang et al., 2014) (Fig 2.11). With this function, users can change the building color, building height, and architecture system according to the environmental context (landscape, surrounding buildings, city planning, etc.) to make a better design (J. Wang et al., 2014). An example from Unity developed an AR application by using Revit files and Unity. 32 Besides visualizing the entire building on-site to help the design, another implementation of using AR in design is to display the virtual building model of one floor, filtered by disciplines, on 2D planning documents. This helps stakeholders make design decisions and streamlines the process (Figure 2.12). Some apps, like Kubity Go, have a direct link from Revit to VR/AR. The AR features of Kubity Go can detect a planar surface and put the linked Revit model on the plan (Figure 2.13). Figure 2. 11 Building planning visualization in the physical environment (https://www.youtube.com/watch?v=r0ZmaljQlW8) Figure 2. 12 Put the 3D models by disciplines on 2D drawings (https://www.youtube.com/watch?v=r0ZmaljQlW8) (Left) (https://www.youtube.com/watch?v=BvXgiKNsXXk) (Right) Figure 2. 13 Kubity Go put linked Revit model on a detected physical surface 33 2.3 BIM, AR, AND FM AR and BIM information can improve the productivity of maintenance by reducing the time when inquiring about the facilities’ information (Chair, Producer: IKEA, price: $250, and Tel: 023- 65400301) (Figure 2.14). In contrast, the traditional way of facility maintenance is a time-consuming process that managers search an index sheet with detailed instructions to check the product information. Figure 2. 14 Identify physical object and show facility information in AR (J. Wang et al., 2014) AR has been used by facility managers for daily inspection activities by visualizing the added data from mobile AR applications. The mobile AR could optimize the process during various tasks including “quality control, safety management, scheduling, mocking up spaces for clients, training workers, construction education, and facility management” (Irizarry et al., 2013). Because AR technology grows more mature based on previous experimental research and practice applications, low-cost mobile AR is likely to change the AECO industry in the near future (Irizarry et al., 2013). Some practical examples of using BIM information and AR technology in FM have demonstrated the benefits of retrieving BIM geometry and functional information in AR to overlay the filed building systems. A facility manager can easily check the information of the building facility, control the systems, and send the report to the manufacturers (Figure 2.15, Figure 2.16). The process can enhance their working efficiency because the BIM digital geometry is not dependent on a screen or in printed paper, the digital model is merged with real-world actual systems. The systems information, such as name, codes, data, operating data, are visible next to the virtual model in the AR users’ screen, which users do not need to search by themselves in FM software. Figure 2. 15 A screenshot of the ARWindow scenario (https://www.youtube.com/channel/UCCEXhTYvD_Slh0c--axwRVw ) 34 Figure 2. 16 A Mobile Augmented Reality method for accessing building information: InfoSPOT (Irizarry et al., 2013) 2.4 Limitations of transferring BIM data to AR/MR Previous research has used VR for “design coordination, project planning, construction education, safety training, and construction operations, facility management and real estate” (Du et al., 2018). Although those VR applications can potentially address the issues related to AEC/FM, there remain many technical limitations of the current VR workflows that hinder the practical adoption of VR in the industry (Du et al., 2018). There are at least four major limitations of transferring BIM data to AR: 1. The complexity of importing BIM data to the AR environment 2. Poor real-time data exchange between BIM software and an AR environment 3. Imprecise position tracking to place virtual objects in the real environment accurately 4. Limited computing speed and system memory 2.4.1 Complexity process of BIM data to the AR environment The first critical limitation is the complexity of transferring design data into a VR/AR environment. BIM is the currently predominantly used software with necessary data in the AEC/FM domain. When building a VR platform to visualize the design model and associated data, the detailed BIM-based model is the best resource for VR environment development. However, the conversion process from a BIM standard or exported file to a VR environment has led to problems (Du et al., 2018). The present common way of moving BIM to VR is a time-consuming and complex process and thus can impact the implementation of VR in AEC/FM. The typical workflow usually has three steps: (Bille et al., 2014) (Figure 2.17) 1. Opening a BIM model file, such as a Revit native file format RVT, in third party graphing programs (e.g., Autodesk Maya, 3ds Max, etc.) 2. Generating a newly rendered FBX file into a game engine in the third-party “transitional software” (e.g., Unity 3D or Unreal 4) 3. Exporting the newly rendered FBX file into a game engine to develop VR/AR (e.g., Unity 3D or Unreal 4, etc.) 35 Figure 2. 17 Workflow of Revit-Unity (Handel et al., 2016) When the data are transitioned through third-party software, data are often lost (Du et al., 2016). Moreover, even when converting a simple model to the VR environment, the process could take several hours or even days. But in fact, the most critical problem is not the cumbersome data transition, but the requirement of programming skills and debugging skills for AEC/FM industry users (Du et al., 2018). 2.4.2 Poor real-time data exchange between a BIM software and the AR environment The second limitation of BIM to VR/AR is that very few current BIM-based VR/AR programs support real-time data exchange between a BIM software and an AR/VR environment (Du et al., 2018). Although many experimental studies successfully demonstrate retrieving and transferring BIM data into a VR/AR environment, their study workflows do not have full round-tripping to transfer and update the BIM data. For example, one cannot export BIMdata from VR/AR back to BIM software. Moreover, most of the transferring BIM data is by manual process. When there is a change in BIM software such as Revit, users have to manually export and import building geometric data through middle rendering software such as 3ds Max or Maya and finally go into the game engine like Unity. This is because the exchanged data is not dynamic and synchronized between the BIM environment and VR/AR environment. One example of a static data exchanging method is transferring between Revit and Unity (Figure 2.18). Figure 2. 18 A BIM-based and Location-based Augmented Reality system (Ratajczak et al., 2019) Changes happen frequently in modern construction that are not on the original drawings. Although VR/AR help stakeholders better visualize and obtain BIM data on-site, without real-time BIM to VR implementation, changes in BIM platforms cannot interactively be presented in VR/AR devices. As a result, feedback can be still delayed. It is 36 noticed that using the traditional approaches to exchange information, such as Request for Information (RFI), the average response time is more than 10 business days (Steuer et al., 2013). Therefore, a real-time BIM-based VR/AR can improve construction efficiency and reduced unnecessary rework (Du et al., 2018). 2.4.3 Imprecise positions tracking to place virtual objects in the real environment accurately Another limitation of implementing BIM to AR is that it can be difficult to detect the users’ positions accurately and place the virtual objects in the real environment exactly. The precise positions and positioning are generally based on a user’s head position and orientation. For example, in the facility management domain, users mainly need the information in a specific indoor location. However, current tracking devices, for example, GPS and Wireless Fidelity (WI-FI), are not as accurate in the indoor environment as those using outdoors. Therefore, the relatively inadequate GPS or wireless network condition ruins the BIM-AR system for indoor visualization. 2.4.4 Limited computing speed and system memory for mobile device BIM is described as a data-rich environment. Transferring necessary BIM data into a mobile AR system requires significant memory (Irizarry et al., 2013). Also, it is extraordinarily difficult to implement all the BIM information into mobile AR application (Irizarry et al., 2013). It reduces the ability of AR to visualize BIM information if only a small portion of information can be processed on time and sent back to the AR system. Therefore, parsing BIM models into a lower geometric file to save AR system memory and choosing what data from BIM are critical. Additionally, as discussed later, the Revit and Unity data has different defaults (such as origin point location). 2.5 Current marketing BIM-based AR software There are two marketing VR/AR tools developed by Unity and Unreal that can easily transfer BIM geometry and information into a VR/AR environment. 2.5.1 Unity AR tool with BIM data Unity can support many formats as assets, but for integrating CAD and BIM geometries, FBX and Collada (DAE) are the preferred file formats. (Figure 2.19) (Boeykens, 2014). Figure 2. 19 Recommendation of file format for different software to Unity (Boeykens, 2014) In November 2019, Unity released a new Revit-to-VR/AR add-ins, called Unity Reflect (Figure 2.20). Unity Reflect is a real-time VR/AR platform that enables users through any platform (mobile or desktop) to visualize real-time Revit model and information. One outstanding feature is that the synchronization of BIM data between Revit and the Unity Reflect so that any change in Revit can update in VR/AR automatically. Unity Reflect also provide previewed Package for Unity Pro users to build custom application BIM-based VR/AR application. With this previewed package, the metadata of each object can be attached to the object in Unity (Figure 2.21). However, although the Revit data is synchronized between two platforms, users can only make changes from Revit, while Reflect only visualizes the updates. Another limitation of this program is the Reflect does not offer a 1:1 scale augmented reality that can overlay the virtual object on the same location of the real-world objects. 37 Figure 2. 20 BIM data synchronization (Unity Reflect, 2020) Figure 2. 21 BIM data synchronization (Unity Reflect, 2020) 2.5.2 Unreal 4 VR tool with BIM data Another Revit-VR program, Twinmotion, that was developed based on Unreal Engine 4 was also released in 2019. The Twinmotion Revit plug-ins load Revit FBX files in either a dynamic way through Revit or import an exported FBX file in Twinmotion. Because Twinmotion is executed on Unreal Engine, it has a very high visual quality. Moreover, Twinmotion provides a rich library includes the weather, lighting, daytime features, etc. that enables users to change the material of objects for better visualization (Figure 2.22). Figure 2. 22 A rendered view in Twinmotion after assigning new materials, and changing weather data 38 However, some limitations are BIM data loss and data synchronization between the Revit and the VR environment. First, because the Twinmotion only reads FBX files, the parameter information of Revit objects (length, thickness, cost, etc.) are not shared into Twinmotion. Second, there is no real-time control from Revit to Twinmotion VR environment. Users need to either click the Revit Twinmotion plug-in to update FBX file to Twinmotion or import a new FBX file in Twinmotion. Last, Twinmotion does not provide the pipeline back to Unreal Engine 4. Therefore, currently, there is no way that users can add the lost BIM metadata to show geometry attributes under the virtual objects in Twinmotion. 2.6 Summary This chapter introduced some basic concepts of BIM and FM, and how BIM can contribute to FM tasks. The integration of BIM and AR has the powerful potential to help AEC/FM tasks. Research on using BIM and AR in construction, architecture, and facility management is overviewed. Some current marketing BIM-based AR products were also explained. 39 Chapter 3 3. METHODOLOGY To build a potentially more efficient tool to help facility personnel, a BIM-based mobile AR application, TransBIM, was created, where the BIM data can be edited and exchanged in two-way communication between the BIM environment and the AR environment. Generally, there are three main workflows to realize the cycle of exchanging BIM data between Revit and the developed mobile app: 1. exporting BIM data (geometry and metadata) from Revit, 2. displaying and editing BIM data in the developed AR application (TransBIM), and uploading the new BIM data to the cloud (Dropbox) 3. updating the Revit model based on the modified BIM data modified. This chapter discusses the overall methodology and functionalities of the developed BIM-based AR application. Chapter 4 then explains details scripts using in Dynamo and using in Unity to develop the mobile app. Chapter 5 gives a case study of USC Watt Hall with usability validation of the developed tool. 3.1 Introduction of the workflow Existing applications can visualize BIM data in a VR/AR environment, for example, Unity Reflect and Unreal Twinmotion. However, these applications do not focus on building a feedback loop with BIM data, but only allow transferring of BIM data from the BIM software to the VR/AR applications. While on-site workers use an AR application, they may want to change the building information model. However, since current mobile AR applications do not provide editing BIM data capability, changes must only be made by an off-site designer with desktop software like Revit. The process requires communications, responding time, and may cause misunderstandings. Therefore, it is important that the BIM data circle around through BIM, AR, and users so that users can change the BIM data in the AR environment on-site and then use the modified data to update the original BIM file (Figure 3.1). Figure 3. 1 Interaction between the user, Revit, and Unity 3d (BIM data circle) To enable transferring BIM back and forth among different environments, the overall process can be categorized into three major phases: BIM data to AR, BIM data in AR, and BIM data in BIM. Each phase regards one environment working with BIM data, which respectively are Revit, Unity and developed AR application, and Revit. There are five tasks to have the data feasibly flowing from one environment to another and to enable diverse functions of mobile AR application (Figure 3.2): 40 • Prepare the BIM (Building Information Model) • Exporting the BIM geometry and metadata (Geometry transfer & Metadata transfer) • Developing mobile BIM-based AR application, called TransBIM (Create AR in Unity) • Use the AR application to visualize and edit BIM data (Use AR in the real world) • Update the original BIM file in Revit based on modified BIM data in Revit (Update BIM) Figure 3. 2 Methodology 3.1.1 Description of tasks The first task (Building Information Model) focuses on preparing the building information model geometries and metadata. The Revit model geometry was exported as an FBX file. Dynamo was used to firstly add Revit objects necessary parameters and then export objects’ all metadata in a text file. The second part (Geometry transfer) addresses modifying the Revit-exported FBX file in Autodesk 3ds Max. The third part (Create AR in Unity) addresses how all information (optimized geometry and metadata) was combined in Unity and how the mobile AR application with user interface and interactions was developed. The fourth part (Use AR in real world) is to use TransBIM on-site, edit BIM data, and upload the changed BIM metadata in a text file to the Dropbox. The last part (Update BIM) is to download the changed BIM metadata from Dropbox and to use Dynamo to read the text file to update the Revit model. 3.1.2 Software selection As BIM data is transferred through software, an appropriate selection of software needs to be chosen to provide compatibility with other software and the overall process. Exporting BIM geometry and metadata, developing a BIM-based AR tool, and updating BIM data are the main cores. Each core addresses different technical issues and can affect the methodology in other cores because of the possible interoperability issue. For BIM geometry and metadata exporting part, Revit was chosen as the BIM software due to the following reasons. First, Revit is one of the major software programs commonly used in AEC/FM during a project’s life cycle. It not only represents 3D geometry but also stores functional data attached to the model. This data-rich model makes Revit an ideal resource for visualizing geometry and parameters in an AR environment. Moreover, Revit has high interoperability with other BIM programs and other Autodesk products. It supports reading and generating multiple file formats. It also offers API and Dynamo for users to develop custom functional tools. The data-rich model and interoperability feature thus provide the possibility to integrate BIM data into another program to make an AR 41 application. Considering the flexibility of Revit’s functionality and accessibility as a student, Revit was chosen as the software to make building information modeling. For the AR development part, a good game engine and AR SDKs can make the development process easier. Because the goal is to create a mobile AR application for users to view and alter the BIM elements’ geometry and parameters, the selected game engine should have interoperability with the exported file format from BIM and also can build the AR application for mobile devices. An educational version of Unity was chosen. Although both Unity and Unreal are two mainstream game engines and both have been invested and developed tools to work with the AEC industry, for example, Unreal Twinmotion and Unity Reflect, Unity is relatively easier to start than the Unreal. Moreover, Unity offers AR SDK plugins such as ARKit, ARCore, and AR Foundation to make AR applications for different operating-system mobile devices. Unity, Unity XR plug-ins (ARFoundation and ARKit) were used to develop the mobile BIM-based AR application for the iOS system. For BIM data update, the internet is the channel for Revit users remotely getting the modified BIM data file from the user’s mobile device. Dropbox was used to store the modified BIM data from the mobile users’ sides. A new text file with modified BIM data was sent to Dropbox and then downloaded using in Revit as a reference to update the Revit model. Considering the mobile AR tool is only for research purposes with no business use, a personal Google Dropbox is relatively easier to use than making a Web server to store the BIM data file. A personal account of Dropbox is free. Overall, all the software used for the methodology was interactive with other software. If one of these systems or applications does not support the workflow, the entire loop is ended. 3.2 Building Information Model and BIM data exporting (BIM to AR) BIM to AR is the workflow of exporting BIM geometry data and metadata from Revit to Unity by using Dynamo, Excel, and Autodesk 3ds Max (Figure 3.3). Figure 3. 3 Workflow of exporting BIM data 3.2.1 Define BIM objects The building information model is the combination of 3D geometry and functional data of the building components. To enable multiple disciplines to work together to share the building information, Revit provides discipline classification when making models. Users can only make their own specialties in multiple separated Revit project files and then integrate these files by using the “Link Revit File” function in Revit. Revit classifies objects automatically into three disciplines: 42 1. Architectural: walls, doors, windows, floors, roof, ceilings, furniture, etc.; 2. Structural: columns, beams, truss, foundations, etc. 3. Systems: duct, pipe, conduit, electrical equipment, lighting fixtures, etc. Users can set views of objects of one or all disciplines (Figure 3.4). Figure 3. 4 Disciplines in Revit (Revit mechanical basic sample project) Each object has the properties box in Revit, which shows the parameters (functional data) of the object. Because the tool is to visualize and edit information of the Mechanical, Electrical, and Plumbing Systems (MEP) in the AR application as a potential way to help facility management, the MEP objects are the main BIM data to be exported. Considering one technical issue of AR is positioning the virtual models correctly in the physical space, some architectural objects are also exported from Revit as reference objects of the real-world building (Figure 3.5). Figure 3. 5 Necessary Revit object to be exported 3.2.2 Add new parameter type to store object location Because one feature of the TransBIM is to change the building components’ location and using the new objects’ locations to update the building information model in Revit, the exported metadata should include the coordinate of each object. However, the default parameter types of each object do not contain a parameter type that shows the objects’ position value; therefore, before exporting all metadata of the building components, each object was added a group of new parameter types with the corresponding coordinate location. Revit defined walls, columns, pipes, conduits, etc., which objects are created by drawing lines, are located based on lines. On the other hand, categories such as windows, doors, lighting fixtures, etc., which are created by attaching a host point on a surface such as a wall or ceiling, are located based on a point location. When getting the location of each object in Dynamo, lines and points were shown in the Dynamo Interface (Figure 3.6). To store this spatial 43 information, line-based objects were added to new parameter types to store corresponding line information, while host-based objects were added parameter types to store the corresponding host point information. Figure 3. 6 Find the location of selected elements (either Line or Point) Categories such as walls, pipes, ducts, conduits were given 6 parameter types to store their lines’ locations: • line Startpoint x, y, z coordinate as startPoint_x, startPoint_y, startPoint_z; • line Endpoint x, y, z coordinate as endPoint_x, endPoint_y, endPoint_z; Categories such as columns, doors, lighting fixtures, electrical equipment, generic models, duct and conduit fittings, air terminals, etc. were given 3 parameter types to store individual host point location: • host point x, y, z coordinate as hostPoint_x. hostPoint_y,hostPoint_z; (Figure 3.7) Figure 3. 7 Adding new project parameter type to each object based on its location type (line or point) 44 After the new parameters were added to each category, Dynamo was used to assign the spatial information of each object to the newly created parameters. (Figure 3.8, Figure 3.9) Figure 3. 8 Assign object spatial information to the newly created parameters Figure 3. 9 Results of adding object’s location information 3.2.3 Transfer Revit data to Unity Revit can export many file formats such as IFC, RVT, gbXML, DWG, FBX, etc., but not all of them include enough BIM information. IFC is the standard file format for BIM software, but it cannot be read in Unity. Although Unity supports FBX directly by the object entity, the FBX file does not include the model elements’ associated functional data while being exported from Revit. To visualize both geometries and the attribute information of the building 45 components in AR, the geometries and functional data (metadata) of selected Revit objects were exported separately as two typical file formats (FBX and CSV). These two types of data were combined again in Unity to allow users to see both the BIM geometry and attributes information in the AR environment. This section contains three parts: 3.2.3.1 Export Revit objects’ geometries to FBX Unit 3.2.3.2 Modify Revit FBX in Autodesk 3ds Max (Convert Autodesk Materials to standard materials & Reset objects pivots) 3.2.3.3 Export Revit objects’ metadata using in Unity 3.2.3.1 Export Revit objects’ geometries to FBX Unity Unity can import an FBX file as an asset. An asset is an item that can be used in the developing Game or Project. Users can create assets (types of files) in Unity, but also can import various types of file that Unity supports outside of Unity, such as a 3D model, an audio file, an image. FBX is one of the file formats Unity supports. All the Revit objects' geometry can be correctly read and shown as an independent entity; each entity is named by the same family type name and the element ID as what entity had in Revit (Figure 3.11). For example, a wall instance of the Revit family type “Basic Wall Generic-8 inches” has an element ID “385959” in Revit (Figure 3.10). The wall in the FBX file is named as “Basic Wall Generic-8’’- [385959]” (Figure 3.11). Figure 3. 10 Check Revit Element ID by Dynamo Figure 3. 11 FBX hierarchy in Unity FBX exported from Revit stores a portion of Revit objects’ information including objects’ hierarchy information, mesh, material information, Revit Family name and Revit object ID. In the Unity Hierarchy window, each GameObject in the FBX hierarchy has a name containing the main FamilyName and the Revit object ID. The entity ID was used to link the BIM metadata back onto each object. A 3D view (e.g. “3D View: 3RD FLOOR Copy 1”) in the FBX hierarchy was originally a 3D-view element in Revit (camera) and is understood as a camera object in Unity (Figure 3.11). This type of element was then deleted because it conflicted with the Unity existing camera, and only one movable camera in the scene was needed in the AR development. Although the methodology of importing Revit geometry to Unity via FBX file mainly works, the textures using in Revit were not embedded with FBX into the Unity project. As a result, even though the FBX file had the material information of each object, the materials were missing since the textures of the materials cannot be found in Unity. All the imported geometries display the same white color in Unity (Figure 3.12). Objects were assigned the default 46 material in Unity. To make the geometries of different disciplinary objects more distinguishable, Autodesk 3ds Max, a middle transitional computer rendering software, was chosen to decorate the FBX file exported from Revit to generate a new processed FBX file with material information that can be recognized in Unity (compatible with Unity material library). Figure 3. 12 Importing the FBX exported from Revit to Unity Another problem of using the Revit FBX file in Unity is that each object in the FBX hierarchy is with a pivot point in the same location instead of on the object itself. For example, in Revit, there are two coordinate systems in Revit, the survey coordinate system and the project coordinate system. Usually, a Revit project uses the project coordinate system, in which all Revit objects are positioned based on the relative location to the project base point, usually (0,0,0) (Figure 3.13). Figure 3. 13 Basepoint of the project coordinate system in Revit, Dynamo, and Unity When the Revit FBX file is transferred into Unity, although Unity shows the Revit entities with the correct relative spatial position the same as in Revit, most elements use the Revit project coordinate base point location as their pivot points. For example, the pivot of an object located by a line (walls, pipes, conduits, etc.) is far from those objects’ geometries (Figure 3.14). A pivot point of an object is the controlled point of an object where all transformational 47 changes are based on. As a result, when moving, rotating or scaling the object for one of the mobile AR functions, the transforming is not followed by an object itself but based on a point outside (Figure 3.15). A floor element was selected as an example to illustrate issues caused by the disagreement of the pivot and center point of an imported object. The floor is rotated based on the pivot outside (Figure 3.15). Figure 3. 14 A pipe’s pivot shown at the project base point location instead of on the pipe itself Figure 3. 15 Floor rotation based on its “Pivot” Because the Revit project base point as pivot points for most objects can produce unexpected and undesirable transformations, the location of each object’s pivot has to be reset by using 3ds Max to realize appropriate interactive functions. 3.2.3.2 Modify Revit FBX in Autodesk 3ds Max (Convert Autodesk Materials to standard materials & Reset objects pivots) Autodesk 3ds Max (3ds Max) is used as the bridge software to convert Autodesk Materials to standard materials and to reset objects’ pivot locations. 3ds Max offers FBX add-in to read the imported or linked FBX file. If the FBX file is linked in 3ds Max, any changes of the linked FBX file can automatically be updated in 3ds Max. When processing the FBX file, geometries can be grouped by Revit material, Revit category, Revit family type and each independent entity (Figure 3.16). 48 Figure 3. 16 Organize FBX file by entities To guarantee the same IDs of the objects’ geometries in Unity as in Revit, the “Do Not Combine Entities” was selected to read the FBX file by entities in Autodesk 3ds Max (Figure 3.16). As a result, each element was independent and maintained the same Revit ElementID as it was in Revit. Although 3ds Max automatically attaches textures, it is necessary to check if any textures are missing in the 3d view in 3ds Max. The “Asset Tracking Window” under the “File” icon enables users to reload the material’s textures by updating the textures’ paths (Figure 3.17). Figure 3. 17 Check or reload objects materials Before exporting an FBX file from 3ds Max, all materials should convert from Autodesk materials to standard materials for Unity use. The materials conversion was conducted by “Scenes converter” where users can set rules for material conversion (Figure 3.18). All Autodesk materials such as “Autodesk Generic, Concrete, Mental, Glazing, Wall paint, Water, etc.” were converted to standard materials. Exceptions are standard materials, physical materials, and missing materials, which do not need to convert. Converting the scene based on the above rules, in the “Slate Material Editors,” all scene materials were successfully converted into standard materials ( Figure 3.19). 49 Figure 3. 18 Convert Autodesk materials to standard materials (1) Figure 3. 19 Convert Autodesk materials to standard materials (2) Results shows after passing through the go-between software (3ds Max), objects in the new FBX in Unity were distinguishable with corresponding materials (Figure 3.12). Figure 3. 20 Comparison of Revit and 3ds max FBX file in Unity After converting the material of the Revit FBX file, another task should be done in 3ds Max, which is to reset the pivot position of each object. An object pivot is the control point of the object. Movement, rotation, and scaling all refer to each object’s pivot. For example, if rotating an object, the object will orbit around its pivot, and if the pivot is outside of the object’s geometry, the objects cannot spin round itself. Without resetting each object’s pivot, all 50 objects from the Revit FBX file in Unity will have the same pivot location, which in Revit is the project base point, usually (0,0,0) (Figure 3.21). It means for most of the objects, an object’s pivot is not on the object’s geometric center but outside of the object’s geometry. Therefore, 3ds Max was used to reset each object’s pivot to the center of the object (Figure 3.22, Figure 3.23) By doing this, the pivot can be repositioned on the individual object’s center and it is easier to develop the mobile AR application to manipulate building components. Figure 3. 21 A pipe’s pivot shown at the project base point location instead of on the pipe itself Figure 3. 22 Before resetting the object’s pivot, line-based object pivot is outside of the geometry Figure 3. 23 Reset the object pivot to the center of the object in 3ds Max 51 Finally, after converting the materials and resetting each object’s pivot to its geometry center, the overall model in 3ds Max was exported into a new FBX file that was next imported into Unity. This workflow from Revit to 3ds Max to Unity was successful in transferring the geometry and materials to Unity. 3.2.3.3 Export Revit objects’ metadata using in Unity The workflow of importing Revit geometry to Unity was described. The next step is to export and convert the Revit objects’ metadata. It is a little more difficult than exporting the Revit geometric data because Revit does not directly provide a simple exporting method of Revit metadata. Although the BIM metadata can be stored from Revit completely or partially in certain file formats such as RVT and IFC, most of the file formats with BIM data are not supported in Unity. For example, Revit can compile the data of a project into an IFC file based on specific rules or translate an IFC file to generate Revit objects, while Unity does not have those functions. Even the IFC can be opened in Notepad, the IFC data is mostly coordinates and reference numbers (Figure 3.24). There are no strings or number values of specific Revit parameters. As a result, IFC was not used to retrieve the Revit metadata. Figure 3. 24 IFC data opened in Notepad Any data structure that can be understood and read in Unity is possible and appropriate to be used. A feasible and simple way to export Revit metadata is by generating Excel files by exporting Revit schedules. However, a Revit schedule can only present the parameters of a group of objects with the same category, such as walls, conduits, pipes, etc. (Figure 3.25). Figure 3. 25 A test of exporting Duct information by Revit schedule Retrieving information about the necessary objects (objects for reference and objects for interest) needs multiple repetitive processes. It would take a long time to manually set several schedules of each category (walls, ducts, conduits, etc.) and export the schedules to a collection of Excel files. Then, users manually combine the data from all 52 the Excel files into one final Excel file may make mistakes. To avoid making mistakes and save the time to generate BIM metadata of selected objects, Dynamo was used to create a custom function to export Revit metadata into an Excel file. The created Dynamo scripts can enable users to retrieve all metadata of the selected elements and to write the data into a.CSV (Figure 3.26, Figure 3.27). A CSV file (comma-separated values file) is a delimited text file that uses a comma to separate values. When a CSV file is opened in Excel, the separate values are put in separate cells (Figure 3.27). The metadata includes all Revit native parameters of each object and all customer parameters such as “Element IDs”, “End_point”, “Start_point” created by Dynamo (Chapter 3.2.2). The .CSV file then was imported in Unity for developing the mobile AR application, TransBIM. Figure 3. 26 Export Revit metadata (element ID and parameters) to Excel in Dynamo Figure 3. 27 Results of the exported metadata in Excel 3.3 Integrate BIM with Unity for AR development (Create AR in Unity) This section introduces the workflow of how the exported BIM geometric and functional data were used to make the AR environment in Unity. Five main tasks were accomplished in Unity: managing FBX Geometry, parsing Revit 53 metadata (.CSV file format), designing mobile AR User Interface (UI), creating interaction functions, generating a text file and developing the web backend request, Set Tracking Image location and building AR app for IOS mobile devices. (Figure 3.28). All the detailed scripts and explanations of each task refer to Chapter 4. Figure 3. 28 Integrate BIM data in Unity (Mobile AR application development) 3.3.1 Geometry management in Unity This section introduces how the FBX file was imported in Unity and how it is used. There are three required steps to working on the FBX asset. 3.3.1.1 Import FBX in Unity & import settings 3.3.1.2 Unpackage the FBX prefab and add scripts to FBX entities 3.3.1.3 Create a new prefab with all the building component 3.3.1.1 Import FBX in Unity & import settings First, the exported FBX file from 3ds Max, with standard materials and with reset object pivots, was imported directly into Unity to generate an asset. The import settings of the FBX file were automatically set by Unity. The default rotation of the model was (-90, 0, 0) to make the Z-axis up in Unity, and the Scale Factor was 0.0254 to translate Feet to Meters (Figure 3.29). 54 Figure 3. 29 Import FBX in Revit and set the importing setting 3.3.1.2 Unpackage the FBX prefab and add scripts to FBX entities The imported FBX file is also a prefab, which is like a prototype or a class that can be duplicated to create instances sharing common features. When modifying a GameObject (a building component) inside a prefab, all the instances of the prefab will change. To realize multiple functions of the developed AR application, such as users choosing a single building component and editing its position, the individual object in the prefab should be attached with some scripts. Therefore, the prefab (imported FBX file) was first unpacked in order to obtain the capacity of manipulating each entity under the prefab (Figure 3.30). Figure 3. 30 Unpack the imported prefab After the prefab with Revit geometry was unpacked, three main components are added to individual GameObject: “Read CSV.cs”, “Click Object.cs”, and “Mesh Collider.” (Figure 3.31). 55 Figure 3. 31 Add script components to all FBX entities “Read CSV.cs” is a C# script that parses the imported CSV file and adds the relevant BIM data to each GameObject. The script enables separate BIM geometry and BIM metadata to be mixed together. “ClickObject.cs” is the script that executes several functions when users click a GameObject in the AR application. For example, the material color of the selected object will change to blue, and after another object being selected, the previous selected object’s material color will be changed to the original color. This script also generates temporary UI components to show the BIM data (object parameters) of the selected object. “Mesh Collider” is a type of collider based on GameObject’s mesh shape. Without “Mesh Collider”, the script “ClickObject.cs” cannot work to let users click the components in the AR application. There all other components attached to individual building objects in Unity. The scripts with detailed explanations refer to Chapter 4. 3.3.1.3 Create a new prefab with a new FBX model prefab After the FBX prefab was unpacked and added needed components, the next step was to set the model’s pivot location. Depending on different buildings and the models, the pivot points of the FBX file are at different locations. For example, the unpacked FBX model prefab was shown below (Figure 3.32). Therefore, users should set an overall pivot point of the individual FBX model. Figure 3. 32 Drag the FBX file prefab into the scene TransBIM used on-site images to anchor the virtual model with the real building. The pivot point of the FBX model represents where the first image should be put in the real building. To make the on-site image easier to put, a center of a door was considered as the pivot location. For example, the case study created a “Watt Hall” prefab and used the center of a door as the pivot location of the “Watt Hall” prefab (Figure 3.33). Therefore, the first scan image was put at the center of the same door on site (Figure 3.34). 56 Figure 3. 33 Set the overall pivot point of the Watt Hall prefab at the center of a door to scan the first image Figure 3. 34 Set the tracking images’ positions in both a virtual model and a real building Here explains the process of how the “Watt Hall” FBX model was given a new pivot point at the center of a door. Users can follow the same steps to create their custom TransBIM model prefabs with custom scan locations. The method to set the FBX model’s pivot point is by giving a parent GameObject to control the FBX model. First, an empty GameObject was created (Figure 3.35). Users can move the empty GameObject to a location that is proper to be the pivot point of the overall FBX model. The place can be at the center of a door, a place on the face of a wall, etc. In the case study example, this empty GameObject was moved to the center of a door (Figure 3.36). Figure 3. 35 Create an empty GameObject 57 Figure 3. 36 Move the empty GameObject to the center of a door Then, the empty GameObject was named as “TransBIM Model” and the unpackaged Watt Hall FBX instance was put under the “TransBIM Model” GameObject as a child (Figure 3.37). As a result, the “Watt Hall” FBX model was controlled by its parent (“TransBIM Model”) and the parent’s pivot point was at the center of one door (Figure 3.37). Figure 3. 37 Make the empty GameObject as the parent of the unpacked FBX model The “TransBIM Model” was saved as the final prefab and deleted from the scene (Figure 3.38). Then, the new prefab was used in “Instantiate Instance” of the “Tracked Image Info Multiple.cs”. The “Tracked Image Info Multiple.cs”. is a script that enables TransBIM to load the “TransBIM Model” in the AR environment when using the “TransBIM”. Detailed explanations of “Tracked Image Info Multiple.cs” see Chapter 4. Figure 3. 38 Save the “TransBIM Model” as the final prefab, delete from the scene, and put in the ARSessionOrigin 58 In summary, The geometry management process includes: importing FBX file as an asset, unpacking the FBX prefab, adding several components to individual GameObjects, and building a new prefab with a new pivot location 3.3.2 Parsing Revit metadata This Section introduces the methodology of how the metadata are visualized and how the data are parsed into partial information associated with an object which the data describe (Figure 3.39). In “BIM to AR” parts, the Revit metadata was exported in a CSV file. Similar to the FBX file, the CSV file was also imported into the Unity project. A Unity C# script called “ReadCSV.cs” was created to read the CSV file, to parse the data (Revit metadata) into groups by ElementIDs, and to attach the information of each group to the corresponding GameObject (building component). Each group contains the data from one row of the CSV file, where the first cell in each row is the unique six/seven-digit Element ID (Figure 3.39). Notice that each building geometry in the “Unity Hierarchy Window” is named with the building family types and unique element ID. Therefore, the unique element IDs were used as significant references to combine the Revit metadata to Revit geometry in Unity. Figure 3. 39 Match the Element ID of the metadata and imported geometry In Chapter 3.3.1 (geometry management process), each object was added the “ReadCSV.cs” file in Unity; as a result, each object has the script-enabled function to read the.CSV file and retrieve each object’s corresponding metadata. The script also lets each object store the retrieved parameters of the metadata as variables. These variables and values are used for displaying the information in the AR user interface (Figure 3.40). The detailed scripts and explanations refer to Chapter 4. 59 Figure 3. 40 Parsed data of one object showing when clicking an object (duct) 3.3.3 Designing Unity User Interface (UI) + user input The User Interface of the application, TransBIM, is composed of multiple components including a scrolling window and some buttons (Figure 3.41). Figure 3. 41 Overview of designing application UI system and Input functions The scrolling window was created to show the BIM data of the selected GameObject. The buttons were the images that can trigger different functions such as uploading modified BIM data (.TXT) to Dropbox. As the scroll window shows the object’s properties, it was designed in a similar interface to the Revit properties window (Figure 3.42). 60 Figure 3. 42 Revit property window (Left), developed AR tool property window (Right) In addition to the property interface appearance, the data types of the different parameter values were also considered in the designed property window. Basically, there are four forms of the parameter values in the Revit property window. 1. Non-editable: these parameters usually only provide values in light grey color and do not let users change, for example, length, area, height. 2. User input: these parameters enable users to type texts, such as comments. 3. Checkbox: These parameters’ values are Boolean type, users can click on or off to select true/false of the parameters, for example, Room bounding. 4. Fixed options: users can click the parameter values to choose other options in the interface, for example, wall location line (Wall Centerline, Wall Finish Interior, Wall Finish Exterior), phases To make the operability and presentation of the object’s parameters and values similar to those described above in the developed application, four UI components were designed to show the different types of parameters and values. They were made into prefabs and were duplicated when showing the matching parameter information. (Figure 3.43, 3.44, 3.45, 3.46) • TextRow: only showing the information without any editable features; • InputRow: enable users to type text in the input field; • CheckRow: enable users to change true or false of that parameter; • OptionRow: enable users to choose different preset options; Each prefab has two white rectangular images, “paraType” and “paraValue” that respectively show individual BIM parameter name and the parameter value (Figure Figure 3.43, 3.44, 3.45, 3.46). Detailed scripts refer to “InventoryControl.cs” file explained in Chapter 4. Only a few parameters were tested to be updated in the mobile AR, which were the Revit object’s spatial coordinates (start_point x, y, z; end_point x, y, z; duct/pipe/conduit size). Therefore, for most parameters, the TextRow were used to present objects’ parameter name and parameter value for providing information. 61 Figure 3. 43 TextRow prefab Figure 3. 44 InputRow prefab Figure 3. 45 CheckRow prefab 62 Figure 3. 46 OptionRow prefab With these four prefabs, when clicking a GameObject in the application, the “Click Object” calls a function to find how many parameters of the objects and what type of parameter (e.g. String, Double, Boolean, etc.) Then the “Click Object” script calls a function from the “InventoryControl.cs”, which creates the UI components (UI prefabs: TextRow, CheckRow, etc.) representing the selected object’s parameters in the Scrolling window. Each TextRow (Clone) or CheckRow (Clone) is a duplicate based on its prefab and shows one type of parameter information (Figure 3.47). For the detailed scripts and explanations, refer to Chapter 4. Figure 3. 47 The prefab of UI components were duplicated showing the selected object’s parameter information 3.3.4 Creating interactions This section introduces the main interaction functions of the developed mobile AR tool, TransBIM (Figure 3.48). 63 Figure 3. 48 Methodology of creating Interaction functions TransBIM was not only to use the AR technology better visualizing BIM geometries and functional information on site but more importantly for users to quickly change the BIM data through in the AR environment. The tool was designed to enhance the efficiency during the process when owners, facility managers, designers walk through the building and want to update the BIM model or the systems affiliated information. Instead of sending comments or photos of where to be changed to someone to change the model, this mobile AR tool enables users to change the BIM data straightforwardly. Considering the potentially needed changes from the facility personnel perspectives, this tool was built with two major functions: changing the objects’ positions and changing the values of some parameters: • Ducts’/ conduits’/ pipes’ location • Ducts’/ conduits’/ pipes’ size • comments and mark 3.3.4.1 Changing objects’ position Translating an object is a very easy function built into Unity. However, because the final goal is to update the building information model in Revit, Revit has to read the translation changes of objects in Unity. The method of changing objects’ locations in Revit is different from the way in Unity. For example, in Revit, walls, columns, pipes, conduits, etc., are created by drawing lines. The objects’ locations are controlled by two points (Figure 3.41). Categories such as windows, doors, lighting fixtures, etc., are created by attaching a host point on a surface. Each object’s location is based on a point location (Figure 3.49). 64 Figure 3. 49 Select a conduit and a window to show a controlled line and a host point (objects’ geometries hidden) Unlike Revit, all objects in Unity only have one control point, the pivot. Therefore, it is necessary to build a formula that converts the objects’ positions changes in Unity to the changes of the control points in Revit. This is why before exporting metadata from Revit, some custom parameters (startpoint x, y, z and endpoint x, y, z) were added and exported into BIM metadata (Figure 3.50). Figure 3. 50 Adding object’s location information to objects parameters in Revit With the added Revit coordinates, a GameObject in Unity does not only have the pivot location in the Unity system (Figure 3.51 Right) but also has the Revit coordinate information (Figure 3.51 Left). Translating (moving) an object in Unity changes related to position vectors (X, Y, Z). When moving a GameObject in AR, the coordinate information of the object will be recalculated to new values used to update the Revit geometry. A script was created to record the relative movement (X, Y, Z) of an object’s pivot from the old location to the new location and add this relative movement to the object’s Revit coordinates. 65 Figure 3. 51 A GameObject Pivot position (Right), a GameObject Revit Coordinate information (Left) For example, when moving a GameObject in the tool, for those objects drawn by line in Revit, TransBIM can calculate the object’s relative movement and rewrite the values of the start point and endpoint of the objects with new values. The new values were calculated by adding the relative translation of a GameObject’s pivot to the Revit previous coordinate values in each vector. The new calculated Revit coordinates of the moved objects were one type of modified BIM data stored in a text file that was next upload to Dropbox and then used for updating the building information model in Revit. The detailed scripts (ReferenceObject.cs) and explanations of moving an object and changing the Revit’ coordinates information refer to Chapter 4. 3.3.4.2 Changing the values of some parameters With the designed UI property window (Chapter 3.3.3), users can quickly view the parameters of the selected building components. The property window contains a collection of 2-column UI prefabs (“TextRow, InputRow, CheckRow, OptionRow”). Each 2-column prefab contains 2 groups of white background images and text left and right. The C# script “InventoryControl” can replace the default texts with an object’s parameter information. The left column shows the objects’ parameter name, and the right column shows that parameter value. For example, in the TextRow UI prefab, “Comments” is one of the parameter types in Revit showing on the left column, while “This is the parameter value” is the parameter value of the “Comments” parameter showing on the right box (Figure 3.52). The UI components not only provide an object’s parameter names and parameter values, but also provide a portal to change some of the parameter’s information. According to Chapter 3.3.3, the designed “TextRow, InputRow, CheckRow, OptionRow” prefabs let users change the parameter values by typing texts, clicking checkboxes, or choose options. s Figure 3. 52 The UI component prefab shows the object’s parameter information 66 The values of the size parameters of ducts, pipes, and conduits were the first three Revit parameter types that can be changed via the UI. By doing a similar process and method to change the parameter value through UI, other parameters’ values of the mechanical systems can also be changed. In Revit, there are two types of cross-sections for duct, pipes, and conduits: rectangular and circular. Therefore, the size is controlled either by width and height for rectangular systems or diameter for circle systems. In Revit, the parameters of one element and the element model are correlated. To change the sizes of ducts, pipes and conduits can be made by changing the width/height/diameter values. For example, users can change the values of the size information in the Revit, and Revit can automatically update the model to the correct shape based on the current size information (Figure 3.53). Figure 3. 53 Change duct size in Revit The imported parameters’ information (BIM metadata) was parsed and attached to the imported FBX model (BIM geometry) based on the unique ElementID. However, changing either of them in Unity does not automatically affect the other. Therefore, a formula was created to build a connection between an object’s size parameters and the object’s geometry. After users change the value of the system's size, the geometry of the changed object will be changed correspondingly in Unity. The method was implemented by connecting the size value and the object’s scale value. When changing the size values (object’s cross-sectional changes), the object can be scaled in its non- longitudinal directions (Figure 3.54). For example, when a user modifies a circular duct’s size from 4 inches (diameter) to 6 inches, it means the cross-section along the duct should become 6/4 = 1.5 times larger. The changed geometry was not transferred back to Revit to update the element. Instead, the information of the parameter size was transferred to Revit to update the Revit native element. When the parameter of an element is changed properly, Revit can automatically change the 3D model of that element correspondingly. The detailed scripts and description of link the object’s size information and the object scale factor refer to Chapter 4. Figure 3. 54 Scale in non-longitudinal directions TransBIM uses both the Revit objects’ geometries (FBX) and the parameter data (CSV) to present the AR scene. But the application only transfers the changed parameter data back to Revit (like the diameter of the pipe or new location of the pipe), relying on Revit to update its models. 67 3.3.5 Generating Files and Developing the web backend request There are 2 main functions built in the TransBIM application: generating a text file and creating a web backend request (Figure 3.55). Any changes made in the AR application will automatically store in a text file. The text file stores the BIM metadata such as the ElementID, new startpoint and endpoint x,y,z coordinates, and other functional information of the modified GameObject. The text file should finally be used in Revit to update the original BIM data. Because it is a mobile AR application, a good way to transfer the text file is using the Internet. Therefore, another function of making a web backend request was developed to enable users to upload the created text file to Dropbox. The detailed scripts and description of how the text file was generated and refer to Chapter 4. Figure 3. 55 Overview of generating text file & web request 3.3.6 Develop iOS AR applications and set tracking images’ locations This part introduces how to build the AR experience for mobile devices (iOS) with Unity with AR Foundation package and ARKit XR plugin. The overall steps include downloading Unity XR packages, set AR scene hierarchy, and using ARTrackedImageManager (Figure 3.56). Detailed scripts and explanations are shown in Chapter 4. Figure 3. 56 Overview of developing the AR experience ARFoundation allows the developer to build augmented reality in multiple platforms with Unity. It does not implement any AR features but only provides Unity developers an interface to use other XR packages for specific devices. For example, to use AR Foundation on a target device, it is required to use a separate package for that platform (e.g., ARKit 68 XR Plugin on iOS, or ARCore XR Plugin on Android). This study used ARFoundation and ARKit XR Plugin to build the AR application for iOS. The Unity packages used were ARFoundation, AR Subsystems, ARKit XR Plugin, and XR management (Figure 3.57). Figure 3. 57 Download ARFoundation, ARKit XR Plugin and other XR packages in Unity Package Management After downloading the XR packages, a basic AR scene hierarchy was added to the scene. It was comprised of the AR Session and AR Session Origin (Figure 3.58). The AR Session and AR Session Origin were created by right- clicking in the scene hierarchy in Unity and selecting XR > AR Session and XR > AR Session Origin (Figure 3.59). Figure 3. 58 Basic AR scene hierarchy Figure 3. 59 Add AR Session and AR Session Origin The AR Session GameObject is the fundamental component to enable an AR experience (ARFoundation, 2020). It automatically has two script components on it, AR Session(Script) and AR Input Manager (Scripts) (Figure 3.60 Left). These scripts enable AR Session to control the lifecycle of an AR experience, camera capture, motion processing, and image analysis (Microsoft Documents, 2020). For the AR Session Origin GameObject, it is used to 69 detect the trackable features (such as planar surfaces and feature points, images) and calculate their final position, orientation, and scale in the Unity scene (Unity World Space). The default of AR Session Origin has only one script component, the AR Session Origin (Scripts) (Figure 3.60 Right) Figure 3. 60 ARSession and scripts (Left) ARSessionOrigin and scripts (Right) The ARSessionOrigin’s GameObject can also have some additional components on it. For example, trackable managers can be added to enable devices to detect specific physical features (Figure 3.61 Left). The trackable managers enable the device to detect and track physical objects in the real world, which includes planes, point clouds, anchors, environment probes, faces, images, and 3d objects. This developed tool, TransBIM, used “AR Tracked Image Manager (Script)” to detect preset on-site images as references to anchor the virtual model on site. (Figure 3.61 Left). Figure 3. 61 ARSessionOrigin with all the trackable managers (Left) ARSessionOrigin in TransBIM scene (Right) Here explains how ARSessionOrigin works in TrasnBIM. When users open the application, the AR session begins. There is a starting position of the device (0, 0, 0). The start position is the origin of the device’s coordinate system (Unity World System) (Figure 3.62). Then, users holding the devices move around and detect the on-site images. Enabled by the AR Tracked Image Manager (Script), the pre-defined on-site images can be found and be given coordinate information (x, y, z) in the Unity World System (Figure 3.62). The digital model then can be put based on the locations of the detected image. 70 Figure 3. 62 Tracked images’ locations in Unity World Coordinate TransBIM uses two images as references to anchor the model with the real building. The two digital images were defined in the “ReferenceImageliabry” of the “AR Tracked Image Manager” (Figure 3.63). As a result, TransBIM can detect these images anywhere using a device’s camera and can give them positions in the Unity world space. Figure 3. 63 Define Custom scan Images in “ReferenceImageliabry” The hard copies of the two images were stuck on site of the real building (Figure 3.64). When the device scanned the images, virtual balls will be put at the detected images’ center (Figure 3.64). The tool can get the images’ positions and calculate the vector direction between the two images in the Unity World Space. Then the virtual building model can be initiated with the right location and rotation in the Unity World Space based on those tracked positions and direction. 71 Figure 3. 64 First Image and Second Image locations The first image, defined in the “ReferenceImageliabry”, was stuck at the center of a door (Figure 3.65 Left). It was used to give the position of the overall model. This was why when creating the model prefab, the pivot of the model was reset at the center of the door (Figure 3.65 Right). When the camera detected the on-site first image’s location, it can anchor the model by merge the model’s pivot to the first image’s location. However, one image can only determine the door's center location but not determining the door’s surface. Therefore, the second image was put at the surface of the door and used to work with the first image to find the door’s direction in the Unity World System (Figure 3.64). The direction can rotate the model correctly so that the virtual door is merged with the physical door. The whole virtual model was superimposed correctly on the real world. Figure 3. 65 Set tracking images’ positions in both the virtual model and the real building. The detailed of all the above scripts and descriptions on how to do image tracking and initiate the virtual model on the tracked images refer to Chapter 4. 3.4 Use AR in real world Chapter 3.3 explains the core functions of the developed mobile AR application, TransBIM. This section explains the overall steps of using TransBIM on site. It is the process of using the developed application to change the BIM data and transfer the modified BIM data from the users’ side to online location. The detailed example of using TransBIM see Chapter 5. Overall, there are four steps (Figure 3.66). 1. Scan on-site images to locate building information model 2. Visualizing HVAC information, 3. Manipulating objects (Changing BIM data) 4. Generating a text file of BIM data and upload it to Dropbox 72 Figure 3. 66 Workflow of using the developed application First, users should open the application and scan the on-site images. By using ARFoundation and ARKit, TransBIM can detect the on-site images, put the virtual balls at the images’ centers and record each image’s location. Based on the detected images’ locations, users can click “Load Model” button to initiates the virtual prefab model correctly overlaying on the real world (Figure 3.67) Figure 3. 67 Scan Images for BIM model locating Second, users can walk around in the building and click objects in the application to visualize BIM data. All parameter information of a selected object is shown in the “Scroll Window”. Figure 3. 68 Visualize HAVC Information 73 Third, parameters such as size information can be changed in the “Scroll Window” (Figure 3.69 Left). Also, users then can move the object to new locations and change some objects’ functional data (Figure 3.69 Right). Figure 3. 69 Change duct size (Left), change duct location (Right) Last, all the changes of BIM data will be automatically stored in a text file. Finally, users can click the “Upload” button to upload the text file with the modified BIM data to Dropbox (Figure 3.70). Figure 3. 70 Generating a modified BIM data and upload to Dropbox The TransBIM UI and functions list below (Figure 3.71). Chapter 5 gives a detailed case study of using TransBIM with each feature. Figure 3. 71 TransBIM Features 74 3.5 Update BIM This section introduces the workflow of how the BIM metadata being used to update the building information model in Revit (Figure 3.72). Dynamo was used to create the custom tool that read the modified BIM metadata and update the building information model. Tasks include downloading the metadata file from Dropbox and run the Dynamo scripts. The scripts can automatically change the Revit objects’ parameter values and the objects’ location based on the downloaded file. The detailed Dynamo scripts were shown in Chapter 4, and a case study of this workflow is presented in Chapter 5. Figure 3. 72 Workflow of updating BIM with modified BIM data by using Dynamo scripts The uploaded text file was downloaded from the Dropbox. Then the Dynamo script “Change location multiple.dyn” was opened to read the text file to update the Revit model Figure 3.72). The Revit model based on the changed BIM metadata was updated. The object’s parameters such as sizes, comments, mark that were typed in TransBIM were changed in the Revit model. The Revit objects’’ locations were also updated by reading the new coordinate information from the text file. As a result, the overall workflow of using TransBIM to edit the building information model in the AR environment was realized. 3.6 Summary This Chapter discussed the overall methodology and functionalities of the developed BIM-based AR application. There are five tasks to realize visualizing and editing BIM data (Revit) in the developed mobile AR application (TransBIM) and transferring the modified BIM data back to Revit to update model (Figure 3.73): • Prepare the BIM (Building Information Model) • Exporting the BIM geometry and metadata (Geometry transfer and Metadata transfer) • Developing mobile BIM-based AR application (Create AR in Unity) • Use the AR application to visualize and edit BIM data (Use AR in real world) • Update the original BIM file in Revit based on modified BIM data in Revit (Update BIM) 75 Figure 3. 73 Methodology The overall scripts using in the tool development lists below (Table 3.1). Table 3. 1 A list of C# script using in the tool development AddTag.cs A script with a function to check whether a “Unity tag” exists in the project and will add a tag automatically if the tag does not exist. For example, wall objects need to be assigned the “Wall” tag. Objects with tags can be filtered by categories. (called by ReadCSV.cs) AssignParameters.cs A script with a function to assign a dictionary of an object’s metadata into the created Unity ScriptableObject of each object. (called by ReadCSV.cs) wallParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of wall objects. (e.g. “wall height”) ductParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of duct objects. (e.g. “duct elevation” ) conduitParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of conduit objects. (e.g. “conduit size”) pipeParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of pipe objects. (e.g. “pipe size”) ReadCSV.cs (MonoBehaviour) Parse the CSV file and divide the CSV data into groups by ElementID. A group of data is the collection of all parameter information for an object. Store the related group information that only contains the information of this GameObject into a dictionary. Create a Unity “ScriptableObject” of each object, which is an asset file to store information, and call functions from “Assignparameters.cs” to assign the dictionary information to the created asset file. ClickObject.cs (MonoBehaviour) After users click a GameObject, the script triggers functions such as • changing the selected object material color light yellow to show that it is selected • update the “Property Window” and “Change Parameter Window” 76 ReferenceObject.cs (MonoBehaviour) A place to record the current selected object and the last previous selected object. When users click a new object (current selected object), the scripts know what the last selected object was so that the “ClickObject” can change the previous selected object color back to the original color; It also includes functions such as changing the ducts/pipes/conduits positions and record the changes into a dictionary. SwitchOver.cs (MonoBehaviour) A script contains functions triggered by UI buttons. For example, clicking the “Upload button” will display the upload window. Likewise, other buttons can trigger other functions in “SwitchOver.cs” to turn on or off the UI interfaces. When users click the “Upload Data” button, a text file with all changed BIM data can be generated. TrackedImageAndLoadModel.cs Get the detected image’s location and assign spheres on the tracked images’ location; Load model and put the virtual model based on the tracked images’ location (also the sphere’s location) The detailed Dynamo scripts and the Unity scripts for the mobile AR development are explained in Chapter 4. Chapter 5 gives a case study of how to use the app with usability validation. 77 Chapter 4 4. Program Development The overall methodology and the core processes were defined in Chapter 3, and now this chapter gives detailed explanations on how some of the core processes were realized in code. Dynamo and Unity were used to execute the developed scripts for the needed functions. The other non-coding processes, which can be done by using specific software instead of writing code, have been explained in Chapter 3 and are not included in this chapter. These non- coding parts listed below are marked in grey in the diagram (Figure 4.1). • Define building information model objects in Revit (Chapter 3.2.1) • Export Revit objects’ geometries in FBX (Chapter 3.2.2.1) • Modify Revit FBX in 3ds Max (Chapter 3.2.2.2) • Use AR (TransBIM) in real world (Chapter 3.4) Figure 4. 1 Process with source codes are translucent The tool development is categorized into three parts: preparing the building information model (Dynamo), developing the mobile AR application (Unity), and updating the BIM with modified metadata (Revit) (Figure 4.2). The first part, “Prepare BIM data”, used Dynamo scripts to add new custom parameter information to objects (specifically the objects’ coordinate information) and export objects’ metadata to a CSV file. The second part, “Develop Mobile AR Application—TransBIM”, used Unity C# scripts and AR SDKs (ARFoundation & ARKit) to develop a BIM-based AR application with the needed features described in Chapter 3. The third part, “Update BIM Model”, used the Dynamo scripts to update the Revit model based on the BIM metadata that was modified in “TransBIM.” The scripts can be found in Appendix B.1: ReadCSV.cs. 78 Figure 4. 2 Scripts used for each function 4.1 Prepare Building Information Model Metadata This section explains how the Dynamo file “Revit to excel.dyn” was used to prepare BIM metadata. There are two major functions addressed by the Dynamo script: 1. adding new types of parameters to store objects’ coordinate information in Revit. 2. exporting the objects’ all parameter information (metadata) from Revit to a CSV file. 4.1.1 Add coordinate information with Dynamo According to Chapter 1-3, objects in different categories have different control points. These control points control where an object is located in Revit. Objects created by drawing lines (e.g. walls, columns, pipes, conduits, etc.) are located based on lines defined by a startPoint and endPoint (Figure 4.3). Objects created by hosting on another component (e.g. windows, doors, lighting fixtures) are located based on one-point location (Figure 4.3). The locations of the control points are represented by coordinates in Revit. 79 Figure 4. 3 Revit element location ( a pipe is line-based and a door is host-based) To store the coordinate information of each object, new project parameters were first added in Revit. These newly created parameters were exported with other parameters as metadata. Depending on how an object is located (line or point), the “Revit to excel.dyn” can filter the selected objects into 2 categories, line-based and host-based objects, to create relevant parameters (Figure 4.4). Figure 4. 4 Adding new project parameter type to each object based on its location type (line or point) Categories such as walls, pipes, ducts, conduits were given 6 parameter types to store individual line location: • line Startpoint x, y, z coordinate as startPoint_x, startPoint_y, startPoint_z; • line Endpoint x, y, z coordinate as endPoint_x, endPoint_y, endPoint_z; Categories such as columns, doors, lighting fixtures, electrical equipment, generic models, duct and conduit fittings, air terminals, etc. were given 3 parameter types to store individual host point location: • host point x, y, z coordinate as hostPoint_x, hostPoint_y, hostPoint_z; After the new parameters were created, Dynamo scripts were assigned the control line/point coordinate information to the object parameters. For example, the scripts filter elements into “One-point based elements” and “Two-point 80 based elements.” Then, the scripts read the hostpoint x,y,z coordinate of each “One-point based element” and assign the hostpoint x, y, z values to the hostPoint_x, hostPoint_y, and hostPoint_z parameters of that corresponding element (Figure 4.5). Likewise, Dynamo reads each “Two_point based element” control line coordinate information (startpoint x, y, z, endpoint x, y, z), and assign the values to parameters (startPoint_x, startPoint_y, startPoint_z, endPoint_x, endPoint_y, endPoint_z) (Figure 4.5). Figure 4. 5 Assign object spatial information to the newly created parameters With the above two processes, the Revit property window can successfully show the new parameters with the object’s coordinate information (Figure 4.6). Figure 4. 6 Adding object’s location results 4.1.2 Export Revit metadata to CSV with Dynamo Coordinate information for all objects was added in the Revit project. The metadata of the selected object was then exported to a CSV file. The groups of nodes were also in the “Revit to excel.dyn” file. There are five steps to export parameter information to a CSV file (Figure 4.7): 81 1. Select Revit object: enable users to select the Revit objects in Revit 2. Get objects ID: get the selected object’s ElementID (6-8-digit number) and format it as a string “ElementID: XXXXXX” 3. Get objects parameter values: get all parameter information about the selected objects and format them as strings in list: “parameter type: parameter value” (e.g. Area: 148 SF) 4. Objects metadata: combine the ElementID and other parameters information into one list (Objects metadata) 5. Export Parameter data to Excel: generate a CSV file with the parameter information list Figure 4. 7 Export Revit metadata (element ID and parameters) to Excel in Dynamo Each row in the CSV file contained all parameters’ data of one object. In each row, the first column was the ElementID and the reset columns were other parameter types and values (Figure 4.8). Figure 4. 8 Results of the exported metadata in Excel 82 4.2 Develop Mobile AR Application—TransBIM This section explains some of the C# Scripts used in Unity to build the mobile AR application—TransBIM (Table 4.1) (Figure 4.9). Figure 4. 9 Scripts used for each function Table 4. 1 A list of C# script using in the tool development AddTag.cs A script with a function to check whether a “Unity tag” exists in the project and will add a tag automatically if the tag does not exist. For example, wall objects need to be assigned the “Wall” tag. Objects with tags can be filtered by categories. (called by ReadCSV.cs) AssignParameters.cs A script with a function to assign a dictionary of an object’s metadata into the created Unity ScriptableObject of each object. (called by ReadCSV.cs) wallParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of wall objects. (e.g. “wall height”) ductParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of duct objects. (e.g. “duct elevation” ) conduitParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of conduit objects. (e.g. “conduit size”) pipeParameter.cs (ScriptableObject) A class extended from ScriptableObject that defines the fields of pipe objects. (e.g. “pipe size”) ReadCSV.cs (MonoBehaviour) Parse the CSV file and divide the CSV data into groups by ElementID. A group of data is the collection of all parameter information for an object. Store the related group information that only contains the information of this GameObject into a dictionary. Create a Unity “ScriptableObject” of each object, which is an asset file to store information, and call functions from “Assignparameters.cs” to assign the dictionary information to the created asset file. ClickObject.cs (MonoBehaviour) After users click a GameObject, the script triggers functions such as • changing the selected object material color light yellow to show that it is selected • update the “Property Window” and “Change Parameter Window” 83 ReferenceObject.cs (MonoBehaviour) A place to record the current selected object and the last previous selected object. When users click a new object (current selected object), the scripts know what the last selected object was so that the “ClickObject” can change the previous selected object color back to the original color; It also includes functions such as changing the ducts/pipes/conduits positions and record the changes into a dictionary. SwitchOver.cs (MonoBehaviour) A script contains functions triggered by UI buttons. For example, clicking the “Upload button” will display the upload window. Likewise, other buttons can trigger other functions in “SwitchOver.cs” to turn on or off the UI interfaces. When users click the “Upload Data” button, a text file with all changed BIM data can be generated. TrackedImageAndLoadModel.cs Get the detected image’s location and assign spheres on the tracked images’ location; Load model and put the virtual model based on the tracked images’ location (also the sphere’s location) 4.2.1 Unity Scripting Overview In Unity, the behavior of a GameObject is controlled by the <Components> that are attached to them. A GameObject is a container for many different Components. For example, by default, all GameObjects automatically have a <Transform> Component, which defines the position, rotation, and scale of a GameObject. Users can attach other components in one GameObject (Figure 4.10). Each GameObject under the FBX hierarchy has a <Mesh Filter>, a <Mesh Renderer>, and a <Mesh Collider> component: • the <Mesh Filter> and <Mesh Renderer> define the geometry of the GameObject; • the <Material> component defines how the GameObject looks; • the <Mesh Collider> is the component that enables Unity to do collision detection on the GameObject. Unity scripts can also be components attached to a GameObject. All GameObjects under the FBX hierarchy were attached with “ReadCSV.cs” and “ClickObject.cs” (Figure 4.10). Figure 4. 10 Add <Components> to the GameObject under the FBX hierarchy Users can create a new script from the Create menu at the top left of the Project panel or by selecting Assets > Create > C# Script from the main menu. All the developed scripts are in the “Assets\Addin_scripts” folder (Figure 4.11). Then users can select GameObject and add script instances as components of that GameObject. 84 Figure 4. 11 Create Scripts in the “Assets\Addin_scripts” folder When attaching a script on a GameObject, the script can realize features on the attached GameObject such as triggering game events, modifying other component’s properties over time, and responding to user input in any way the scripts defines. For example, all GameObjects (Revit elements) under the FBX hierarchy were given instances of two scripts: “ReadCSV.cs” and “ ClickObject.cs” (Figure 4.12). Figure 4. 12 Attach “Mono Script” components on each Revit Object under the FBX hierarchy A script that can be attached to a GameObject should be a class extended from the MonoBehaviour class, which is a base class from which every Unity script is derived (Figure 4.13). When extended from the MonoBehaviour class, scripts can make connections with the internal workings of Unity. Figure 4. 13 Script derived from MonoBehaviour Class A Unity script (derived from MonoBehaviour) is not like the traditional idea of a program where the code runs continuously in a loop until it completes its task. Instead, Unity passes control to a script intermittently by calling certain functions that are declared within it. Once a function has finished executing, control is passed back to Unity. These functions are known as event functions since they are activated by Unity in response to events that occur during gameplay. Some of the important event functions are • Start(): is called when a script is enabled just before any of the first Update is called • Update(): is called every frame, if the MonoBehaviour is enabled. • Awake(): is called when a script is instantiated • OnEnable(): is called whenever a script is enabled • OnCollisionEnter(): is called whenever two colliders hit each other 85 • OnApplicationQuit(): is called when the user quits the application • OnDisable():is called whenever a script becomes disabled • OnDestroy(): is called when an object is destroyed Unity defines the event functions in a predetermined order (Figure 4,14). Unity will execute the event functions followed by orders (Figure 4.14). For example, a script has Awake(), OnEnable(), Start() and Update() functions and all functions events meet their trigger conditions. According to the order diagram, Awake() is ran before onEnable() followed with Start(). Therefore, the Awake() is first called; then OnEnable() is executed before the Start() function, and the Start() function is executed before Update() function. Figure 4. 14 Order of Execution for Event Functions (Partial Diagram) (https://docs.unity3d.com/Manual/ExecutionOrder.html) 86 4.2.2 Define ScriptableObject A ScriptableObject is a type of data container that can store large amounts of data as an asset file (FileName.asset) in the project, similar to MonoBehaviours, ScriptableObjects derived from the base Unity object. However, unlike MonoBehaviours, ScriptableObject cannot be attached to a GameObject. This application uses the different types of ScriptableObject as containers as the final place to store an object’s metadata from the CSV file exported from Revit. Each GameObject had a corresponding ScriptableObject (asset file) created and saved in the “Asset/Resources/metadataScriptableObject” folder (Figure 4.15). Each ScriptableObject (asset file) stores each GameObject’s (Revit element) metadata (Figure 4.15). Using the ScriptableObject can increase the operating efficiency of the developed application “TransBIM.” The application can directly access the asset file of that GameObject to retrieve the storing information instead of taking the time reading the CSV file to re-find a GameObject’s corresponding Revit element parameter information in each run time. Figure 4. 15 The result of creating an asset file that store the object Revit metadata for each GameObject In “TransBIM,” there are a total of 5 types of ScriptableObject classes. Each different category has its own type of ScriptableObject. Each type of ScriptableObject defines variable fields indicating what parameters belong to a certain category. The ScriptableObjects for “Walls”, “Ducts”, “Conduits”, and “Pipes” were respectively defined in “wallParameter.cs”, “ductParameter.cs”, “conduitParameter.cs”, “pipeParameter.cs” (Figure. 4.16). For other categories such as “Roofs”, “Floors”, the ScriptableObject was defined in “otherObjectParameter.cs” Figure 4. 16 The ScriptableObjects for different categories (e.g. ductParameter.cs) One can show the types of ScriptableObject defined for the duct category (Figure 4.17) and the conduit category (Figure 4.18). In the ductParameter.cs, all the parameter types of Revit ducts were defined as variable fields. For example, the startPoint_x, starPoint_y, startPoint_z are the parameters to store the object’s Revit coordinate information; parameters such as dimensions (height, width, size) and mechanical parameters (system type, system name, area, etc.) were also included. Based on the individual parameter type and whether the parameter is edited or only presented in the developed application, the variable types of the fields are different. Overall, three types were used: “string” (text information), “float” (decimal numbers), and “bool” (a value that is true or false). String 87 provides text information, which only provides users the information of certain parameters not editable. Floats are numbers so that users can control the number to either change the size or location of the selected object. Boolean is the “true” or “false” value that will be shown as “checkbox” for users to switch on and off. Figure 4. 17 Define the ScriptableObject for duct category Figure 4. 18 Define the ScriptableObject for conduit category 88 The ScriptableObjects for “Walls”, “Ducts”, “Conduits”, and “Pipes” were defined with detailed parameter information because one of the purposes of the tool is to check the Revit mechanical elements’ functional information. For objects that do not belong to the four categories such as “Roofs”, “Floors”, “Columns”, etc., the scripts define an “otherObjectParameter.cs” ScriptableObject type to store limited general information (Revit coordinate information, elementID ,Category, etc.) ( Figure 4.19). Figure 4. 19 Define the ScriptableObject for objects not belonging to “Walls”, “Ducts”, “Conduits”, and “Pipes” categories With the defined ScriptableObject for each category (dutctParameter.cs, conduitParameter.cs, etc.), the asset files can be created. Examples were the new asset files for the duct and conduit GameObject before assigning values (Figure 4.20). Each GameObject had a corresponding asset file in the “/Asset/Resource/metadataScriptableObject” folder (Figure 4.15). These assets were created when executing the “ReadCSV.cs”. Figure 4. 20 Examples of assets created by the different ScriptableObject types 4.2.3 ReadCSV.cs (Parse Revit metadata) “ReadCSV.cs” is one of the major scripts attached to the GameObjects. The scripts address the “Parse the Revit metadata” part, i.e. extract the data from the CSV file and store the data into an asset file (ScriptableObject) for each GameObject. The workflow of “ReadCSV.cs” include 89 • Read the imported CSV file that was exported from Revit with the metadata of each Revit object (4.2.3.2); • Parse the CSV file and extract relevant parameter information of a GameObject based on same ElementID (4.2.3.3); • Store the parameter information of a GameObject into a dictionary “ParaDict” (4.2.3.4); • Create and save an asset file (ScriptableObject instance) to store GameObject parameter information by assigning information from the “ParaDict” (4.2.3.5); • Create tags in the projects based on all GameObjects “Category” parameter information (4.2.2.6); • Load the asset file of the GameObject and give a tag for each GameObject (4.2.2.7); 4.2.3.1 Set variables ReadCSV.cs is a Unity class extended from the MonoBehaviour and is attached to GameObjects. The script started with defining several fields (variables of any type that is declared directly in a class or structure) (Figure 4.21). The fields were set to public (e.g. public ScriptableObject) because the information stored in the fields was supposed to be accessed later by other scripts. • “assetPathAndName”: used to store the file path used when creating a new asset (ScriptableObject). • “loadassetPath”: used to store the file path used when loading the asset file to acquire information. • “ObjectID”: used to store the ElementID (6-8-digit number) of the script attaching GameObject. • “parameterArray” and “paraDict” are respectively an array and a dictionary to temporally store an object’s BIM metadata before generating the ScriptableObject (asset file) of that GameObject. • “data” is a static string array that stores all the CSV information as one string (all objects’ metadata). • “object_SO” is a variable that uses to reference what ScriptableObject (asset file) belongs to the GameObject. The created ScriptableObject is the final destination of the BIM metadata and will be loaded when using the developed AR application. Figure 4. 21 Set fields to store important information 4.2.3.2 Read the CSV file into an array of strings After processing the variable declarations, Unity first excutes the Start() function in the “ReadCSV.cs”. The Start() function reads the imported CSV file that stores all object’s metadata. The CSV file was named as “Revit_metadata_1201.csv” in the “Asset/Resoures” Folder (Figure 4.22). 90 Figure 4. 22 CSV “Revit_metadata_1201.csv” in Unity Asset/Resources Folder The data of the CSV file were read and stored into the “data” variable (static string[]) (Figure 4.22). The scripts first check the “data” (static string[]) whether its value is “null” (Figure 4.23). If the “data” value is not “null”, it means Unity has passed the information from the CSV file to the “data” during the runtime. On the contrary, “null” means the CSV file has not been read and the metadata of the CSV file has not assigned to the “data” field. Then, the CSV file is read by using “Resource.Load<TextAsset>(“Revit_metadata_1201)” into a text. The text is split by line (‘/n’) as an array stored into the “data” (string array) (Figure 4.23). Figure 4. 23 Read CSV File only one time and store the information into a static string[] As the variable “data” is derived by using the split() function, it is a string array. Each element in the array contains all information of one row from the CSV file. For example, data[0] stores information of the first row in the CSV file. One row contains all the metadata of one Revit object (Figure 4.24). Figure 4. 24 Debug results show the example of data[0] information (first row of CSV) The reason to set “data” as a static variable is because there are multiple (even substantiate amount) objects being attached with “ReadCSV.cs”. Unity will execute all these “ReadCSV.cs” instances so that each GameObject can 91 generate its ScriptableObject (.Asset) to store its BIM metadata. However, it is not necessary for every script to read the CSV file many times. Therefore, the static variable “data’ was used to make all the GameObject sharing the “data”. By only reading the CSV file once, all the GameObjects can extract the relevant BIM metadata from the static sting array “data”. 4.2.3.3 Find the GameObject’s relevant Revit metadata by matching ElementID Each GameObject has a name with the Revit Family type and [element ID] (Figure 4.25). Also, the array “data” derived from the CSV has the element ID (Figure 4.26). By matching the ElementID, each GameObject’s BIM metadata can be extracted from the “data” array and assigned to the variable parameterArray (public array) and paraDict (public dictionary) of each GameObject. Figure 4. 25 Match the Element ID of the metadata and imported geometry Figure 4. 26 Static string[] data Partial Debug Results The ElementID from a GameObject’s name is first extracted and stored into a string variable “ObjectID” (Figure 4.27). The “ObjectID” is used to find which element in the variable “data” array contains the GameObject’s metadata. Figure 4. 27 Extract ElementID from GameObject’s name and assign it to the “ObjectID” variable 92 Then, each ElementID in the variable “data” is extracted and compared whether it is the same as the “ObjectID”. The variable “data” is an array that each element in the array is a string storing one-row information of the CSV file. To get the element ID from one string (one-row information), the string should be sliced. The “for loop” is used to iterate the process of slicing each string (element in the “data” array). Because cells in one row are separated by commas (‘,’) in CSV, in each loop the string was split by commas (‘,’) to get individual cell’s information of one row. It generates a new array assigned to the string[] “row” variable, where each element in this array represent a cell’s information. In each loop, the row[0] stores the first cell information of one row. (e.g. “Element ID : 138949”). The statement “string ElementID = row[0].Substring(row[0].IndexOf(':') + 2);” extracts the Element ID (6-8-digit number) (Figure 4.28). If the “ElementID” matches the “ObjectID”, it means in this loop, the array “row” (one row of information from CSV) has the metadata for the GameObject. As a result, that matching array is assigned to the parameterArray of this GameObject, and the loop is broken. During the runtime, in the Unity inspector, the metadata of one Revit Object’s metadata is stored and shown in parameterArray (Figure 4.29). Figure 4. 28 Find GameObject’s metadata based on matching ElementID and assign the matched metadata into an array “ParameterArray” Figure 4. 29 An example of a duct object was assigned with metadata into the parameterArray variable 93 4.2.3.4 Store the GameObject relevant Revit information into a dictionary Each element in the ParameterArray is a string that represents the information of one cell from the CSV file. The structure of one string is like “Revit parameter name : parameter value” (e.g. “Size : 24x20”). Therefore, a dictionary of each GameObject was created to store a pair of information sliced from each string (Figure 4.30). The keys of the dictionary are the parameter types, while the values of the dictionary are the parameters’ values. With the dictionary “paraDict” created, it is easier to find the specific value of a parameter. For example, paraDict[“size”] can get the value of “24x20”. The “paraDict” is like a temporary container storing an object’s Revit metadata because TransBIM used ScriptableObjects (asset files) as final places to store each GameObject’s Revit metadata permanently. The next section, 4.2.3.5, explains how to assign the “paraDict” data to create the asset files. Figure 4. 30 Slice the information from the array “parameterArray” into the dictionary “paraDict” 4.2.3.5 Create a ScriptableObject based on the GameObject category and assign values In the “ReadCSV”, After each object had a “paraDict” dictionary with all the object’s metadata, a collection of asset files (ScriptableObjects) was created as the final places to store the dictionary information. Correspondingly, each GameObject will create a ScriptableObject. The script uses the “CreateAssetandAssignDictValue()” function to create an asset file (ScriptableObject) based on the GameObject category type and assign the asset file values from the Dictionary that stores that GameObject’ Revit element parameter information (Figures 4.31 and 4.32). Figure 4. 31 Create an asset file (ScriptableObject) based on category type and assign values to the asset file In the CreateAssetandAssignDictValue() function, the script uses paraDict[“Category”] to check the category of the GameObject (Figure 4.32). Depending on the cases (“Ducts” “Walls”, “Conduits”, and “Pipes”), the specific ScritableObject instance of that category was created (e.g.: “ductParameter ductAsset = 94 ScriptableObject.CreateInstance<ductParameter>();”). If the case is not “Walls”, “Ducts”, “Conduits”, or “Pipes”, then the GameObject belongs to other categories. The default option is to create an “otherObjectParameter” asset file to store limited information of that GameObject (Figure 4.32). Figure 4. 32 Based on object’s category to create and save the asset file with the object’s parameter information In each case, a function from “AssignParameter.cs” class was called to assign the dictionary information, i.e. store the GameObject Revit metadata to the created ScriptableObject instance. (Figure 4.33, Figure 4.34). “AssignParameter.cs” is not a MonoBehaviour. It is only a C# static class that defines static functions that can be used in other classes. Some of the static functions are • public static void DictToDuctAsset (Dictionary<string, string> paraDict, ductParameter ductAsset); • public static void DictToConduitAsset (Dictionary<string, string> paraDict, conduitParameter conduitAsset); • public static void DictToPipeAsset (Dictionary<string, string> paraDict, pipeParameter pipeAsset); • public static void DictToWallAsset (Dictionary<string, string> paraDict, wallParameter wallAsset); • public static void DictToOtherObjectAsset (Dictionary<string, string> paraDict, otherObjectParameter otherOBAsset); For conduits, the “DictToConduitAsset” function was called (Figure 4.33). Similarly, for ducts, the “DictToDuctAsset” function was called (Figure 4.34). Basically, each function assigns every dictionary key’s value to the relevant fields in the created ScriptableObject instance (Figure 4.33). After the instance has stored the parameter information from the “paraDict” of the GameObject, the ScriptableObject instance with assigned dictionary values was finally saved as asset in the “Assets/Resources/metadatScrtipableObject” folder and was named with ObjectID (ElementID) (Figure 4.35). 95 Figure 4. 33 AssignParameter.cs define functions to assign dictionary data to a ScriptableObject instance (DictToConduitAsset() Partial) Figure 4. 34 DictToDuctAsset() (Partial) Figure 4. 35 Results of each asset file with assigned information 96 4.2.3.6 Add tags in the Unity project After the ScriptableObject was created and assigned with dictionary information, based on the GameObject’s category, a series of tags was created (Figure 4.37). The application assigns each Revit GameObject a tag representing its category. The reason to give each GameObject a tag is later to build a filter function. The filter function lets users choose categories to turn objects visible or invisible in the AR environment (Figure 4.36). When users selecting one category in the Filter Window, TransBIM can find all objects of this category based on the corresponding tag. Below only explains how the tags were created and assigned to each object. The filter function and examples will be explained in Chapter 4.2.4.1. Figure 4. 36 Filter objects by categories The scripts first create the tags in the project before assigning a tag to a GameObject. Creating tags is realized by a static function (“CreateTag”) in the “AddTag.cs” static class: “AddTag.CreateTag(paraDict["Category"]”, (Figure 4.37). Figure 4. 37 Add tags in the Unity project The function CreateTag (string tag) in the “AddTag.cs” scripts enable the system loading all the tags from the Unity Tag Manager and checks whether a category has a tag in the TagManager. If a tag with the same category is not found, the category tag will be added into the project so that the GameObject can be assigned with the tag (Figure 4.38). The results of added tags can be found in the Unity TagManager Window (Figure 4.39). ss Figure 4. 38 “AddTag.cs” and CreateTag() function 97 Figure 4. 39 The results of tags shown in the Unity Tag Manager After the tags were added in the project TagManager Window, each GameObject can be given a tag based on its category. For example, the roof element was assigned with a “Roofs” tag during the run time (Figure 4.40). Figure 4. 40 Tag assigning to each GameObject and all created tags 4.2.3.7 Load the asset file in the mobile application and assign GameObject a tag With the previous functions (4.2.3.1~4.2.3.6), the information in the CSV file was stored into individual assets in the project folder. Each asset stores the information of one object’s Revit metadata. As a result, when running the developed application (TransBIM), to find a GameObject’s Revit information, the system can directly use the asset file to retrieve the object’s BIM data instead of reading the CSV (Figure 4.41). With the created asset files, the mobile device does not need to run the functions listed below because all those functions are transitional methods to transfer the data from CSV file into the individual ScriptableObject instances (asset files). • Read the imported CSV file that was exported from Revit with the metadata of each Revit object (4.2.2.2); • Parse the CSV file and extract relevant parameter information of a GameObject based on same ElementID (4.2.2.3); • Store the parameter information of a GameObject into a dictionary “ParaDict” (4.2.2.4); 98 • Create and save an asset file (ScriptableObject instance) to store GameObject parameter information by assigning information from the “ParaDict” (4.2.2.5); • Create Tags in the Projects based on all GameObjects “Category” parameter information (4.2.2.6); To eliminate those functions’ scripts not being run in the mobile device, in the “ReadCSV.cs” line 42 (“#if UNITY_EDITOR”) and line 98 (#endif), the Unity Platform dependent compilation was used (Figure 4.41, Figure 4.42). Unity Platform dependent compilation defines what platform is used to compile and execute a section of scripts. For example, the scripts between “if UNITY_EDITOR” (line 42) and “# endif” (line 98) can only be compiled and executed by Unity editor. As a result, the above listed “transitional functions” were all only run in the Unity editor. Figure 4. 41 Using #define directive to call Unity Editor scripts from your game code In the mobile device, the application can skip the lines between 42 and 98, which skips the parts of reading the CSV file, splitting the CSV data, storing data in a dictionary, creating new asset files and generating new project tags. Instead, the mobile system can pass the part of the scripts and go the end to directly load the needed asset file (Figure 4.42). The script uses “object_SO = Resources.Load<ScriptableObject>(loadassetPath);” to load the corresponding asset file of the GameObject and store the loaded asset file in the “object_SO” variable. Because an asset file stores a GameObject’s parameter information including “category”, the scripts then call the “FindCategory” function to find the category information from each loaded asset file (Figure 4.42, Figure 4.43). After finding the category from the loaded asset file, each GameObject was assigned a tag based on its category name (Figure 4.42). An example shows a roof GameObject was assigned with a “Roofs” tag (Figure 4.44). Figure 4. 42 Load the asset file and find category information from its loaded asset file Figure 4. 43 FindCategory() function to get the GameObject’s category from the asset file of the GameObject 99 Figure 4. 44 Result of assigning a tag to each GameObject based on its category name 4.2.4 Designing the User Interface This section explains what components are in the TransBIM UI systems and how the UI systems work (Figure 4.45). Basically, the UI was comprised of two scroll windows (“Property Window” and “ChangeParameter Window”) and several buttons (Figures 4.43 ~ 4.47). The scroll windows allow the user to see the object’s parameter information. The buttons allow users to call functions. These UI elements (scroll windows and buttons) were putting in the Canvas (Figure 4.46). The Canvas is a Game Object and all UI elements are children of such a Canvas. The Canvas is the area that all UI elements should be inside. After putting the UI elements in the Canvas, the Canvas was saved into a “Canvas” prefab so that Users can drag the prefab to the scene to use UI components to create their “TransBIM” application. Figure 4. 45 Overview of TransBIM UI system and Input functions 100 Figure 4. 46 The application UI components in the Canvas prefab (1) Figure 4. 47 The application UI components in the Canvas prefab (2) A description of each GameObject is in the Canvas prefab (Table 4.2). Table 4. 2 The descriptions of the GameObject in the Canvas prefab InventoryMenu It is a Unity “ScrollView” component. It is the “Property window” in the application. When users click a GameObject, several UI elements will be created by scripts and put into the “scroll window” in a vertical grid layout. Each UI element put in the InventoryMenu represents one parameter information. CategoryWindow Similar to the “InventoryMenu”, the “CategoryWinodow” is also a scrolling window but to control the visibility of the GameObject by categories. It is placed at the same location as “Property Window”. The application enables only one of both windows to be visible at the same time. ChangeParameter It is a Unity “ScrollView” component. It contains a few UI elements that show the editable parameter information of a GameObject. Through the UI elements in the “ChangeParameter” scroll window, users can change the editable parameter information to manipulate the GameObject in the AR environment. UploadUIWindow It is the interface with buttons and texts to control and show the status when uploading the file that stores the modified BIM data to Dropbox. ButtonFunctions It is a GameObject attached with a MonoBehaviour script called <SwitchOver.cs> with several functions. The functions are called when users click corresponding buttons. 101 Button Property It is a Unity “Button” component. Clicking this button calls a function in “ButtonFunctions” to turn the visibility of “InventoryMenu” (Property Window) on or off. Button Category It is a Unity “Button” component, which can call a function to control the visibility of “CategoryWindow”. Button Color It is a Unity “Button” component. Clicking the button calls a function to control whether set the current selected GameObject shown in its original material color or a default color such as “yellow”. Button Control It is a Unity “Button” component. Clicking the button can call a function to control the visibility of the “ChangeParameter window”. Button Upload It is a Unity “Button” component. Clicking this button can call a function to make the “UploadUIWindow” pop up (set the “UploadUIWindow” active). Load Model It is a Unity “Button” component. Clicking the button can call a function to put the whole building information model on the right location as the model is merged with the real environment. Invisible It is a Unity “Button” component, which looks like a closed eye. It can call a function to turn off the visibility of the current selected GameObject. Visible It is a Unity “Button” component, which looks like an open eye. Differing from the “Invisible button”, the “Visible button” can turn on the visibility of the current selected GameObject. AllVisible It is a button to call a function that turns all invisible GameObject to be visible. Panel It is an interface, a “welcome panel” when users open the application. It shows the step by step instructions of how to use the application. 4.2.4.1 Unity “ScrollView” Component The “InventoryMenu” (Property Window), “CategoryWindow” (Filter Window), and “ChangeParameterWindow” were created by using the Unity “ScrollView” Component. The Unity “ScrollView” is Unity built-in component that provides the scrollable features and the gird layout pattern. It can be created in the Hierarchy window→ UI → ScrollView (Figure 4.48). Figure 4. 48 Create Unity built-in UI component ScrollView The “ScrollView” contains children components under its hierarchy (Figure 4.48). Among them, the GameObject, “ParaContent”, has an attached built-in script called <GridLayoutGroup>, which controls all the children GameObject under the “ParaContent” hierarchy into a layout pattern. For example, in the “InventoryMunu”, the “ParaContent” has a <GridLayoutGoup> component with defined cell size and spacing (Figure 4.49). Then putting some UI elements as children of “ParaContent” will be layout based on the cell size and spacing defined by the <GridLayoutGroup>. 102 Figure 4. 49 <GridLayoutGoup> component in the “ParaContent” of the “InventoryMenu” All the “InventoryMenu” (Property Window), “CategoryWindow”, and “ChangeParameterWindow” were also attached with a custom script component, “InventoryControl.cs” (Figure 4.48). The main purpose of “InventoryControl.cs” is to store some “public GameObjects”: “TextRow”, “OptionRow”, “CheckRow”, etc (Figure 4.50). These “public GameObjects” reference to some UI prefabs that will be duplicated many times to show a GameObject’s parameters information. Figure 4. 50 InventoryControl.cs The UI prefabs referenced by the <InventoryControl> scripts. One prefab is a row combined cells into one row. The cell components are Unity built-in component <Text>, <InputField>, <Toggle>, and <Dropdown>. The detailed explanations of these row-like prefabs refer to Chapter 3.3.3. • TextRow: only showing the information without any editable features (<Text>); • InputRow: enable users to type text in the input filed (<InputField>); • CheckRow: enable users to change true or false of that parameter by using <Toggle>; • OptionRow: enable users to choose different preset options by using <Dropdown>; Figure 4. 51 TextRow prefab (Left) and CheckRow prefab (Right) 103 The “InventoryMenu”, “CategoryWindow”, “ChangeParameterWindow” are all “ScrollView window” with the same structure. However, the UI contents put under the “ParaContent” hierarchy were not the same. The “InventroyMenu” only has a “text” GameObject; the “CategoryWindow” has no GameObject under “ParaContent” because the GameObject will be generated by script functions when running TransBIM; and the “ChangeParameter widow” were given 9 UI elements (e.g. “X offset”, “Y offset”, “Height”, etc.) that each represents a type of parameter that can be edited with the application (Figure 4.52). Figure 4. 52 The UI Contents in the “ParaContent” in each “ScrollView” window In fact, the UI elements in the “InventoryMenu” and “CategoryWindow” were created by scripts. The first reason is both “InventoryMenu” and “CategoryWindow” need many UI elements (“rows”) to the information. Manually adding the UI components takes time. Another reason is because of the uncertainty of how many parameters a GameObject has and how many rows are needed to show the Object’s parameter information. Therefore, for “InventoryMenu”, the UI elements will be added during the run time. When users click a GameObject, the application will duplicate a corresponding number of UI elements based on the GameObject asset file (ScriptableObject) and put the duplicated UI elements under the “ParaContent” to show the all available parameter information of the current selected GameObject. For example, when users click a conduit, the “PropertyWindow” will show many rows with the conduit parameters information (Figure 4.53). Each row is a clone of a UI element (e.g. TextRow) put under the “ParaContent” hierarchy. The duplication method will be explained in the “ClickObject.cs” (Chapter 4.2.5). Figure 4. 53 Automate generating a corresponding number of UI elements in the “Property window” For the “category window” (filter window), the UI elements that show project categories were generated by using the Filter Box() in the “InventoryControl.cs” (Figure 4.54). The FilterBox() function has three parts: 1. Create an array containing all available categories in the project; 2. Based on the array length, duplicate the same amount of “CheckRow” UI elements and assign each “CheckRow” a category name (Figure 4.55); 104 3. Get the Toggle “component” of each duplicated “CheckRow” and define an event call function when each toggle component changes its value. (m_Toggle.onValueChanged.AddListener(delegate { ToggleValueChanged(m_Toggle); });) It means that whenever the values (true or false) of the Toggle changes, the scripts will call the ToggleValueChanged() function. The ToggleValueChanged() function can reset the visibility of all GameObject with the same category. For example, when users click off the checkbox of the “Wall” category via the “Category Window”, the value of the toggle of the manipulated UI element changes to “false”, as a result, the ToggleValueChanged() function is triggered to reset the visibility of GameObjects with that category (Figure 4.56). Figure 4. 54 FilterBox() function in the InventoryControl.cs Figure 4. 55 Automate Generate UI element based on Categories 105 Figure 4. 56 Turn off the visibility of “Floor” GameObjects 4.2.4.2 Buttons and “ButtonFunctions” The buttons used the Unity built-in “Button” component, which has a button script on it. The button script enables a “Button” component with clickable features and an OnClick() function (Figure 4.57 Left). The OnClick() function is a UnityEvent function that is triggered when the button is pressed. The developer can define what functions from another script can be called in the OnClick() function. When users click the button, the button’s OnClick() function will call the defined function. Each button components were given a function into the individual OnClick() event function. The example shows the switchProperty() function in the SwitchOver.cs was assigned to the OnClick() event of the “Button Property” Button (Figure 4.57 Right, Figure 4.58). Clicking the “Property Button” can trigger the switchProperty() function to open or close the “Property window” (Figure 4.59) Figure 4. 57 Button Component and OnClick() function for “Button Property”(SwitchOver.switchProperty) t Figure 4. 58 Button Component and OnClick() function for “Button Category” (SwitchOver.switchCategory) 106 Figure 4. 59 Click “Property Button” to control the visibility of the “Property window” Table 4.3 Assign functions into button components. Details scripts see the Appendix C.1: SwitchOver.cs. Table 4. 3 All Button functions Button Name Called function where the function defines Button Property switchProperty () SwitchOver.cs Button Category switchCategory() SwitchOver.cs Button Color switchColor() SwitchOver.cs Button Control swichChangeParameter() SwitchOver.cs Button Upload CreateText () SwitchOver.cs Load Model LoadModle () TrackedImageInfoMultipleManager.cs Invisible SetInvisible () SwitchOver.cs Visible SetVisible () SwitchOver.cs AllVisible SetAllVisible () SwitchOver.cs In summary, the UI components (scrollable windows and clickable buttons) provide users the interfaces to view the information and input information. Also, these UI components were put in the Canvas saved as a prefab, users can directly use the prefab and quickly build the “TransBIM” application for their BIM project. Figure 4. 60 Use the Canvas Prefab to build TransBIM UI 4.2.5 ClickObject.cs Chapters 4.2.3 and 4.2.4 discussed “parsing Revit metadata” and “Unity UI system”. This section introduces one of the interactive features of TransBIM: “select GameObject” (Figure 4.61). The “select GameObject” feature is an essential feature that enables a user to select a GameObject in the AR environment. It was realized by the “ClickObject.cs” script. 107 Figure 4. 61 One of the interactions: “select GameObject” Overall, there are 5 features enabled by the “ClickObject.cs” (Table 4.4) Table 4. 4 Features of the “ClickObject.cs” Features Chapter 1 User can select a GameObject 4.2.5.3 After clicking a GameObject: 2 Changing the selected object’s material color to dark blue: When users click a new object, the previous selected object’s material color will change back to the original color, and the new selected object will be changed to blue. 4.2.5.4 3 Generate property information (object’s parameter information) shown in the “Property Window”. Users can view all the object’s parameter information in the “Property Window” 4.2.5.5 4 Refresh the “Change Parameter Window”, which shows the editable information of the selected object. For example, if users select a duct, the “Change Parameter Window” will show the duct size, duct location, duct offset that can be edited by users. 4.2.5.6 5 Store the object's original transform information and changed information in two dictionaries in this class instance. The changed information will be used to save in a text file that will be uploaded to Dropbox. 4.2.5.7 4.2.5.1 Set Variables Similar to the ReadCSV.cs, the script ClickObject.cs is also attached to all the GameObjects under the FBX hierarchy (Revit-geometry objects). Also, the script starts with setting variable fields (Figure 4.62) (Table 4.5). 108 Figure 4. 62 Set class variables for “ClickObject.cs” Table 4. 5 Variables (Parameters) in the “ClickObject.cs” Public color “ChangeColor” A Unity Color component defines the color used to change the selected object. In this case, it is (0.1F, 0.3F, 0.5F, 1.0F), which the four values represent a color’s RGB value and transparency. Public Hashtable “ht_color” A hash table stores all the original materials’ color information of a gameObject. Explanation: When a user clicks an object, the object’s materials color will be changed to a default color. When users click another object, the last selected objects’ materials color needs to be changed back. There, the script uses ht_color to store each gameObject’s materials color information. Public string “thisObjectName” It is used to store the name of the GameObject (e.g. “Basic Wall WAT NE-2 [385971]”) Private GameObject “Parent” It is used to store the reference to the parent of each object. Because each object with “ClickObject.cs” is in the same hierarchy, they all have the same parent object. Public GameObject “GizmosOb” and “GizmosInstance” “GizmosOb” is a GameObject variable referencing the “Gizmos” prefab. When clicking an object, the scripts will call the “GizmosOb” and duplicate a new instance of “GizmosOb” in the scene. This instance is stored in the “GizmosInstance”. Static InventoryControl “propertyUI” Static InventoryControl “propertyUI” references an “InventoryControl.cs” instance that is attached to the UI component. By referencing that “InventoryControl.cs” instance, some UI components such as “textRow”, “optionRow”, “checkRow”, “inputRow” attached to that InventoryControl.cs” can be accessed. Static GameObject “ControlUI”, “ControlUIcontent” “InvisibleButton” “VisibleButton” These are variable to reference the Some UI components. By accessing these UI elements, the scripts can call functions from these UI elements. Private GameObject “textRow”, “optionRow”, “checkRow”, “inputRow” These are variables referencing the UI components’ prefabs. The prefabs are also called “textRow”, “optionRow”, “checkRow”, “inputRow”. These UI components are major elements of the property window. When users click an object, the scripts can duplicate many “textRow”, “optionRow”, “checkRow”, “inputRow” that can show the object’ parameter information. These UI components are stored in the “UIlist”. 109 Public GameObject[] “UIlist” An array of GameObject stores UI components (TextRow, optionRow, checkRow, inputRow) in the property window. When users click a new object, the “UIlist” of the previous selected object will be deleted. public Dictionary<string, string> “changeParaDict”; A dictionary stores the changed BIM information, for example, “Offset”, “Diameter_Trade_Size”, “Longitudinal Move”, “Cross Sectional Move”, and “Vertical Move”. public Dictionary<string, float> originTsDict; A dictionary stores object's transform location (x, y, z ) in Unity and original Revit startpoint and endpoint (x, y, z) information (9 Pairs). 4.2.5.2 Assign references (values) to the predefined variables After setting those variables, the script then assigns the values to these variables in the Awake() function (Figure 4.63). The Awake() function is called when the script instance is being loaded. Therefore, the Awake() function was used here to first assign values to the predefined variables. For example, there are four parts in the Awake() function: 1. Find the UI components, “textRow”, “optionRow”, etc, so the scripts can later duplicate many UI components in the “Property Window” to show object’s information. 2. Similar to Part 1 that find the UI components from the “ChangeParameterWindow”. After finding the UI components, the scripts later can update the UI components when users click a new GameObject. 3. Find the parent of each GameObject and attach a script called “ReferenceObject.cs” on it. After adding the “ReferenceObject.cs” to the parent object, at the same time, the script passes some values to the public variables defined in ReferenceObject.cs. 4. Get all the materials color information stored into a Hashtable variable “ht_color”. Figure 4. 63 Awake() function to assign values to the predefined variables 110 In part1, the script uses “FindObjectsOfType<InventoryControl>();” to find the instances of “InventoryControl” class in the scene (Figure 4.64). There are two GameObjects having the “InventoryControl.cs” components: “InventoryMenu”, and “FilterWindow” (Figure 4.65). The “InventoryMenu” is the scroll window to show object’ parameter information, and the “FilterWindow” is a similar window but to show and filter objects’ categories. The purpose to find the “InventoryControl” instance is to pass the public variables (Rowprefabs) from the found “InventoryControl” instance to the variables of each “ClickObject” class instance. As a result, the four GameObject (TextRow, OpitonRow, CheckRow, InputRow) from the <InventoryControl >were assigned to the “textRow”, “optionRow”, “checkRow”, and “inputRow” variables in each ClickObject” class instance (Figure 4.66). The scripts later can duplicate the found UI component (e.g.: “textRow”) to show an object’s parameter information (Figure 4.67). Figure 4. 64 Get “InventoryControl” component and the UI components (e.g. “textRow’) Figure 4. 65 “Inventory Control” components in the InventoryMenu (Property Window) and FilterWindow (Category Window) Figure 4. 66 Variables in <ClickObject> with no values (Left); The result of passing the variables value from the InventoryControl to <ClickObject> (Right) 111 Figure 4. 67 The results when duplicating (clone) the UI components prefabs Similar to part 1, part 2 is to find the GameObject called “ParaContent” in the “ChangeParameterWindow” (Figure 4.68, Figure 4.69). The “ChangeParameterWindow” is the interface where users can see the editable parameters of the selected object. There are nine UI elements, for example, the “CrossSectional Movement”, “Longitudinal Movement”, “Offset”, etc. As the GameObject (“ParaContent”) is found and stored in the variable “ControlUIcontent”, the scripts later can use the “ControlUIcomtent” to refer to the “ParaContent” GameObject to find all contents (UI components with editable parameter information). With the found UI elements (“rows”) in the “ChangeParameterWindow”, a function called “UpdateChangeParaUI” can update the UI elements (rows) to show a selecting GameObject’s editable parameters information. Figure 4. 68 Find and assign the GameObjects (“ChangeParameter” window and “ParaContent” ) to variables Figure 4. 69 The GameObjects (contents) under the “ParaContent” of the “ChangeParameter” window Part 3 is to find the parent of each GameObject and attach a script called “ReferenceObject.cs” to the parent (Figure 3.70). A parent is the upper-hierarchy GameObject to the GameObject. For example, all the Revit objects are under the FBX hierarchy, in this case (“Watt_Hall_Third_Floor_Half_3dsMax_FBX_1201_CenterPivot”) (Figure 4.71). Therefore, all Revit objects have the same parent and are the children of the “Watt_Hall_Third_Floor_Half_3dsMax_FBX_1201_CenterPivot” GameObject. 112 After finding the parent, the script checks whether the parent GameObject has been attached with a “ReferenceObject.cs” (Figure 4.70). If the parent does not have that component, the component is going to be added. If the parent has already had the <ReferenceObject> component, nothing will be done. Finally, only one “ReferenceObject” class instance is added to the parent GameObject (Figure 4.72). Figure 4. 70 Find the parent of each GameObject and attach a script (“ReferenceObject.cs”) to it Figure 4. 71 Revit objects have same parent GameObject Figure 4. 72 The parent GameObject was added with only one “ReferenceObject.cs” component The functionality of the “ReferenceObject.cs” is to record what object is currently selected and what object is the last selected object. Because each Revit GameObject has the same parent GameObject, attaching the “ReferenceObject.cs” to the parent can easily record what child GameObject is or was being selected. Moreover, “ReferenceObject.cs” define functions that enabling users to move the selected object, calculating new coordinate information of the moved GameObject in the Revit coordinate system, and record the changes that will be sent back to Revit to update Revit Model. Detailed explanations on “ReferenceObject.cs” refer to 4.2.5. 4.2.5.3 Select (Touch) a GameObject The main functions and features that users can click the GameObject and see the object’s parameter information are achieved in the OnMouseDrag() function (Figure 4.73). The OnMouseDrag() function is called when the user has clicked on a “Collider” component and is holding down the mouse or touching the screen. Unity doesn’t detect mouse events unless an object has a collider. It is why when importing the FBX file in Unity, each entity under the FBX file was added a “Mesh Collider” (Figure 4.73) 113 Figure 4. 73 Generate Colliders when importing the FBX file in Unity When touching the screen, the script first checks whether the touching point is on the UI component by using “if (!IsPointerOverUIObject())”(Figure 4.74). The “IsPointerOverUIObject()” is a Boolean function that returns true or false. Without this function, the tool cannot distinguish whether or not a user is controlling the UI. For example, when a user scrolling down the UI, objects behinds the UI can be selected. As a result, TransBIM will immediately update the UI information of the currently selected object, which is not what the users want. On the other hand, with the “IsPointerOverUIObject()” function, when touching the UI panel, users would never accidentally select virtual objects blocked by the UI panel (Figure 4.75). If users not clicking the UI, the scripts record the selected GameObject as the current object stored in the “currentObject” variable in the “ReferenceObject” instance: “parent.GetComponent<ReferenceObejct>().currentObject = gameObject;”. Figure 4. 74 IsPointerOverUIObject() check whether users click the UI components Figure 4. 75 Touching the UI won’t select the virtual object blocked by the UI Then the scripts use an “if” statement (line 140) to check whether the two variables “PreviousObject” and “CurrentObject” recorded in the <ReferenceObejct> instance refer to the same GameObject (Figure 4.76). If the two variables have different values, that means users just clicked a new object and the scripts in the “if” condition will be executed. 114 Figure 4. 76 OnMouseDrag () function Generally, there are three parts under the “if” condition (Figure 4.76). Part 1 sets all the UI components under the “ParaContent” of the InventoryMenu (Property Window) to be inactive, which makes all the UI components invisible. This is why it is necessary to deactivate these UI components: The property window contains a list of UI components showing the parameter information of the current selected GameObject (Figure 4.73). However, when users click a new GameObject and before new UI components are duplicated to show the newly selected object’s parameters’ information, the UI components that show the parameter information of the previous select GameObject were still under the “ParaContent” of the Property Window. Therefore, the scripts first turn off the visibility of the previously generated UI components before generating the new UI components showing the current selected GameObject’s parameter information. Figure 4. 77 The components under the “ParaContent” hierarchy of the InventoryMenu show the object’s parameter information Part 2 gets the GameObject’s materials components and change all materials’ color to a default color by using the Color variable “ChangeColor” (0.1F, 0.3F, 0.5F, 1.0F). By doing this, the touched GameObject can be immediately changed the color to yellow to show the user what GameObject has touched (Figure 4.78). 115 Figure 4. 78 When clicking a GameObject, the object’s color change to yellow Part 3 scripts are under an “if” statement that checks whether the current selected GameObject has a tag (category) which belongs to “Walls,”, “Ducts”, “Pipes”, and “Conduits”. If the condition happens, Part 3 scripts will be executed, which includes five actions (Figure 4.79, Table 4.6). The action numbers are noted and explained in the following sections (Figure 4.79, Table 4.6). Table 4. 6 Actions when selecting an object Actions Used functions Description Section 1 Add the GameObject’s parameter information to two dictionaries (originTsDict and ChangeParaDict). The originTsDict stores the Revit and Unity coordinate information of a GameObject (originTsDict) and the ChangeParaDict stores the editable information of a GameObject. AddDataToParaDict () (4.2.5.4) 2 Show editable parameter information in the “Change parameter window”. The editable parameter types are the GameObject’s cross-sectional, longitudinal, and vertical movements (Fig 4.79). UpdateChangeParaUI() (4.2.5.5). 3 Generate a new “Gizmos” at the clicked object’s center. The Gizmos is a GameObject that has arrows pointing out the cross-sectional, longitudinal, and vertical directions of the selected object (Fig 4.79). CreateGizmos() (4.2.5.7) 4 Get the GameObject asset file (ScriptableObject), which stores the object’s Revit information. According to 4.2.2, each GameObject loads the asset file and stores it in the “object_SO” variable in the “ReadCSV” component. Therefore, this script can get the asset file from the ReadCSV component. Line 154 (ScriptableObject tempSO = GetComponent<ReadCSV>(). object_SO;) 5 Uses the asset file to show the object’s parameter information in the “PropertyWindow” (Fig 4.79). UpdatePropertyWindow() (4.2.5.6) 116 Figure 4. 79 OnMouseDrag () function Part 3 enables five actions Figure 4. 80 Results of some of the actions enabled by the Part 3 scripts in OnMouseDrag () function All the above three parts of scripts are under the “if” condition (line 140). However, if the GameObject does not belong to any of the “Walls”, “Ducts”, “Pipes”, and “Conduits” categories, the condition does not happen. Then, the Part 4 scripts will be executed, which sets a GameObject with a text (“No information for this category”) in the property window to be active instead of creating a couple of “rows”. The example shows a roof selected, and the “Property window” showed the text instead of the object’s parameter information (Figure 4.81). Figure 4. 81 Result of selecting a GameObject not belonging to “Walls”, “Ducts”, “Pipes”, and “Conduits” All the above scripts (Part 1- Part 4) in the OnMouseDrag () function address how a newly selected GameObject can be manipulated. The actions include • changing the color of this newly selected object to yellow; 117 • generating UI components in the Property window to show the object’s information; • and creating a “Gizmos” GameObject at the object’s center. At the end of the OnMouseDrag() function, there are also two actions to manipulate the previously selected (last selected) GameObject (Table 4.7, Figure 4.82). Examples show after clicking a conduit component, the color of the previously selected duct component was changed to the original color, and there was no “Gizmos” GameObject at the center of the previously selected duct (Figure. 4.83) Table 4. 7 Two actions to manipulate the previously selected 1 Destroy the “Gizmos” GameObject that is at the last selected GameObject’s center; Because the user has clicked a new GameObject, the “Gizmos” on the previous selected GameObject is not useful. 2 Change the last selected object’s material color back to the original color; Here is how the Hash table “ht_color” is useful. Before any GameObject is selected and the materials change color, in the Awake() function, all the material’s original color information of a GameObject is stored in the “ht_color” variable in each “ClickObject.cs”. The action 2 is to access the “ht_color” and materials of the previous GameObject and change the material's color back to original. Figure 4. 82 Scripts to manipulate the previously selected Figure 4. 83 A duct was selected with highlighted color (Left); The duct, as a previously selected object, was changed its color back (Right) At last, after executing actions for the current selected GameObject and the previously selected GameObject, in line 193, the system can pass the current selected GameObject to the “PreviousObject” variable in the 118 <ReferenceObject.cs> (Figure 4.82). It seems confusing because when not clicking an object, in the <ReferenceObject.cs> instance, the two variables “PreviousObject” and “CurrentObject” refer to the same GameObject (Figure 4.84). However, it is logically correct because when users click the new object (OnMouseDrag() triggering), the variable “currentObject” in the <ReferenceObject.cs> will be immediately overwritten to a new GameObject, and at that moment, if the “PreviousObject” and “CurrentObject” refer to different GameObject, all above actions repeat. Figure 4. 84 “PreviousObject” and “CurrentObject” in the ReferenceObject.cs 4.2.5.4 AddDatatoParaDict() function According to 4.2.5.3 (Figure 4.76), the AddDatatoParaDict() is called when users click a new GameObject and when the GameObject belongs to “Ducts”, “Conduits”, “Pipes”, and “Walls”. This AddDatatoParaDict() function is to store some parameters information of the selected GameObject to the dictionaries (originTsDict and ChangeParaDict) (Figure 4.85). The originTsDict stores the Revit and Unity coordinate information of a GameObject, while the ChangeParaDict stores some editable parameters information of a GameObject. Figure 4. 85 AddDatatoParaDict() function Before it passes relevant information to the two dictionaries, first, the system needs to get the object’s parameter information. Realized by <ReadCSV.cs>, the parameter information including Revit coordinate information of each 119 GameObject was stored in a single asset files (ScriptableObject). The script uses the “GetComponent<ReadCSV>().object_SO” to find the asset file (Figure 4.81 (line 285)). With accessing the asset file of the current selected GameObject, the scripts then can retrieve the needed information from the asset file and store the information into the two dictionaries. Example shows the needed information of a duct asset (Figure 4.86). Figure 4. 86 The need information in the asset file The next step is to define what parameter information is being added into two dictionaries. For the “originTsDict”, because the AddDatatoParaDict() only happens when the GameObject belongs to “Ducts”, “Conduits”, “Pipes”, and “Walls” and all these categories of objects have a Starpoint and Endpoint in Revit, the originTsDict consistently stores 9 pairs of coordinate information (Table 4.8). Table 4. 8 Add relevant information in orginTsDict Revit Coordinate Information (Revit element’s startPoint and endpoint) startPoint_x startPoint_y startPoint_y endPoint_x endPoint_y endPoint_z Unity Coordinate Information (Object’s pivot location in the Unity world space (geometry center)) x_local y_local z_local For the “ChangeParaDict”, objects in different categories will have different Revit parameter types. Even for an object with the same category, different family types can affect the object’s Revit parameter types (Table 4.9). For example, the duct objects have either “Rectangular Duct” family type or “Round Duct” family type. If the GameObject is “Rectangular Duct”, the “ChangeParaDict” will have “Width”, and “Height” parameters for the duct size; on the other hand, a “Round Duct” will have a “Diameter” parameter type. Therefore, depending on the category of the selected GameObject, different group of parameters information will be added to the “ChangeParaDict” variables in the <ClickObject.cs> component of the selected GameObject. The “ChangeParaDict” information is also shown in the “ChangePara Window” (Figure 4.87). Examples show that a Round Duct has a “Diameter” information, while the wall only has some location information (Figure 4.87). 120 Figure 4. 87 “ChangeParaDict” information shown in “Change Parameter window” Table 4. 9 Add relevant in information to ChangeParaDict based on object’s category and family Ducts Conduits Pipes Walls startPoint_x; startPoint_y; startPoint_z; endPoint_x; endpoint_y; endpoint_z; startPoint_x; startPoint_y; startPoint_z; endPoint_x; endpoint_y; endpoint_z; startPoint_x; startPoint_y; startPoint_z; endPoint_x; endpoint_y; endpoint_z; startPoint_x; startPoint_y; startPoint_z; endPoint_x; endpoint_y; endpoint_z; Longitudinal Move; Cross Sectional Move; Vertical Move; Longitudinal Move; Cross Sectional Move; Vertical Move; Longitudinal Move; Cross Sectional Move; Vertical Move; Longitudinal Move; Cross Sectional Move; Vertical Move; Offset Offset Offset \ Rectangular Duct Round Duct Diameter_Trade_Size Diameter_Trade_Size \ Width; Height; Diameter_ Trade_Size The originTsDict and the ChangeParaDict will be called later in other functions. For example, the originTsDict is called in the CreateGizmos() function as the originTsDict contains the Revit coordinate information, which is used to calculate the created Gizmos’ rotation angle. Also, the ChangeParaDict will be called in UpdateChangeParaUI() to update the editable parameter information in the “Change Parameter window”. 4.2.5.5 UpdateChangeParaUI() UpdateChangeParaUI() is called when users clicked a new GameObject via the application. The function enables the “Change parameter window” to update the UI components that show the editable parameters of the newly selected GameObject (Figure 4.89). The update is realized by passing the information stored in the “ChangeParaDict” to the relevant UI component. Basically, there are two steps to update the “Change Parameter window” based on the GameObject’ “ChangeParaDict” (Figure 4.89). 1. Get each UI element from the Change Parameter Window, for example, the “Cross Sectional Move” UI element; 2. Check whether the dictionary “ChangeParaDict” has keys contains the UI element’s name. If so, the scripts can use the key (UI element’s name) to get the information of one parameter form the ChangeParaDict to the UI component; Explained in 4.2.4, the “ChangeParameter window” was preset with nine UI elements (Figure 4.88, Table 4.10). Also, depending on the GameObject’s category and family, the keys in a ChangeParaDict are the same as the UI element name. For example, the “ChangeParaDict” of a “Rectangular Duct” GameObject has keys such as 121 “Longitudinal Move”, “Cross Sectional Move”, “Width” and “Height”. These keys are exactly the same as the names of some of the UI components in the “Change Parameter window” (Figure 4.88, Table 4.10). When the keys in the ChangeParaDict matches to the name of the UI component of the “ChangeParameter window”, the scrips can pass the values of those keys from the “ChangeParaDict” to the corresponding UI element. Table 4. 10 UI components in the “Change Parameter window” Longitudinal Move; Text Height Cross Sectional Move; Offset Width Vertical Move; Slop Diameter_Trade_Size Figure 4. 88 The GameObjects (contents) under the “ParaContent” of the “ChangeParameter” window Figure 4. 89 UpdateChangeParaUI() function Results show when clicking a duct object, the ChangeParaDict shows several parameters, while selecting a wall only shows three parameters (Figure 4.90). It is because when ChangeParaDict for duct has more editable parameter types matches to the UI elements in the “Change Parameter window”, while the ChangeParaDict of a wall 122 only has three parameters (Cross Sectional, Longitudinal move, vertical move) as shown in the UI elements in the “Change Parameter window”. Figure 4. 90 Update the “ChangeParameter window” 4.2.5.6 UpdatePropertyWindow() The UpdatePropertyWindow() function is similar to the UpdateChangeParaUI() function, which needs to get the current selected GameObject’s parameter information to update the scroll window. However, instead of using the information of the parameters in the ChangeParaDict to get the information of the editable parameters, the UpdatePropertyWindow() directly uses the GameObject’s asset file (ScriptableObject) to show all parameters information in the application “Property Window” (Figure 4.91, Figure 4.93). Figure 4. 91 UpdatePropertyWindow() Basically, there are four steps (Figure 4.91): 1. Get the number of fields from the asset file of the selecting GameObject and create a UIlist with the same length. The UIlist is going to store each duplicated UI elements. 123 2. For each field in the asset file, check what the field type is. The field type depends on how the field was defined in the ScritpableObject. The field type can be string type, int type, float type, and bool type. Examples show some of the variable fields are string type while some of the variable fields are float type defined in the “ductParameter.cs” (Figure 4.92). Figure 4. 92 Variable fields types defined in the ScriptableObject (ductParameter.cs) 3. Based on the file type (e.g. string type, bool type, etc.), the corresponding UI elements “Rowprefabs” will be duplicated and be assigned with new information to show a pair of parameter type and parameter value (Table 4.11). For example, when a field is the “Length”, which the field type defined in ScriptableObject is a string, then a “textRow” UI component is duplicated. The field name, “Length”, was replaced the text in the first cell (“paratype”), while the field value store in GameObject asset file is retrieved and replaces the text in the second cell (“paraValue”). If a field type is a Boolean type (e.g. the “size_lock” field defined in ductParameter.cs), a “checkbox” UI element will be duplicated the “Boolean” value to decide whether the “toggle” of the checkbox” is on or off. Moreover, if a field type is string, but also the field name is “Marks” or “Comments”, the InputRow will be duplicated and assigned information. Using the inputRow UI element enables users to type their comments and marks. Table 4. 11 Duplicate UI elements and assign information based on the field types in an asset file Cases Used UI elements Change the parameter name and parameter value Field Type is <int> <string> <float> TextRow prefab Cell 1 (Parameter name) Replace the text by the field name (e.g. “length”) Cell 2 (Parameter value) Replace the text of <Text> by the field value stored in the asset file Field type is <string> and field name is “Comments” and “Marks” InputRow prefab Cell 1 (Parameter name) Replace the text by the “field name (e.g. “Marks”) Cell 2 (Parameter name) Replace the text of <InputField> by the field value stored in the asset file 124 Field Type is <bool> CheckRow prefab Cell 1 (Parameter name) Replace the text by the “field name (e.g. “size_lock”) Cell 2 (Parameter name) Change “isOn” to true or false based on the field value (bool) 4. Finally, after the “RowPrefab” has been duplicated with the assigned information, each duplicated UI element is set as a child of the “ParaContent”. This makes the UI element all under the “ParaContent” hierarchy; as a result, the “ChangeParameteterWindow” shows in an orderly layout (Figure 4.93). In summary, when a user clicks a new GameObject, in the “Property Window”, a series of UI elements were generated with assigned parameters information based on the GameObject’s asset file (ScriptableObject) (Figure 4.88). Figure 4. 93 Generate UI elements with assigned parameter information based on the object’s asset file 4.2.5.7 CreateGizmos() Ducts, conduits, pipes, and walls have a long direction (longitudinal direction) and a cross-sectional direction. This section explains how to create a Gizmos GameObject at the center of a selected GameObject to show the object’s longitudinal direction. The function to create a Gizmos at the center of an object and align the Gizmos longitudinal arrows align with the object’s long direction is realized by CreateGizmos(). The CreateGizmos() enables the application to generate a new “Gizmos” GameObject at the selected object’s geometry center and to show which direction is the cross-sectional, longitudinal and vertical of the GameObject. (Figure 4.94). Because the intent to create a Gizmos is to show what is the “cross-sectional”, “longitudinal”, and “vertical” directions of selected GameObject, every time a Gizmos instance is created for a specific object, the Gizmos object will be rotated to align its longitudinal arrow with the long directions of the GameObject (ducts, conduits, pipes, walls). Examples show the created Gizmos of a selected duct and a selected conduit (Figure 4.94). 125 Figure 4. 94 Create a “Gizmos” at a selected object’s geometry center and to show the GameObject’s vertical, longitudinal, and cross-sectional directions First, the Gizmos GameObject was created and defined. The Gizmos was a GameObject that has six arrows representing the “cross-sectional”, “longitudinal”, and “vertical” directions (Fig 4.95). By default, the “Vertical” arrows align with the Unity z-axis, the “Longitudinal” arrows align with the Unity y-axis, and the “Cross-sectional” arrows align with the Unity x-axis. Figure 4. 95 The default settings of the Gizmos GameObject The reason that defining the “Vertical” arrow aligned with the Unity z-axis instead of aligning with the Unity y-axis for pointing up was because of the difference between Revit coordinate systems and Unity coordinate system. Basically, in Revit, the z-axis is the up axis, while in Unity, the y-axis is the up axis (Figure 4.96). Setting the “Vertical” arrow of the Gizmos object aligning with the Unity z-axis can make the Gizmos object later show the vertical direction of the Revit model (FBX). When importing the building model (FBX) into Unity, Unity read the FBX file by aligning the Revit coordinate system with the Unity coordinate system. The Revit X-Axis was aligned with the Unity X-Axis, Revit Y-Axis is aligned with Unity Y-Axis, and the Revit Z-Axis was aligned with the Unity Z-Axis. However, the z-axis was not the up axis in Unity. As a result, putting the model by aligning the Revit axis with the Unity axis, the model in Unity looked not in a horizontal plane but looks suspended vertically. (Figure 4.96). Figure 4. 96 Unity Y-axis points up 126 To position the model correctly, the entire FBX file was given a rotation angle to compensate for the difference between the Revit and Unity coordinate systems, in other words, making the local z-axis of the model pointing up as well as making the local y-axis of the model in the horizontal plane. (Figure 4.97). Therefore, the whole model was rotated -90 degrees (Figure 4.97). This action was done when the FBX model was imported into Unity. Moreover, because each object (entity) was a child under the model (FBX) hierarchy, each object was also rotated followed accompanied by the model. As a result, each object under the model hierarchy also had its local z-axis pointing up, as well as its local x-axis and y-axis in a horizontal plane (Figure 4.93). Figure 4. 97 The FBX model was rotated to make the Revit z-axis pointing up Figure 4. 98 Each object has its local z-axis pointing up, x-axis and y-axis in a horizontal plane In summary, the model was rotated so that the model and the children object under the model hierarchy have their local z-axis point up. Then, when putting a Gizmos GameObject under an object’s hierarchy, the default setting of the Gizmos can make the Gizmos’s “Vertical” arrow align with the object’s local z-axis, which means the “Vertical” arrow of the Gizmos points up. The example shows putting the Gizmos under a duct and pipe component, the Vertical arrows point up (Figure 4.99). Figure 4. 99 Putting the Gizmos under a duct’s and a pipe’s hierarchy, the Gizmos’s vertical arrows point up 127 Here explains the CreateGizmo() function. There are three steps to create the Gizmos GameObject with the correct direction (Figure 4.100): 1. Create the Gizmos at the GameObject’s pivot point (geometry center) and set it as child of this GameObject; 2. Calculate the relative angle between the object’s long direction and the object’s local x-axis; 3. Rotate the Gizmos based on the calculated angle to show object’s directions; Figure 4. 100 The CreateGizmo() function The first part creates a new Gizmos instance and puts it as a child of a GameObject. Since the model and the children object under the model hierarchy have their local z-axis point up (Figure 4.101), the Gizmos “Vertical” axis along with the z-axis is pointed up. The example shows two types of Gizmos in the project scene (Figure 4.101). The “A Gizmos” has no parent GameObject and has the default rotation angle (0,0,0). The “Vertical” arrow is pointing to the z-axis of the Unity world system. In contrast, the “B Gizmos” are the children of a duct and a pipe object (Figure 4.102). Even though the Gizmos local rotation is also (0,0,0), the “Vertical” arrows are pointing up (Figure 4.101). However, the “Longitudinal” arrows of the Gizmos of the duct and the pipe point to the same direction (Figure 4.101). It is because the “Cross sectional” arrow always aligns with the object’s local y-axis, and all GameObject under the model hierarchy have the same local y-axis. To address this issue, the scripts then calculate the relative angle between the object’s long direction and the object’s local x-axis. Figure 4. 101 Creates a new Gizmos instance and puts it as a child of a GameObject (1) 128 Figure 4. 102 Creates a new Gizmos instance and puts it as a child of a GameObject(2) The second part is to calculate the angle between the object’s long direction and the object’s local x-axis. The object’s local x-axis is the Revit x-axis, the object’s local y-axis is the Revit y-axis. Therefore, the angle can be calculated by using the Revit coordinate system (Figure 4.103, Figure 4.104). • First, get the object’s Revit startPoint and endPoint coordinate information from the dictionary originDict, where information was stored in the previous “AddDatatoParaDict() function”. • Then calculate the x variance and y variance of X coordinate. • The angle was based on the mathematic formula: Tan(angle) = X_Difference / Y_Difference, so the angle = Angle = arctan (X_Difference /Y_Difference). • Finally, convert the radians to degree: radians * (180 / Math.PI) Figure 4. 103 The angle between object’s longitude and Revit x-axis (object local x-axis in Unity) Figure 4. 104 Calculate the angle by using trigonometry operations The third part is to rotate the Gizmos based on the calculated angle to show the object’s directions (Figure 4.105). Results show that the Gizmos was rotated horizontally to align the “Longitudinal” arrow with the object’s long direction (Figure 4.106). Figure 4. 105 Reset the Gizmos’s location and rotation 129 Figure 4. 106 Final results of the Gizmos showing the object’s longitudinal, cross-sectional, and vertical directions 4.2.6 ReferenceObject.cs (Change object’s parameter and record the changes) This section introduces how to change and save the parameter information (such as movement) of a selected GameObject through the application’s “Change Parameter window” interface. The major script is ReferenceObject.cs. Here only explains the scripts related to the changing movement function. The details of how to change the object’s sizes, for example, changing a duct diameter, is not explained. See Appendix B.3 for the sizeChange() function in ReferenceObject.cs for more details (Appendix B.3). The ReferenceObject.cs is attached to the FBX GameObject, which is the parent of each building component (Figure 4.107). Attaching the “ReferenceObject.cs” to the parent can easily record what child GameObject is being selected, what object is the last previously selected GameObject, and what object is being edited (Figure 4.107). The “ReferenceObject.cs” script also gives the parent a “Project” tag (Figure 4.107). Figure 4. 107 Attach the ReferenceObject.cs to the FBX Model parent Besides being a place to record and reference GameObjects, the most important functions in the ReferenceObject.cs are to manipulate the movement of the current selected GameObject and to store the changes in a dictionary. The dictionary stores the changed object’s Revit object id as well as all the changed parameter information First, the scripts set several parameters (Figure 4.107, Table 4.12). Table 4. 12 ReferenceObject.cs Variables Variables Descriptions “previousObject”, “currentObject”, “recordingObject” When users click a GameObject, that object is assigned to “currentObject” and “recordingObject”, and when users click a new GameObject, the “previousObject” is assigned with the last selected object and the “currentObject” and “recordingObject”, updates. 130 uploadDict It is a dictionary that stores all changed objects their changed parameter information. The keys in the dictionary are changed object’s Revit ElementID, and the value of each key is an array of strings of the changed parameter information. AllDict It is a dictionary that stores all changed objects’ original transform information, for example, the object’s original Unity transform and Revit coordinate information before moving. With this dictionary, users can reset the movement changes of the changed objects. ControlUI, ControlUIcontent, Cross_Sectional_Move, Longitudinal_Move, Vertical_Move, etc. These public GameObject variables are used to reference the UI components from the “Change Parameter Window” (Figure 4.107) CrInput, LgInput, VtInput, HeightInput These are public <InputField> variables that are assigned input component in the Awake() function. In the Awake() function, the UI components are found and stored in the above variables. The script also gets the Input “component” from the UI components and defines events that will happen when each input component is edited by the user. There are some functions responding to the input events when the user types a value in the input components, for example, MovementChangeEditted() and HeightChangeEdit() (Figure 4.108, Figure 4.109). The sizeChange() to control an object’s sizes is not explained here. Figure 4. 108 Variables and Awake() in ReferenceObject.cs 131 Figure 4. 109 The UI components in the “Change Parameter Window” The “HeightChangeEdit()” function is similar to MovementChangeEdited() function. It happens when the user changes the value in the “Offset” input. The “Offset” parameter is the distance of a GameObject from the building level. Offset equals “0” means it is connected with the floor, while offset equal “10” means it is 10 feet higher than the floor level. Users can directly change the object’s offset, and the “Vertical movement” value will be changed accordingly. For details of “HeightChangeEidt()”, see Appendix B.3: referenceObject.cs. The MovementChangeEdited() function happens when users change the object’s movement value via the “Cross Sectional Move”, “Longitudinal Move", or "Vertical Move" input interface (Figure 4.109). The “Cross Sectional Move” means the relative movement in the object’s cross-sectional direction based on the original position. The “Longitudinal Move" and "Vertical Move" work likewise. The input units are defined in feet and decimal feet. There are five parts in the MovementChangeEdited() (Figure 4.110) 1. Add the object’s original transform information into a dictionary “AllObDict” 2. Convert the input text to float number 3. Calculate the object’s movement in the local x-axis, y-axis, z-axis (Revit x-axis, Revit y-axis, Revit z-axiss) 4. Get the new object’s local x,y,z position by using the original transform location plus the added movements; at the same time, update the object’s Revit coordinate information. 5. Associate the vertical movement change with the offset change. For example, a duct’s offset is 10 feet. After setting the vertical movement to 1 foot, the duct moves up and the value in the offset input changes to 11 feet. 132 Figure 4. 110 MovementChangeEditted() function 1. The first part checks whether the object’s original local transform information has been stored in the “AllObDict” dictionary. If the information does not exist in the dictionary, the “addOriginalDataToDict()” function is called to transfer the needed information from the object’s “originTsDict” and “changeParaDict” to the “AllObDict” (Figure 4.111). As a result, the “AllObDict” contains many objects’ original Unity transform (x,y,z) and the object’s Starpoint and Endpoint Revit coordinates. The reason to store the object’s original transform information is because of the movement methods. Moving an object is always based on the original location. Every time a user is typing or changing the movement value via the movement input, the object’s original in the “AllObDict” will be used. Figure 4. 111 addOriginalDataToDict() function 133 2. The second part is to try to convert the text from the user input to a float number. For example, to move an object 3 feet along with the longitudinal direction, a user can type a “3” in the “Longitudinal Move” input. Then the system will get the input text “3” and convert it to float 3. If user types characters that are not numeric characters or a decimal point, the conversion may not succeed, and the user input will be rewritten to none. 3. The third part is the core part in the movement function. It calculates what are the movements of the object in its local x, y, z-axis while users giving the movement value in the cross-sectional, longitudinal, and vertical directions. Each object has the same local x-axis, y-axis, and z-axis (Revit x-axis, y-axis, and z- axis); however, each object has a different direction (Figure 4.112). To convert a movement in the object’s own direction to the object’s local axis, the angle between the object’s longitudinal direction and the object’s local x-axis is used again. The angle is already calculated when creating a Gizmos of the GameObject by the CreateGizmos() function (4.2.5.7). Then the movements in the x-axis and y-axis are calculated by using circular functions (Figure 4.112). For example, the angle of the longitudinal direction and object’s local x-axis is θ. Based on the circular function, moving 1 unit along the longitudinal direction is equal to moving the object x(cos θ) units in the x-axis plus moving the object y(sin θ) units in the y-axis (Figure 4.112). Figure 4. 112 Each object has the same local x-axis, y-axis, and z-axis Figure 4. 113 Using the circular function to get the resolutions of a movement The θ between the object’s longitudinal direction and the x-axis is the radians, “angle_LG_X”. angle_LG_X = angle * Math.PI /180 The θ between object’s cross-sectional direction and x-axis is the radians, “angle_CR_X” angle_CR_X = (angle +90) *Math.PI/180 Therefore, the contribution longitudinal movement float cr_x_movement = (float) (CrMove * 0.3048 * Math.Cos(angle_CR_X)); float cr_y_movement = (float )(CrMove * 0.3048 * Math.Sin(angle_CR_X)); The contribution of the cross-sectional movement: 134 float lg_x_movement = (float) (LgMove * 0.3048 * Math.Cos(angle_LG_X)); float lg_y_movement = (float )(LgMove * 0.3048 * Math.Sin(angle_LG_X)); The contribution of the vertical movement: float vt_z_movement = (float) (VtMove * 0.3048); (Note: 0.3048 means convert feet to meters) Finally, the object’s new local x, y, z position is calculated by adding the original local x, y, z with the x, y, z movement (Figure 4.113). float new_x = OriginalPosition["x_local"] + x_movement; float new_y = OriginalPosition["y_local"] + y_movement; float new_z = OriginalPosition["z_local"] + z_movement; Figure 4. 114 Calculate the radians and movement in x-axis and y-axis 4. The fourth part is to set the object’s local position based on the above results (new_x, new_y, new_z) (Figure 4.115). The new movements are also passing to the object’s “changeParaDict to update the dictionaries' information. A function called updateRevitCoord is called to recalculate what is the object’s Revit coordinate based on the object’s new position (Figure 4.116). For example, the new startpoint x = original startpoint x + x_movement. (x_movement is explained above) Therefore, every time the object is moved, the object’s Revit coordinate (startpoint and endpoint) will be overwritten. The new Revit coordinate information will be transferred back as the changed parameter information to update the Revit model. Figure 4. 115 Move the object’s to the new location; save the movement information in “changeParaDict” and update this object’s Revit coordinate information Figure 4. 116 The updateRevitCoord() In summary, the MovementChangeEdited() enables users to type the movement value to change the object’s location. Examples show below (Figure 4.117). 135 Figure 4. 117 Change movement results Besides recording the GameObjects such as current GameObject and defining the movement function, the “ReferenceObject.cs” script also defines a function called updateDict() function (Figure 4.118). In the “ReferenceObject.cs”, there is a dictionary “uploadDict” <string, string []>. The keys of “uploadDict” are the Revit elementIDs of the changed objects, and the value of a key is an array of strings representing one object’s new Revit coordinate information. Therefore, the uploadDict stores all changed object’s parameter information. Every time the users selected a new GameObject, the updateDict() function is called to update the “uploadDict” with the previously manipulated GameObject’s new Revit coordinate information (Figure 4.118). Basically, there are three steps to update the “uploadDict” when users click a new GameObject (Figure 4.118). 1. Create a new dictionary to store the previous (recording) object’s new Revit coordinate 2. Convert the new dictionary information into an array of strings. Each array element is a pair of key and value (e.g. “new_startP_x : 14.3” ) 3. Update the “uploadDict” dictionary based on the new array. Figure 4. 118 The updateDict() function 4.2.7 Create the Text File After a user modifies the object’s parameter information (objects locations) in the AR environment, users can use the “Upload data” button to create a Text file with all the changes made in the “TransBIM”. Creating a text file function is called “CreateText()”, and it is triggered when the user clicks the “Upload data” button (Figure 4.119). 136 Figure 4. 119 Use “Upload data” button to create Text There are four parts in the “CreateText()” function (Figure 4.120): 1. Call the updateDict() function again to make sure the last selected GameObject’s Revit coordinate information stored in the “uploadDict” dictionary; then show the “upload UI window” (Figure 4.123); 2. Define where the text file will be created. The text file is called “objectMeta.txt” 3. Create or overwrite the file in the defined path with no text information. (Figure 4.121) 4. Get the updateDict information and write the information line by line into the Text file. Each line starts with the Element ID and followed by the object’s new Revit coordinate information (Figure 4.122) Figure 4. 120 CreateText() function Figure 4. 121 Generate the text file in the defined path Figure 4. 122 The text file result 137 4.2.8 Upload Text File to Dropbox To transfer the changed BIM metadata from TransBIM to Revit, TransBIM uses Dropbox as a transfer station. This section explains how to upload the text file with the changed BIM metadata to Dropbox (Figure 4.123). Figure 4. 123 Upload the file to Dropbox When the user clicks the “Upload data” button, the CreateText() function is triggered. In the CreateText() function, the “UploadUIWindow” is set to be active (Figure 4.124). As a result, the “UploadUIWindow” is popped up in the screen (Figure 4.124). The “UploadUIWindow” is the interface where user can upload the local file (created Text file) from Unity or the users’ mobile devices to Dropbox. The “UploadUIWindow” and connected functions were modified by a package called “DropboxSync” downloaded from the Unity asset store (Figure 4.125). Figure 4. 124 The “UploadUIWindow” is popped up when clicking the“Upload data” button Figure 4. 125 DropboxSync - upload and download files from Dropbox (https://assetstore.unity.com/packages/tools/integration/dropboxsync-upload-and-download-files-from-dropbox- 120529) 138 To use the package function, the user has to create an application on the Dropbox to use the Dropbox API. 1. Navigate to the Dropbox App creating page (https://www.dropbox.com/developers/apps/) and click the “Create app” button (Figure 4.126). Figure 4. 126 Go to the Dropbox creating APP page 2. Create the app’s folder. In this case, the folder name is TransBIM (Figure 4.127). Figure 4. 127 Create an app folder on Dropbox 3. generate accessToken for the app that will be used by DropboxSync (Figure 4.128). Figure 4. 128 Generate accessToken for the app 139 4. Copy generated access token and paste into “DropboxSync Script” inspector field (Figure 4.129) Figure 4. 129 Copy the access token and paste into “DropboxSync Script” 5. Change the upload local file path in the UploadFile.cs (Figure 4.130). When the users click the “UploadFile” button in the “UploadUIWindow”, the file located with the defined path can be uploaded to Dropbox (Figure 4.131). Figure 4. 130 Change the upload local file path in the UploadFile.cs Figure 4. 131 Results after the text file was uploaded to the Dropbox 4.2.9 ARFoudation and ARKit (locate the model on site) The chapter so far has explained how to parse Revit data into asset files (ScriptableObject), UI system, interaction functions (select GameObject to show Revit Parameter information, change object’s locations), generate a text file with changed Revit data and upload the Text file to Dropbox. The last feature of TransBIM is using the on-site images to merge the virtual model with the real building (Figure 4.132). 140 Figure 4. 132 AR Image Tracking and Load model This section explains how to position the virtual model by using AR image tracking features provided by ARKit3 and ARFoundation3 (Figure 4.133). The ARKit3 and ARFoundation3 can be downloaded from the Unity Package Manager (Figure 4.133). According to Chapter 3, there are two images used to locate the model. After scanning the real-world images, the system can give the tracked images a Unity world system location (a relative position based on where a user opens the application). The images’ positions are the factors to locate the virtual model. Figure 4. 133 Download ARFoundation, ARKit XR Plugin and other XR packages in Unity Package Management First, the “AR Session” and “AR Session Origin” components were added in the scene (Figure 4.134). The “AR Session” GameObject is the fundamental component to enable an AR experience. It had two scripts attached to it (Figure 4.134 Left). The “ARSessionOrigin.cs” controls the lifecycle of an AR experience, managing the camera capture, motion processing, and image analysis. The “AR Session Origin” GameObject had three scripts attached to it. 141 1. The “AR Session Origin” is used to get the position, orientation, and scale of the trackable features (in this case are tracked images) in the Unity World Space 2. The “ARTrackedImageManager.cs” enables the system to track preset images. The tracked images were defined by the “ReferenceImageLibrary” (Figure 4.135). The two images (used on site as references to position the virtual model) were set in the “ReferenceImageLibrary” (Figure 4.135). The first image was called “FirstPoint(Blue)” and the second image was called “SecondPoint(Green)”. 3. The “TrackedImageAndLoadModel.cs” is the script that has functions to position the Revit model based on the tracked images’ location so that the virtual model and the real building are merged. Figure 4. 134 ARSession (Left) ARSessionOrigin (Right) Figure 4. 135 Set trackable images in the “ReferenceImageLibrary” The “TrackedImageAndLoadModel.cs” is used to assign some prefabs to the public GameObject variables (Figure 4.136). The ARObjectsToPlace is an array containing two elements “FirstPoint(Blue)” and “SecondPoint (Green)” (Figure 4.136). These two prefabs are two spheres that will be positioned at the center of detected images (Figure 4.137). The “InstantiateInstance” is assigned with the model prefab. In this case, the model prefab is called “Watt Hall” (Figure 4.138). 142 Figure 4. 136 Assign prefabs to the “TrackedImageAndLoadModel.cs” Figure 4. 137 “FirstPoint(Blue)” and “SecondPoint (Green)” prefabs Figure 4. 138 The model prefab “Watt Hall” The virtual model was positioned on the real-world environment based on the tracking images locations. There were two images to locate the model. The first image was used to give the position of the overall model. The second image was used to calculate the rotation angle to make the virtual align with the real building. Two images (hard copy) were posted on site of the real building (Figure 4.139). 143 Figure 4. 139 Set tracking images’ positions in both the virtual model and the real building Because the pivot point controls the overall position of the Revit model, the first image’s tracked location is the location where the model’s pivot point position. Because it was easily put the physical images sticking at the door’s center, the pivot point of the model prefab was set at a door’s center (Figure 4.139). Also, the direction of the door is aligned with the Unity world z-axis (Figure 4.140). Figure 4. 140 Set the Pivot point at one door’s center When detecting the defined images, the OnTrackedImagesChanged() function is called, which inside triggers the UpdateARImage() function (Figure 4.132). The UpdateARImage() function then calls the AssignGameObject() function to update the tracked image’ location based on the Unity World coordinate (Figure 4.141). Figure 4. 141 UpdateARImage() to update the detected images’ location 144 The AssignGameObject() function is to update the tracked image’s location, and assign the latest images’ locations to update the prefabs’ (spheres) positions. Then, if the system has found the two on-site images, the system will calculate a vector from the first image to the second image (Figure 4.142, Figure 4.143, Figure 4.144). The vector is to get the rotation angle of the model (Figure 4.143, Figure 4.144). Figure 4. 142 AssignGameObject() to get the tracked images’ locations and vector Figure 4. 143 AssignGameObject() to get the tracked images’ locations and vector 145 Figure 4. 144 Tracked images’ X-Z Plane-Projection locations in Unity World Coordinate The vector of the first image and the second in the X-Z plane is (x2-x1, 0, z2-z1) (Figure 4.144). This vector represents the direction of the door at the real building. Because the default direction of the virtual door is aligned with the Unity world normal z-axis, the rotation angle of the model is based on the angle between the images horizontal vector and Unity world normal z-axis. After scanning the two images, users can click the “Load Model” button to locate the model. The button will call a “LoadModel()” function (Figure 4.145). The function can get the two tracked images’ location and calculate the model rotation angle. As a result, the model is located based on the first image location with a rotation angle (Figure 4.146). Figure 4. 145 LoadModel () to locate the model based on the first and second images’ locations Figure 4. 146 Result of loading the model based on the tracked images’ locations 146 4.2.10 Summary This section explained all major Unity scrips and features to develop the “TransBIM” application. These features include: • Define ScriptableObject for different objects. • Parse the CSV file and store each Object’s Revit metadata into an asset file. • Design TransBIM UI system and user input. • Select GameObject and show the object’s parameter information in the “Property window” and “changeParameter window”. • Change an object’s parameter information (movement, sizes, etc.). • Create a text file with all changed object’s modified Revit metadata and upload the text file to Dropbox • Use ARFoundation image tracking method to merge the virtual model with the real building. 4.3 Update BIM with Modified Metadata This section introduces the workflow of how the BIM metadata is being used to update the building information model in Revit (Figure 4.147). Dynamo was used to update the Revit model based on reading the modified BIM metadata from the text file. Figure 4. 147 Update BIM Model using Dynamo First, download the text file from the Dropbox (Figure 4.148). Each line of the text file stores one changed object’s Revit elementID, and object’s new coordinate information after the object’s location was modified in the AR environment. 147 Figure 4. 148 Download text file from Dropbox Then, use the “Change location multiple.dyn” to update the model (Figure 4.149). The changed objects can be found based on the elementID and their location will be updated based on the new coordinate information parsed from the downloaded text file. Figure 4. 149 Use “Change location multiple.dyn” to update the Revit model The Dynamo script first reads the text file into a string. Each line stores an individual changed object’s information, and in each line, the information (elemenet ID and object’s new object’s coordinate) is separated by “,”. Therefore, the string is split by each line then split by “,” to get a list of object’s coordinate information (Figure 4.150). 148 Figure 4. 150 Read and parse the text file Then, the numeric information stores in the new Revit coordinate is extracted (Figure 4.151). The numeric information is grouped into six categories: startPoint-x, startPoint-y, startPoint-z, endPoint-x, endPoint -y, endPoint -z (Figure 4.151). Figure 4. 151 Get the coordinate numeric information Finally, the coordinate numeric information is converted from string to float number (Figure 4.152). During the conversion, “0.3048” is used to convert the meters to feet. Then, the startpoint and endpoint of each changed objects 149 were constructed. The startpoint and endpoint are used to generate one object’s control line to update the object’s location. Results show one duct and one conduit moved to their new location by using the Dynamo script (Figure 4.153) Figure 4. 152 Generate controlled lines based on the new coordinate to update the object’s location Figure 4. 153 Result of updating the changed objects’ locations In summary, updating an object’s parameter information in Dynamo is realized by assigning a new value to the corresponding parameter type of an object. The script “Change location multiple.dyn” shows the example of how to update the object’s location based on the new coordinate information stored from the text file that was generated on site. As a result, the changes that the user remotely made on site were synchronized in the local Revit model. 150 4.4 Summary This chapter explains some core Dynamo scripts and Unity scripts when developing the tool. The scripts are categorized into three parts: preparing BIM data (Dynamo), developing the mobile AR application (Unity), and updating the BIM model with modified metadata (Dynamo) (Figure 4.154). The first part and third part used Dynamo scripts to export Revit metadata and update the Revit metadata. The second part used Unity C# scripts and AR SDKs (ARFoundation & ARKit) to develop the “TransBIM” application. The entire scripts can be found in the Appendix A, B, C. In the next chapter, a case study of using TransBIM in a real building is demonstrated. Figure 4. 154 Core script using in Dynamo and application development 151 Chapter 5 5. CASE STUDY AT WATT HALL This chapter discusses how the application, TransBIM, is used. It starts with an introduction of the case study building (USC Watt Hall) selected to test the developed application and the building’s Revit model preparation. Then it explains how to build the TransBIM application on iOS devices by Unity and Xcode. The most important part explains how this application was used in the case study building (Watt Hall). Finally, the application’s functionality and limitations are discussed. Overall, this case study intends to validate the tool’s usability and accuracy by examining whether the main features of the program work properly and give expected outcomes. 5.1 Watt Hall Revit Model and Asset Preparation This section overviews the case study building and the Revit model. The building’s Revit architectural model was provided by USC Facility Management Services. Then some of the MEP (mechanical, electrical, plumbing) components were modeled and added into the test case Revit model. The method to model the as-built MEP components was based on the building’s point cloud data that was generated by using laser scan technology. Finally, when enough MEP components for testing were added to the model, an FBX file of the Revit model was exported and modified in 3ds Max. Also, a CSV file with all Revit model metadata was generated. The FBX file and CSV file were the assets that were used to build the TransBIM application. 5.1.1 Case study scale (Watt Hall Third floor) USC’s Watt Hall was selected as the case study building. It is the three-story headquarters of the USC School of Architecture and Fine Arts with a net area of 67,855 square feet (Figure 5.1). The building has three stories above ground and one underground with a mezzanine (Figure 5.1). Figure 5. 1 Watt Hall To narrow down the scale of the case study, it was focused on the third floor of Watt Hall (Figure 5.2) (Figure 5.3). Another reason for using the third floor was because it has a high ceiling (the roof) without a suspended ceiling. All the MEP systems are exposed. Anyone in the building can see where those MEP components are. The visible MEP component can validate the application’s augmented reality features. By checking whether the virtual building model can properly superimpose on the physical building, the accuracy of the AR systems can be validated. 152 Figure 5. 2 A panorama image of the Watt Hall third floor Figure 5. 3 Using TransBIM at Watt Hall third floor MBS Conner 5.1.2 Revit model and MEP as-built based on point cloud The Revit model was provided by USC Facility Management Services. However, the model only has the architecture discipline elements (walls, doors, windows, columns, roofs, etc.) and a few building asset elements (lighting fixtures, computers) (Figure 5.4). Since the purpose of the TransBIM application prototype was to visualize and edit the MEP components’ Revit data in the AR environment and demonstrate the potential use of AR during FM or renovation, some of the existing MEP components in Watt Hall were put into the overall Revit model. Figure 5. 4 Watt Hall Revit Model (Left), Watt Hall Third Floor (Right) without MEP objects The as-built MEP elements were modeled in Revit by using laser-scanning technology. The laser scanning technology is a way to capture the reality in exact size and shape into the computer world as a digital 3-dimensional 153 representation. The scan results are called point clouds because the captured results are 3D points. The Leica BLK360, a laser scanning device, and Autodesk Recap Pro, a 3D scanning software, were used to capture the building reality and process the building’s point cloud data (Figure 5.5, Figure 5.6). When scanning at Watt Hall, the BLK 360 was connected to an iPad that had installed the Recap. During the scanning, the scanner was positioned at several locations at Watt Hall. The more scans were made, the more accurate the results were. When one scan at a location was finished, the scan result (point cloud data) was transferred from the BLK 360 to the iPad’s Recap project. In the Recap Pro, the point cloud data from different scan locations were automatically merged. Finally, all the scanned data stored in the iPad were transferred into PC. Figure 5. 5 Use BLK 360 scan one spot in the building Figure 5. 6 Recap Pro and Leica BLK360 To use the point cloud information on a local machine, the iPad was connected to a PC to transfer the point cloud file. In the PC device, the Recap Pro Desktop was used to read the point cloud information (Figure 5.7). Finally, the point cloud information was saved in a Recap compatible file format (. RCP, .RES). The .RCP file was imported into the Watt Hall Revit project and merged with the model (Figure 5.8) The point cloud information is a volume that contains captured reality points in colors. Since the point cloud information was merged with the Revit, the scanned model component’s point cloud data was visible in the Revit model. The point cloud provided what shape of an MEP component is, what type of an MEP component looks like and what the accurate location an MEP component is. 154 Figure 5. 7 Point cloud in Recap Pro (Desktop) Figure 5. 8 Link RCP file in Revit and merge the model The trick to draw the model in Revit based on the point cloud is to set proper “View Range” and create sections with the proper “Far Clip Offset”. The “View Range” and “Far Clip Offset” both define the range of visibility in the views. These can make each view only display the objects and point cloud that are in each view range. It helps the modeler to partialized the building space and more easily understand the objects’ spatial relationships in each space. The example shows a comparison of the plan view with a wide “View Range” (from high level to low level) and the plan view with a short “View Range” (Figure 5.9, Figure 5.10). When setting the “View Range” at the height around where the ducts and conduits are, the point cloud out of the view range was removed in the view (Figure 5.10). The point cloud in the view range displayed the MEP components. Figure 5. 9 Floor plan with wide view range (from high to low) 155 Figure 5. 10 Floor plan with a short view range around at a high level The point cloud in the floor plan view provided the layout of MEP components’ direction and size (diameter or width). The point cloud in the section views provided the object’s elevation location and size information. For example, a section view cut the conduit’s cross-sections showing the conduit’s cross-section size and elevation (Figure 5.11 Left). The conduit’s horizon layout was obtained from the plan view (Figure 5.11 Right). A section view along with the duct’s longitudinal direction shows the duct’s elevation and shapes (Figure 5.12). Figure 5. 11 Section view provided conduits’ elevations and sizes (Left) Plan view provided conduit’s layout and sizes (Right) Figure 5. 12 Section View Shows the Duct’s Elevations and Sizes 156 Finally, based on the plan views, section views, and the point cloud, some of the MEP components were modeled in almost accurate sizes and locations in Revit (Figure 5.13, Figure 5.14). Not all the data scanned was used, just enough for the case study. Figure 5. 13 The 3D View of Final Model (MEP disciplines) Figure 5. 14 The 3D View of Final Model (Coordination disciplines) 5.1.3 Prepare the FBX file and CSV file (BIM geometry and metadata) When users have the Revit model prepared, an FBX file of the Model and the CSV file of the model’s metadata should be generated. In the case study, an FBX file, “Watt_Hall_Third_Floor_Revit.fbx”, was exported from Revit (Figure 5.15). The FBX file was opened in 3ds Max to convert the objects’ materials to standard materials and reset each object’s pivot point to the object’s geometry center (Figure 5.16). For a description of the detailed steps, refer to Chapter 3. Then, a new FBX file called “Watt_Hall_Third_Floor_3ds.fbx” was created by 3ds Max (Figure 5.17). The final FBX file was imported into the Unity project folder as an asset. 157 Figure 5. 15 Export FBX file from Revit Figure 5. 16 Convert object’s materials to standard and reset the pivot points Figure 5. 17 The FBX files exported from Revit and 3ds Max The metadata of the objects in the Revit model was exported into a CSV file. This was done with the Dynamo script “Revit to Excel”. There were three separate actions in the Dynamo script. For a detailed description of the Dynamo script refer to Chapter 3.2.2 & 3.2.3 and Chapter 4.1. 1. Assign new parameter types to each element. Elements whose locations are controlled by lines were assigned with 6 parameters (startPoint x, y, z and endpoint x, y, z). Elements whose locations are controlled by one point were assigned with 3 parameters (hostPoint_x, hostPoint_y, hostPoint_z;) 2. Assign object spatial information to the newly created parameters (Figure 5.18) 3. Export Revit metadata (element ID and parameters) to Excel (5.19) 158 Figure 5. 18 Result of assigning object’s spatial information to the newly created parameters Figure 5. 19 Results of the exported metadata in Excel Both the CSV file and the FBX (exported from 3ds Max) were imported into the Unity project. Detailed instructions on how to export and modify the FBX file and use the Dynamo script to create the CSV file are discussed in Chapter 3 and Chapter 4. 5.2 Build the TransBIM on iOS This section introduces the five steps for using the developed Unity scripts to build a TransBIM application for a project. The total time of creating the Unity scene and building the application is around 30 minutes. 5.2.1 Add the FBX and CSV file into the TransBIM project’s folder 5.2.2 Set model location and custom scan images 5.2.3 Run the application in Unity Editor to generate asset files 5.2.4 Create a Dropbox connection 5.2.5 Build the application to iOS with Xcode 159 5.2.1 Import FBX and CSV file into the TransBIM project Users can build the “TransBIM” application by directly using the “TransBIM” scene (Figure 5.20). The TransBIM scene is the default scene that contains the basic components such as the Unity XR component (AR Session and AR Session Origin), TransBIM User Interface (Canvas). Also, making sure the Unity has downloaded the XR packages: AR Foundation 3.0, AR Kit XR Plugin 3.0, and XR management 3.0 (Figure 5.21). Figure 5. 20 TransBIM scene in Unity Figure 5. 21 Download Unity XR packages After the scene is opened, the CSV file with BIM data and FBX file with BIM geometries can import to the Unity project. The CSV file (BIM metadata) should be imported into the Assets/Resources folder (Figure 5.22). The CSV file name must be “Object Parameters.csv” because the “ReadCSV.cs” script reads the CSV file from exactly the default file path (Assets/Resources/Object Parameters). Figure 5. 22 Importing the “Object Parameters.csv” into Assets/Resources folder 160 For the FBX file, users need to check the FBX import settings (Figure 5.23). The FBX model was set with Convert Units, Read/ Write Enabled and generated a mesh collider for each entity under the FBX hierarchy. Figure 5. 23 FBX file import settings 5.2.2 Set model pivot, on-site image, and custom scan images After the FBX file and CSV file were imported, the second step was to set the model’s pivot location. Depending on different buildings and the models, the pivot points of the FBX file are at different locations. Therefore, users should set an overall pivot point of the individual FBX model. According to Chapter 3.3.6 and Chapter 4.2.9, the pivot point of the FBX model represents where the first image should be put in the real building. For example, the case study created a “Watt Hall” prefab and used the center of a door as the pivot location of the “Watt Hall” prefab (Figure 5.24). The first scan image was put at the center of the same door on site (Figure 5.25). Figure 5. 24 Set the overall pivot point of the Watt Hall prefab at the center of a door to scan the first image 161 Figure 5. 25 Set the tracking images’ positions in both a virtual model and a real building Based on the “TransBIM” scripts, the defined pivot point is required to be located at the center or face of a building component that has a direction along with the Unity Z-axis (Figure 5.24). The reason is that the TransBIM can detect the on-site two images’ locations and calculate the angle between the two-image direction and the Unity World Z-axis. The rotation makes the virtual model correctly rotated to merge the virtual model with the real environment. For a detailed explanation, see Chapter 4.2.9. Users only need to set the first image location (Prefab pivot point) in Unity, while the second image location does not need to be set in Unity. If it meets the above pivot point rule, users can put the second image on-site anywhere as long as the second image is to the right of the first image and in the same plane with the first image. In the case study, the first image was put at where the pivot location (a door center), and the second image was put at two different physical locations to do to tests (Figure 5.26). One test was putting the second image on the right of the first image in the same door (Figure 5.26 Left). The other test was putting the second image on the different door that shares the same door’s centerline with the door put with the first image (Figure 5.26 Right). Both tests correctly loaded the model. Figure 5. 26 First Image and Second diagram User can set their custom scan images in the “ReferenceImageLibrary”. The “ReferenceImageLibrary” can be found from the “AR Session Origin” component (Figure 5.27). The images the users use in the “ReferenceImageLibrary” determine what images will be printed and put on site. 162 Figure 5. 27 Define Custom scan Images in “ReferenceImageliabry” In this specific example, the “Watt Hall” FBX model was given a new pivot point at the center of a door. Users can follow the same steps to create their custom TransBIM model prefabs with custom scan locations. The method to set the FBX model’s pivot point is by giving a parent GameObject to control the FBX model. First, the FBX file prefab was dragged into the scene (Figure 5.28). This created an instance of the users’ FBX file model. The Rotation-x was automatically set as -90 degrees to make the FBX model’s z-axis pointing up. Figure 5. 28 Drag the FBX file prefab into the scene Second, the FBX prefab instance was unpacked so that any child GameObject under the FBX hierarchy can be freely changed (Figure 5.29). 163 Figure 5. 29 Unpack the Prefab Third, all objects under the FBX hierarchy were selected and added two script components (“ReadCSV.cs” and “ClickObject.cs”) (Figure 5.30). Adding these two script components enables TransBIM functionalities. For more detail about these two scripts, see Chapter 4.2.3 for “ReadCSV.cs” and 4.2.5 for “ClickObject.cs”. Figure 5. 30 Add script components to all FBX entities Fourth, an empty GameObject was created (Figure 5.31). Users can move the empty GameObject to a location that is proper to be the pivot point of the overall FBX model. The place can be at the center of a door, a place on the face of a wall, etc. In the case study example, this empty GameObject was moved to the center of a door (Figure 5.32). Figure 5. 31 Create an empty GameObject 164 Figure 5. 32 Move the empty GameObject to the center of a door Instead of manually moving the empty GameObject, one trick to move the empty GameObject accurately at the center of a door was to get the door’s pivot position in the Unity World System and copy that position information to the GameObject’s world position. The door was duplicated and dragged out of the FBX file hierarchy to show what was the door’s position in the Unity World System (Figure 5.33). Figure 5. 33 Find the door’s pivot location in the Unity world system Finally, the GameObject was named as “TransBIM Model” and the unpackaged Watt Hall FBX instance was put under the “TransBIM Model” GameObject as a child (Figure 5.34). As a result, the “Watt Hall” FBX model was controlled by its parent (“TransBIM Model”) and the parent’s pivot point was at the center of one door (Figure 5.34). Figure 5. 34 Make the “TransBIM Model” as the parent of Watt Hall FBX instance 165 The “TransBIM Model” was saved as the final prefab and deleted from the scene (Figure 5.35). Then, in the AR Session Origin, the “Instantiate Instance” GameObject was replaced with the “TransBIM Model” prefab, which the application can load the “TransBIM Model” in the AR environment when using the “TransBIM”. Figure 5. 35 Use the “TransBIM Model” in the AR Session Origin Users have to follow the same process to create their own “TransBIM Model” prefab based on the imported FBX file and set what images can be detected and put the images on site. 5.2.3 Run the application in Unity Editor to generate asset files After the model prefab is set, the user can click the “Play” in Unity. Then the scene will be executed by the Unity Editor. The model prefab will be duplicated into the scene (Figure 5.36). Because the model prefab has the “ReadCSV.cs” script component, the play execution will run the script and generate asset files for each building component to store its BIM metadata from the CSV file (Figure 5.37). All the asset files are saved in the Assets/Resources/metadataScriptableObject folder (Figure 5.37). Because the asset files are created in the project folder, these asset files will be built into the “TransBIM” application. When a user runs the “TransBIM” application, the asset files will be called to provide each object’s BIM metadata. Figure 5. 36 The “TransBIM” model prefab was duplicated in the scene 166 Figure 5. 37 Generated the asset files that store each object’s metadata 5.2.4 Create a Dropbox connection One of the features in TransBIM is to upload a text file that contains the changed BIM metadata to the user's Dropbox. To realize TransBIM working with users’ Dropbox, an “access token” from the Dropbox must be created and used in TransBIM so that the users’ Dropbox can permit TransBIM to access Dropbox services. Users first create a Dropbox app on the Dropbox developer web page (Figure 5.38). Then an access token can be generated in the Dropbox app setting (Figure 5.39). Last users copy the access token into the script “Dropbox Sync” in the TransBIM scene (Figure 5.40). Detailed instructions see Chapter 4.2.8. Figure 5. 38 Create a Dropbox App folder Figure 5. 39 Create an App in Dropbox and generate an access token 167 Figure 5. 40 Copy the access token to the “DropboxSync” in TransBIM scene 5.2.5 Build the application to iOS with Xcode Sections 5.2.1 through 5.2.4 address how to use and modify the default TransBIM scene with the users’ custom FBX files and CSV files. This part overviews the process of building an iOS application by Unity and Xcode. Users should follow the steps in the description below that explain how the application was installed on an iOS device used in the case study. First, some changes were made in the player setting and the iOS settings (Figure 5.41). Figure 5. 41 TransBIM iOS setting In the iOS setting> other settings, a new Bundle Identifier was created for the “TransBIM” development. A Bundle ID is a tool Apple uses to identify individual apps. Also, the “Target Device” was set as “iPhone +iPad” and “Target minimum version” was 11.0, which is the minimum version that can support ARKit features. After all the settings are set as in the examples (Figure 5.42, Figure 5.43), users can close the “Project Setting” window. 168 Figure 5. 42 Project Settings> Player Settings >iOS Settings> Other Setting (1) Figure 5. 43 Project Settings> Player Settings >iOS Settings> Other Setting (2) Next, the TransBIM scene was selected to build an Xcode project (Figure 5.44, Figure 5.45). Figure 5. 44 Build the TransBIM Unity project to an Xcode project 169 Figure 5. 45 The Xcode project built from Unity For the case study, the built Xcode project was opened in Xcode (Figure 5.46). The iOS devices should connect to the Apple devices (Figure 5.46). The Xcode version should support the iOS version needed for ARKit. For example, the case study used Xcode Version 11.3.1 to make sure the application can be built on both iPad and iPhone with iOS 13.3 (Figure 5.47). Figure 5. 46 Open the Xcode 170 Figure 5. 47 Xcode version (Left) and iOS version (Right) The general information of the Xcode project had already been set in Unity (Figure 5.48). In the Signing & Capabilities part, users need to log in to their Apple Developer Account to create a developer profile (Figure 5.48). Turning on “Automatically manage signing” can automatically generate a “Signing Certificate” (Figure 5.48 ). Finally, users can build the application and install the application on the connected iOS device. The building and installation time is around 5 minutes. Figure 5. 48 Log in Apple Developer Account In summary, this part introduced the settings and processes of how to build the project from Unity to the iOS device with Xcode. In the case study, the TransBIM application was installed on both an iPad and an iPhone (Figure 5.49) Figure 5. 49 TransBIM installed in the iPad 171 5.3 Case Study: Use TransBIM at Watt Hall Third Floor The previous two sections introduced how to use the Revit model and the Unity “TransBIM scene” to create and install TransBIM in iPad and iPhone. This section introduces how to use TransBIM, which covers the functionality and instructions of the TransBIM. TransBIM is the BIM-based mobile AR application to visualize and edit the BIM model. Users can use TransBIM to visualize the building information model on-site, specialized for MEP components (Figure 5.50). The “Longitudinal” shown in the example was reversed because the user holding the device stood behind the virtual text (Figure 5.50). The virtual components can overlay with the real building. Users can also view the virtual components' parameter information and edit some element parameter information such as an element’s location information. All changed parameter information will save into a text file and upload to Dropbox. Figure 5. 50 Using TransBIM to visualize duct on the Watt Hall third floor There are eight features of the application (Figure 5.51): 5.3.1 Welcome Panel 5.3.2 Scan Images 5.3.3 Load Model 5.3.4 Filter object 5.3.5 Select object 5.3.6 Change the object’s color, hide and unhide objects 5.3.7 View object’s parameter 5.3.8 Change the object’s parameter (movement) 5.3.9 Upload the changed data to Dropbox 172 Figure 5. 51 TransBIM features 5.3.1 Welcome panel When opening the TransBIM application, the screen shows a “Welcome Panel” listing instructions of how to use the application (Figure 5.52): 1- Scan on-site reference images (two): there were two images predefined in Unity that the application can detect and give a location in the Unity World space. 2- Once images are scanned, click "Load Model" Button to load model in the scene 3- One can always rescan the image and click the "Load Model" Button again to update the model’s location 4- Once the model is loaded, one can click each object to view information 5- Use "Property" Button to show or hide the object's parameter information 6- Use "Filter Objects" Button to filter objects 7- Use “Highlight Color” Button to switch the highlight color on or off 8- Use "Change Parameter" Button to change object's editable parameters 9- Use "Upload Data" to upload changed information to Dropbox 173 Figure 5. 52 A Welcome Panel when opening the application 5.3.2 Scan the images Two printed images were put at defined locations on the Watt Hall third floor. These two images can be detected by TransBIM, and the application can give the images a Unity world system location. Based on the two-image location and application algorithm, the virtual Revit model can be put at the scanned location with the correct rotation. The rotation angle enables the model to align the virtual geometry with a physical component. For a detailed explanation, see Chapter 4.2.9. The first image was put at a center of the defined door (Watt Hall pivot point). However, there are many ways to put the second image as long the second image is on the right side of the first image and in the same plane as the first image. For example, the second image can be put on the right of the first image in the same door (Figure 5.53). Also, the second image can be put on a different door which shares the same door’s centerline of the door with the first image. Figure 5. 53 First and Second Image locations at Watt Hall building (Second image was put at the same door with the first image) When walking close to the image, the device’s camera can detect the predefined image. When the camera finds the image, a corresponding sphere will be loaded at the center of the detected image. The blue ball represented the first image location (Watt Hall pivot location), while the green ball represented the second image location (Figure 5.54). 174 Figure 5. 54 Detecting the first and second image and put balls at the detected images’ locations The example shows when the two images were put at the same door. The first image controlled the overall location of the model, and the second image was used to find the direction from the first image to the second image (Figure 5.54). The second image could also be placed on the next door, as long as that door’s direction is in the same line of the door with the first image (Figure 5.55). Putting the second image on the next door was tested, where the next door’s centerline is the same as the centerline of the door with the first image (Figure 5.56). No matter in which way the second image was put, test results showed the virtual model was put at the correct location and aligned with the physical building. Tests showed the second way, which put the two images in a long-distance can increase the accuracy of the putting model at the location. Figure 5. 55 Place the images on different doors that share the same centerline (1) 175 Figure 5. 56 Place the images on different doors that share the same centerline (2) 5.3.3 Load Model After the two images’ locations were scanned and the two balls were visible in the AR environment, the “Load Model” button was clicked to load the building model (Figure 5.57). The virtual model will be instantly overlaid on the physical environment captured by the device’s camera. The result shows the virtual model was correctly merged in the real environment. The virtual door was overlaid with the physical door (Figure 5.57). The virtual ducts and conduits were almost overlaid with the real MEP components (Figure 5.58). Users can always rescan the first image and second image to refresh the images’ locations. Then users can click the “Load Model” button to relocate the virtual model based on the new scanned image location. It is noted that loading the model based on scanning the two images that had a longer distance might be more accurate to overlay the model with reality. It seems to be an issue with iOS devices and ARKit features that when users moving from the first image place to the second image place in a short distance, the iOS devices and ARKit features cannot detect the accurate movement of users and detect locations of the two images. However, when a user has to walk a long distance for scanning; as a result, the devices can detect a more accurate user’s movement so that the scanned location becomes accurate. Figure 5. 57 Load the virtual model at the right location by clicking the “Load Model” button 176 Figure 5. 58 Ducts and conduits were overlaid on the real building 5.3.4 Filter object After loading the model, all the available building components were loaded together. Some of the virtual components blocked the real environment, so the “Filter Objects” button was used to filter objects to give users options working on specific category objects. When clicking the button, the filter window will be popped up. The walls, floors, roofs, etc. objects were filtered out. Examples show the compared result before filtering and after filtering (Figure 5.59). The filtered AR environment was much clearer to see the MEP component in the case study (Figure 5.60). Figure 5. 59 Use the “Filter Objects” to select what objects visible in the AR environment Figure 5. 60 MEP components were clear to see after using the filter function 177 5.3.5 Select object The objects in the AR environment can be selected by touching the device (smartphone or iPad) screen. For example, a duct was touched, and the color of the object was changed to a default highlighted color (yellow) (Figure 5.61). Besides changing the color, a “Gizmos” object, which contains arrows to show the object’s longitudinal and cross-sectional direction, was displayed at the touched object’s center. For more detailed explanations, see Chapter 4.2.5. Figure 5. 61 Select a Duct in the AR environment The system algorithm enables the highlighted color and a Gizmos to appear on the object that the user just clicked. The previously clicked object’s color will be automatically changed back and the corresponding gizmos will disappear. Examples show when a new duct was clicked on, the newly selected duct’s color was changed and the previous duct’s color was changed back (Figure 5.52). Figure 5. 62 When selecting a new object, the old selected object’s color was changed to the original 5.3.6 Change the object’s color, hide and unhide objects The “Highlight Color” button switches the selected object’s color between the highlighted color and the object’s original color (Figure 5.63). The “Closed Eye” and “Eye” icons control the visibility of an individually selected object. Sometimes a virtual object may block other virtual objects behind it. To work with the blocked object, users can first select the front object by touching the screen and press the “Closed eye” icon to hide the selected object; 178 then the objects behind the hidden object are visible (Figure 5.64). Users can use the “Eye” icon to unhide all hidden objects (Figure 5.64). Figure 5. 63 Use the “Highlight Color” to switch the object’s color Figure 5. 64 Use the “Eye” and “Closed Eye” icons to unhide and hide objects 5.3.7 View object’s parameter The “Property Window” shows the parameters information of a selected object (Figure 5.65). Pressing the “Property” button can open or close the “Property Window”. For example, a duct component was selected in the Watt Hall third floor. The “Property Window” presented all the parameters information of the duct (Figure 5.65). Figure 5. 65 View selected object’s parameter information 179 Besides viewing the object’s information, users can type notes into the “Mark” and “Comment” rows (Figure 5.66, Figure 5.67). The typed information will be uploaded with other object’s changed parameters information and finally update the Revit model. Figure 5. 66 Type a comment and mark in the Property Window (1) Figure 5. 67 Type a comment and mark in the Property Window (2) 5.3.8 Change object’s parameter (movement & sizes) Changing an object’s parameter (BIM metadata) in AR is the most important feature of TransBIM. The test TransBIM version is a prototype that only provided only limited types of editable parameters (objects’ locations, sizes, marks, and comments). Changing an object’s “comments” and “mark” has shown in 5.37 (Figure 5.66, Figure 5.67). The movement and sizes are other editable parameters that users can change in the “Change Parameter Window”. For example, a duct and a conduit were selected to test the movement. The duct was moved up 2 feet, moved longitudinally 1 foot, and moved cross-sectionally 1 foot (Figure 5.68 Left). The conduit was moved likewise (Figure 5.68 Right). Examples also show a duct diameter size was also changed from 16 inches to12 inches in the “Change Parameter Window” (Figure 5.70). 180 Figure 5. 68 Move a duct (Left); move a conduit (Right) The movement value was done by typing the decimal characters (Figure 5.69). Typing non-decimal characters will automatically set input results to zero. Figure 5. 69 Change the object’s movement by inputting decimal characters Figure 5. 70 Change the duct diameter size 181 5.3.9 Upload the changed data to Dropbox After changing some MEP components’ parameters information in the AR environment onsite, the final step was uploading the changes to Dropbox. The uploading process was done by touching the “Upload Data” and “Upload File” buttons. The file called “objectMeta.txt” with all new parameters’ information was uploaded to Dropbox (Figure 5.71, Figure 5.72). In the case study, one duct and one conduit were changed from their original locations. Figure 5. 71 Click the “Upload data” button to upload the BIM metadata text file to Dropbox Figure 5. 72 The uploaded text file in Dropbox 5.4 Update Revit model The uploaded text file “ObjectMeta.txt” was downloaded from the Dropbox. The Watt Hall Revit model was opened in Revit. Then the Dynamo script “Change location multiple.dyn” was opened to read the text file to update the Revit model (Figure 5.73, Figure 5.74). Because the duct was moved in the AR environment, the duct’s location in the Revit model was updated to where the duct had been set in the AR environment (Figure 5.75, Figure 5.76). The details about the Dynamo scripts see Chapter 4.3. Figure 5. 73 Use “Change location multiple.dyn” (Partial) to update the Revit model 182 Figure 5. 74 Change location multiple.dyn According to the above examples of using TransBIM at Watt Hall, the duct was moved 0.5 feet in vertical movement, 1 foot in both longitudinal direction and cross-sectional direction. Also, the duct’s diameter was changed from 16 inches to 12 inches, and comments and mark were assigned to the duct in TransBIM. Therefore, the Revit model was changed based on the changed made in TransBIM. Examples show the comparison of the Revit model before updating and the Revit model after updating (Figure 5.75, Figure 5.76). The duct in Revit was moved from 14’6’’ height t 15’ 0’’ height, and moved 1 foot in both longitudinal and cross-sectional directions. The mark and comments were also updated in the Revit model (Figure 5.75, Figure 5.76). Figure 5. 75 The Revit Model before updating 183 Figure 5. 76 The Revit Model after updating In summary, updating the Revit model based on the changed BIM metadata was the last step in the case study. The overall workflow of using TransBIM to edit the building information model in the AR environment was realized. 5.5 Summary This chapter discusses how the TransBIM application was used in a case study building (USC Watt Hall) (Figure 5.74). The TransBIM features were discussed (Figure 5.77, Table 5.1). The basic functionalities of TransBIM were shown to work by the case study test. The case study showed how “TransBIM” provides a way to change the building information model in the AR environment. Users can use TransBIM to view and edit the building information model in the AR environment. By using TransBIM, the parameter information (locations, sizes, comments, etc.) of the MPE components in Watt Hall was modified in AR and transferred back to the Watt Hall Revit model. The Revit model was updated with the changed metadata in AR. As a result, TransBIM can be a prototype to visualize and edit the building information model in the AR environment. Figure 5. 77 Using TransBIM in Watt Hall The chapter also covers how the users can use the TransBIM scene in Unity to build their TransBIM application for their buildings. The process of changing the BIM metadata in the AR environment and using the new BIM metadata to update Revit was discussed. The basic functionalities of TransBIM were shown to work by the case study test. In summary, the changes made in AR were successfully transferred back to Revit and used to update the Revit model. Chapter 6 discusses the limitations of the tool, research limitations, conclusion, and future work. 184 Figure 5. 78 TransBIM Features Table 5. 1 The summary of TransBIM features FEATURE DESCRIPTION 1. Welcome Panel When opening TransBIM, the screen shows a “Welcome Panel” listing instructions on how to use the application. 2. Scan Images TransBIM can detect the defined images, get the images’ locations, and generate a virtual sphere model at the detected image center. 3. Load Model The virtual model will be displayed on the device screen and merged in the real environment. 4. Filter object TransBIM provides a “Filter Window” to set visibility of objects based on categories such as walls, floors, roofs, etc. 5. Select object Enable users to select an object by touching the screen. The selected object’s color will change to a highlighted color (yellow). Also, a “Gizmos” object, which contains arrows to show the object’s longitudinal and cross-sectional direction, will be displayed at the touched object’s center. 6. Change the object’s color Switch the selected object’s color between the highlighted color (yellow )and the object’s original color 7. Hide and unhide objects Hide or unhide a selected object (set the visibility of an object to true or false) 8. View object’s parameter Shows the parameters information of a selected object in the “Property Window” 9. Change the object’s parameter (movement) Change the location of a selected object in the AR environment 10. Upload the changed data to Dropbox Generate a file “objectMeta.txt” with all new parameters information and uploaded the file to Dropbox 185 Chapter 6 6. DISCUSSION AND FUTURE WORK This chapter first overviews the research background and methodology. Then it discusses how the workflow performed during the development of TransBIM, together with some limitations of the workflow and the software. The chapter also proposes measures of improvement to deal with existing limitations and shows the future potential of TransBIM as a prototype to be further developed into a more advanced product assisting facility operation and management. 6.1 Background and research methodology Chapter 1 first introduced some basic concepts of BIM, FM, VR/AR/MR. BIM is popularly implemented across the AEC/FM industry. The interoperability and data-rich model of BIM enhances the efficiency in FM during the building lifecycle. FM is an important job to ensure and maintain a building normally operating. It is also a time- consuming job as FM is responsible for all physical components. Because using AR can visualize virtual objects superimposed in the real world and the AR technology has become increasingly accessible, many industries (non- AEC/FM and AEC/FM) are applying AR for specific tasks. It is worth considering integrating BIM with the AR visualization technique to improve the working efficiency in FM. Then, some specific research about BIM and FM, and how BIM can contribute to FM tasks were discussed in Chapter 2. The integration of BIM and AR has a powerful potential to help AEC/FM tasks. Research on using BIM and AR in construction, architecture, and facility management is overviewed. Some current marketing BIM-based AR products were also explained. Overall, one of the most important limitations of the current BIM-VR/AR tools is that the BIM metadata only transferring from BIM to AR/VR products. The building information model cannot be edited in the AR environment. Changes should always be made in BIM software to update the visualization effect in VR/AR. However, updates to the Revit model in the operating phase are made in office based on whatever the onsite pictures, FM personal comments, and FM system records. There is a delay between the time when changing the asset information onsite and the time when the changes in the building information model are applied. Also, there is a spatial gap where the office modeler cannot see the relationship between the model and the building. To build a potentially more efficient tool to help facility managers, a BIM-based mobile AR application, TransBIM, was created, where the BIM data can be edited and exchanged in two-way communication between the BIM environment and the AR environment. The overall methodology designed tool functionalities, and the detailed research process were discussed in Chapter 3. The methodology contains the three main parts: “BIM Data to AR”, “BIM Data in AR”, and “BIM Data to BIM” (Figure 6.1). Figure 6. 1 Methodology 186 The software development was explained in Chapter 4. Then Chapter 5 showed a case study example of how to develop TransBIM for users’ buildings and use TransBIM. 6.2 Evaluation of current workflow The current workflow enables BIM data to be transferred from Revit to TransBIM, modified in TransBIM, and transferred from TransBIM back to Revit. Based on the methodology, the BIM data is circling among Revit, Unity, TransBIM, which two-way communication between the BIM environment and the AR environment was realized (Figure 6.2). Figure 6. 2 Interaction between the user, Revit, and Unity 3d (BIM data circle) The overall workflow of creating a TransBIM application for a Revit model, using TransBIM and updating the Revit model is composed of three main parts: “BIM data to AR”, “BIM data in AR”, and “BIM data to BIM” (Figure 6.3) 187 Figure 6. 3 Overall Methodology The first part “BIM data to AR” addresses how the needed BIM geometry and BIM metadata were prepared (Figure 6.3). The BIM geometry is the FBX file exported from Revit and then modified in 3ds Max. The BIM metadata is a CSV file that can be generated by using a created Dynamo script. Both the BIM geometry and BIM metadata are imported to Unity to build the TransBIM project. The second part “BIM Data in AR” addresses how the user can develop a TransBIM application with the BIM data and use the application to change the BIM data (Figure 6.3). The development of TransBIM requires users to use the created “TransBIM” scene. The scene contains all necessary assets for building a TransBIM application, such as Unity scripts, UI canvas, Prefabs, etc. The BIM geometry (FBX file) and BIM metadata (CSV file) are also imported into the scene as assets. Although the scene has provided the content, users must still manually set the pivot of their FBX file in Unity for setting custom scan images for their project. The instructions on how to set the model pivot for scan image locations are in Chapter 5.2. After building the TransBIM on mobile devices, TransBIM can be used to visualized and edit the building information model (such as moving the pipe). Finally, the changed BIM data can be upload to Dropbox for future updates in Revit. The third part “BIM data to BIM” addresses how the changed BIM data downloaded from Dropbox can be used to update the Revit model (Figure 6.3). Dynamo and developed Dynamo scripts are used again. Overall, the BIM data is cycled between Revit and TransBIM and this workflow enables users to edit the building information model on site. The concept of using AR to edit BIM was implemented. The implementation of the Watt Hall case study proved the development workflow. In that example, the duct and conduits were moved to new locations in the AR environment and finally, the duct and conduit in the Revit model were updated. 6.3 Discussion of results: examination of tool features This section summarizes the performance of tool features. TransBIM, as a BIM-based mobile AR application, enables users to visualize and edit their building information model, especially for MEP components, in the AR environment. With TransBIM, users can merge the 1:1 scale building information model with the real building (Figure 6.4). Users can select the building components in the AR environment and see the BIM information (Figure 6.4). Users can filter objects to only work with specific categories. The most important feature is to change the MEP components parameter information. The current TransBIM only supports changing the objects’ location information (Figure 6.4). 188 Figure 6. 4 Some features of TransBIM TransBIM is composed of eight main features separated into three categories. These features are examined with the case study of Watt Hall (Table 6.1). Most of the features performed satisfactorily by always producing desired outputs, except for some minor issues with the ARKit image tracking. Table 6. 1 The summary of TransBIM features FEATURE DESCRIPTION 1. Welcome Panel When opening TransBIM, the screen shows a “Welcome Panel” listing instructions on how to use the application. 2. Scan Images TransBIM can detect the defined images, get the images’ locations, and generate a virtual sphere model at the detected image center. 3. Load Model The virtual model will be displayed on the device screen and merged in the real environment. 4. Filter object TransBIM provides a “Filter Window” to set visibility of objects based on categories such as walls, floors, roofs, etc. 5. Select object Enable users to select an object by touching the screen. The selected object’s color will change to a highlighted color (yellow). Also, a “Gizmos” object, which contains arrows to show the object’s longitudinal and cross-sectional direction, will be displayed at the touched object’s center. 6. Change the object’s color Switch the selected object’s color between the highlighted color (yellow )and the object’s original color 7. Hide and unhide objects Hide or unhide a selected object (set the visibility of an object to true or false) 8. View object’s parameter Shows the parameters information of a selected object in the “Property Window” 9. Change the object’s parameter (movement) Change the location of a selected object in the AR environment 10. Upload the changed data to Dropbox Generate a file “objectMeta.txt” with all new parameters information and uploaded the file to Dropbox 189 In addition to using TransBIM to visualize and edit the building information model, Dynamo and developed Dynamo scripts shall be used to realize the whole process of exchanging the BIM metadata between Revit and TransBIM. These Dynamo scripts are categorized into two parts: preparing BIM data and updating the BIM model (Table 6.2) (Figure 6.5). Table 6. 2 The summary of Dynamo scripts and features DYNAMO SCRIPT FEATURES Revit to excel.dyn Add each object’s coordinate information into the exported Excel spreadsheet Change location multiple.dyn Update BIM object’s location based on modified metadata Figure 6. 5 Scripts Overview 6.4 Limitations Limitations of the development methodology and the software are also examined to guide improvements for future versions. There are four categories of current limitations: 6.4.1 Problems with the workflow 6.4.2 Usability of the current prototype 6.4.3 Building components isolated in TransBIM 6.4.1 Problems with the workflow Although the current workflow can realize edit Revit information in AR and finally update the Revit model, there are many opportunities for improvement in the overall workflow and developed TransBIM features. The limitation of the current workflow is that there are many manual steps needed when users build the TransBIM for their projects. For example, to prepare BIM metadata and to update the Revit model are required to use Dynamo. It is not useful for someone who only knows Revit. Also, users have to the basic use of Unity to do all the required steps. Below explain all manual steps. 190 First, users should use Dynamo to export Revit metadata and update the Revit model. Manually using the Dynamo script is not user-friendly. Users need to open the Dynamo scripts and connect a few nodes to realize the necessary functions. For users’ convenience, developing a Revit add-in with Revit API may be easier, because users only need to click buttons or select files to realize the same functions. Second, in the first part of the workflow, users need to use the 3ds Max to convert the FBX materials and reset the pivot point of each entity in the FBX model. The approximate time to these tasks around 10~20 minutes, depending on how users are familiar with the process. Both using Dynamo and using 3ds Max are manual tasks and takes users’ time. This method to export Revit geometry and metadata causes inconvenience for users. Also, the FBX and the CSV file exported from Revit are not real-time data. The geometry and metadata used in TransBIM cannot synchronize with the Revit model in real-time. When changing the Revit model, the FBX file and CSV file must be re-generated and all the processes must be repeated to build the TransBIM. Third, in the “BIM Data in AR” part, when creating the TransBIM application, it requires some manual operation to edit the TransBIM scene before users can build the TransBIM to visualize the model information. For example, users have to create a prefab for their model and add script components to the building GameObject (Figure 6.6). Also, users need to generate asset files in Unity. Users have to run the application in Unity Editor before they build the application on their devices so that a series of asset files (ScriptableObjects) that stores individual object’s Revit metadata can be generated in the TransBIM Resource folder (Figure 6.7). Another manual process is that users have to set the pivot point of the whole model to define where they want to put the scanned images on site. However, this process cannot be avoided because the individual models are not the same, and there is currently no way to uniformly identify the pivot of each model and the scanned image’s locations of each building. Figure 6. 6 Add script components to all FBX entities Figure 6. 7 Generated asset files that store the object’s metadata 191 Some of these limitations could be addressed by using Unity Reflect. Unity Reflect provides real-time synchronization from Revit to Unity (Figure 6.8). Users can upload the Revit project to Reflect via the Reflect plug- in in Revit and the model can be synchronized so that users can visualize the BIM model and metadata in Reflect real-time (Figure 6.9). However, Reflect only provides the one-way synchronization from Revit to Reflect; changes in Reflect cannot update back to Revit. Moreover, the built-in AR feature of Reflect is a table-placed AR application, in which the model is a small-scale model put on a surface and cannot merge with the building. Unity Reflect provides a package for Unity Pro licensed users to develop custom real-time BIM application (Figure 6.10). Using Reflect and some features of TransBIM could make TransBIM into a real-time BIM-based AR application. TransBIM can get the real-time building information model to edit the building information model onsite and upload the changed BIM data to Dropbox. Then users can download the changed BIM data to update the Revit model. Figure 6. 8 Unity Reflect (Retrieved from https://unity.com/products/reflect ) Figure 6. 9 Upload Revit project to Reflect 192 Figure 6. 10 Develop custom real-time BIM application by using the preview packages in Unity Pro 6.4.2 Usability of the current prototype Besides the limitations in the current workflow, TransBIM also has some application limitations that remain to be improved. First, although the intent of the TransBIM was to modify the BIM metadata in the AR environment as a way to help Facility Management and Renovation, the current TransBIM only provides limited objects’ parameters to be edited. Only the location information was modified in the TransBIM prototype. Second, moving the object in the AR environment may cause some visual illusion. In the AR environment, the virtual objects are always overlaid the camera’s capturing environment. Therefore, it is hard to tell whether a component has moved behind a physical object. For example, another example of the duct was moved in AR with a higher height (vertical movement). The result shows after the Revit model being updated, the duct was moved above the roof in Revit (Figure 6.11); however, in the AR environment, the virtual roof was hidden before and moving the duct above the roof cannot be noticed. Figure 6. 11 The updated duct in Revit was above the roof 6.4.3 Building components isolated in TransBIM In Revit, the MEP components are usually connected by fittings (Figure 6.12). For these connected MEP components, moving one element can affect other elements that are connected (Figure 6.13). If moving improperly, for example, Revit cannot automate the fitting for the new location of a moved duct, there will be a connection warning (Figure 6.14). There are many other types of connections for different elements in Revit. However, the building components are isolated in TransBIM. 193 Figure 6. 12 A type of duct fitting Figure 6. 13 When moving a duct, the connected duct will automate moving or stretching Figure 6. 14 Connection warning if moving improperly The building information model elements in TransBIM were not connected. It is a big limitation. Users can move the MEP element via TransBIM, but the connections cannot be moved properly in the AR environment (Figure 6.15). Moreover, there would not be any warning if users move a MEP element improperly in TransBIM. When updating the Revit model based on the TransBIM returned data, the locations of the elements that have been edited in TransBIM can be updated and the connected MEP elements will be moved correspondingly (Figure 6.16). However, a concern is that some elements updating to new locations may cause warning because TransBIM does not check whether a movement is proper to avoid the warning. 194 Figure 6. 15 Moving a duct does not move the connections properly Figure 6. 16 The connected ducts and fittings can be moved with the changed duct in Revit 6.5 Future Work This section introduces possible directions and measures for addressing existing limitations and expand the functionalities of TransBIM. Future work can be categorized into four parts: 6.5.1 Code improvement 6.5.2 TransBIM user experiences improvement 6.5.3 Real-time synchronization 6.5.4 TransBIM visionary development for FM 6.5.1 Code improvement For the TransBIM Unity script, the codes can be simplified. For example, there are many <string> to <float> conversion processes in the developed scripts. In some cases, the string contains the numeric number but also has specific characters with meanings. The height information is a good example (Figure 6.17). When the height information is “ 8’-0 9/32” ”, the script needs to extract the 8 as feet number and extract the 9/32 as inches number, and convert the feet and inches to feet and decimal feet. Extraction from the string uses the split method that splits the height string by an array of character ( { '\'', ' ', '-', ' ', '\"' }) . However, using a regular expression on the string can be cleaner to extract the needed number. 195 Figure 6. 17 Extract Height information from string and convert the extracted string to float 6.5.2 TransBIM user experiences improvement The user experiences improvement focuses on making the TransBIM workflow easier and adding more features in TransBIM. Developing Revit Add-in to replace Dynamo script and developing more editable parameters for more objects in TransBIM are considered. First, Revit Add-in to replace Dynamo script should be developed. Since using Dynamo scripts requires users to have a little familiarity with Dynamo, all the Dynamo scripts can be replaced by using the Revit API to realize the same function. Using Revit API, a Revit add-in can be developed to simplify the users' tasks. The Revit add-in covers all the same functions defined by the developed Dynamo script. Instead of opening the Dynamo and understand the process of Dynamo scripts, users only need to click the add-in button in Revit to export the BIM metadata or update the Revit model based on the returned text file. Second, TransBIM should be built with more editable parameters for more types of objects. The developed TransBIM focused on visualizing and editing the BIM data for MEP components and walls. The scripts defined different types of ScriptableObject to save each object’s BIM metadata. Only duct, conduit, pipes, and walls were defined in ScriptableObjects. For example, the duct parameters were defined in the “ductParameter.cs” (ScriptableObject) (Figure 6.18). Therefore, the BIM metadata for other categories such as roof, floor, lighting fixtures, equipment, etc. cannot be provided. Moreover, the editable parameter feature is limited. Only a few parameter types (locations, sizes, comments, and mark) can be edited in TransBIM. The size of the MEP components and other functional information can be added to be editable in the future TransBIM features. Figure 6. 18 Define the ScriptableObject for duct category (Partial scripts) 196 6.5.3 Real-time BIM synchronization When the Revit model is changed, the current TransBIM should be rebuilt in Unity. It takes time for users. Future work can extend the TransBIM features by using Unity Reflect to realize TransBIM real-time building information model synchronization. It can save the manually time when users need to update the Revit model in TransBIM. Using Reflect will change the “BIM to AR” workflow, which means the FBX file and CSV file will not need to be edited manually by users, and users do not need to use 3ds Max. Unity Reflect can automate the BIM geometry and BIM data synchronizing with the developed BIM application in real-time. However, it is still important to add object coordinate parameters and assign the object coordinate information into the parameter because the coordinate information should be used in TransBIM to calculate the object’s directions and new locations. One of the possible technical problems is that Unity Reflect does not reset the pivot of each Revit component. Therefore, the script “ReferenceObject.cs”, which controls the movement of objects, should be modified so that the changing the object’s location in TransBIM is still working. The combination of Unity Reflect and TransBIM could make real-time BIM data synchronization between the Revit and new TransBIM (Figure 6.19). The ideal result is the user can upload the Revit model in office, on-site users can download the latest model and edit more parameters on site. All the changed data will be upload to Dropbox, and Dynamo or a future Revit add-in is used to update the Revit model. Figure 6. 19 A proposal methodology 6.5.4 TransBIM visionary development for FM The case study described in Chapter 5 showed TransBIM can move the MEP components in AR and update the Watt Hall Revit model. However, because only limited parameters can be edited in the current TransBIM, TransBIM cannot be used to undertake core tasks in FM. The only feature provides FM a way to move the BIM model onsite. The basic idea of using TransBIM is to quickly update the building information model during facility management. This can be a great opportunity for FM managers when some building asset has been changed or been upgraded. Instead of recording the changes in the FM management system and update the building information model later, TransBIM enables FM managers to update the asset onsite in the AR environment. The changes made in AR will automate being changed in the Revit model. Here are some visionary features of what TransBIM can do with future development. TransBIM can be made to be A cloud-based app so that users can download and upload the latest BIM model. FM managers can move the MEP elements, change the MEP elements’ sizes, and view the asset information. TransBIM can not only building components' information but also can view the equipment system operating information. This requires that TransBIM integrates with cloud-based FM systems to extract the real-time operating data. 197 Moreover, the ideal feature of TransBIM is users can draw building information model components onsite rather than using Revit. Instead of only changes the existing building information model, TransBIM also provides many assets that users can put virtual elements into the AR environment. Users can delete a lighting fixture and put a new lighting fixture. The ideal feature of TransBIM is users can draw building information model components onsite rather than using Revit. These assets are compatible with Revit (Revit families) so that the later update Revit can import the new components in the model. After TransBIM will be developed with more features, the future study can compare TransBIM with other marketing FM mobile applications to decide what are the limitations and benefits of using TransBIM for FM personnel. 6.6 Conclusion A mobile BIM-based augmented reality application, TransBIM, was created and tested. It is a BIM-based tool because building the TransBIM on users’ devices needs two assets (BIM geometry file and BIM metadata file). The BIM geometry is an FBX file exported from Revit and then modified in 3ds Max. The BIM metadata file is a CSV file that can be generated by using Dynamo and developed Dynamo scripts. With the TransBIM application, the BIM geometry and element information, specifically for the building MEP systems, can be visualized and edited in an AR environment. The changes in the building information model can be saved as a text file and uploaded to Dropbox and another developed Dynamo script is used to update the Revit model based on the changes. The functionality and instruction of using TransBIM were evaluated by the case study of Watt Hall. The case study result showed that the virtual model of Watt Hall and physical Watt Hall building were merged, and all application features worked (Figure 6.20). The MEP systems’ Revit parameter information was provided, and some parameter information such as the MEP component’s locations (ducts and conduits’ locations) were edited with TransBIM. Finally, the Watt Hall Revit model was updated by using the developed Dynamo script and the text file generated by TransBIM. Figure 6. 20 Using TransBIM on site (Watt Hall) Overall, the idea of transferring the BIM metadata between BIM (Revit) and AR (TransBIM) was implemented. The TransBIM can allow the FM manager to visualize the existing HAVC system’s BIM asset information and provides an opportunity to directly modify the MEP asset onsite. By scanning only two images, the whole building information model (building assets) can be superimposed on the real environment. This AR feature could potentially save the FM task time. The facility manager can walk around the building and quickly check the information of a building asset by touching the virtual asset model. Moreover, TransBIM provides a new concept of the way to manage and update the Revit model during a building’s operating phase. The idea of using TransBIM to update the Revit model focuses on modifying the BIM metadata on-site by using AR. TransBIM allows the on-site FM personnel to modify the BIM asset in AR and quickly update the BIM by developed using Dynamo script. The TransBIM could be a prototype for developing more sophisticated products that help model management and model renovation in FM. Moreover, it could be integrated with other commercially available Revit-to-AR tools that have real-time one-way Revit to AR synchronizing. A real-time two way synchronizing of BIM metadata between AR and Revit could better streamline workflows of the FM managing the building information model during the project’s lifecycle. 198 REFERENCES Arayici, Y., Onyenobi, T., & Egbu, C. (2012). Building Information Modelling (BIM) for Facilities Management (FM): The Mediacity Case Study Approach. International Journal of 3-D Information Modeling. https://doi.org/10.4018/ij3dim.2012010104 ARKit. (2020). Apple ARKit Website. Accessed March 29, 2020. https://developer.apple.com/augmented reality/arkit/ ARFoundation. (2020). Unity ARFoundation 3.0 Manual.Accessed March 29, 2020. https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@3.0/manual/index.html BIM Wiki .(2019). "BIM maturity levels". Accessed March 29, 2020. https://www.designingbuildings.co.uk/wiki/BIM_maturity_levels BIM 360. (2020). "Autodesk BIM 360." Autodesk. Accessed March 29, 2020. https://www.autodesk.com/bim-360/ Becerik-Gerber, B., Jazizadeh, F., Li, N., & Calis, G. (2012). Application areas and data requirements for BIM- enabled facilities management. Journal of Construction Engineering and Management. https://doi.org/10.1061/(ASCE)CO.1943-7862.0000433 BIM FORUM. (2018). “LOD Specification 2018”. Accessed March 6, 2020. https://bimforum.org/wp- content/uploads/2018/09/BIMForum-LOD-2018_Spec-Part-1_and_Guide_2018-09.pdf Barack, L. (2016). The future of play: virtual and augmented reality hold promise--especially if playtime gets its due.(DESIGNING THE FUTURE). Library Journal, 141(15). Chi, H. L., Kang, S. C., & Wang, X. (2013). Research trends and opportunities of augmented reality applications in architecture, engineering, and construction. Automation in Construction. https://doi.org/10.1016/j.autcon.2012.12.017 Costin, A., Pradhananga, N., Teizer, J., & Marks, E. (2012). Real-time resource location tracking in Building Information Models (BIM). Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). https://doi.org/10.1007/978-3-642-32609-7_5 Ding, L., Zhou, Y., & Akinci, B. (2014). Building Information Modeling (BIM) application framework: The process of expanding from 3D to computable nD. Automation in Construction. https://doi.org/10.1016/j.autcon.2014.04.009 Dom, B. (2018). History of VR - Timeline of Events and Tech Development. Dossick, C. S., & Neff, G. (2010). Organizational divisions in bim-enabled commercial construction. Journal of Construction Engineering and Management. https://doi.org/10.1061/(ASCE)CO.1943-7862.0000109 Du, J., Shi, Y., Mei, C., Quarles, J., & Yan, W. (2016). Communication by Interaction: A Multiplayer VR Environment for Building Walkthroughs. Construction Research Congress 2016: Old and New Construction Technologies Converge in Historic San Juan - Proceedings of the 2016 Construction Research Congress, CRC 2016. https://doi.org/10.1061/9780784479827.227 Du, J., Zou, Z., Shi, Y., & Zhao, D. (2018). Zero latency: Real-time synchronization of BIM data in virtual reality for collaborative decision-making. Automation in Construction. https://doi.org/10.1016/j.autcon.2017.10.009 Engineering Document Management with Meridian.(2018). Accurent. Accessed March 29, 2020. https://www.accruent.com/solutions/engineering-document-management/engineering-document-management- meridian 199 Eastman, C. (2011). A Guide to Building Information Modeling for Owners, Managers, Designers, Engineers, and Contractors. In John Wiley & Sons, Inc. https://doi.org/10.1093/nq/s7-II.32.110-e Eberly, D. (2007). 3D game engine design : a practical approach to real-time computer graphics (2nd ed.). Amsterdam, [Netherlands: Morgan Kaufmann. Ecodumos. (2019). EcoDomus. Accessed March 29, 2020. http://ecodomus.com/ Ford, S., Aouad, G., Kirkham, J., Brandon, P., Brown, F., Child, T., … Young, B. (1995). An information engineering approach to modelling building design. Automation in Construction. https://doi.org/10.1016/0926-5805(94)00029-M Fritsch, D., & Kada, M. (2004). Visualisation using game engines. Archiwum ISPRS, 35, B5. Ghaffarianhoseini, A., Tookey, J., Ghaffarianhoseini, A., Naismith, N., Azhar, S., Efimova, O., & Raahemifar, K. (2017). Building Information Modelling (BIM) uptake: Clear benefits, understanding its implementation, risks and challenges. Renewable and Sustainable Energy Reviews. https://doi.org/10.1016/j.rser.2016.11.083 Gutiérrez A., M. A., Vexo, F., & Thalmann, D. (2008). Stepping into virtual reality. In Stepping into Virtual Reality. https://doi.org/10.1007/978-1-84800-117-6 HM Government. (2012). Building Information Modelling. Industrial Strategy: Governement and Idustry in Partnership. https://doi.org/10.1016/j.aei.2007.03.001 Hololens. (2019). Hololens Website. Accessed March 6, 2019. https://www.microsoft.com/en-us/hololens Holz, T., Campbell, A. G., Ohare, G. M. P., Stafford, J. W., Martin, A., & Dragone, M. (2011). MiRA-mixed reality agents. International Journal of Human Computer Studies. https://doi.org/10.1016/j.ijhcs.2010.10.001 Hosseini, M. R., Roelvink, R., Papadonikolaki, E., Edwards, D. J., & Pärn, E. (2018). Integrating BIM into facility management: Typology matrix of information handover requirements. International Journal of Building Pathology and Adaptation. https://doi.org/10.1108/IJBPA-08-2017-0034 Hou, L., Wang, X., Bernold, L., & Love, P. E. D. (2013). Using animated augmented reality to cognitively guide assembly. Journal of Computing in Civil Engineering. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000184 Hughes, N., Wells, M., Nutter, C. Zack, J. (2013).Impact and Control of RFIs on Construction Projects. http://www.navigant.com/insights/ library/construction/construction-forum/rfis-on-construction-projects. ISO 41011: Facility Management. (2017). International Organization for Standardization. Accessed March 29, 2020. https://www.iso.org/standard/68167.html International Facility Management Association. (2019). "What is Facility Management?". Accessed March 29, 2020. https://www.ifma.org/about/what-is-facility-management Irizarry, J., Gheisari, M., Williams, G., & Walker, B. N. (2013). InfoSPOT: A mobile Augmented Reality method for accessing building information through a situation awareness approach. Automation in Construction. https://doi.org/10.1016/j.autcon.2012.09.002 Kassem, M., Kelly, G., Dawood, N., Serginson, M., & Lockley, S. (2015). BIM in facilities management applications: A case study of a large university complex. Built Environment Project and Asset Management. https://doi.org/10.1108/BEPAM-02-2014-0011 Kensek, K. (2015). BIM guidelines inform facilities management databases: A Case Study over Time. Buildings. https://doi.org/10.3390/buildings5030899 200 Kensek, K. M. (2014). Building information modeling. In Building Information Modeling. https://doi.org/10.4324/9781315797076 Kim, M. J., Wang, X., Love, P. E. D., Li, H., & Kang, S. C. (2013). Virtual reality for the built environment: A critical review of recent advances. Journal of Information Technology in Construction. Lee, a, Aouad, G., Cooper, R., Fu, C., Marshall-Ponting, a J., Tah, J. H. M., & Wu, S. (2005). nD modelling-a driver or enabler for construction improvement? In RICS Research Paper Series, RICS, London. Li, X., Yi, W., Chi, H., Wang, X., & Chan, A. (2018). A critical review of virtual and augmented reality (VR/AR) applications in construction safety. Automation in Construction, 86, 150–162. https://doi.org/10.1016/j.autcon.2017.11.003 Lin, Y. C., Su, Y. C., & Chen, Y. P. (2014). Developing mobile BIM/2D barcode-based automated facility management system. Scientific World Journal. https://doi.org/10.1155/2014/374735 Liu, R., & Issa, R. R. A. (2013). Issues in BIM for facility management from industry practitioners’ perspectives. Computing in Civil Engineering - Proceedings of the 2013 ASCE International Workshop on Computing in Civil Engineering. Microsoft Documents. (2020). ARKit document. Accessed March 6, 2020. https://docs.microsoft.com/en- us/dotnet/api/arkit.arsession?view=xamarin-ios-sdk-12 Milgram, P., & Kishino, F. (1994). A Taxonomy of Mixed Reality Visual Displays ActiveCube View project Augmented Reality through Graphic Overlays on Stereoscopic video View project A TAXONOMY OF MIXED REALITY VISUAL DISPLAYS. In IEICE Transactions on Information Systems. Mihelj, M., Novak, D., & Beguš, S. (2014). Virtual Reality Technology and Applications. Intelligent Systems, Control and Automation: Science and Engineering. https://doi.org/10.1007/978-94-007-6910-6 Nicał, A. K., & Wodyński, W. (2016). Enhancing Facility Management through BIM 6D. Procedia Engineering. https://doi.org/10.1016/j.proeng.2016.11.623 Pärn, E. A., Edwards, D. J., & Sing, M. C. P. (2017). The building information modelling trajectory in facilities management: A review. Automation in Construction. https://doi.org/10.1016/j.autcon.2016.12.003 Penttilä, H. (2006). Describing the changes in architectural information technology to understand design complexity and free-form architectural expression. Electronic Journal of Information Technology in Construction. Peters, E., Heijligers, B., De Kievith, J., Razafindrakoto, X., Van Oosterhout, R., Santos, C., … Louwerse, M. (2016). Design for collaboration in mixed reality: Technical challenges and solutions. 2016 8th International Conference on Games and Virtual Worlds for Serious Applications, VS-Games 2016. https://doi.org/10.1109/VS-GAMES.2016.7590343 Porwal, A., & Hewage, K. N. (2013). Building Information Modeling (BIM) partnering framework for public construction projects. Automation in Construction. https://doi.org/10.1016/j.autcon.2012.12.004 Parsanezhad, P., & Dimyadi, J. (2013). Effective Facility Management and Operations via a BIM-based Integrated Information System. https://doi.org/10.13140/2.1.2298.9764 Ratajczak, J., Riedl, M., & Matt, D. T. (2019). BIM-based and AR Application Combined with Location-Based Management System for the Improvement of the Construction Performance. Buildings. https://doi.org/10.3390/buildings9050118 201 R. Bille, S.P. Smith, K. Maund, G. Brewer, Extending building information models into game engines, Proceedings of the 2014 Conference on Interactive Entertainment, ACM, 2014, pp. 1–8, http://dx.doi.org/10.1145/2677758. 2677764. Ramachandra, C.G., Srinivas, T.R. and Shruthi, T.S. (2012), “A study on development and implementation of a computerized maintenance management information system for a process industry”, International Journal of Engineering and Innovative Technology, Vol. 2 No. 1, pp. 93-98. Revit IFC Manual. (2018). “About Revit and IFC”. Accessed March 6, 2020. http://help.autodesk.com/view/RVT/2018/DEU/?guid=GUID-6708CFD6-0AD7-461F-ADE8-6527423EC895 Steuer, J. (1992). Defining Virtual Reality: Dimensions Determining Telepresence. Journal of Communication, 42(4), 73–93. https://doi.org/10.1111/j.1460-2466.1992.tb00812.x Shahin, A. and Ghazifard, A.M. (2013), “Radio Frequency Identification (RFID): a technology for enhancing Computerized Maintenance System (CMMS)”, New Marketing Research Journal, Vol. 3 No. 5, pp. 13-20. Shin, D. H., & Dunston, P. S. (2008). Identification of application areas for Augmented Reality in industrial construction based on technology suitability. Automation in Construction. https://doi.org/10.1016/j.autcon.2008.02.012 Sherman, W., & Craig, A. (2019). Understanding virtual reality interface, application, and design (Second edition.). Cambridge, Massachusetts: Morgan Kaufmann. Teicholz, P. (2013). BIM for facility managers. Hoboken, N.J: John Wiley & Sons, Inc. Tekla. (2019). "BIM maturity levels". Accessed March 6, 2020. https://campus.tekla.com/bim-maturity-levels Taylor, J. E., & Bernstein, P. G. (2009). Paradigm Trajectories of Building Information Modeling Practice in Project Networks. Journal of Management in Engineering. https://doi.org/10.1061/(asce)0742-597x(2009)25:2(69) Unity Engine. (2019). “What is a game engine?” Accessed March 29, 2020. https://unity3d.com/what-is-a-game- engine/ Unity Reflect. (2020). Accessed March 29, 2020. https://unity.com/products/reflect Wang, J., Wang, X., Shou, W., & Xu, B. (2014). Integrating BIM and augmented reality for interactive architectural visualisation. Construction Innovation. https://doi.org/10.1108/CI-03-2014-0019 Wang, X., Kim, M., Love, P., & Kang, S. (2013). Augmented Reality in built environment: Classification and implications for future research. Automation in Construction, 32, 1–13. https://doi.org/10.1016/j.autcon.2012.11.021 Wang, X. (2009). Augmented Reality in Architecture and Design: Potentials and Challenges for Application. International Journal of Architectural Computing. https://doi.org/10.1260/147807709788921985 Wang, X., Love, P. E. D., Kim, M. J., Park, C. S., Sing, C. P., & Hou, L. (2013). A conceptual framework for integrating building information modeling with augmented reality. Automation in Construction. https://doi.org/10.1016/j.autcon.2012.10.012 Wang, Y., Wang, X., Wang, J., Yung, P., & Jun, G. (2013). Engagement of facilities management in design stage through BIM: Framework and a case study. Advances in Civil Engineering. https://doi.org/10.1155/2013/189105 202 Xiao, F., & Fan, C. (2014). Data mining in building automation system for improving building operational performance. Energy & Buildings, 75, 109–118. https://doi.org/10.1016/j.enbuild.2014.02.005 Yang, Z. (2017). Building information modeling based design review and facility management: Virtual reality workflows and augmented reality experiment for healthcare project [Data set]. https://doi.org/10.25549/USCTHESES-C89-40429 203 APPENDIX A: ScriptableObject A.1 ductParameter.cs /* * Created by Zeyu Lu 01/01/2020 * Define the duct parameters in TransBIM */ using UnityEngine; [CreateAssetMenu(menuName = "ScriptableObjects/ductParameter")] //[SerializeField] public class ductParameter : ScriptableObject { // Constraint public string Horizontal_Justification; // Center, Left, right public string Vertical_Justification; // Middle, bottom, top public string Reference_Level; // Level 1,2,3 //Revit 2019 elevation (offset) public string Offset; // offset based on the floor level //float public string Start_Offset; //showing //float public string End_Offset; //showing //float /*/ Revit 2020 elevation (offset) public string Middle_Elevation; public string End_Middle_Elevation; public string Start_Middle_Elevation; /*/ public string Slop; //Dimensions public string Width; // changable ///float public string Height; // changable ///float public string Size; //(width * height) ///string public string Length; //cannot be changed ///float // Identity Data public string Mark; public string Comments; //Mechanical public string System_Type; public string System_Classification;// equal to system_type public string System_Name; public string System_Abbreviation; //non-changeable public string Bottom_Elevation; //non-changeable // float public string Top_Elevation; //non-changeable // float public string Equivalent_Diameter; //non-changeable // float public bool Size_Lock; public string Loss_Coefficient; public string Hydraulic_Diameter; public string Section; public string Area; // Identity Data public string Service_Type; //null public string Image; // Image not useful now public string Design_Option; //(-1) // Type public string Element_ID; public string Type_Id; public string Type; public string Type_Name; public string Category; public string Family; public string Family_Name; public string Family_and_Type; //Phasing public string Phase_Demolished; // should build a phase option in Revit before choosing a phase public string Phase_Created; //Movement public float Longitudinal_Move; public float Vertical_Move; public float Cross_Sectional_Move; // Data public float startPoint_x; public float startPoint_y; public float startPoint_z; public float endPoint_x; public float endPoint_y; public float endPoint_z; //Can add more parameter types: //Mechanical Flow ; //Insulation ; //Lining } 204 A.2 conduitParameter /* * Created by Zeyu Lu 01/01/2020 * Define the conduit parameters in TransBIM */ using UnityEngine; [CreateAssetMenu(menuName = "ScriptableObjects/conduitParameter")] //[SerializeField] public class conduitParameter : ScriptableObject { // Constraint public string Horizontal_Justification; // Center, Left, right public string Vertical_Justification; // Middle, bottom, top public string Reference_Level; // Level 1,2,3 public string Offset; // offset based on the floor level ///float public string Start_Offset; //showing ///float public string End_Offset; //showing ///float // Electrical public string Bottom_Elevation; ///float public string Top_Elevation; ///float //Dimensions public string Diameter_Trade_Size; // changable Diameter(Trade Size) ///float public string Outside_Diameter; //cannot be changed ///float public string Inside_Diameter; //cannot be changed ///float public string Size; //cannot be changed ///float public string Length; ///float // Identity Data public string Mark; public string Comments; public string Service_Type; public string Image; // Image not useful now public string Design_Option; //(-1) not useful now // Type public string Element_ID; public string Type_Id; public string Type; public string Type_Name; public string Category; public string Family; public string Family_Name; public string Family_and_Type; //Phasing public string Phase_Demolished; // should build a phase option in Revit before choosing a phase public string Phase_Created; // Data public float startPoint_x; public float startPoint_y; public float startPoint_z; public float endPoint_x; public float endPoint_y; public float endPoint_z; //Move public float Longitudinal_Move; public float Vertical_Move; public float Cross_Sectional_Move; } A.3 wallParameter.cs /* * Created by Zeyu Lu 01/01/2020 * Define the wall parameters in TransBIM */ using UnityEngine; [CreateAssetMenu( menuName = "ScriptableObjects/wallParameter")] //[SerializeField] public class wallParameter : ScriptableObject { public string Element_ID; public string Type_Id; public string Base_Constraint;//level 1, level 2 public bool Structural; public string Location_Line; // Wall Centerline, Core Centerline, FinishLine public string Area; //Area public bool Top_is_Attached; 205 public string Type_Name; public string Base_Offset; //Length public string Category; public bool Room_Bounding; public string Mark; public string Comments; public string Volume; // Volume public string Unconnected_Height; //Wall Height public string Design_Option; //(-1) not useful? public bool Enable_Analytical_Model; public bool Related_to_Mass; public string Family; public string Family_and_Type; public string Structural_Usage; public string Top_Extension_Distance; //Length public string Length; //Length public string Top_Offset; //Length public string Top_Constraint; //Unconnected, upto level 1, level 2 public string Family_Name; public string Type; public string Image; // Image not useful now public string Base_Extension_Distance;// Length public bool Base_is_Attached; public string Phase_Demolished; public string Phase_Created; //Movement public float Longitudinal_Move; public float Vertical_Move; public float Cross_Sectional_Move; //Data public float startPoint_x; public float startPoint_y; public float startPoint_z; public float endPoint_x; public float endPoint_y; public float endPoint_z; } A.4 otherParameter.cs /* * Created by Zeyu Lu 01/01/2020 * Define the parameters for other objects except(ducts, conduits, walls) in TransBIM */ using UnityEngine; [CreateAssetMenu(menuName = "ScriptableObjects/otherObjectParameter")] //[SerializeField] public class otherObjectParameter : ScriptableObject { // Data public float startPoint_x; public float startPoint_y; public float startPoint_z; public float endPoint_x; public float endPoint_y; public float endPoint_z; //Movement public float Longitudinal_Move; public float Vertical_Move; public float Cross_Sectional_Move; // public string Element_ID; public string Category; public string Mark; public string Comments; } 206 APPENDIX B: Main Unity Scripts (Unity Monobehavior Class) B.1 ReadCSV.cs /* * Created by Zeyu Lu 01/01/2020 * Read the CSV file and create asset files in Unity Resource/metadata folder */ using UnityEngine; using UnityEditor; using System.Collections.Generic; public class ReadCSV : MonoBehaviour { public string assetPathAndName; public string loadassetPath; public string ObjectID; public string[] parameterArray; //An array store the paramter information public Dictionary<string, string> paraDict = new Dictionary<string, string>(); public ScriptableObject object_SO; //reference to the object's scriptableobject (asset) public string Category; static string[] data; static List<string> tagList; void ReadCSVFile() { Debug.Log("processing data file"); var textFile = Resources.Load<TextAsset>("Object Parameters"); //This is the CSV File Name string csvData = textFile.text.Replace("\"\"", "\"").Replace("\"\"", "\"").Replace("\" ", "\' "); // Replace the """ chracter to" exported from dynamo data = csvData.Split(new char[] { '\n' }); // Optional way below Debug.Log("All data:" + data); } void Awake() { //Extract ObjectID from Object's Name string ObjectName = gameObject.name; try { ObjectID = ObjectName.Substring(ObjectName.IndexOf('[') + 1, ObjectName.IndexOf(']') - ObjectName.IndexOf('[') - 1); Debug.Log("ObjectID "+ ObjectID); } catch { Debug.Log("Can't find ID"); } # UNITY_EDITOR // Read CSV File if (data == null) //indicate only read the csv file for one time { ReadCSVFile(); } //CREATE Array for (int i = 0; i < data.Length - 1; i++) { string[] row = data[i].Split(new char[] { ',' }); string ElementID = row[0].Substring(row[0].IndexOf(':') + 2); Debug.Log("ElementID" + ElementID); if (ElementID == ObjectID) { parameterArray = row; Debug.Log("ElementID:" + ElementID); break; } } // CREATE Dictionary foreach (string element2 in parameterArray) { if (element2.Length > 1) { string[] paraPair = element2.Split(new char[] { ':' }, 2); // if separtaor does not exist, return one element array!!! 207 string paraKey = paraPair[0].Replace("("," ").Replace(")"," ").TrimEnd(' ').TrimStart('"'); // Change paramter such as "Diameter (Trade Size)" to "Diameter Trade Size") Debug.Log("Dictkye" + paraKey); // // some parameter start with extra symbol '"' //paraKey = paraKey.Replace(" ", "_"); string paraValue; try { paraValue = paraPair[1].Substring(1); Debug.Log("DictValue" + paraValue); } catch { paraValue = ""; } if (paraDict.ContainsKey(paraKey) == false)// check whether if there already a key { paraDict.Add(paraKey, paraValue); } } } if (paraDict.ContainsKey("Category")) { // Create ScriptableObject based on category type and assign Dict information to the asset file (ScriptableObject) CreateAssetandAssignDictValue(); // Add tags in the Unity projects AddTag.CreateTag(paraDict["Category"]); gameObject.tag = paraDict["Category"]; } #endif // Load the asset file in the mobile application loadassetPath = "metadataScriptableObject/" + ObjectID; object_SO = Resources.Load<ScriptableObject>(loadassetPath); if(object_SO!= null) { FindCategory(object_SO); } if (Category != "") { gameObject.tag = Category; } } //Create ScriptableObject based on category type and assign Dict information to the asset file(ScriptableObject) void CreateAssetandAssignDictValue() { # UNITY_EDITOR assetPathAndName = "Assets/Resources/metadataScriptableObject/" + ObjectID + ".asset"; switch (paraDict["Category"]) { case "Ducts": ductParameter ductAsset = ScriptableObject.CreateInstance<ductParameter>(); AssignParameter.DictToDuctAsset(paraDict, ductAsset); if (ObjectID != null) { Debug.Log("Create new Duct asset with this elementID"); AssetDatabase.CreateAsset(ductAsset, assetPathAndName); AssetDatabase.SaveAssets(); } break; case "Walls": //Walls wallParameter wallAsset = ScriptableObject.CreateInstance<wallParameter>(); AssignParameter.DictToWallAsset(paraDict, wallAsset); //craete asset file in Unity if (ObjectID != null) { AssetDatabase.CreateAsset(wallAsset, assetPathAndName); AssetDatabase.SaveAssets(); } break; case "Conduits": conduitParameter conduitAsset = ScriptableObject.CreateInstance<conduitParameter>(); AssignParameter.DictToConduitAsset(paraDict, conduitAsset); if (ObjectID != null) //File.Exists(assetPathAndName) is false && { AssetDatabase.CreateAsset(conduitAsset, assetPathAndName); AssetDatabase.SaveAssets(); } 208 break; // when catetory isn' not the above cases. default: otherObjectParameter otherObjectAsset = ScriptableObject.CreateInstance<otherObjectParameter>(); AssignParameter.DictToOtherObjectAsset(paraDict, otherObjectAsset); if (ObjectID != null) //File.Exists(assetPathAndName) is false && { AssetDatabase.CreateAsset(otherObjectAsset, assetPathAndName); AssetDatabase.SaveAssets(); } break; } #endif void FindCategory(ScriptableObject ObjectSO) { wallParameter theWallSO = ObjectSO as wallParameter; if (theWallSO != null) { print("wall found for obj " + gameObject.name); Category = theWallSO.Category; return; } ductParameter theductSO = ObjectSO as ductParameter; if (theductSO != null) { print("duct found for obj " + gameObject.name); Category = theductSO.Category; return; } conduitParameter theconduitSO = ObjectSO as conduitParameter; if (theconduitSO != null) { print("conduit found for obj " + gameObject.name); Category = theconduitSO.Category; return; } otherObjectParameter theotherObjectSO = ObjectSO as otherObjectParameter; if (theotherObjectSO != null) { print("other object found for obj " + gameObject.name); Category = theotherObjectSO.Category; return; } } void OnEnable() { Debug.Log("PrintOnEnable: script was enabled"); } } B.2 ClickObject.cs using UnityEngine; using System.Collections; using System.Collections.Generic; using UnityEngine.UI; using UnityEngine.EventSystems; using System; /// <summary> /// ClickObject is the script enable each object can be selected; /// after selected, triggering the "property window" and "changeParameter windows" to show object's relevant information; /// also, store the object's original transform information and changed information in two dictionaries in this class instance /// </summary> public class ClickObject : MonoBehaviour { public Color ChangeColor = new Color(255f, 255f, 0f, 255f); //new Color(1f, 1f, .4f, 1f); dark blue public Hashtable ht_colors = new Hashtable(); public string thisObjectName; private GameObject parent; public GameObject GizmosOb; public GameObject GizmosInstance; public double angle; public double radians; 209 static InventoryControl propertyUI; static GameObject ControlUI; static GameObject ControlUIcontent; static GameObject InvisibleButton; static GameObject VisibleButton; [SerializeField] private GameObject textRow; [SerializeField] private GameObject optionRow; [SerializeField] private GameObject checkRow; [SerializeField] private GameObject inputRow; public GameObject[] UIlist; // A list of GameObject that are UI components (TextRow, optionRow, checkRow, inputRow) in the property window public Dictionary<string, string> changeParaDict; public Dictionary<string, float> originTsDict; //object's original transform(x,y,z) and startpoint and endpoint (3+3+3 = 9 floart number) //public GameObject[] ChangeParalist; void Awake() //start() will wrong when initiate { thisObjectName = gameObject.name; // Find the class of InventoryControl and assign the Row prefabs to public variables //(only run one time) if (propertyUI is null) { InventoryControl[] invControlInstances = FindObjectsOfType<InventoryControl>(); foreach (InventoryControl ic in invControlInstances) { if (ic.gameObject.name == "InventoryMenu") { propertyUI = ic; Debug.Log("FindInventoryControl for property UI: " + propertyUI.name); } } } Debug.Log("FindInventoryControl " + propertyUI.name); textRow = propertyUI.textRow; optionRow = propertyUI.optionRow; checkRow = propertyUI.checkRow; inputRow = propertyUI.inputRow; //Assign the "ChangeParameter" UI component to GameObject variable //(only run one time) if (ControlUI is null) { ControlUI = GameObject.Find("ChangeParameter"); ControlUIcontent = ControlUI.transform.GetChild(0).GetChild(0).gameObject; } //Find the parent of the object and give the parent a Monoscript called "ReferenceObject.cs" //(only run one time) parent = gameObject.transform.parent.gameObject; if (parent.GetComponent<ReferenceObejct>() == null) { Debug.Log("create parent"); parent.AddComponent<ReferenceObejct>(); parent.GetComponent<ReferenceObejct>().ControlUI = ControlUI; parent.GetComponent<ReferenceObejct>().ControlUIcontent = ControlUIcontent; parent.GetComponent<ReferenceObejct>().propertyUI = propertyUI; } //Find the reference of Invisible and Visible Button InvisibleButton= GameObject.Find("Invisible"); VisibleButton = GameObject.Find("Visible"); //Store the materials color information to Hashtable for (int i = 0; i < GetComponent<Renderer>().materials.Length; i++) { ht_colors.Add(i, GetComponent<Renderer>().materials[i].color); } } //When clicking the UI private bool IsPointerOverUIObject() { PointerEventData eventDataCurrentPosition = new PointerEventData(EventSystem.current); eventDataCurrentPosition.position = new Vector2(Input.mousePosition.x, Input.mousePosition.y); 210 List<RaycastResult> results = new List<RaycastResult>(); EventSystem.current.RaycastAll(eventDataCurrentPosition, results); return results.Count > 0; } //OnMouseDrag is called when the user has clicked on a Collider and is still holding down the mouse. void OnMouseDrag()//OnMouseOver() { if (!IsPointerOverUIObject()) // when click the UI is flase { parent.GetComponent<ReferenceObejct>().currentObject = gameObject; if (parent.GetComponent<ReferenceObejct>().previousObject != parent.GetComponent<ReferenceObejct>().currentObject) // When first click a new object { propertyUI.DisactiveUI(); // when click a new, disactive the UI reight away! if (ControlUIcontent != null) { Transform[] ts = new Transform[ControlUIcontent.transform.childCount]; ControlUIcontent.transform.GetChild(0).GetComponent<Text>().text = gameObject.name; for (int i = 1; i < ts.Length; i++) // disactive children component from the 2nd compoennt { ControlUIcontent.transform.GetChild(i).gameObject.SetActive(false); } } //Change selected object color Debug.Log("Change color"); Debug.Log(ht_colors); for (int i = 0; i < parent.GetComponent<ReferenceObejct>().currentObject.GetComponent<Renderer>().materials.Length; i++) // change color to the defalt color { parent.GetComponent<ReferenceObejct>().currentObject.GetComponent<Renderer>().materials[i].color = ChangeColor; } string[] tagsArray = { "Walls", "Ducts", "Conduits", "Pipes" }; List<string> tagsList = new List<string>(tagsArray); // when the GameObject has an asset file (walls, ducts, pipes, conduits) if (GetComponent<ReadCSV>().object_SO && tagsList.Contains(gameObject.tag)) { //Store some data into ChangeParaDict and originTsDict if(changeParaDict is null) { AddDataToParaDict(gameObject); } //Generate a new Gizmos for the clicked object Debug.Log("CreateGizmos"); CreateGizmos(gameObject, originTsDict); // Show editable parameter in the "chagne parameter window" Debug.Log("Show UI"); UpdateChangeParaUI(changeParaDict, gameObject); ScriptableObject tempSO = GetComponent<ReadCSV>().object_SO; // Show all object's information in the "change parameter window" Debug.Log("Inventory Control"); UpdatePropertyWindow(tempSO); //Set "Invisble Button" to Active, and Set "Visible Button" to Disactive VisibleButton.SetActive(false); InvisibleButton.SetActive(true); } //when the GameObject does not have an asset file or not belongs to (walls, ducts, pipes, conduits) else { GameObject theText= propertyUI.transform.GetChild(0).GetChild(0).GetChild(0).gameObject; theText.SetActive(true); theText.GetComponent<Text>().text = "No information for this category"; theText.GetComponent<Text>().fontSize = 45; } // check if there was a previous clicked object. If so: change the color back to original, delete that object's Gizmos if (parent.GetComponent<ReferenceObejct>().previousObject != null) { Debug.Log("Change previous color back"); ClickObject previousClick = parent.GetComponent<ReferenceObejct>().previousObject.GetComponent<ClickObject>(); Hashtable previousColors = previousClick.ht_colors; //Destroy the Gizmos of the previous selected objects DestroyImmediate(previousClick.GizmosInstance); 211 //Change the previous object's materials color back to original for (int i = 0; i < parent.GetComponent<ReferenceObejct>().previousObject.GetComponent<Renderer>().materials.Length; i++) { Color oColor = (Color)previousColors[i]; Debug.Log("Material " + (i + 1) + " color is: " + oColor); parent.GetComponent<ReferenceObejct>().previousObject.GetComponent<Renderer>().materials[i].color = oColor; } } // no previous selected object, the scripts is running because the users just chooses the first gameobject in the scene. // Pass the first clicked object to the "recordingObject" varaible in the <ReferenceObject> // only excute only one time else { parent.GetComponent<ReferenceObejct>().recordingObject = parent.GetComponent<ReferenceObejct>().currentObject; } parent.GetComponent<ReferenceObejct>().previousObject = parent.GetComponent<ReferenceObejct>().currentObject; parent.GetComponent<ReferenceObejct>().swichOn = true; } } } public void UpdatePropertyWindow(ScriptableObject tempSO) { int i = 0; int rowNumber = tempSO.GetType().GetFields().Length; // Get how many fields (parameters) of a ScriptableObject type UIlist = new GameObject[rowNumber]; Debug.Log("Total parameters number:" + UIlist.Length); Debug.Log("teamso.gettype" + tempSO.GetType().ToString()); Debug.Log("teamso.gettype" + tempSO.GetType().GetFields()); foreach (var property in tempSO.GetType().GetFields()) { Debug.Log("var name" + property.ToString()); var parameterObject = property.GetValue(tempSO); //Note: some of the field not have a value to assigned in the UIlist, thus UIList may have null value in the end. if (parameterObject != null) { var varType = parameterObject.GetType(); string varValue = parameterObject.ToString(); Debug.Log("Parameter Name: " + property.Name + ", " + " Value: " + parameterObject); // (e.g. Name: elementID value: xxxxxx) Debug.Log("Parameter Type: " + varType); //(e.g. Type: System.String) string varTypestr = varType.ToString(); switch (varTypestr) // Depending on each filed type, create corresponding UI component to show that field name and field value { case "System.String": //string if (property.Name == "Mark" || property.Name == "Comments") { UIlist[i] = Instantiate(inputRow); UIlist[i].transform.GetChild(0).GetChild(1).GetComponent<Text>().text = property.Name.Replace("_", " "); UIlist[i].transform.GetChild(1).GetChild(1).GetComponent<InputField>().text = varValue; Debug.Log("UILIST INPUT TEXT" + UIlist[i].transform.GetChild(0).GetChild(1).GetComponent<Text>().gameObject); } else { UIlist[i] = Instantiate(textRow); UIlist[i].transform.GetChild(0).GetChild(1).GetComponent<Text>().text = property.Name.Replace("_", " "); UIlist[i].transform.GetChild(1).GetChild(1).GetComponent<Text>().text = varValue; } break; case "System.Int32": // int UIlist[i] = Instantiate(textRow); UIlist[i].transform.GetChild(0).GetChild(1).GetComponent<Text>().text = property.Name.Replace("_", " "); UIlist[i].transform.GetChild(1).GetChild(1).GetComponent<Text>().text = varValue; break; case "System.Single": // float UIlist[i] = Instantiate(textRow); 212 UIlist[i].transform.GetChild(0).GetChild(1).GetComponent<Text>().text = property.Name.Replace("_", " "); UIlist[i].transform.GetChild(1).GetChild(1).GetComponent<Text>().text = varValue; break; case "System.Boolean": // bool UIlist[i] = Instantiate(checkRow); UIlist[i].transform.GetChild(0).GetChild(1).GetComponent<Text>().text = property.Name.Replace("_", " "); if (varValue != "False") UIlist[i].transform.GetChild(1).GetChild(1).GetComponent<Toggle>().isOn = true; else UIlist[i].transform.GetChild(1).GetChild(1).GetComponent<Toggle>().isOn = false; break; } UIlist[i].transform.SetParent(propertyUI.gridGroup.transform, false); i++; } } } void AddDataToParaDict(GameObject controlledObject) // this function get some parameter that can be changed { string controlledElementID = controlledObject.GetComponent<ReadCSV>().ObjectID; ScriptableObject object_SO = GetComponent<ReadCSV>().object_SO; changeParaDict = new Dictionary<string, string>(); originTsDict = new Dictionary<string, float>(); originTsDict.Add("x_local", transform.localPosition.x); originTsDict.Add("y_local", transform.localPosition.y); originTsDict.Add("z_local", transform.localPosition.z); changeParaDict.Add("new_startP_x", ""); changeParaDict.Add("new_startP_y", ""); changeParaDict.Add("new_startP_z", ""); changeParaDict.Add("new_endP_x", ""); changeParaDict.Add("new_endP_y", ""); changeParaDict.Add("new_endP_z", ""); switch (controlledObject.tag) { case "Ducts": ductParameter controlledDuctAsset = object_SO as ductParameter; // add original cooridnate to originTsDict originTsDict.Add("startPoint_x", controlledDuctAsset.startPoint_x); originTsDict.Add("startPoint_y", controlledDuctAsset.startPoint_y); originTsDict.Add("startPoint_z", controlledDuctAsset.startPoint_z); originTsDict.Add("endPoint_x", controlledDuctAsset.endPoint_x); originTsDict.Add("endPoint_y", controlledDuctAsset.endPoint_y); originTsDict.Add("endPoint_z", controlledDuctAsset.endPoint_z); // add original cooridnate to originTsDict changeParaDict.Add("Offset", controlledDuctAsset.Offset); changeParaDict.Add("Longitudinal Move", controlledDuctAsset.Longitudinal_Move.ToString()); changeParaDict.Add("Cross Sectional Move", controlledDuctAsset.Cross_Sectional_Move.ToString()); changeParaDict.Add("Vertical Move", controlledDuctAsset.Vertical_Move.ToString()); if (controlledDuctAsset.Family == "Rectangular Duct") { Debug.Log("Rectangular Duct OBJECT"); changeParaDict.Add("Width", controlledDuctAsset.Width); changeParaDict.Add("Height", controlledDuctAsset.Height); } if (controlledDuctAsset.Family == "Round Duct") // define what category and what shape { Debug.Log("Round Duct OBJECT"); changeParaDict.Add("Size", controlledDuctAsset.Size); //SIZE OR DIAMETER_TRADE_SIZE } break; case "Conduits": conduitParameter controlledConduitAsset = object_SO as conduitParameter; // add original cooridnate to originTsDict originTsDict.Add("startPoint_x", controlledConduitAsset.startPoint_x); originTsDict.Add("startPoint_y", controlledConduitAsset.startPoint_y); originTsDict.Add("startPoint_z", controlledConduitAsset.startPoint_z); originTsDict.Add("endPoint_x", controlledConduitAsset.endPoint_x); originTsDict.Add("endPoint_y", controlledConduitAsset.endPoint_y); originTsDict.Add("endPoint_z", controlledConduitAsset.endPoint_z); //add changed parameter data to dictioanry changeParaDict.Add("Offset", controlledConduitAsset.Offset); changeParaDict.Add("Diameter_Trade_Size", controlledConduitAsset.Diameter_Trade_Size); 213 changeParaDict.Add("Longitudinal Move", controlledConduitAsset.Longitudinal_Move.ToString()); changeParaDict.Add("Cross Sectional Move", controlledConduitAsset.Cross_Sectional_Move.ToString()); changeParaDict.Add("Vertical Move", controlledConduitAsset.Vertical_Move.ToString()); break; case "Walls": wallParameter controlledWallAsset = object_SO as wallParameter; // add original cooridnate to originTsDict originTsDict.Add("startPoint_x", controlledWallAsset.startPoint_x); originTsDict.Add("startPoint_y", controlledWallAsset.startPoint_y); originTsDict.Add("startPoint_z", controlledWallAsset.startPoint_z); originTsDict.Add("endPoint_x", controlledWallAsset.endPoint_x); originTsDict.Add("endPoint_y", controlledWallAsset.endPoint_y); originTsDict.Add("endPoint_z", controlledWallAsset.endPoint_z); //add changed parameter data to dictioanry changeParaDict.Add("Longitudinal Move", controlledWallAsset.Longitudinal_Move.ToString()); changeParaDict.Add("Cross Sectional Move", controlledWallAsset.Cross_Sectional_Move.ToString()); changeParaDict.Add("Vertical Move", controlledWallAsset.Vertical_Move.ToString()); break; case "Pipes": pipeParameter controlledPipeAsset = object_SO as pipeParameter; break; default: otherObjectParameter otherObjectAsset = object_SO as otherObjectParameter; break; } } void UpdateChangeParaUI(Dictionary<string,string> paraDict, GameObject controlledObject) { if (ControlUIcontent != null) { Transform[] ts = new Transform[ControlUIcontent.transform.childCount]; for (int i = 0; i < ts.Length; i++) { GameObject paraUI = ControlUIcontent.transform.GetChild(i).gameObject; if (paraUI.name == "Text") { paraUI.GetComponent<Text>().text = controlledObject.name; paraUI.GetComponent<Text>().fontSize = 40; } //the text field show the clicked object name if (paraDict.ContainsKey(paraUI.name)) { Debug.Log("paradict value" + paraUI.name + ": "+paraDict[paraUI.name]); if (paraUI.name == "Cross Sectional Move" || paraUI.name == "Longitudinal Move" || paraUI.name == "Vertical Move" || paraUI.name == "Offset") // for input component { paraUI.gameObject.SetActive(true); //set component active paraUI.transform.GetChild(1).GetChild(1).GetComponent<InputField>().text = ""; // As to show new gameobject, clear the text of previous gameobject' parameter Debug.Log("again" + paraUI.transform.GetChild(1).GetChild(1).GetComponent<InputField>().text); Debug.Log("sd33" + paraDict[paraUI.name]); if (paraDict[paraUI.name] == "0") { //paraUI.transform.GetChild(1).GetChild(1).transform.Find("Placeholder").GetComponent<Text>().text = "Please enter a (+/-) number"; paraUI.transform.GetChild(1).GetChild(1).GetComponent<InputField>().text = ""; } else // e.g. (offset is 8' - 0'') { paraUI.transform.GetChild(1).GetChild(1).GetComponent<InputField>().text = paraDict[paraUI.name]; } } if (paraUI.name == "Width" || paraUI.name == "Height") // for optionRow component { paraUI.gameObject.SetActive(true); //set component active Dropdown m_Dropdown = paraUI.transform.GetChild(1).GetChild(1).gameObject.GetComponent<Dropdown>(); m_Dropdown.ClearOptions(); List<string> m_Messages = new List<string>(); for (int inches=1; inches < 31; inches++) { 214 string m_NewData = inches.ToString() + "\'\'"; m_Messages.Add(m_NewData); } m_Messages.Add(paraDict[paraUI.name]); m_Dropdown.AddOptions(m_Messages); m_Dropdown.value = m_Dropdown.options.Count - 1; } if (paraUI.name == "Size") // for optionRow component and ducts { paraUI.gameObject.SetActive(true); //set component active Debug.Log("sisisis"); Dropdown m_Dropdown = paraUI.transform.GetChild(1).GetChild(1).gameObject.GetComponent<Dropdown>(); m_Dropdown.ClearOptions(); List<string> m_Messages = new List<string>(); for (int inches = 10; inches < 41; inches++) { string m_NewData = inches.ToString() + '\"'; m_Messages.Add(m_NewData); } m_Messages.Add(paraDict[paraUI.name]); m_Dropdown.AddOptions(m_Messages); m_Dropdown.value = m_Dropdown.options.Count - 1; } if (paraUI.name == "Diameter_Trade_Size") // for optionRow component and conduit { paraUI.gameObject.SetActive(true); //set component active Debug.Log("sisisis2"); Dropdown m_Dropdown = paraUI.transform.GetChild(1).GetChild(1).gameObject.GetComponent<Dropdown>(); m_Dropdown.ClearOptions(); List<string> m_Messages = new List<string>(); for (float inches = 0.25f; inches < 20.25f; inches+=0.25f) { string m_NewData = inches.ToString() + '\"'; m_Messages.Add(m_NewData); } m_Messages.Add(paraDict[paraUI.name]); m_Dropdown.AddOptions(m_Messages); m_Dropdown.value = m_Dropdown.options.Count - 1; } } } } } // Creates an instance of the prefab at the current spawn point. void CreateGizmos(GameObject controlledObject, Dictionary<string, float> CoordinateDict) //float[] CorrdinateList, { GizmosOb = propertyUI.GiszomsOb; GizmosInstance = Instantiate(GizmosOb, new Vector3(0, 0, 0), Quaternion.identity); GizmosInstance.transform.SetParent(parent.transform, false); GizmosInstance.transform.localPosition = controlledObject.transform.localPosition; // Calculate the arctangent of the previous tangent. float X_Difference = CoordinateDict["endPoint_x"] - CoordinateDict["startPoint_x"]; float Y_Difference = CoordinateDict["endPoint_y"] - CoordinateDict["startPoint_y"]; float tangentResult = Y_Difference/ X_Difference; radians = Math.Atan(tangentResult); angle = radians * (180 / Math.PI); float z_rotation = (float)angle; Debug.Log("angle: " + angle); GizmosInstance.transform.localRotation = Quaternion.Euler(0, 0, z_rotation); GizmosInstance.transform.localScale = new Vector3(1f, 1f, 1f); GizmosInstance.transform.SetParent(gameObject.transform); } } 215 B.3 ReferenceObject.cs using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.UI; using System; using System.Text.RegularExpressions; public class ReferenceObejct : MonoBehaviour { public GameObject previousObject; public GameObject currentObject; public GameObject recordingObject; public bool swichOn = true; public Dictionary<string, string[]> uploadDict = new Dictionary<string, string[]>(); //Dictionary<string, float> ChangeParaDict = new Dictionary<string, float>(); private Dictionary<string, Dictionary<string, float>> AllObDict = new Dictionary<string, Dictionary<string, float>>(); // a dictionary with key of element id and value of original object's data public InventoryControl propertyUI; public GameObject ControlUI; public GameObject ControlUIcontent; public GameObject Cross_Sectional_Move; public GameObject Longitudinal_Move; public GameObject Vertical_Move; public GameObject DuctSize; public GameObject ConduitSize; public GameObject Height; public InputField CrInput; public InputField LgInput; public InputField VtInput; public InputField HeightInput; public Dropdown ductSizeInput; public Dropdown conduitSizeInput; public Button CrossSectionalButton; public Button LongitudinalButton; public Button VerticalButton; private bool CSbuttonSwitch = false; private bool LGbuttonSwitch = false; private bool VTbuttonSwitch = false; private string previousHeightValue = ""; void Awake() { AddTag.CreateTag("Project"); gameObject.tag = "Project"; ControlUI = GameObject.Find("ChangeParameter"); ControlUIcontent = ControlUI.transform.GetChild(1).gameObject; Transform[] ts = ControlUI.GetComponentsInChildren<Transform>(true); foreach (Transform elementTs in ts) { switch (elementTs.gameObject.name) { case "ParaContent": ControlUIcontent = elementTs.gameObject; break; case "Cross Sectional Move": Cross_Sectional_Move = elementTs.gameObject; Debug.Log("cross" + Cross_Sectional_Move.name); break; case "Longitudinal Move": Longitudinal_Move = elementTs.gameObject; Debug.Log("cross" + Longitudinal_Move.name); break; case "Vertical Move": Vertical_Move = elementTs.gameObject; Debug.Log("cross" + Vertical_Move.name); break; case "Offset": Height = elementTs.gameObject; Debug.Log("cross" + Height.name); break; case "Size": DuctSize = elementTs.gameObject; break; case "Diameter_Trade_Size": ConduitSize = elementTs.gameObject; 216 break; } } Debug.Log("propertyUI" + propertyUI); Debug.Log("controluicomtent" + ControlUIcontent.name); // InputField CrInput = Cross_Sectional_Move.transform.GetChild(1).GetChild(1).gameObject.GetComponent<InputField>(); CrInput.characterValidation = InputField.CharacterValidation.Decimal; LgInput = Longitudinal_Move.transform.GetChild(1).GetChild(1).gameObject.GetComponent<InputField>(); LgInput.characterValidation = InputField.CharacterValidation.Decimal; VtInput = Vertical_Move.transform.GetChild(1).GetChild(1).gameObject.GetComponent<InputField>(); VtInput.characterValidation = InputField.CharacterValidation.Decimal; HeightInput = Height.transform.GetChild(1).GetChild(1).gameObject.GetComponent<InputField>(); //Dropdown ductSizeInput =DuctSize.transform.GetChild(1).GetChild(1).gameObject.GetComponent<Dropdown>(); conduitSizeInput = ConduitSize.transform.GetChild(1).GetChild(1).gameObject.GetComponent<Dropdown>(); //experimental change here CrInput.keyboardType = TouchScreenKeyboardType.DecimalPad; LgInput.keyboardType = TouchScreenKeyboardType.DecimalPad; VtInput.keyboardType = TouchScreenKeyboardType.DecimalPad; HeightInput.keyboardType = TouchScreenKeyboardType.DecimalPad; //Adds a listener to the main input field and invokes a method when the value changes. CrInput.onEndEdit.AddListener(delegate { MovementChangeEditted(CrInput, LgInput, VtInput, HeightInput); }); LgInput.onEndEdit.AddListener(delegate { MovementChangeEditted(CrInput, LgInput, VtInput, HeightInput); }); VtInput.onEndEdit.AddListener(delegate { MovementChangeEditted(CrInput, LgInput, VtInput, HeightInput); }); HeightInput.onEndEdit.AddListener(delegate { HeightChangeEidt(CrInput, LgInput, VtInput, HeightInput); }); // Dropdown lisener for duct size ductSizeInput.onValueChanged.AddListener(delegate { sizeChanged(ductSizeInput); }); Transform[] ts2 = new Transform[ControlUIcontent.transform.childCount]; for (int i = 1; i < ts2.Length; i++) // disactive children component from the second compoennt { // Debug.Log("11111" + ts[i].GetChild(i).gameObject.name); ControlUIcontent.transform.GetChild(i).gameObject.SetActive(false); } } void sizeChanged(Dropdown sizeDropdown) { int selection = sizeDropdown.value; string newSizeString = sizeDropdown.options[selection].text; Debug.Log("new options:" + sizeDropdown.options[selection].text); var tempClickObject= currentObject.GetComponent<ClickObject>(); GameObject currentGizmos = tempClickObject.GizmosInstance; Debug.Log(currentGizmos.name); currentGizmos.transform.SetParent(transform); currentObject.transform.SetParent(currentGizmos.transform); GameObject ChangedDropDown = sizeDropdown.transform.parent.transform.parent.gameObject; string originalSize=""; float priviousSize = 1f; float newSize = 1f; if (ChangedDropDown.name == "Size") { Debug.Log("duct size"); originalSize = tempClickObject.changeParaDict["Size"]; tempClickObject.changeParaDict["Size"] = newSizeString; priviousSize = float.Parse(originalSize.Split('\"')[0]); newSize = float.Parse(newSizeString.Split('\"')[0]); } float scaleFactor = newSize / priviousSize; // object's cross-sectional scale float y_Scale = currentGizmos.transform.localScale.y; // object's longtigudinal scale float x_Scale = currentGizmos.transform.localScale.x; // x represente the longtigudinal, y represent the cross-sectional currentGizmos.transform.localScale = new Vector3(x_Scale, y_Scale* scaleFactor, y_Scale * scaleFactor); currentObject.GetComponent<ClickObject>().changeParaDict["Size"] = newSizeString; // currentObject.transform.SetParent(transform); currentGizmos.transform.SetParent(currentObject.transform); } float ConvertHeightStringToFloat(string origHeight) { //(8' - 0 9/32") char[] delimiterChars = { '\'', ' ', '-', ' ', '\"' }; 217 string[] orhts = origHeight.Split(delimiterChars); string[] inchString = orhts[orhts.Length - 2].Split('/'); Debug.Log("inches parse:" + orhts[orhts.Length - 2]); float feet = float.Parse(orhts[0]); float inches; if (inchString.Length > 1) { inches = float.Parse(inchString[0]) / float.Parse(inchString[1]);// parse (9 / 32) to 9 and 32 ; Debug.Log("inches" + inches); } else { inches = float.Parse(orhts[orhts.Length - 2]); // e.g. 0 } float OriginalHt; if ((12 / inches).ToString() == "Infinity") // inches =0 { OriginalHt = feet; } else { OriginalHt = feet + (float)(inches/12); //feet and decimal feet } Debug.Log("origianl height:" + OriginalHt); return OriginalHt; } void addOriginalDataToDict() { Dictionary<string, float> OrigTsDict = currentObject.GetComponent<ClickObject>().originTsDict; if (currentObject.tag != "Walls") { // ADD Height (8' - 0'') to dicitioanry (8' - 0 9/32") string origHeight = currentObject.GetComponent<ClickObject>().changeParaDict["Offset"]; float ConvertHeight = ConvertHeightStringToFloat(origHeight); OrigTsDict.Add("Offset", ConvertHeight); } AllObDict.Add(currentObject.name, OrigTsDict); } // Checks if there is anything entered into the input field. void updateRevitCoord(float xDifference, float yDifference, float zDifference, Dictionary<string, float> RevitCrDict, Dictionary<string, string> uploadDi) { Dictionary<string, float> new_Reivt_Cor_Dict = new Dictionary<string, float>(); new_Reivt_Cor_Dict.Add("new_startP_x", RevitCrDict["startPoint_x"] - xDifference); // when Unity and Revit has the reverse X axis!!! new_Reivt_Cor_Dict.Add("new_endP_x", RevitCrDict["endPoint_x"] - xDifference); new_Reivt_Cor_Dict.Add("new_startP_y", RevitCrDict["startPoint_y"] + yDifference); new_Reivt_Cor_Dict.Add("new_endP_y", RevitCrDict["endPoint_y"] + yDifference); new_Reivt_Cor_Dict.Add("new_startP_z", RevitCrDict["startPoint_z"] + zDifference); new_Reivt_Cor_Dict.Add("new_endP_z", RevitCrDict["endPoint_z"] + zDifference); foreach (KeyValuePair<string, float> para in new_Reivt_Cor_Dict) { Debug.Log("new coordinates: " + para.Key + ":" + para.Value); if (uploadDi.ContainsKey(para.Key)) { uploadDi[para.Key] = para.Value.ToString(); } else { uploadDi.Add(para.Key, para.Value.ToString()); } } } void MovementChangeEditted(InputField inputCr, InputField inputLg, InputField inputVt, InputField inputHeight) { if (!AllObDict.ContainsKey(currentObject.name)) // check whether the object's original localTransform has stored in the AllObDict { addOriginalDataToDict(); } Dictionary<string, float> OriginalPosition = AllObDict[currentObject.name]; float CrMove = 0; float LgMove = 0; 218 float VtMove = 0; try //Cr { CrMove = float.Parse(inputCr.text); } catch { inputCr.text = ""; }// if the input cannot convert to int, rewrite the input text to null try //Lg { LgMove = float.Parse(inputLg.text); } catch { inputLg.text = ""; } try //Vt { VtMove = float.Parse(inputVt.text); } catch { inputVt.text = ""; } //calcualte the movement double angle = currentObject.GetComponent<ClickObject>().angle; Debug.Log("Gizmos " + currentObject.GetComponent<ClickObject>().angle); double angle_LG_X = angle * Math.PI / 180; double angle_CR_X = (angle + 90) * Math.PI / 180; float cr_x_movement = (float)(CrMove * 0.3048 * Math.Cos(angle_CR_X)); float cr_y_movement = (float)(CrMove * 0.3048 * Math.Sin(angle_CR_X)); float lg_x_movement = (float)(LgMove * 0.3048 * Math.Cos(angle_LG_X)); // the angle of the new each direction to the x and y direction float lg_y_movement = (float)(LgMove * 0.3048 * Math.Sin(angle_LG_X)); float vt_z_movement = (float)(VtMove * 0.3048); Debug.Log("crx" + cr_x_movement + "cry" + cr_y_movement + "lgx" + lg_x_movement + "lgy" + lg_y_movement); float x_movement = cr_x_movement + lg_x_movement; float y_movement = cr_y_movement + lg_y_movement; float z_movement = vt_z_movement; float new_x = OriginalPosition["x_local"] + x_movement; float new_y = OriginalPosition["y_local"] + y_movement; float new_z = OriginalPosition["z_local"] + z_movement; float x_Revit_movement = x_movement / 0.3048f; // feet movement made in AR float y_Revit_movement = y_movement / 0.3048f; float z_Revit_movement = z_movement / 0.3048f; Debug.Log("x original: " + OriginalPosition["x_local"] + " y original: " + OriginalPosition["y_local"] + " z original: " + OriginalPosition["z_local"]); Debug.Log("x movement: " + x_movement + " y movement: " + y_movement + "z position: " + new_z); Debug.Log("x position: " + new_x + " y position: " + new_y); currentObject.transform.localPosition = new Vector3(new_x, new_y, new_z); currentObject.GetComponent<ClickObject>().changeParaDict["Cross Sectional Move"] = inputCr.text; Debug.Log("123q" + inputCr.text); Debug.Log("123q" + currentObject.GetComponent<ClickObject>().changeParaDict["Cross Sectional Move"]); currentObject.GetComponent<ClickObject>().changeParaDict["Longitudinal Move"] = inputLg.text; Debug.Log("123q" + currentObject.GetComponent<ClickObject>().changeParaDict["Longitudinal Move"]); currentObject.GetComponent<ClickObject>().changeParaDict["Vertical Move"] = inputVt.text; Debug.Log("123q" + inputVt.text); updateRevitCoord(x_Revit_movement, y_Revit_movement, z_Revit_movement, OriginalPosition, currentObject.GetComponent<ClickObject>().changeParaDict); // update the Height information: if (currentObject.tag != "Walls") { float original_height = OriginalPosition["Offset"]; float new_height = original_height + VtMove; Debug.Log("infinatie" + (new_height % 1).ToString()); string final_height; if ((new_height % 1).ToString() == "NaN") { final_height = Math.Floor(new_height).ToString() + "\'" + " - " + (0).ToString() + "\""; } else { final_height = Math.Floor(new_height).ToString() + "\'" + " - " + ((new_height % 1) * 12).ToString() + "\""; } Debug.Log("original height" + original_height); Debug.Log("new_height" + VtMove); Debug.Log("height" + new_height + " lala: " + Math.Floor(new_height).ToString() + " lalala:" + (new_height % 1 * 12)); Debug.Log("final height" + final_height); HeightInput.text = final_height; currentObject.GetComponent<ClickObject>().changeParaDict["Offset"] = final_height; } } public void HeightChangeStoreValue(string currentHeightValue) { previousHeightValue = currentHeightValue; 219 Debug.Log("The previous height value:" + previousHeightValue); } void HeightChangeEidt(InputField inputCr, InputField inputLg, InputField inputVt, InputField inputHeight) { if(previousHeightValue == inputHeight.text) { Debug.Log("height did not change"); return; } Debug.Log("height changed"); if (!AllObDict.ContainsKey(currentObject.name)) // check whether the object's original localTransform has stored in the AllObDict { addOriginalDataToDict(); } Dictionary<string, float> OriginalPosition = AllObDict[currentObject.name]; float CrMove = 0; float LgMove = 0; float NewHeight = 0; try //Cr { CrMove = float.Parse(inputCr.text); } catch { inputCr.text = ""; }// if the input cannot convert to int, rewrite the input text to null try //Lg { LgMove = float.Parse(inputLg.text); } catch { inputLg.text = ""; } try //Height { NewHeight = float.Parse(inputHeight.text); } catch { inputHeight.text = ""; } //calcualte the movement double angle = currentObject.GetComponent<ClickObject>().angle; double radians = currentObject.GetComponent<ClickObject>().radians; Debug.Log("Gizmos " + currentObject.GetComponent<ClickObject>().angle); double angle_LG_X = angle * Math.PI / 180; double angle_CR_X = (angle + 90) * Math.PI / 180; float cr_x_movement = (float)(CrMove * 0.3048 * Math.Cos(angle_CR_X)); float cr_y_movement = (float)(CrMove * 0.3048 * Math.Sin(angle_CR_X)); float lg_x_movement = (float)(LgMove * 0.3048 * Math.Cos(angle_LG_X)); float lg_y_movement = (float)(LgMove * 0.3048 * Math.Sin(angle_LG_X)); float original_height = OriginalPosition["Offset"]; // here is the difference when changing the "height" instead of "Vertical movement" float x_movement = cr_x_movement + lg_x_movement; float y_movement = cr_y_movement + lg_y_movement; float z_movement = (NewHeight - original_height)* 0.3048f; // here is the difference when changing the "height" float new_x = OriginalPosition["x_local"] + x_movement; float new_y = OriginalPosition["y_local"] + y_movement; float new_z = OriginalPosition["z_local"] + z_movement; float x_Revit_movement = x_movement / 0.3048f; float y_Revit_movement = y_movement / 0.3048f; float z_Revit_movement = z_movement / 0.3048f; Debug.Log("x original: " + OriginalPosition["x_local"] + " y original: " + OriginalPosition["y_local"] + " z original: " + OriginalPosition["z_local"]); Debug.Log("x movement: " + x_movement + " y movement: " + y_movement + "z position: " + new_z); Debug.Log("x position: " + new_x + " y position: " + new_y); currentObject.transform.localPosition = new Vector3(new_x, new_y, new_z); // change the vertical movement to correspoinding value //VtInput.text = (z_movement).ToString(); VtInput.text = (z_Revit_movement).ToString(); //add new information to paradict currentObject.GetComponent<ClickObject>().changeParaDict["Cross Sectional Move"] = inputCr.text; currentObject.GetComponent<ClickObject>().changeParaDict["Longitudinal Move"] = inputLg.text; currentObject.GetComponent<ClickObject>().changeParaDict["Vertical Move"] = inputVt.text; //chagne the float number to feet and decimal feet e.g. (10.4 to 10 ' - 3'') string final_height; if ((NewHeight % 1).ToString() == "NaN") { final_height = Math.Floor(NewHeight).ToString() + "\'" + " - " + (0).ToString() + "\""; } else { 220 final_height = Math.Floor(NewHeight).ToString() + "\'" + " - " + ((NewHeight - Math.Floor(NewHeight)) * 12).ToString() + "\""; /// problems Debug.Log("hight2:" + ((NewHeight - Math.Floor(NewHeight)) * 12).ToString()); } HeightInput.text = final_height; currentObject.GetComponent<ClickObject>().changeParaDict["Offset"] = final_height; updateRevitCoord(x_Revit_movement, y_Revit_movement, z_Revit_movement, OriginalPosition, currentObject.GetComponent<ClickObject>().changeParaDict); } void Update() { //check where users clicke a new object. means users has finshed modifying the recording object ,the system begin to record the changed information. if (recordingObject != null && recordingObject != currentObject && recordingObject.GetComponent<ReadCSV>().object_SO != null) { updateDict(); } } public void updateDict() { Dictionary<string, string> recordparaDict = new Dictionary<string, string>(); // create a new dictionary to store the recording object string recordElementID = recordingObject.GetComponent<ReadCSV>().ObjectID; //add new coordinate information to recordParaDict Dictionary<string, string> RecordOB_changeParaDict = recordingObject.GetComponent<ClickObject>().changeParaDict; if (RecordOB_changeParaDict != null) { recordparaDict.Add("new_startP_x", RecordOB_changeParaDict["new_startP_x"]); recordparaDict.Add("new_startP_y", RecordOB_changeParaDict["new_startP_y"]); recordparaDict.Add("new_startP_z", RecordOB_changeParaDict["new_startP_z"]); recordparaDict.Add("new_endP_x", RecordOB_changeParaDict["new_endP_x"]); recordparaDict.Add("new_endP_y", RecordOB_changeParaDict["new_endP_y"]); recordparaDict.Add("new_endP_z", RecordOB_changeParaDict["new_endP_z"]); if (RecordOB_changeParaDict.ContainsKey("Size")) { recordparaDict.Add("duct_Diameter_size", RecordOB_changeParaDict["Size"]); } if (RecordOB_changeParaDict.ContainsKey("Diameter_Trade_Size")) { recordparaDict.Add("conduit_Diameter_size", RecordOB_changeParaDict["Diameter_Trade_Size"]); } } GameObject[] recordUIList = recordingObject.GetComponent<ClickObject>().UIlist; // the UIList may have some null value //update the data change form the inventory control UI foreach (GameObject UIcomponent in recordUIList) { if (UIcomponent != null) // store the changed information into a dictionary { if (UIcomponent.name == "InputRow(Clone)") { string paraKey = UIcomponent.transform.GetChild(0).GetChild(1).GetComponent<Text>().text; string paraValue = UIcomponent.transform.GetChild(1).GetChild(1).transform.Find("Text").GetComponent<Text>().text; if (paraValue != "") // user has changed the input text is none { Debug.Log("key and value:" + paraKey + ": " + paraValue); recordparaDict.Add(paraKey, paraValue); } } if (UIcomponent.name == "CheckRow(Clone)") { string paraKey = UIcomponent.transform.GetChild(0).GetChild(1).GetComponent<Text>().text; string paraValue = UIcomponent.transform.GetChild(1).GetChild(1).GetComponent<Toggle>().isOn.ToString(); Debug.Log("key and value:" + paraKey + ": " + paraValue); recordparaDict.Add(paraKey, paraValue); } } } //assign dictioanry value to update recordObject asset file (ScriptableObject) and assign the value to an array // get the asset of the recording object to overwrirte some fields' data string loadassetPath = "metadataScriptableObject/" + recordElementID; switch (recordingObject.tag) { 221 case "Ducts": ductParameter recordDuctAsset = Resources.Load<ductParameter>(loadassetPath); Debug.Log("reload asset"); AssignParameter.DictToDuctAsset(recordparaDict, recordDuctAsset); break; case "Pipes": pipeParameter recordPipeAsset = Resources.Load<pipeParameter>(loadassetPath); AssignParameter.DictToPipeAsset(recordparaDict, recordPipeAsset); Debug.Log("reload asset"); break; case "Conduits": conduitParameter recordConduitAsset = Resources.Load<conduitParameter>(loadassetPath); AssignParameter.DictToConduitAsset(recordparaDict, recordConduitAsset); Debug.Log("reload asset"); break; case "Walls": wallParameter recordWallAsset = Resources.Load<wallParameter>(loadassetPath); AssignParameter.DictToWallAsset(recordparaDict, recordWallAsset); Debug.Log("reload asset"); break; } //Make recordDict data into the recordparaArray string[] recordparaArray = new string[recordparaDict.Count]; int i = 0; foreach (KeyValuePair<string, string> para in recordparaDict) { recordparaArray[i] = para.Key + " : " + para.Value; i++; string paraField = para.Key.Replace(" ", "_"); Debug.Log("paraField" + paraField); } //Update the information in uploadDict if (uploadDict.ContainsKey(recordElementID) == true) { uploadDict[recordElementID] = recordparaArray; } else { uploadDict.Add(recordElementID, recordparaArray); } // Clear all UI components of the recording Object ClearUI(recordingObject); recordingObject = currentObject; foreach (KeyValuePair<string, string[]> para in uploadDict) { Debug.Log("uploadkey: " + para.Key + "value:" + para.Value); foreach (string value in para.Value) { Debug.Log("uploadvalue: " + value); } } } B.4 TrackedImageInfoMultipleManager.cs using System.Collections.Generic; using UnityEngine; using UnityEngine.UI; using UnityEngine.XR.ARFoundation; [RequireComponent(typeof(ARTrackedImageManager))] public class TrackedImageInfoMultipleManager : MonoBehaviour { [SerializeField] private GameObject welcomePanel; [SerializeField] private Button dismissButton; [SerializeField] private Button scanImageButton; [SerializeField] private Text imageTrackedText; [SerializeField] private GameObject[] arObjectsToPlace; [SerializeField] 222 private Vector3 scaleFactor = new Vector3(0.01f, 0.01f, 0.01f); private ARTrackedImageManager m_TrackedImageManager; private Dictionary<string, GameObject> arObjects = new Dictionary<string, GameObject>(); private GameObject firstItem; private GameObject secondItem; public GameObject instantiateInstance; private GameObject puttingModel; public bool switchARscanning=true; ` void Awake() { ARTrackedImageManager ImageManager = this.gameObject.GetComponent<ARTrackedImageManager>(); dismissButton.onClick.AddListener(Dismiss); m_TrackedImageManager = GetComponent<ARTrackedImageManager>(); // setup all game objects in dictionary foreach (GameObject arObject in arObjectsToPlace) { GameObject newARObject = Instantiate(arObject, Vector3.zero, Quaternion.identity); Debug.Log(" location ball instantiated"); newARObject.name = arObject.name; arObjects.Add(arObject.name, newARObject); newARObject.SetActive(false); } puttingModel = Instantiate(instantiateInstance, Vector3.zero, Quaternion.identity); puttingModel.SetActive(false); Debug.Log(" revit model instantiated"); } public void swichScanSwitch() { if (switchARscanning) { ColorBlock colors = scanImageButton.colors; colors.selectedColor = Color.red; scanImageButton.transform.GetChild(0).GetComponent<Text>().text = "Start\nScan"; switchARscanning = false; Debug.Log("disabled"); } else { ColorBlock colors = scanImageButton.colors; colors.selectedColor = new Color(225, 225, 225,200); scanImageButton.transform.GetChild(0).GetComponent<Text>().text = "Stop\nScan"; switchARscanning = true; Debug.Log("enabled"); } m_TrackedImageManager.enabled = switchARscanning; } void OnEnable() { m_TrackedImageManager.trackedImagesChanged += OnTrackedImagesChanged; Debug.Log("AR images changed: "+ m_TrackedImageManager); Debug.Log( m_TrackedImageManager); } void OnDisable() { Debug.Log("disables ar image manager"); m_TrackedImageManager.trackedImagesChanged -= OnTrackedImagesChanged; } private void Dismiss() => welcomePanel.SetActive(false); void OnTrackedImagesChanged(ARTrackedImagesChangedEventArgs eventArgs) { foreach (ARTrackedImage trackedImage in eventArgs.added) { UpdateARImage(trackedImage); Debug.Log("added"); } foreach (ARTrackedImage trackedImage in eventArgs.updated) { UpdateARImage(trackedImage); Debug.Log("updated"); } 223 imageTrackedText.text = ""; } private void UpdateARImage(ARTrackedImage trackedImage) { // Display the name of the tracked image in the canvas imageTrackedText.text = trackedImage.referenceImage.name; // Assign and Place Game Object AssignGameObject(trackedImage.referenceImage.name, trackedImage.transform.position); Debug.Log($"trackedImage.referenceImage.name: {trackedImage.referenceImage.name}"); } void AssignGameObject(string name, Vector3 newPosition) { if (arObjectsToPlace != null) { GameObject goARObject = arObjects[name]; goARObject.SetActive(true); goARObject.transform.position = newPosition; goARObject.transform.localScale = scaleFactor; Debug.Log(goARObject.name + "position" + goARObject.transform.position); if (name == "FirstPoint(Blue)" && firstItem is null) { firstItem = goARObject; } if (name == "SecondPoint(Green)" && secondItem is null) { secondItem = goARObject; } if (firstItem != null && secondItem != null) { Vector3 FirstImageLocation = firstItem.transform.position; Vector3 SecondImageLocation = secondItem.transform.position; Debug.Log("first image location:" + FirstImageLocation + "Second Image Location" + SecondImageLocation); float X_difference = SecondImageLocation.x- FirstImageLocation.x; float Z_difference = SecondImageLocation.z- FirstImageLocation.z; Vector3 LocatingDirection = new Vector3(X_difference, 0f, Z_difference); Vector3 normalZ = transform.forward; float rotationAngle = Vector3.SignedAngle(normalZ, LocatingDirection, Vector3.up); Debug.Log("rotationAngle:" + rotationAngle); Debug.Log("x, z difference: " + X_difference + Z_difference); } } } public void LoadModel() { Debug.Log("Try to load model"); if (firstItem != null) { Debug.Log("Find first and second"); Vector3 FirstImageLocation = firstItem.transform.position; Vector3 SecondImageLocation = secondItem.transform.position; Debug.Log("first image location:" + FirstImageLocation + "Second Image Location" + SecondImageLocation); float X_difference = SecondImageLocation.x-FirstImageLocation.x; float Z_difference = SecondImageLocation.z- FirstImageLocation.z; Vector3 LocatingDirection = new Vector3(X_difference, 0f, Z_difference); Vector3 normalZ = transform.forward; float rotationAngle = Vector3.SignedAngle(normalZ, LocatingDirection, Vector3.up); Debug.Log("rotationAngle:" + rotationAngle); Debug.Log("x, z difference: " + X_difference + Z_difference); puttingModel.transform.position = FirstImageLocation; puttingModel.transform.rotation = Quaternion.Euler(0, rotationAngle, 0); puttingModel.SetActive(true); } else { imageTrackedText.text = "no enough images were tracked. Try agin"; } } } 224 APPENDIX C C.1 SwichOver.cs using System; using System.Collections; using System.Collections.Generic; using System.IO; using UnityEngine; public class SwichOver : MonoBehaviour { Color originalColor = new Color(255f, 255f, 0f, 255f); //new Color(1f, 1f, .4f, 1f); dark blue public Hashtable ht_colors; public GameObject Canvas; GameObject PropertyWindow; GameObject CategoryWindow; GameObject ChangeParameterWindow; GameObject UploadUIWindow; GameObject VisibleButton; GameObject InvisibleButton; [SerializeField] private GameObject project; GameObject MovementButtons; List<GameObject> InvisibleGameObList = new List<GameObject>(); public GameObject currentObject; public GameObject GiszomsOb; [SerializeField] private bool PropertyControll; [SerializeField] private bool CategoryControll; [SerializeField] private bool ParameterControll; private void Start() { //Canvas = GameObject.FindGameObjectWithTag("Properties"); //This method returns the first GameObject it finds with the specified tag. PropertyWindow = Canvas.transform.Find("InventoryMenu").gameObject; CategoryWindow = Canvas.transform.Find("CategoryWindow").gameObject; ChangeParameterWindow= Canvas.transform.Find("ChangeParameter").gameObject; UploadUIWindow = Canvas.transform.Find("UploadUIWindow").gameObject; VisibleButton = Canvas.transform.Find("Visible").gameObject; MovementButtons = Canvas.transform.Find("MovementButtons").gameObject; VisibleButton.SetActive(false); InvisibleButton = Canvas.transform.Find("Invisible").gameObject; FilterObject(); CategoryWindow.SetActive(false); PropertyWindow.SetActive(false); ChangeParameterWindow.SetActive(false); PropertyControll = false; CategoryControll = false; ParameterControll = false; project = GameObject.FindWithTag("Project"); } void FilterObject() { Transform[] ts = CategoryWindow.GetComponentsInChildren<Transform>(true); foreach (Transform ts_catagroy in ts) { Debug.Log("category window2s" + ts_catagroy.gameObject.name); } } public void CloseUIWindow() { UploadUIWindow.SetActive(false); } public void SetVisible() { if (project is null) { project = GameObject.FindWithTag("Project"); } GameObject AcitvieObject = project.GetComponent<ReferenceObejct>().currentObject; if(AcitvieObject != null) { 225 AcitvieObject.SetActive(true); if (InvisibleGameObList.Contains(AcitvieObject)) { InvisibleGameObList.Remove(AcitvieObject); } VisibleButton.SetActive(false); InvisibleButton.SetActive(true); } } public void SetInvisible() { if (project is null) { project = GameObject.FindWithTag("Project"); } GameObject AcitvieObject = project.GetComponent<ReferenceObejct>().currentObject; if (AcitvieObject != null) { AcitvieObject.SetActive(false); if (InvisibleGameObList.Contains(AcitvieObject) == false) { InvisibleGameObList.Add(AcitvieObject); } InvisibleButton.SetActive(false); VisibleButton.SetActive(true); } } public void SetAllVisible() { if (project is null) { project = GameObject.FindWithTag("Project"); } if (InvisibleGameObList.Count>0) { foreach (GameObject AcitvieObject in InvisibleGameObList) { AcitvieObject.SetActive(true); } } InvisibleGameObList = new List<GameObject>(); } public void swichColor() { if (project is null) { project = GameObject.FindWithTag("Project"); } else { currentObject = project.GetComponent<ReferenceObejct>().currentObject; } if (currentObject != null && project != null) { bool swichOn = project.GetComponent<ReferenceObejct>().swichOn; ht_colors = currentObject.GetComponent<ClickObject>().ht_colors; if (swichOn) { for (int i = 0; i < currentObject.GetComponent<Renderer>().materials.Length; i++) { Color oColor = (Color)ht_colors[i]; currentObject.GetComponent<Renderer>().materials[i].color = oColor; } project.GetComponent<ReferenceObejct>().swichOn = false; } else { for (int i = 0; i < currentObject.GetComponent<Renderer>().materials.Length; i++) { project.GetComponent<ReferenceObejct>().previousObject.GetComponent<Renderer>().materials[i].color = new Color(255F, 255F, 0F, 255F); ; } project.GetComponent<ReferenceObejct>().swichOn = true; } 226 } } public void swichProperty() { if (PropertyControll == false) { PropertyWindow.SetActive(true); PropertyControll = true; CategoryWindow.SetActive(false); CategoryControll = false; } else { PropertyWindow.SetActive(false); PropertyControll = false; } } public void swichCategory() { if (CategoryControll == false) { CategoryWindow.SetActive(true); CategoryControll = true; PropertyWindow.SetActive(false); PropertyControll = false; } else { CategoryWindow.SetActive(false); CategoryControll = false; } } public void swichChangeParameter() { if (ParameterControll == false) { ChangeParameterWindow.SetActive(true); ParameterControll = true; } else { ChangeParameterWindow.SetActive(false); ParameterControll = false; } } public void SaveJson() { project = GameObject.FindWithTag("Project"); if (project.GetComponent<ReferenceObejct>().uploadDict is null) { return; } ElementData elementData = new ElementData { ToJsonDict = project.GetComponent<ReferenceObejct>().uploadDict }; string object_matadata = (project.GetComponent<ReferenceObejct>().uploadDict).ToString(); string file_name = "Object_metadata.txt"; //string json = JsonUtility.ToJson(elementData); //Debug.Log(json); WriteToFile(file_name, object_matadata); } public void CreateText() { project = GameObject.FindWithTag("Project"); project = GameObject.FindWithTag("Project"); project.GetComponent<ReferenceObejct>().updateDict(); 227 UploadUIWindow.SetActive(true); string path; //#if UNITY_EDITOR path = Application.dataPath + "/objectMeta.txt"; //#endif #if UNITY_IOS path = Application.persistentDataPath + "/objectMeta.txt"; #endif File.WriteAllText(path, String.Empty); string oneObjectMetadata; Dictionary<string, string[]> uploadDict= project.GetComponent<ReferenceObejct>().uploadDict; //Content of the file if (uploadDict.Count > 0) { foreach (KeyValuePair<string, string[]> para in uploadDict) { oneObjectMetadata = para.Key + ","; foreach (string parameter in para.Value) { oneObjectMetadata = oneObjectMetadata + parameter + ","; } oneObjectMetadata = oneObjectMetadata + "\n"; File.AppendAllText(path, oneObjectMetadata); Debug.Log("one obejct paramter:" + oneObjectMetadata); } // string content = "Login date: " + System.DateTime.Now + "\n"; //Add some to text to it } } private void WriteToFile(string fileName, string json) { string path = Application.persistentDataPath + "/" + fileName; Debug.Log("file path:" + path); FileStream filestream = new FileStream(path, FileMode.Create); using (StreamWriter writer = new StreamWriter(filestream)) { writer.Write(json); Debug.Log("json file write:" + json); } } } C.2 AddTag.cs using UnityEngine; using UnityEditor; public static class AddTag // check if a tag exist and add tag based on a givin string { public static void CreateTag(string tag) { #if UNITY_EDITOR Object[] asset = AssetDatabase.LoadAllAssetsAtPath("ProjectSettings/TagManager.asset"); //Object[] asset = Resources.LoadAll("ProjectSettings/TagManager"); if ((asset != null) && (asset.Length > 0)) { SerializedObject so = new SerializedObject(asset[0]); SerializedProperty tags = so.FindProperty("tags"); for (int i = 0; i < tags.arraySize; ++i) { if (tags.GetArrayElementAtIndex(i).stringValue == tag) { Debug.Log("tag "+tag+" has been already in TagManger, no need to add"); return; // Tag already present, nothing to do. } } //Add the new Category into TagMnager tags.InsertArrayElementAtIndex(tags.arraySize); tags.GetArrayElementAtIndex(tags.arraySize - 1).stringValue = tag; so.ApplyModifiedProperties(); 228 so.Update(); } #if UNITY_EDITOR } } C.3 AssignParameter.cs using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEditor; public static class AssignParameter { public static void DictToConduitAsset(Dictionary<string, string> paraDict, conduitParameter conduitAsset) { // Data { if (paraDict.ContainsKey("startPoint_x")) conduitAsset.startPoint_x = (float)((float.Parse(paraDict["startPoint_x"])) ); if (paraDict.ContainsKey("startPoint_y")) conduitAsset.startPoint_y = (float)((float.Parse(paraDict["startPoint_y"])) ); if (paraDict.ContainsKey("startPoint_z")) conduitAsset.startPoint_z = (float)((float.Parse(paraDict["startPoint_z"])) ); if (paraDict.ContainsKey("endPoint_x")) conduitAsset.endPoint_x = (float)((float.Parse(paraDict["endPoint_x"])) ); if (paraDict.ContainsKey("endPoint_y")) conduitAsset.endPoint_y = (float)((float.Parse(paraDict["endPoint_y"])) ); if (paraDict.ContainsKey("endPoint_z")) conduitAsset.endPoint_z = (float)((float.Parse(paraDict["endPoint_z"])) ); } // Constraint { if (paraDict.ContainsKey("Horizontal Justification")) conduitAsset.Horizontal_Justification = paraDict["Horizontal Justification"]; if (paraDict.ContainsKey("Vertical Justification")) conduitAsset.Vertical_Justification = paraDict["Vertical Justification"]; if (paraDict.ContainsKey("Reference Level")) conduitAsset.Reference_Level = paraDict["Reference Level"]; if (paraDict.ContainsKey("Offset")) conduitAsset.Offset = paraDict["Offset"]; if (paraDict.ContainsKey("Start Offset")) conduitAsset.Start_Offset = paraDict["Start Offset"]; if (paraDict.ContainsKey("End Offset")) conduitAsset.End_Offset = paraDict["End Offset"]; //Revit 2020 if (paraDict.ContainsKey("Middle Elevation")) conduitAsset.Offset = paraDict["Middle Elevation"]; if (paraDict.ContainsKey("Start Middle Elevation")) conduitAsset.Start_Offset = paraDict["Start Middle Elevation"]; if (paraDict.ContainsKey("End Middle Elevation")) conduitAsset.End_Offset = paraDict["End Middle Elevation"]; } // Electrical { if (paraDict.ContainsKey("Bottom Elevation")) conduitAsset.Bottom_Elevation = paraDict["Bottom Elevation"]; if (paraDict.ContainsKey("Top Elevation")) conduitAsset.Top_Elevation = paraDict["Top Elevation"]; } //Dimensions { if (paraDict.ContainsKey("Diameter Trade Size")) conduitAsset.Diameter_Trade_Size = paraDict["Diameter Trade Size"]; if (paraDict.ContainsKey("Outside Diameter")) conduitAsset.Outside_Diameter = paraDict["Outside Diameter"]; if (paraDict.ContainsKey("Inside Diameter")) conduitAsset.Inside_Diameter = paraDict["Inside Diameter"]; if (paraDict.ContainsKey("Size")) conduitAsset.Size = paraDict["Size"]; if (paraDict.ContainsKey("Length")) conduitAsset.Length = paraDict["Length"]; } // Identity Data 229 { if (paraDict.ContainsKey("Mark")) conduitAsset.Mark = paraDict["Mark"]; if (paraDict.ContainsKey("Comments")) conduitAsset.Comments = paraDict["Comments"]; if (paraDict.ContainsKey("Service_Type")) conduitAsset.Service_Type = paraDict["Service_Type"]; if (paraDict.ContainsKey("Image")) conduitAsset.Image = paraDict["Image"]; if (paraDict.ContainsKey("Design Option")) conduitAsset.Design_Option = paraDict["Design Option"]; } //Phasing { if (paraDict.ContainsKey("Phase Demolished")) conduitAsset.Phase_Demolished = paraDict["Phase Demolished"]; if (paraDict.ContainsKey("Phase Created")) conduitAsset.Phase_Created = paraDict["Phase Created"]; } // Type { if (paraDict.ContainsKey("Element ID")) conduitAsset.Element_ID = paraDict["Element ID"]; if (paraDict.ContainsKey("Type Id")) conduitAsset.Type_Id = paraDict["Type Id"]; if (paraDict.ContainsKey("Type")) conduitAsset.Type = paraDict["Type"]; if (paraDict.ContainsKey("Type Name")) conduitAsset.Type_Name = paraDict["Type Name"]; if (paraDict.ContainsKey("Category")) conduitAsset.Category = paraDict["Category"]; if (paraDict.ContainsKey("Family")) conduitAsset.Family = paraDict["Family"]; if (paraDict.ContainsKey("Family Name")) conduitAsset.Family_Name = paraDict["Family Name"]; if (paraDict.ContainsKey("Family and Type")) conduitAsset.Family_and_Type = paraDict["Family and Type"]; } } public static void DictToWallAsset (Dictionary<string, string> paraDict, wallParameter wallAsset) { if (paraDict.ContainsKey("Area")) wallAsset.Area = paraDict["Area"]; if (paraDict.ContainsKey("Base Constraint")) wallAsset.Base_Constraint = paraDict["Base Constraint"]; if (paraDict.ContainsKey("Base Extension Distance")) wallAsset.Base_Extension_Distance = paraDict["Base Extension Distance"]; if (paraDict.ContainsKey("Category")) wallAsset.Category = paraDict["Category"]; if (paraDict.ContainsKey("Comments")) wallAsset.Comments = paraDict["Comments"]; if (paraDict.ContainsKey("Design Option")) wallAsset.Design_Option = paraDict["Design Option"]; if (paraDict.ContainsKey("Element ID")) wallAsset.Element_ID = paraDict["Element ID"]; if (paraDict.ContainsKey("Enable Analytical Model")) stringtoBool(paraDict["Enable Analytical Model"], wallAsset.Enable_Analytical_Model); if (paraDict.ContainsKey("Family")) wallAsset.Family = paraDict["Family"]; if (paraDict.ContainsKey("Family and Type")) wallAsset.Family_and_Type = paraDict["Family and Type"]; if (paraDict.ContainsKey("startPoint_x")) wallAsset.startPoint_x = ((float)(float.Parse(paraDict["startPoint_x"]) * 0.3048)); if (paraDict.ContainsKey("startPoint_y")) wallAsset.startPoint_y = ((float)(float.Parse(paraDict["startPoint_y"]) * 0.3048)); if (paraDict.ContainsKey("startPoint_z")) wallAsset.startPoint_z = ((float)(float.Parse(paraDict["startPoint_z"]) * 0.3048)); if (paraDict.ContainsKey("endPoint_x")) wallAsset.endPoint_x = ((float)(float.Parse(paraDict["endPoint_x"]) * 0.3048)); if (paraDict.ContainsKey("endPoint_y")) wallAsset.endPoint_y = ((float)(float.Parse(paraDict["endPoint_y"]) * 0.3048)); if (paraDict.ContainsKey("endPoint_z")) wallAsset.endPoint_z = (float)((float.Parse(paraDict["endPoint_z"])) * 0.3048); if (paraDict.ContainsKey("Family Name")) wallAsset.Family_Name = paraDict["Family Name"]; if (paraDict.ContainsKey("Length")) wallAsset.Length = paraDict["Length"]; 230 if (paraDict.ContainsKey("Mark")) wallAsset.Mark = paraDict["Mark"]; if (paraDict.ContainsKey("Phase Created")) wallAsset.Phase_Created = paraDict["Phase Created"]; if (paraDict.ContainsKey("Phase Demolished")) wallAsset.Phase_Demolished = paraDict["Phase Demolished"]; if (paraDict.ContainsKey("Related to Mass")) stringtoBool(paraDict["Related to Mass"], wallAsset.Related_to_Mass); if (paraDict.ContainsKey("Room Bounding")) stringtoBool(paraDict["Room Bounding"], wallAsset.Room_Bounding); if (paraDict.ContainsKey("Structural")) stringtoBool(paraDict["Structural"], wallAsset.Structural); if (paraDict.ContainsKey("Structural Usage")) wallAsset.Structural_Usage = paraDict["Structural Usage"]; if (paraDict.ContainsKey("Top Constraint")) wallAsset.Top_Constraint = paraDict["Top Constraint"]; if (paraDict.ContainsKey("Top Extension Distance")) wallAsset.Top_Extension_Distance = paraDict["Top Extension Distance"]; if (paraDict.ContainsKey("Top is Attached")) stringtoBool(paraDict["Top is Attached"], wallAsset.Top_is_Attached); if (paraDict.ContainsKey("Top Offset")) wallAsset.Top_Offset = paraDict["Top Offset"]; if (paraDict.ContainsKey("Type")) wallAsset.Type = paraDict["Type"]; if (paraDict.ContainsKey("Type Id")) wallAsset.Type_Id = paraDict["Type Id"]; if (paraDict.ContainsKey("Type Name")) wallAsset.Type_Name = paraDict["Type Name"]; if (paraDict.ContainsKey("Unconnected Height")) wallAsset.Unconnected_Height = paraDict["Unconnected Height"]; if (paraDict.ContainsKey("Volume")) wallAsset.Volume = paraDict["Volume"]; } public static void DictToDuctAsset(Dictionary<string, string> paraDict, ductParameter ductAsset) { // Constraint { if (paraDict.ContainsKey("Horizontal Justification")) ductAsset.Horizontal_Justification = paraDict["Horizontal Justification"]; if (paraDict.ContainsKey("Vertical Justification")) ductAsset.Vertical_Justification = paraDict["Vertical Justification"]; if (paraDict.ContainsKey("Reference Level")) ductAsset.Reference_Level = paraDict["Reference Level"]; if (paraDict.ContainsKey("Offset")) ductAsset.Offset = paraDict["Offset"]; if (paraDict.ContainsKey("Start Offset")) ductAsset.Start_Offset = paraDict["Start Offset"]; if (paraDict.ContainsKey("End Offset")) ductAsset.End_Offset = paraDict["End Offset"]; //Revit 2020 if (paraDict.ContainsKey("Middle Elevation")) ductAsset.Offset = paraDict["Middle Elevation"]; if (paraDict.ContainsKey("Start Middle Elevation")) ductAsset.Start_Offset = paraDict["Start Middle Elevation"]; if (paraDict.ContainsKey("End Middle Elevation")) ductAsset.End_Offset = paraDict["End Middle Elevation"]; //Revit 2020 if (paraDict.ContainsKey("Slop")) ductAsset.Slop = paraDict["Slop"]; } //Dimensions { if (paraDict.ContainsKey("Width")) ductAsset.Width = paraDict["Width"]; if (paraDict.ContainsKey("Height")) ductAsset.Height = paraDict["Height"]; if (paraDict.ContainsKey("Size")) { //ductAsset.Size = paraDict["Size"]; string[] sizeArray = paraDict["Size"].Split('\"'); if (sizeArray.Length > 0) { ductAsset.Size = sizeArray[0] + '\"'; ; //ductAsset.Diameter_Trade_Size = sizeArray[0] + '\"'; } } 231 if (paraDict.ContainsKey("Length")) ductAsset.Length = paraDict["Length"]; } // Electrical { if (paraDict.ContainsKey("System Type")) ductAsset.System_Type = paraDict["System Type"]; if (paraDict.ContainsKey("System Classification")) ductAsset.System_Classification = paraDict["System Classification"]; if (paraDict.ContainsKey("System Name")) ductAsset.System_Name = paraDict["System Name"]; if (paraDict.ContainsKey("System Abbreviation")) ductAsset.System_Abbreviation = paraDict["System Abbreviation"]; if (paraDict.ContainsKey("Bottom Elevation")) ductAsset.Bottom_Elevation = paraDict["Bottom Elevation"]; if (paraDict.ContainsKey("Top Elevation")) ductAsset.Top_Elevation = paraDict["Top Elevation"]; if (paraDict.ContainsKey("Equivalent Diameter")) ductAsset.Equivalent_Diameter = paraDict["Equivalent Diameter"]; if (paraDict.ContainsKey("Size Lock")) stringtoBool(paraDict["Size Lock"], ductAsset.Size_Lock); if (paraDict.ContainsKey("Loss Coefficient")) ductAsset.Loss_Coefficient = paraDict["Loss Coefficient"]; if (paraDict.ContainsKey("Hydraulic Diameter")) ductAsset.Hydraulic_Diameter = paraDict["Hydraulic Diameter"]; if (paraDict.ContainsKey("Section")) ductAsset.Section = paraDict["Section"]; if (paraDict.ContainsKey("Area")) ductAsset.Area = paraDict["Area"]; } // Identity Data { if (paraDict.ContainsKey("Mark")) ductAsset.Mark = paraDict["Mark"]; if (paraDict.ContainsKey("Comments")) ductAsset.Comments = paraDict["Comments"]; if (paraDict.ContainsKey("Service_Type")) ductAsset.Service_Type = paraDict["Service_Type"]; if (paraDict.ContainsKey("Image")) ductAsset.Image = paraDict["Image"]; if (paraDict.ContainsKey("Design Option")) ductAsset.Design_Option = paraDict["Design Option"]; } //Phasing { if (paraDict.ContainsKey("Phase Demolished")) ductAsset.Phase_Demolished = paraDict["Phase Demolished"]; if (paraDict.ContainsKey("Phase Created")) ductAsset.Phase_Created = paraDict["Phase Created"]; } // Type { if (paraDict.ContainsKey("Element ID")) ductAsset.Element_ID = paraDict["Element ID"]; if (paraDict.ContainsKey("Type Id")) ductAsset.Type_Id = paraDict["Type Id"]; if (paraDict.ContainsKey("Type")) ductAsset.Type = paraDict["Type"]; if (paraDict.ContainsKey("Type Name")) ductAsset.Type_Name = paraDict["Type Name"]; if (paraDict.ContainsKey("Category")) ductAsset.Category = paraDict["Category"]; if (paraDict.ContainsKey("Family")) ductAsset.Family = paraDict["Family"]; if (paraDict.ContainsKey("Family Name")) ductAsset.Family_Name = paraDict["Family Name"]; if (paraDict.ContainsKey("Family and Type")) ductAsset.Family_and_Type = paraDict["Family and Type"]; } // Data { if (paraDict.ContainsKey("startPoint_x")) ductAsset.startPoint_x = ((float)(float.Parse(paraDict["startPoint_x"]) )); if (paraDict.ContainsKey("startPoint_y")) ductAsset.startPoint_y = ((float)(float.Parse(paraDict["startPoint_y"]))); if (paraDict.ContainsKey("startPoint_z")) ductAsset.startPoint_z = ((float)(float.Parse(paraDict["startPoint_z"]) )); if (paraDict.ContainsKey("endPoint_x")) 232 ductAsset.endPoint_x = ((float)(float.Parse(paraDict["endPoint_x"]) )); if (paraDict.ContainsKey("endPoint_y")) ductAsset.endPoint_y = ((float)(float.Parse(paraDict["endPoint_y"]) )); if (paraDict.ContainsKey("endPoint_z")) ductAsset.endPoint_z = ((float)(float.Parse(paraDict["endPoint_z"]) )); } } public static void DictToOtherObjectAsset (Dictionary<string, string> paraDict, otherObjectParameter otherOBAsset) { if (paraDict.ContainsKey("startPoint_x")) otherOBAsset.startPoint_x = ((float)(float.Parse(paraDict["startPoint_x"]) * 0.3048)); if (paraDict.ContainsKey("startPoint_y")) otherOBAsset.startPoint_y = ((float)(float.Parse(paraDict["startPoint_y"]) * 0.3048)); if (paraDict.ContainsKey("startPoint_z")) otherOBAsset.startPoint_z = ((float)(float.Parse(paraDict["startPoint_z"]) * 0.3048)); if (paraDict.ContainsKey("endPoint_x")) otherOBAsset.endPoint_x = ((float)(float.Parse(paraDict["endPoint_x"]) * 0.3048)); if (paraDict.ContainsKey("endPoint_y")) otherOBAsset.endPoint_y = ((float)(float.Parse(paraDict["endPoint_y"]) * 0.3048)); if (paraDict.ContainsKey("endPoint_z")) otherOBAsset.endPoint_z = ((float)(float.Parse(paraDict["endPoint_z"]) * 0.3048)); if (paraDict.ContainsKey("Element ID")) otherOBAsset.Element_ID = paraDict["Element ID"]; if (paraDict.ContainsKey("Category")) otherOBAsset.Category = paraDict["Category"]; if (paraDict.ContainsKey("Mark")) otherOBAsset.Mark = paraDict["Mark"]; if (paraDict.ContainsKey("Comments")) otherOBAsset.Comments = paraDict["Comments"]; } public static void DictToPipeAsset(Dictionary<string, string> paraDict, pipeParameter pipeAsset) { // } public static bool stringtoBool(string paraValue, bool result) { if (paraValue == "No" || paraValue == "-1" || false) result = false; else result = true; return result; } } C.4 InventoryControl.cs using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.UI; public class InventoryControl : MonoBehaviour { [SerializeField] public GameObject textRow; [SerializeField] public GameObject optionRow; [SerializeField] public GameObject checkRow; [SerializeField] public GameObject inputRow; [SerializeField] public GridLayoutGroup gridGroup; public GameObject GiszomsOb; public List<Object> FilterOffObject = new List<Object>(); public void Start() { AddTag.CreateTag("Properties"); //Only working the first time in Unity transform.parent.gameObject.tag = "Properties"; // the InventoryMenu's paraant is the Cavas //textInventory("Property Window", "Information"); // //Create the filter UI compoennt based on the tags 233 if (this.gameObject.name == "CategoryWindow") { FilterBox(); } } public void DisactiveUI() { foreach (Transform child in gridGroup.transform) { child.gameObject.SetActive(false); } } public void FilterBox() { string[] TagArray ={ "Walls","Floors", "Roofs","Duct Accessories", "Ducts", "Duct Fittings", "Conduits","Conduit Fittings","Pipes","Generic Models","Speciality Equipment", "Air Terminals", "Doors","Lighting Fixtures","Electrical Equipment","Columns", "Structural Columns","Stairs","Railings", "Ceilings", "Curtain Panels","Curtain Wall Mullions","Project"}; foreach (string element in TagArray) { GameObject filter = Instantiate(checkRow); filter.transform.GetChild(0).GetChild(1).GetComponent<Text>().text = element; filter.name = element; Toggle m_Toggle = filter.transform.GetChild(1).GetChild(1).GetComponent<Toggle>(); m_Toggle.isOn = true; m_Toggle.onValueChanged.AddListener(delegate { ToggleValueChanged(m_Toggle); }); filter.transform.SetParent(this.gridGroup.transform, false); } } void ToggleValueChanged(Toggle changeTg) { string categoryTag = changeTg.transform.parent.parent.gameObject.name; Debug.Log("categoryTag: " + categoryTag); GameObject[] ObjectsWithTags; if (changeTg.isOn== false) { ObjectsWithTags = GameObject.FindGameObjectsWithTag(categoryTag); for (int i = 0; i < ObjectsWithTags.Length; i++) { FilterOffObject.Add(ObjectsWithTags[i]); //throws NullReferenceException } foreach (GameObject eachGameOb in ObjectsWithTags) { eachGameOb.SetActive(false); } } else { foreach (GameObject eachGameOb in FilterOffObject) { if(eachGameOb.tag == categoryTag) eachGameOb.SetActive(true); } } } } C.5 StoreHeight.cs using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.Events; using UnityEngine.EventSystems; using UnityEngine.UI; public class StoreHeight : MonoBehaviour { public void OnSelect(BaseEventData eventData) 234 { Debug.Log(this.gameObject.name + "was selected, and is changing the height"); Text theTextComponent = gameObject.GetComponent<Text>(); FindObjectOfType<ReferenceObejct>().HeightChangeStoreValue(theTextComponent.text); } } C.6 UploadFile.cs // DropboxSync v2.1.1 // Created by George Fedoseev 2018-2019; Modified by Zeyu Lu using System; using System.Collections; using System.Collections.Generic; using UnityEngine; using DBXSync; using UnityEngine.UI; using System.Linq; using System.Text; using System.IO; using System.Threading.Tasks; using System.Threading; public class UploadFile : MonoBehaviour { public InputField localFileInput; public Button uploadButton; public Button cancelButton; public Text statusText; public string localFilePath; private CancellationTokenSource _cancellationTokenSource = new CancellationTokenSource(); private string _uploadDropboxPath; void Start() { string path; //#if UNITY_EDITOR path = Application.dataPath +"/objectMeta.txt"; //#endif #if UNITY_IOS path = Application.persistentDataPath + "/objectMeta.txt"; #endif localFilePath = path; localFileInput.text = localFilePath; ValidateLocalFilePath(); uploadButton.onClick.AddListener(Upload); cancelButton.onClick.AddListener(() => { _cancellationTokenSource.Cancel(); }); } void ValidateLocalFilePath() { if (File.Exists(localFilePath)) { //_uploadDropboxPath = Path.Combine("/DropboxSyncExampleFolder/", Path.GetFileName(localFileInput.text)); _uploadDropboxPath = Path.Combine("/", Path.GetFileName(localFilePath)); Debug.Log("localFilePath: "+ localFilePath); statusText.text = $"Ready to upload to {_uploadDropboxPath}"; uploadButton.interactable = true; } else { statusText.text = "<color=red>Specified file does not exist.</color>"; uploadButton.interactable = false; } } void Upload() { _cancellationTokenSource = new CancellationTokenSource(); uploadButton.interactable = false; //var localFilePath = localFileInput.text; DropboxSync.Main.UploadFile(localFilePath, _uploadDropboxPath, new Progress<TransferProgressReport>((report) => { 235 if(Application.isPlaying){ statusText.text = $"Uploading file {report.progress}% {report.bytesPerSecondFormatted}"; } }), (metadata) => { // success print($"Upload completed:\n{metadata}"); statusText.text = $"<color=green>Uploaded. {metadata.id}</color>"; uploadButton.interactable = true; }, (ex) => { // exception if(ex is OperationCanceledException){ Debug.Log("Upload cancelled"); statusText.text = $"<color=orange>Upload canceled.</color>"; }else{ Debug.LogException(ex); statusText.text = $"<color=red>Upload failed.</color>"; } uploadButton.interactable = true; }, _cancellationTokenSource.Token); } }
Abstract (if available)
Abstract
Building information modeling (BIM) is a common type of software in the Architecture, Engineering, Construction, and Operation (AECO) industry. Although the benefits and the features of BIM have been extended, BIM is still primarily presented as digital data stored in the desktop systems in offices or outputted as 2D drawings. A gap remains between the digital BIM and the real-world objects described by it. In facility management (FM), the traditional way of facility maintenance is a time-consuming process that managers search an index sheet with detailed instructions to check the building assets’ information. Currently, there has been cases using BIM to integrate with the FM systems to help FM personnel manage the assets and the building operations. Some cloud-based BIM-FM tools even enable FM personnel to work on site by scanning asset references such as barcodes to retrieve needed information. However, these BIM-FM-integration tools still need manual work to do the process repeatedly. One trend to optimize the efficiency of facility management is by using Augmented Reality (AR) technology to overlay the whole building information model on-site, which can combine virtual information with the real environment. ❧ This study intended to build a tool that can be used when there are demands to modify a building information model on site. Different from most BIM-AR research and tools focusing on transferring BIM data unidirectionally from BIM software to the VR/AR platform, a workflow of exchanging BIM data bilaterally was established. A BIM-based mobile AR application, TransBIM, was developed to visualize, edit, and update the building information model in the AR environment, specifically for MEP systems. Also, TransBIM enables users to send the changed BIM data back to Revit to update the Revit model. USC Watt Hall and its Revit model were used as a case study to test TransBIM and the proposed workflow. ❧ Autodesk Revit, Dynamo, Autodesk 3ds Max, Unity, and ARSDKs (ARFoundation & ARKit), Dropbox were used to develop the mobile AR application (TransBIM) and to realize the entire workflow. The case study showed that TransBIM can overlay the 1:1 scale BIM model with the real building by scanning the on-site images. Users can filter objects by categories and click the virtual objects to retrieve the relevant BIM information. Most importantly, users can change an object's location and edit some parameter information such as MEP systems’ (ducts/conduits/ pipes) sizes, comments and mark in TransBIM. The changes made in TransBIM were saved into a text file (.TXT) with the modified BIM information and the text file can be uploaded to Dropbox from the mobile device. Finally, the BIM-metadata text file was downloaded in PC. Another developed Dynamo script was used to read the text file to update the Revit model. The Revit model (MEP systems’ locations and parameters) was correctly and instantly updated. ❧ Using TransBIM realized transferring the BIM metadata between BIM (Revit) and AR (TransBIM). TransBIM offers an opportunity for FM managers to visualize the existing MEP system’s BIM asset information and to modify the MEP model and information onsite. The Revit model can be automatically updated by reading the changed data. Moreover, TransBIM provides a new concept of managing and updating the Revit model during a building’s operating phase, which is using AR to modify a building information model on site. TransBIM can be a prototype for developing more sophisticated products that help asset management and model renovation design in FM.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Building information modeling based design review and facility management: Virtual reality workflows and augmented reality experiment for healthcare project
PDF
BIM+AR in architecture: a building maintenance application for a smart phone
PDF
A BIM-based visualization tool for facilities management: fault detection through integrating real-time sensor data into BIM
PDF
CFD visualization: a case study for using a building information modeling with virtual reality
PDF
Streamlining sustainable design in building information modeling: BIM-based PV design and analysis tools
PDF
Augmented reality in room acoustics: a simulation tool for mobile devices with auditory feedback
PDF
Building information modeling: guidelines for project execution plan (PxP) for India
PDF
MM Electrical Tool: a tool for generating electrical single line diagrams in BIM
PDF
Automating fire code compliance using BIM and Revit plug-ins
PDF
Data visualization in VR/AR: static data analysis in buildings
PDF
Planning in advance for rehabilitation and restoration using BIM and seismic simulation to record and analyze the Japanese House in Los Angeles
PDF
Building bridges: filling gaps between BIM and other tools in mechanical design
PDF
Multi-domain assessment of a kinetic facade: determining the control strategy of a kinetic façade using BIM based on energy performance, daylighting, and occupants’ preferences; Multi-domain asse...
PDF
Visualizing thermal data in a building information model
PDF
Lateral design with mass timber: examination of structural wood in high-rise timber construction
PDF
Visualizing architectural lighting: creating and reviewing workflows based on virtual reality platforms
PDF
Daylight and health: exploring the relationship between established daylighting metrics for green building compliance and new metrics for human health
PDF
Acoustics simulation for stadium design using EASE: analyzing acoustics and providing retrofit options for the Los Angeles Memorial Coliseum
PDF
Pushing the solar envelope: determining solar envelope generating principles for sites with existing buildings
PDF
Envisioning seismic Los Angeles: data visualization techniques for an interactive seismicity and building response map application
Asset Metadata
Creator
Lu, Zeyu
(author)
Core Title
Using building information modeling with augmented reality: visualizing and editing MEP systems with a mobile augmented reality application
School
School of Architecture
Degree
Master of Building Science
Degree Program
Building Science
Publication Date
04/30/2020
Defense Date
03/23/2020
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
building information modeling (BIM),facility management (FM),mobile augmented reality (mobile AR),OAI-PMH Harvest,Unity
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Kensek, Karen (
committee chair
), Dwyer, Charles Steve (
committee member
), Lympouridis, Vangelis (
committee member
), Moser, Margaret (
committee member
)
Creator Email
zeyulu@usc.edu,zeyulu1994@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-292304
Unique identifier
UC11665694
Identifier
etd-LuZeyu-8376.pdf (filename),usctheses-c89-292304 (legacy record id)
Legacy Identifier
etd-LuZeyu-8376.pdf
Dmrecord
292304
Document Type
Thesis
Rights
Lu, Zeyu
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
building information modeling (BIM)
facility management (FM)
mobile augmented reality (mobile AR)
Unity