Augmented

This paper discusses Augmented Reality (AR) as means to interact with information regarding infrastructure projects before, under and after construction. For that purpose, two different prototypes were developed using Apples ARKit and Unity's game design platform and tested on two use cases. However, the main focus of this paper is interacting with infrastructure information through AR rather than researching core AR technology. We learned that using AR under the constructing phase with subsurface utilities is still facing several difficulties. Especially when it comes to accessing and interacting with information in a changing construction environment. These difficulties will be discussed and also the challenges regarding information flow between civil engineering and AR software.


INTRODUCTION
Construction projects in the infrastructure and building domain are frequently experiencing cost overruns and delayed time schedules. Consequently, governments are facing budget overruns and using more money than expected since especially infrastructure projects are public funded. The problem is among other things related to the construction sector being one of the least digitized sectors. Therefore, an EU task group emphasizes that Building Information Modeling (BIM) must be the digital driving force to streamline workflows in their 2017 report [1]. This is a must if the infrastructure and building domain wants to be more efficient, minimize errors and avoid bad communication.
Meanwhile in the age of the fourth industrial revolution digitization is appointed as the foremost driving force of continuous economic growth and hereby the Boston Consulting Group has further appointed Augmented Reality (AR) as one of the nine technological building blocks (Brunelli et al. 2017). They believe that workers will use AR to access and interact with graphical information at the factory floor that are connected to a digital copy of the entire factory. These days, some might want to call this connected digital copy of the physical environment a Digital Twin. Similar on a construction site AR can be used to ease access to project information and thereby support faster and better decision-making. A recent evaluation test on using AR visualization techniques compared to traditional PC-based visualizations has shown that using AR on site increases the use and understanding of technical construction information (Meza et al. 2015). Compared to a user perceiving a 3D model on a normal PC monitor the study showed AR was up to 20% better in understandability and usability. Additionally, it was concluded that a well-formed digital model, similar to BIM models, was needed before architects and engineers could take full advantage of AR.
In recent time AR is going through a lot of popularity in the media. Pokémon GO, a location-based AR game for smartphones, was one of the big headlines in 2016. Also, Microsoft released HoloLens and Apple and Google released AR tracking frameworks native to their mobile operating systems. Regardless of the popularity; there still exist a need to advance the field of AR applications within the building and infrastructure domain, as well as other use case domains. Surely the phenomenon of Pokémon GO made AR evident for everyone and in the slipstream of that, even executives started asking for AR in a professional context expecting the technology to be mature for a broader utilization.

Scope / Aim
The presented paper is partly an outcome of such demands where BIM models designed at engineering consultancy companies are suggested to be "used for more". They are pointing at Virtual and Augmented Reality technologies to be used towards their clients and project partners to ease communication dealing with technical information, but also to improve model design before construction. Therefore, a research project, which is still on-going, was established together with a large Danish based consultancy company (COWI), a small software developer company (Epiito) and Aalborg University to explore these possibilities. The overall aim was to develop an AR prototype to access infrastructure information in the field by using the already made 3D design models by the engineering consultancies. Another goal was to use a device type that was accessible and familiar to use for partners and clients to ease a possible future implementation of a finalized AR solution.
We present our initial findings based on our prototype developments and tests from two use cases. The prototypes were developed over a period of one year in a two-step process as prototype 2 is a further development of prototype 1. Both prototypes were designed for a specific infrastructure project and specific AR use case in mind. The infrastructure projects Figure 1 The two infrastructure AR use cases located in Denmark and Norway.
consist of two major highway constructions respectively located in Denmark and in Norway as shown in Figure 1. Prototype 1 is related to the planning and design phase showcasing the new infrastructure project as a possible new development on site in relations to for instance politicians and neighbours as well as engineers inside the project. While prototype 2 aims at the construction phase helping the contractor to visualize the progress of the construction work by visualizing the project on site and enabling the alteration of status information to infrastructure subutility elements on the construction site. This is done by implementing the open IFC data model together with the Building Collaboration Format (BCF).

Paper structure
In the following section previous AR work done in the academic field is presented together with the current commercial AR development. It is focusing on similar AR application use cases for inspiration and to address current challenges and limitations prior to the development. Next in the third section the prototype development and functioning are presented as well as the reasoning behind the selected AR hardware and software. It continues by presenting the information flow between civil engineering software and the AR prototype 1. Continuing in the fourth section with results from prototype development and findings from the AR prototype 1 test, but mostly findings from the more mature prototype 2 tested on a construction site. Finally, in the conclusion, we revisited our findings and discuss recommendations and further work for the ongoing research project.

BACKGROUND
AR is a technology that combines the virtual and physical world by placing virtual content directly in the user's surroundings. Recent examples of AR in everyday life is the popular mobile game Pokémon GO released in the summer of 2016, where virtual monsters appear in parks and cities by looking through the smartphones video feed. Snapchats face-filters is another example and perhaps the most widely used AR feature today.

Previous work
Using AR for games and social media is not the only use case. Neither is the idea of using AR for construction projects to optimize workflows and ease communication. In fact, the building and infrastructure domain has been a popular area of showcasing the potential of AR. One could even say that it dates back to the first outdoor Augmented Reality system which visualized a 3D model of a historical building on its former site (Hollerer et al. 1999). Here after Roberts et al. (2002) made an AR system to visualize the subsurface utilities beneath the ground. The system also manages to archive centimeter level positioning and orientation by the integration of Real-time kinematic GNSS and an Inertial Navigation System (INS) unit. A quite impressive achievement at its time. This was before the mobile revolution, which meant the fact that all mentioned AR systems needed to be carried with backpacks to contain all the necessary computer power and sensors. This made it nearly impractical to use in everyday tasks. Computer graphics were still very poor, and calibration was a big issue compared to what today's AR systems are capable off. Though, enough to show the potential of AR.
AR applications that enabled users to look into the ground and see subsurface utilities continued to be a popular theme. As technology progresses and AR hardware shrinks in size and weight Schall et al. (2009) made a handheld AR system that visualized underground infrastructure to aid field workers of utility companies. An AR application to perform virtual redlining in outdoor environments (previously done manually by annotating on printed maps or 2D GIS systems) was also made on the same AR system . The AR system was substantially more compact than previous systems and featured different visualization techniques to archive a better depth perception -like making a virtual cut-out in the surface. The reason was to aid the user in perceiving a comprehensible AR visualization, since the human brain is not used to see through an opaque sur-face. Even though the AR systems were able to operate in a handheld manner an evaluation showed that the ergonomics of the device still has potential for improvement before broader implementation among field workers. This was as well done on a rather clumsy device compared to modern smartphone and tablets. Still the AR system was comprehensive by combining Real-time kinematic Global Positioning systems and computer vision techniques to obtain a tracking solution with centimeter-level accuracy Schall et al. (2013).
After the first smartphones became availablepre iPhone era -it was soon used for Augmented Reality purposes. Also in the research field, as Woodward et al. (2010) presented their mobile handheld Augmented Reality system for construction site visualization and communication to support BIM tasks. They were able to use an IFC file of a proposed building design and thereby achieve object-based level of interaction with the augmented design model. An example of this is their work on time scheduled (4D) simulation presented on the same AR system (Hakkarainen et al. 2009). AR used in the construction phase are also demonstrated by Zollmann et al. (2014) AR system for construction site monitoring and documentation. The system could perform tasks such as annotating and surveying that will stay situated and attached to the underlying 3D reconstruction model of the construction project.
AR applications for the construction phase have shown to be a popular area in recent time. In a literature review done by Brioso and Calderon-hernandez (2018) a total of 46 articles from a five year period containing AR and BIM was reviewed. It showed that approximately ¾ of all AR applications were from the construction phase. Their review also showed that the three biggest limitations/challenges (out of ten) was (a) visualization techniques with occlusion as the main issue, (b) alignment of the virtual AR content with the real surroundings and (c) limited device/hardware capabilities. It is worth mentioning that ergonomics was the least occurring limitation/challenge found in the review. A shift compared to challenges addressed in Schall et al. (2009) AR application evaluation.
Looking at the more broader AR research field the challenges found in AR/BIM seem to be consistent. A recent article reviewing the past 10 years of AR research presented at the International Symposium on Mixed and Augmented Reality (ISMAR) shows that these five topics are the most cited in 2018; (1) mobile AR -possible because of the evolution in powerful and sensor-rich mobile devices, (2) spatial reconstruction of real surroundings -which can be used to make occlusion for visualization techniques, (3) Tracking and positioning techniques -which can help align the virtual with the real, (4) AR application and (5) Evaluation. Where the latter two topics most likely are related to each other, and the reason behind might suggest that AR is maturing, because more evaluations are conducted on real users. (Kim et al. 2018)

Commercial AR development
Another sign of AR is maturing, is the recent and frequent announcement of AR related products and software support from big tech companies like Apple, Google, Microsoft, Magic Leap etc. It already seems like a while since HoloLens was released in March 2016. And just to show how tremendous the development has unfolded; consider this. In 2015 a comparative study of AR SDKs was conducted comparing AR tracking SDKs from Metaio, Vuforia, Wikitude, D´Fusion and ARToolKit (Amin and Govilkar 2015). Today, this comparison study is almost useless because of the release of native mobile tracking SDKs for Apples and Googles handheld devices; respectively ARKit (iOS) and ARCore (Android). Today these SDKs are the default choice. Now developers can use these well-functioning AR frameworks that does not require calibration, because Apple and Google (to some extent) make their own devices and then already are pre calibrated. Thus, also making the tracking more robust and reliable. This newfound easily accessible tracking framework encourages developers to develop AR applications even more. As an addi-tional consequence, popular game design platforms like Unity and Unreal Engine have incorporated the frameworks in their platform editors. Even though these platforms are mainly focusing on game developments their editors are widely used for other professional domains for AR and VR related applications.
Also, product companies within the building and infrastructure domain have been showing involvement with AR. Trimble now lets SketchUp support AR by using Apples ARKit framework on iOS devices and also supports Microsoft HoloLens. Trimble also has a working mobile outdoor AR prototype system called SiteVision, running on an Android device with Googles ARCore framework and connected to an external GNSS receiver. As mentioned, combining a GNSS receiver has been done before by individuals in academia, but the interesting part is that it soon becomes accessible in a ready-to-use commercial package. If so, high accuracy location-based AR technology will be available for professionals as well as academics. The AR marked is moving fast so properly by the release of this paper the mentioned and new AR products will already be available.

AR PROTOTYPES
From studying previous work, it got evident that an outdoor AR system needed to integrate some kind of high accuracy external GNSS receiver combined with advanced vision-based methods to instantly obtain a high precession global position and orientation of the visualized AR content (Schall et al. 2013). However, such a comprehensive system was beyond our capabilities to develop and no commercial solution could offer this. Perhaps the Trimble SiteVision would be suitable, but this solution was not at our disposal at that time and was a working prototype still connected to the Google Tango device.
Beside that, several existing solutions were tested based on mobile or tablet devices. They were based on either Vuforia or similar AR frameworks. But none of the tested solutions gave a convincing performance and impression of quality, which would fulfil the demands of a useful professional application. This led to further encouragement in pursuing our own development of a prototype.
It was decided to go for a handheld device that was familiar to use and easily accessible from the market i.e. a modern smartphone or tablet. Whereas the preferable option was a tablet, because of the larger screen size. A head-mounted display (HMD) with optical see-through was quickly declined based on experiences with the Microsoft HoloLens (the HMD available at our disposal at the time) in outdoor daylight conditions and by studying previous work from 2009;2013) and others. However, a hands-free AR experience would be preferable, the trade-off in using video see-through display technology is the better option for our use cases.
As mentioned in section 2 the state-of-the-art tracking AR framework for handheld smart devices is now considered by most being either ARKit or AR-Core. Since it was decided a tablet was the preferred device the obvious choice was to use ARKit, and an iPad Pro 12.9" (2. gen.) was selected. This was also due to the limited Android devices supporting AR-Core which at that time did not include tablets. As for the app development Unity was chosen, which had released a plugin for Apples ARKit. Figure 2 shows the handheld iPad in action running prototype 1 in AR mode.

Developments and practical implementations
There were primarily two main challenges building the prototypes; (a) developing an application for the handheld device enabling the user to geo pose a vir-tual model into the real world at a defined geographical positioning and orientation, and (b) establishing an information flow that enabled acceptable data exchange between the civil engineer CAD software and the Augmented Reality prototype application.
The following subsections explain the fundamentals of the developments which for a large extent were similar on both prototypes. Although the Prototype 2 looks very similar to prototype 1 they differ on several important parts which are essential to cope with for a future development of a professional AR application.

Geo Pose
There exist several methods to geo pose a virtual model in the real-world view, and most of the AR developments have been focusing on sensor fusion to deal with this challenge as described in section 2. The following workflow on the other hand is built around the functions available in the ARKit SDK provided by Apple. By design ARKit version 1.0 is a model-free tracking framework, and its main feature is to place virtual models randomly onto a planar surface. ARKit uses the RGB camera on an iOS device to detect patterns in the texture of a given planar surface. Once detected it generates a surface that can be used to place a virtual model on top by tapping on the surface displayed on the iPad screen. The virtual model then appears on top of the planar surface though with a random orientation. This method works really well for indoor environments on a texture-rich wooden floor or table to show furniture, but it is not ideal for a large-scale AR outdoor situation where the model should appear in a specific position and orien-tation. However, by programming some additional functions it was possible to come up with a method to manually adjust the virtual model. The following steps describes the process.
Step 1. The AR prototype is developed in such way that it allows the user to select a predefined spatial position (x, y, z) in the virtual model, which must correspond to an identifiably position in the real world. Therefore, the user must first place himself on the physical predefined position as shown in figure 3 (left) Step 2. The AR system detects the surrounding surfaces when the user is panning the device around in a circular motion while the camera points down at the ground. This is the process where the ARKit functionality is detecting natural features in the surroundings from the video input.
Step 3. The user taps slightly with the finger on the iPad display where the defined virtual position is supposed to appear in the real world. This places the virtual model on the registered surface connecting two dots. To help the user to tap the right place, the defined positions should be marked physically or should be easily recognizable like for instance a well on the pavement surface.
Step 4. The user selects the virtual position by using the scene buttons to adjust the user's physical position for instance the height of eye placement. Thus, the virtual model is moved to its correct position and matches the correct point of view by the user.
Step 5. As mentioned initially the orientation of the virtual model has been chosen arbitrary by the sys- tem and has to be adjusted so it aligns with the real world. The rotation is done manually using the control buttons. An example is shown in figure 3 (right) where a virtual blue cylinder has been inserted into the model representing the existing wind turbine. Thereby the wind turbine acts as a landmark and is used to adjust the virtual model primarily by rotation. The control buttons are placed on the lower right side of the view.

Information flow
In prototype 1 and 2 the initial design model was created using professional CAD systems such as Mi-croStation/InRoads and Novapoint/AutoCAD. During prototype 1, the data exchange was carried out manually converting and recreating model files into either the FBX or OBJ format, which is a readable object file format within Unity. Figure 4 shows a flow chart of the process which consists of the following steps: 1. Move the model from a global to a local cartesian coordinate system. 2. Import model into 3DS MAX (or some other modelling software) using the DWG format. 3. Make sure all normal vectors of the surfaces are pointing in the direction towards the user. 4. Create a corresponding material library with textures. 5. Attach the materials to the model. 6. Export the model in FBX or OBJ format together with the material file. 7. Load the model and materials into Unity

TESTING OF THE PROTOTYPES Prototype 1: Herning-Holstebro Highway (Denmark)
The Herning-Holstebro highway project was selected as the first case to explore AR applicability in the field. Therefore, one of the goals were to actually build and test a working prototype which used the ARKit tracking frameworks in order to test the tracking ability in an outdoor environment. As can be observed in the video (www.vimeo.com/276430462) the movement of the combined view consisting of a real-world video capture and augmented virtual 3D model is very smooth and stable. There was hardly any drifting in the model and the interface developed turned out to work as intended. Even though the model con-

Figure 4
Workflow of how to transfer design models from CAD software to a game design platform. This method was used in prototype 1. tained around 1.4 million polygons the iPad had no trouble handling the data.
Results from the manual geo pose method. As soon as the model is fixed into the real world view it handles very steady and the iPad can easily be passed to another person without any major drift in the situated virtual model. It was also possible to turn around 360 degrees without any loss of orientation.
Compared to other systems we tested this was a very uplifting experience.
Results from information flow. The 3D model of the Danish highway which was chosen for prototype 1 had a high level of geometry detail but had no attribute data attached to its design objects, which in principle meant that it didn't really apply to a BIM model but merely consisted of a "dumb" CAD model missing any semantic definitions. Explained in another way the model was well-formed, but not wellinformed. In this case though it was sufficient because only the visual experience was in focus. However, the transfer the design models from CAD tools to the Unity Editor it turned out to be one of the most tedious challenges. As illustrated in figure 4, a manual workflow transferring the design model from CAD software to AR system was found, but only to obtain one way information flow -not an information round-trip. The model had been modified so much that every possible information attached to the design model would had been lost. It was evident that the data exchange process between CAD software and AR system has to be improved considerably before AR can become practical in a day-to-day work environment.

Prototype 2: E18 Rugtvedt-Dørdal Highway (Norway)
The second prototype development differ mainly due to the data quality and enhanced user experience rather than enhancements with regards to the rather poor manual handling of the virtual model which was not developed further. A video can be seen here: www.vimeo.com/276431890.
The 3D model of this project situated in Norway had an equivalent high level of detail in geometry but in comparison to the Danish 3D highway model it also had a lot of attribute data attached to welldefined design objects of infrastructure elements. The Highway project used the design software Trimble Novapoint and Quadri; a cloud-based collaborative platform with its own object classification library for infrastructure elements. This represents in many ways an information model known from the building domain and suited for the open IFC data model format.
It was obvious to try to use this new possibility and use the identification and information retrieval of single objects in the model, and in the AR view. This prototype development was aimed at the constructor and the construction phase, and therefore the development had to determine functionality the constructor could benefit from. It was decided to develop a kind of progress indicator for certain objects. This way it should be made possible to tap on specific objects in the view and indicate the state of progress during construction. The idea was to use the Quadri platform and a direct link to the model database, but this turned out to be impossible due to a lack of a suitable API but also the fact that Unity and ARKit not is able to handle huge numbers used as coordinates i.e. a placement far away from origo.
The development during prototype 1 clearly showed a huge gap between the CAD design world and the game design platform environment. Therefore, efforts were on improving the data exchange workflow in prototype 2 trying to avoid the huge loss of information. A collaboration with the Danish software house Epiito, which at that point already had developed a software solution for importing IFC files, and also developed VR/AR apps in Unity, lead to prototype 2. Epiito had also developed their own cloud service solution (Epiito Cloud) for mobile VR applications. The original prototype 1 was then redesigned using Epiitos Cloud solution and IFC import.
It was investigated whether it was possible to make an IFC export directly from the Quadri model. It was, but the problem was that the IFC Road data Illustration of data exchange in Prototype 2 using a combination of the IFC and BCF data model. model (IFC 5) is yet to be developed by BuildingS-MART [2]. The objects showed as proxy elements with no name. Wells and pipes however are two objects that are also found in the IFC 4 data model, hence these two were defined as ifcFlowTerminal and ifcFlowSegment in the current IFC model. That meant the remaining road model and other elements had to be converted similar to the workflow described in prototype 1 with a huge loss of data since the data model was much richer on information than the one in Denmark.
The user interface was optimized so the user could interact with the sub-utility elements by tapping on them, thereby reading the actual status and assigning a new status if desirable. The prototype uses the Building Collaboration Format (BCF) to transfer status information from the Epiito Cloud to the AR prototype. This new workflow is illustrated on figure 5.
The prototype worked as intended though the primitive manual positioning turned out to be less useful since this was on a construction site, and therefor subject to temporal changes. This makes it evident to use preselected marked points which equivalent spatial location do not change over time, like nearby buildings or bridges because for instance the earthwork is reshaping the terrain continuously. Certainly, a minor thing since a similar application in a professional edition would need GNSS enabled positioning anyway. A minor flaw experienced was light-ning conditions even though we chose a video-seethrough technology the sun reflecting on the screen could be annoying. A piece of cardboard helped to take care of the problem. The model was not as smooth tracking as in prototype 1. An obvious explanation was probably that the model in prototype 2 is 6 times bigger than in prototype 1 -7,8 million triangles in all to handle for the iPad. Also missing a horizon as visual tracking guidance could have played a role together with the many movements occurring at a construction site. The view experience though was still fully acceptable.

CONCLUSIONS
The paper questioned; How can infrastructure information be accessed and handled using AR, and how can we interact with these virtual models, and retrieve or add information using AR in the field? For that purpose, two prototypes were developed with focus on content not technology. They were built with relatively ease and should encourage others to take advantage of the current state of development and free accessibility to AR SDK's. The presented AR prototypes combined an off-the-shelf tablet and the latest available mobile AR tracking framework from Apple. The iPad form factor has become familiar to most people and thereby provided a straight forward experience to access a model of a concrete construction site, therefore the majority of users had no problem to accommodate to the "new tool". However, the user interface design and interactions with the device needs to be improved, before a finalized AR solution can be implemented by the consultancy companies. Both engineers and workers expressed that the AR prototypes had great potential, but it needed to be more user friendly. Therefore, this is an area to improve in the further development and conduct more structured experiments with professional partners in the research project.
From the test results it could be concluded that the tracking part, which is one of the most important visual perceptions, is almost solved. In close relation to tracking is global positioning and orientation, which still needs to be a more automated process before AR becomes useful in outdoor environments.
Looking ahead these technological challenges are on the agenda of both small and major tech companies and recent announcement of AR related products and software support from big tech companies like Apple, Google and Microsoft proves that AR are beginning to mature. Therefore, if we want to embrace the age of the fourth industrial revolution we need to think of AR as a new media in which we can interact with embedded digital objects placed in the physical world.