Pop Up Factory : Collaborative Design in Mixed Rality

This paper examines a novel, integrated and collaborative approach to design and fabrication, enabled through Mixed Reality. In a bespoke fabrication process, the design is controlled and altered by users in holographic space, through a custom, multi-modal interface. Users input is live-streamed and channeled to 3D modelling environment,on-demand robotic fabrication and AR-guided assembly. The Holographic Interface is aimed at promoting man-machine collaboration. A bespoke pipeline translates hand gestures and audio into CAD and numeric fabrication. This enables non-professional participants engage with a plethora of novel technology. The feasibility of Mixed Reality for architectural workflow was tested through an interactive installation for the makeCity Berlin 2018 festival. Participants experienced with on-demand design, fabrication an AR-guided assembly. This article will discuss the technical measures taken as well as the potential in using Holographic Interfaces for collaborative design and on-site fabrication.Please write your abstract here by clicking this paragraph.


INTRODUCTION
In this paper we argue that AR can improve manmachine collaboration in architecture making. Holographic, multimodal interfaces simplify access to cutting-edge technology for the all-user. Human operators can edit digital information with using intuitive interfaces, for CAD/CAM. Leveraging human capabilities to communicate vie speech and handgestures allows non-professionals easy access to 3D modelling and robotic fabrication. We examined AR representation and multimodal interface potential in the inclusion of non-professional in architectural de-sign and making, human-machine collaboration and multi-participant design. This approach was tested through a fully integrated cycle of holographic 3D modelling, on-demand robotic machining and ARguided assembly that resulted in a collaborative installation for the MakeCity Festival 2018 in Berlin.

BACKGROUND
The 4th industrial revolution marks the fusion of novel automation with advanced means of communication and increased connectivity (Schwab, 2016).
Cyber-Physical production, the joint work of human-human, machine-human and machinemachine, is the foundation for increased productivity on-which lies Industry 4.0 (Schuh, et al., 2014). In the architecture field this is seen in the adaptation of tools originating from various industries: fabrication with automotive-industry robotic arms; visualization techniques origination in the gaming industry, such as AR and VR goggles. Synthesizing the two brings about precise, on-demand CAD/CAM fabrication onsite. Studies in robotic fabrication mark its performance as a prominent tool of the future worksite. Automation brings about increased process control, speed, precision of execution and increase load-bearing capacities (Bonwetsch, F. Gramazio, M. Kohler, 2012). It enables continuous work in-situ as well as an increase in workers' safety (Keating, 2017). On-site automated fabrication with using roboticarms has been studied for its potential in combined fabrication with several agents (Parascho et al., 2017). Latest design research and application in robotichuman fabrication in a building scale is seen in the collaborative project DFAB House, of NCCR Digital Fabrication and ETH Zurich (ETH Zurich, NCCR Digital Fabrication, 2018).
Augment Reality is advantageous in showing contextual information, regarding design and fabrication. It allows hands-free access to information, serving as guides for fabrication (J. Zhou et al., 2011), increasing product quality and reducing product costs (M. Brettel et al., 2014). In architecture, onsite visualization during 3D modelling design phase promotes context-aware design, adaptation to siteconditions (Zlatanova, 2001) and increased design performance (Popovski, 2014). Visualization is a key factor in collaborative design and public participation (Allen, 2011). Previous experiments in using AR-interfaces for fabrication purposes, were tested for professional and non-professional use (Weichel, 2014), executed in small-scale (Do, 2010) and taught a skill in repeating a predefined design (Billinghurst, 2015). AR and VR has been long identified for their simulative advantages, training human agents in working with digital fabrication tools (Lin, 2002). The results of improper operation can be simulated without incurring the associated costs in terms of human injury and equipment repair.
Multimodal interfaces make use of human natural communication capabilities: speech, gestures, touch. It is advantageous in translating long-learnt motoric skills into machine-related skills, in short time (Arshad et al., 2017). Multimodal interfaces have been in use for the gaming industry, tracking motion and sound. Inputs are tracked with using sensors (Nintendo Wii; Microsoft Kinect) or translated through remote controls (HTC Vive; Samsung Gear VR; Oculus Rift). Tracking of human motoric skills into machine-code is seen in digital art-making (Hacmun, 2018) and in medicine practices, allowing remote support from experts (Bodner, 2005). Humanbody "outputs", translated into machine-code, have been used as artistic generators in art installation: heart-beats, neuronal activity and sound. The latter, an exploration in sound-driven pattern-making, serves as base for the installation further discussed (Betti, 2018).
A fully operational AR based, multimodal system as the one prototyped here is yet to be developed and tested at scale but it seems to belong to a rapidly approaching future.
The use of Holographic, multimodal Interface is tested for its possible impacts on in-situ design, fabrication and assembly. The workflow demonstrated here is aimed at the inclusion of non-professionals in a collaborative design and fabrication process. For this purpose, machine inputs are customized to fit the every-user. In sum, Pop-Up Factory system is composed of hardware, software and human agents.

METHODS
The Pop-Up Factory installation explores digital design, fabrication and assembly on-site with using multiple technologies, controlled via AR. Hardware components used are Microsoft HoloLens AR headset, ABB 1200 robotic arm augmented with a custommade hot-wire cutting end-effector and a microphone. Software components used are Rhinoceros The physical and holographic components of the Pop-Up Factory installation 3D modelling, Grasshopper programming, Python programming; and various Grasshopper plug-ins, among them: Fologram and HAL robotics. We will discuss those in the following sections.
1) Holographic Interface: A bespoke AR user interface was designed for a design and fabrication enterprise. The installation consists of a workflow in three stages: design of the brick tower (CAD), fabrication of bricks (CAM) and manual assembly. AR interface is used to control software and automated hardware, allows navigation between the different CAD/-CAM stages and serves as guide in the assembly sequence. The interface exhibits six applications, 'Action Buttons' , each displaying an identifying icon: 1. Eye / Hand icon: shift between two visualization modes: built portion of the tower (hand); design of the over-all tower (eye). 2. Control Points: controlling the paths that determine the tower's growth patterns and overall form. 3. Audio Recording icon: starts and pauses an audio recording, used to texture the brick's exterior face. 4. Ruler icon: defines the portion of design still available for modifications ('above the line') or already decided ('below the line'). 5. Send Fabrication Data: initiates wire cutting tool-path to the robot.

Show Assembly Instructions
Hand gesture, Air Tap and Drag, are registered with the headset's sensors, activating the interface. Augmented Reality goggles display the proposed design on-top-of the structure.
2) Architecture: AR-interface is used in the making of an architectural-scale brick installation. The design emerges from the assembly of individual bricks. Its gradual growth is subjected to user's input, with the following limitations: 1. Tower's dimensions respond to site conditions. 2. Base-to-canopy ratio is limited to ensure structure stability.
3) 3D modelling: digital design typically required previous experience with 3D modelling environments.
To include non-professional users in a collaborative design effort, a new, intuitive AR-based interface was developed. The new design pipeline makes use of sound inputs and hand gestures, to generate and modify the tower's design. Holographic-Interface 3D modelling takes place in two scales and is performed as follows: design of the tower is attained with hand-gestures, shifting control points. Design of the tower's façade-pattern is performed through the processing of audio signals. Sound samples are divided into channels: tone, pitch and volume. Each façade element receives a unique pattern, specific to each user (Betti et al, 2018). A bespoke 3Dmodelling pipeline discretizes the global form into bricks. The bricks differ in surface texturing and the geometry of the ___ contact surfaces (responsive to the tower's overall geometry) to generate tower's curvature. User's inputs are registered in the CAD/CAM environment and updated in real-time with any new data. 4) Robotic Fabrication: on-demand fabrication, using a robotic arm placed in-situ is controlled with the interface. A bespoke pipeline translates 3D modelling information into robotic machine code. Toolpath was designed to execute bricks design, with minimal leftover material. Each tool-path generates five bricks, cut in black foam. Tool-path was designed to execute brick's joinery and façade-details, with minimal leftover material. 5) Assembly: Users assemble the bricks manually. Assembly instructions are shown in AR and update as the structure grows. Once a series of five bricks is cut, animated arrows indicate its orientation, location and assembly sequence. Shifting between visualization modes allows users to evaluate the ongoing production process, to inspect the individual and the overall structure.

RESULTS
The installation lasted two days and was visited by general public and colleagues, ranging in age and design experience. The following workflow was exercised: installation space accommodated three production-related areas, designated for 3D design, numeric fabrication and assembly. A graphic tutorial exhibited on-site introduced visitors with the different holographic design modes and their related 'Action Buttons' . Participants were first introduced to AR space and operating-gestures: wearing AR goggles and practicing Air Tap and Drag commands, and interface-navigation. In taking on collaborative design, participants explored digital design and fabrication using the 'Action Buttons' . Each user in turn added to the global design of the structure, fabricat-ing and assembling a series of five bricks.
Crowd participation was somewhat limited: while people were keen to try on the AR googles and experience the interface and installation in AR, participation in design was in smaller numbers. We attribute this mainly to limited time available for each visitor to interact with the installation.
Human-machine collaboration. Most interface 'Action Buttons' were reported as easy to use. Hand gestures shifting 'Control Points' , were sometime not registered. This could be due to its location at the very top of the Holographic space, missed by the sensors; and errors in reading caused by sun glare coming from the windows of the installation space.
The AR interface was extremely successful in enabling a productive interaction with the robotic arm, Overview of the installation through the holographic interface allowing continuous fabrication and no physical interaction between participants and robot.
The hands-free and mobile interface made easy the free-form assembly of multiple discrete elements, otherwise extremely arduous. Visual feedback enabled design in context and accurate fabrication. The sequenced design and fabrication process realized for non-professional was clear and easy to follow.
Human-human collaboration. The designed work-sequence had a large effect on the nature of the practiced collaborative design. Though the installation relied on multiple design inputs its outcome was bound by design definitions, made beforehand. The pre-requisites were aimed at increased, speeded fabrication of minimal waste and installation-timeline. Bricks' dimensions, for example, were pre-defined; allowing individual input only on its 'façade' and in joinery detail. Another example in which user input was bound to installation predefined logic was a collaborative design of a tower only in aggregation.
Another interesting aspect of the participatory discourse taking place was the sequential nature of participation. Most participant wanted to contribute to the design only once. The outcome is different than what is usually seen in participatory design, where participants are encouraged to debate and alter the design until the final results is agreed by the majority of the group.
Our initial hypothesis was then only partially verified. We still believe that AR can play a role in promoting collaborative design.

DISCUSSION
1. This paper outlines a novel, integrated and collaborative approach to design and fabrication, enabled through Augmented Reality, robotic fabrication and multimodal interface. In this installation, the custom notational input used for 3D modelling was altered, accessible to intuitive input. We contribute to the emerging field of production research by providing simplified access to innovative design and fabrication tools. Hand-gestures and speech were used as multimodal interaction with CAD/CAM, in a large-scale, multiparticipant effort. Non-professionals easily followed visual cues in AR and utilized their existing communication skills to control 3D modelling and robotic fabrication, with little time of training. In doing so, the focus in architecture-making is shifted from forming the design itself to forming a platform for others to explore design. The public, now in the focus of architecture-making, participates in decision-making creative effort and manufacturing with using cutting-edge technology.
For the architectural field this marks a potential shift to public participation in practice and discourse. In active participation, the public gains better understanding of the impact of the design and has the freedom to alter it according to their wishes. 2. Recent innovations in the optic industry accelerates simulative abilities, enabling 3D representation in-situ. Augmented reality gives accessed to plentiful, varied information. It is advantageous in experiencing a proposed design with context to site or end-user, teaching skills to non-professionals and quality control. In this installation, AR interface was used for 3D design on-site, providing context to the structure; a remote and simple control over automated fabrication tools; instructions for precise execution in manual fabrication; and means of assessing of the assembled struc-ture. We believe AR as a generative and critical tool has the potential in bridging gaps between plan and execution, bringing about adapted design. 3. The Digital Turn (Carpo, 2013) and parametric practice embodied a change in workflow. The use of AR leads a change in design representation, moving from 2D drawings to 3D elements. New ways of communicating architecture might have great consequences on the built environment: in the form of new geometry used and new practitioners in the field, interacting with advanced tech.