A Framework for a Five-Axis Stylus for Design Fabrication

This paper proposes a new workflow between design and fabrication phases through the introduction of a novel framework centered around a stylus that is tracked in real-time for five-axis by a single RGB-D camera. Often misconceived as a linear process, urgent reinterpretation of design and fabrication tools is discussed briefly. Similar to how industrial robots have become an enabler for fabrication process in the field of architecture and construction, the necessity for providing a similar tool that would reform the ``design'' process is underlined. A generic stylus is proposed with interchangeable operations which allows for intuitive, non-obstructive grasp of the user serves as the physical avatar that transform into a virtual representation of a fabrication tool mounted on a six-axis industrial robot arm. User interaction with the apparatus is simulated for the user, and the user is notified of any errors as the interaction is translated for motion planning of a KUKA KR20-3 industrial robot.


Evolution of Tools for the Architect
As architecture evolved from its roots in crafts into its modern status, the tools employed by the architect and the architect's interaction with these tools have showed great diversification.This evolution marks the shift from an autographic practice in which the architect was exposed to the tools of fabrication, into an allographic system where the architect was expected to transform the data for fabrication into a notational structure through tools of abstraction.Mario Carpo pinpoints the exact time of the new definition of architecture's allographic and notational status to Leon Battista Alberti's theory and his treatise De re Aedificatoria (Carpo, 2011).
Alberti had strived to develop tools for exact reproduction through notational systems as evidenced both in Descriptio Urbis Romae and De Statua.No matter how detailed they are, there is an inevitable drawback inherent in all notational systems.Allographic tools can transfer only measurable data which can be encoded through a notational system.It contains a certain degree of abstraction in the making of the physical piece as data informed through the tools of fabrication is diminished in comparison to pre-allographic tools.A dichotomy is introduced in the process of making as the previously holistic process is remodelled into a linear twostage approach in which, generally speaking, "design" precedes "fabrication".Status-quo adoption of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) systems by the majority of architects in their workflow demonstrates that architects utilize these tools as to further the Albertian Paradigm, deepening the chasm between these two stages as well as encouraging the ill-conceived linearity of design and fabrication.

A Hybrid Proposal
Appropriately used, digital tools can facilitate as a two-way link between design and fabrication.It allows to collapse the division between design and realisation as well as the distinction between intellectual work and manual production (Carpo, 2009).They allow the designer to tinker with the method of production closely, becoming almost like a prosthetic extension similar to pre-industrial craftsman and his pre-allographic tools.It fosters thinking through making, a method in which sentient practitioners and active materials can continuously correspond in the generation of a form (Ingold, 2013).Established properly, this reinterpreted interaction with digital tools can bolster the conviviality of these tools, an affinity towards familiarization and control of a production (Illich, 1973), which restores the bond between the designer, the tool and the fabrication.Designers can take comprehensive design decisions only as shortcomings in tools of abstraction can be recognized and the awareness of informing the process through the limits of tools for fabrication is raised.
The industrial robot is considered an enabler as it has industrially proven its ability to perform an unlimited variety of non-repetitive tasks reliably and providing the architect flexibility to inform robotic fabrication through architectural experience (Gramazio, Kohler, & Willmann, 2014).Albeit not strictly anthropomorphic, industrial robots provide a 6 DoF movement that can replicate the acts of a craftsman or a maker.Robot is merely a generic manipulator, and the emphasis is placed upon the manipulation of the material and the tool (the end-effector).
In a similar approach, a generic apparatus for the designer to mediate between design and fabrication is proposed.Through its genericity, it enables an intuitive grasp that is variant depending on the task at hand.Instead of serving a specific function, it molds into a virtual representation of any tool in a mixedreality environment.As the designer takes design decisions utilizing this physical avatar which transforms into a tool for fabrication of his choice in mixedreality, he intuitively can program a six-axis robot in the process as his motions with the apparatus can be translated for motion planning for the industrial robot.Fabrication feedback of the design is instant, and the designer can quickly reflect on his process and iterate the design.

RELATED WORK
Precedent attempts in mending design and fabrication stages with an industrial robot as a fabrication enabler in the field of architecture have primarily focused on understanding various crafts and translating the gestures of craftsmen, such as that of a stone mason (Steinhagen, et al., 2016) or a carpenter (Brugnaro & Hanna, 2017) to gather data to analyse and optimize the fabrication enabler for future processes.
A previous study has focused on customized robotic tool path creation through a tangible controller that mimics an industrial robot (Payne, 2011), similar to puppeteering a scaled model.Even though they provide an intuitive motion planning for the manipulator, puppeteering and anthropomorphic teleoperation studies are disregarded as they do not resemble how a craftsman interacts with his tools and counterintuitive from this point-of-view.
Tracking user gestures and translating them as user interaction is ubiquitous backbone of the field of Human Computer Interaction (HCI), and motion tracking is a mature field of research.Thus, the primary aim is not on the development of a novel technique in the field, but its implementation in the aforementioned context.Developed as a novel costeffective active stylus tracking method (Bubnik & Havran, 2015), light chisel restricts the user to a chisel grip.Tracking of a stylus through a motion capture system, such as that of Vicon, is possible through the placement of custom markers on a target and their tracking.Although it is an established and precise method of an object, it requires an expensive Figure 1 Proposed workspace for user interaction and its correspondent workspace for the industrial robot.setup and is best for tracking larger targets.Dode-caPen tracks an object with a monocular camera and the placement of 3D-printed dodecahedron with binary markers on an object, providing a precise tracking of a passive stylus with sub-millimetre accuracy (Wu, et al., 2017) and proposes a reliable methodology to work with in the future experimentations.Another method that can be integrated is the fusion of Magnetic, Angular Rate and Gravity (MARG) sensor and vision sensors for tracking the orientation and placement (Chintalapalli, et al., 2014).Even though most commercial HCI devices for mixed-reality environments are easy to get a hold of and tinker with, they are omitted from consideration as they come in a pistol grip form factor, a counter-intuitive scenario for a craftsman.zSpace, a commercial product with stylus interface working in tandem only with builtin tracking systems on dedicated displays which are viewed through custom glasses is neglected as a basis to develop the framework.
There is no prior study in developing an intu-itive tool that links the gestures and the actions of the designer with a generic apparatus that serves as an avatar for a virtual tool for fabrication while translating data to form the motion planning of a six-axis robot.

SETUP Workspace
An Intel Realsense D435 that utilize active IR stereo cameras as depth sensors is placed on a tripod on a table directly facing the user.Even though the depth camera can track objects up to a range of 10 meters, it starts to drift noticeably after a certain distance.Tracking of the objects within 30 centimetres also yield unreliable results.As such, the tracking space is limited within a distance of 30-110 centimetres in depth.For convenience, tracked space is also limited on a horizontal space of 80 centimetres and vertical space of 80 centimetres.Projected tracking space is indicated on the table.A laptop is placed next to the depth camera to provide visual feedback to the user.A six-axis industrial robot arm, in this case a KUKA KR20-3, with custom hot-wire cutter is used for the simulations and the fabrication.Tracked space correlates to the maximum reach of the specific robotic arm (r=1600 mm), corresponding to a 1:4 scale in conversion.The industrial robot is simulated as to be located at the center of the base plane of the confined space.(Figure 1)

Stylus Design
Design aim for the apparatus is to keep it as simple and as generic as possible, with the least amount of user guidance and material obstruction to accommodate a variety of holding and grasping gestures with an intuitive hold.Considering the stylus is expected to serve as a generic intermediary and as there will be no tracking of an active stylus, a Caran D'ache Fixpencil 3 have been repurposed as the prototypical stylus body rather than designing and making a new stylus from scratch.This deliberate design decision also allows ease of mounting for interchangeable stylus tips.End cap and the stylus tips are custom color coded for tracking.(Figure 2) Utilizing the principles of a mechanical pencil, stylus is intended to have 3D-Printed tips that are interchangeable depending on the desired application.

APPLICATION
Initial test scenario for the designer to experiment with is (robotic) hotwire cutting of foam.Designer's interaction with the stylus is tracked and simultaneously simulated for robotic motion control on a screen.Location tracking of the stylus within the aforementioned workspace is achieved through object tracking by color using OpenCV.Center points for tracked color blobs are marked for pixel location in the RGB camera.Depth values matching these pixel locations in the stereo map are retrieved and appended as z-values for the marked locations.Realtime data is transmitted from a custom program to Rhinoceros 6 and Grasshopper.
Longitudinal axis of the stylus is defined by a vector through these points.In the case of robotic hotwire cutting, longitudinal axis of the stylus matches the wire direction that becomes the Y-axis of the end-effector.Midpoint between the tracked points becomes the tool center point (TCP) for the manipulator.Collection of these TCPs, which is intrinsically informed through designer actions, encapsulate the robotic motion path as the apparatus helps designate the tool XYZ coordinates and rotations around these axes.Robotic motion path and its immediate simulation is displayed on a screen utilizing the KUKA|prc plug-in for Grasshopper.User gets an error message in the case of out-of-reach, axis singularity or self-colliding solutions and is prompted to revise the input.(Figure 3)

RESULTS
As of this writing, there have been limited number of trials utilizing only the hotwire-cutting scenario.However, initial perception of the stylus is promising as an intermediary design input that is highly intuitive in terms of translating and informing designer actions for fabrication.The workflow establishes a practical link between the design and fabrication phases as intended and the visual feedback instantly informs the user regarding the actions taken.
Nonetheless, some imminent improvements are deemed necessary.The raw data needs to be filtered for jitter caused by the working principal of the depth camera sensor.Since this was a preliminary study to establish ground work for work to follow-up, the accuracy has not been a pressing issue.The workflow needs to be compared and tested against commercial motion tracking systems to prove its viability.As improvements in accuracy are implemented, correction for camera lens distortion should also be reviewed and accounted for.
It is noted that unexpected interaction with the stylus, grasping it solely from the ends, leads to obstruction of tracking due to utilizing a single depth camera as data source.In addition, rotations along the axis of the stylus -roll -cannot be tracked consistently.While this was not a problem in the case of hot-wire cutting application in which the affecting tool has perfect symmetry along its affecting axis, it should be resolved for other applications.

CONCLUSION AND FUTURE WORK
The research is an attempt at restoring the designer's association with tools of fabrication as well as rein-terpreting the tools of abstraction.The new stylus is not a tool of abstraction nor a tool for fabrication -it is a new hybrid tool with both allographic and pre-allographic qualities as the dichotomy of design and fabrication is abolished.Physical articulation of embodied input and simulated output can enable a wider adoption of revamped technologies.Thus, future work will involve the comparative assessment of how the stylus and the workflow be adopted by different user groups ranging from novice design students to professionals.
Integration of an inertial measurement unit and utilizing magnetic, angular rate and gravity data is essential to increase precision to filter jitter in sensor data and provide additional data on the rotations of the stylus while tracked positions are obstructed.Furthermore, it will also permit the tracking of stylus roll consistently.Other stylus tips will be implemented and user interaction will be analysed.
Presently, only visual feedback is provided to the user based on the interactions.Inclusion of even the most basic haptic feedback for error notice is expected to enhance user engagement in the process.Currently, all gestures are translated in a fixed scale.An intriguing aspect of a digital tool that blends design and fabrication would be its ability to surpass a confined scale and allow its user to experiment between different scales instantly.
Figure 3 User interactions yield TCP location and orientation over time, which is used for motion planning for the robot.