Cyber-Physical Emotive Spaces: Human Cyborg, Data, and Biofeedback Emotive Interaction with Compassionate Spaces

This paper aims to link human's emotions and cognition to the built environment to improve the user's mental health and well-being. It focuses on cyber-physical adaptive spaces that can respond to the user's physiological and psychological needs based on their biological and neurological data. Through artificial intelligence and affective computing, this paper seeks to create user-oriented spaces that can learn from occupant's behavioral patterns in real-time, reduce user's anxiety and depression, enhance environmental quality, and promote more flexible human-centered designs for people with mental/physical disabilities. To achieve its objectives, this research integrates tangible computing devices/interfaces, robotic self-adjusting structures, interactive systems of control, programmable materials, human behavior, and a sensory network. Through embedded responsiveness and material intelligence, the goal is to blur the lines between the physical, digital, and biological spheres and create cyber-physical spaces that can ``feel'' and be controlled by the user's mind and feelings.


INTRODUCTION
We spend more than 87 percent of our lives inside buildings. Studies show that the built environment affects our behavior, thoughts, emotions, and wellbeing and has both direct and indirect effects on mental health (Cooper et al. 2011, Eavns 2003. Sarah Williams Goldhagen argues in her book, Welcome to Your World: How the Built Environment Shapes Our Lives, that the built environment has a profound im-pact on people's lives. She believes that "There's no such thing as a "neutral" environment and your built environment is either helping you, or it's hurting you. " (Pedersen 2017) Her studies remark the role of Hippocampus in consolidating long-term memories, controlling our spatial recognition, and preserving the place and even building recognition neurons. She then argues that developing a long-term memory devoid of elements and experiences related to a familiar place is unlikely since our spatial navigation is dependent on the same neural pathways through which we develop autobiographical memories. That reveals the importance of architecture and the built environment and how central they are in the formation of our identities. In her book, Sarah Williams Goldhagen rethinks architectural design and education and attempts to link cognitive science to a new human-centered approach to the built world while exploring the science of cognition in relevance to architecture. (Goldhagen 2017). However, we do not know how the shaping occurs in both directions and what impact the built environment has on our neural activity. Also, how we cognitively place information necessary to design the building. (Fisher 2016).
Recently, there are many efforts to explore what architecture and neuroscience can learn from each other. They encourage architects to take the insideout approach and improve the design of the built environment by increasing their knowledge and understanding of human behavior and neuroscience. The Academy of Neuroscience for Architecture explores the new connections between the two disciplines. In 'Mind in Architecture: Neuroscience, Embodiment, and the Future of Design', it is argued that, through cross-disciplinary efforts, architecture is now viewed and practiced through the lenses of architects, neuroscientists, cognitive scientists, psychologists, and philosophers, to name a few. This provides a historical context into examining the implications of contemporary architecture and imagining the future of architecture as a neuroscientifically informed practice. (Robinson and Pallasmaa 2015). In Cognitive Architecture, Sussman and Hollander suggest new ways to analyze current designs before they are built, allowing the designer to anticipate a user's future experience such as face-processing in the human brain (Sussman 2014).
Nevertheless, all these studies focus on rigid conventional architecture and the criteria for designing the built environment that accords with the cognitive principles that people need and seek in buildings. They focus on how the built environment can shape and impact the user's identities in a one-way interaction. In my research, by converging adaptive architecture, human cyborg, and interactive system of control, I aim to address how cybernetics can implicate space and the human in a two-way weave of interactions, shaping the new "human_space_cyborg", transcending normal limitations. I try to explore how the user can impact and shape the built environment through their brain and thoughts. This research explores the potentials of making a built environment that can change, respond, and adapt to the user's emotional and cognitive needs in real-time.

THESIS
Will the kinetic and adaptive architecture of the future be able to convey information about the feeling, thoughts, and activities of users within the space and address problems of cognition or mood? By collecting biological data, how can smart environments measurably improve the well-being of occupants by autonomously responding to their anxiety or depression? What is the impact of such spaces on reducing stress, addressing comfort needs and PTSD symptoms as measured by affective computing? Through this interconnected world of human, mind-expanding technologies, and sensory environments how the built environment could be the extension of our mind and body? In this research, I want to explore the answers to these questions.

LITERATURE REVIEW AND GAP
Nowadays, Technology is fully integrated into our daily lives and it has become a necessary and unavoidable part of our existence. The increasing practicality of the Internet of Things (IoT), artificial intelligence (AI), innovation in materials science, algorithmic design capabilities, advanced analysis of human factors, and the integration of new tools in communication push our physical environment to be on the verge of becoming an extension of the Internet (Atzori 2010). The Internet of Things and related computation technology merge seamlessly with the goals of adaptive architectural systems, providing tools to enhance the environmental quality of buildings and promote more flexible, human-centered designs. Despite the seamless integration of IoT and the high level of efficiency and sustainability introduced through adaptive thinking, the relationship between our psychology, thoughts, and built environment is rather less explored. Absent is the psychological factor. Future architectural design requires solutions to integrate people, structures, and sensing technologies to arrive at a successful Human-Computer Interaction (Beilharz, 2005). In his review of Yiannoudes's book on Architecture and Adaptation: From Cybernetics to Tangible Computing, Alhadidi remarks on how Yiannoudes in his book overlooks the broader discussion on the impact of advancements in emotional computing (Alhadidi 2017).
Despite the frequent existence of AI in our daily life, architectural spaces remain largely unchanged, because of a separation between architecture as an object, on the one hand, and the needed AI "brainpower" with the proper interface, on the other (Alhadidi 2017). We need a different category by which to think of spatial significance and animate them with emotions and thoughts. The active interaction of the users, as active subjects, through situated and embodied exchange can apply meaning to space (Yiannoudes 2016). Annual Reports from Alzheimer's Association prove that cognitive impairment is on the rise (Alzheimer's Association 2019). Laura Malinin, architect and cognitive scientist and her colleagues conduct research on how the role of enriched environments with novelty, challenge, and engagement in improving cognitive health is undermined when compared to the role of physical activity. For example, according to architect Michael Chapman, stroke's victim recovery mostly happens at home. This shows the importance of designing residential environment with the same standard applied to successful institutional environments where effective use of nature, daylight, and color, as well as navigable interiors are visible. Along with the engaging factors, such spaces should also provide a degree of challenge to serve their purpose of rehabilitating victims with cognitive damage such as stroke victims (Fisher 2016). These highlight the necessity and importance of smart compassionate adaptive spaces in improving the mental health of the inhabitants.

OBJECTIVES AND APPLICATIONS
The objective of the paper is to fill this gap, developing tangible reciprocities between feelings and thoughts, on the one hand, and space, on the other. It emphasizes the potential role of human thoughts, feelings, and emotions as a means of modifying atmospheric conditions. By using transformable structures and affective computing, it fosters a process in the brain, trigger responses in buildings and, therefore, results in "Compassionate Spaces". In the words of Alhadidi, these compassionate spaces are "adaptive machine[s] that can sense, respond to, and learn from stimulus (emotions and thoughts) and provide the end user with the ability to control and modify elements through tangible computing devices and interfaces that provide a "friendly" relationship between the system and users. " (Alhadidi 2017). This research makes use of adaptive systems, interactive system of control, and a sensory network, altogether set into motion to help the human without physical human help. Through cyber-physical environments, it could mean the mobilization of a population now dormant and empower them by understanding and accommodating their cultural, gender-specific, and ethnic needs along with their personal preferences. It contributes to the design of future spaces that will be considered as living organisms, as they can be controlled with user's thoughts and feelings, learn users' behavior, and respond to their deeper needs and desires in real-time. It examines the design and technology and their impact on an under-represented group of people, offering them more equality, independency, and a chance to change their environment based on their needs. It also focuses on providing places and spaces that can respond to emotional deficiencies and create balanced and supportive human-centered environments.
Among its many attributes, it has significant implications in the medical field, providing augmented assistant living for people with physical disabilities, motor system disorder, elderlies, and neuromuscular diseases, ultimately empowering them to regain control over their environments and live more equal and independent lifestyles. It can autonomously respond to needs at their ebb and flow instead of designating human help at unnecessary hours. It can also make caregiving institution and people aware of feelings of people with PTSD and Autism. Thus, the built environment not only addresses the symptoms for treatment but also plays a preventative role in informing caregivers about inflictions before they happen. For example, it can help children with Autism Spectrum Disorder (ASD), who suffer from the inability of the nervous system to filter sensory input to determine an appropriate response, regain their ability for appropriate responses by amending their sensory regulatory environment, integrating physical and visual feedback. Kids will be able to better communicate with the outside world, overcoming an overintensified sensory experience and a dis-regulated state.

EMOTIONS, COMPASSION, AND CYBER-PHYSICAL SPACES
This research is built upon the premise that compassion might find expression in the built environment and that we are moving closer to the cyber-physical spaces. A growing body of evidence suggests that compassion is a natural and automatic response central to survival and vital good health (Goetz et al. 2010). As Dacher Keltner proposed: "compassion is an evolved part of human nature, rooted in our brain and biology. " (SEPPALA 2013) Compassion engages specific patterns of neural activation. As the state of compassion is associated with expressive behavior and physiological response (Goetz et al. 2010), this research explores the ways in which emotional data can communicate with the built environment. The aim is to create a cyber-physical and responsive environment that can help users heal and feel bet-ter by correlating their emotional and neurological data with different changes in the environment such as light, natural air, colors, and new function. Andy Clark, philosopher and author of Natural-Born Cyborgs, believes that humans are naturally born Cyborgs: "not in the merely superficial sense of combining flesh and wires but in the more profound sense of being human-technology symbionts. " (Clark 2004) He believes that through the symbionts of the biological brain and nonbiological circuitry, there is a mutual interaction and transition between our mind and space. As cyborgs, we already possess the circuitry to do so; we only need to activate it. This research relies on human brain openness to information-processing, mind-expanding technologies, and sensory environments to blur the line between physical, digital, and biological spheres and rethinks the built environment as part of the mental apparatus of the user. It aims to create a reciprocal relationship between the human mind and the built environment, each shaping the other.

Figure 1
Predefined configurations assigned to the adaptive wall prototype based on psychological consideration to decrease the level of stress. Emotions "are adaptations to particular survival related situations. " (Ekman 1992). According to the Affective media group at MIT, emotions are fundamental to everyday human tasks and experiences. They influence cognition, perception, and the way we learn, communicate, and make decisions. Our senses filter external information into useful thoughts, feelings, or actions. Emotions are the result of activities in the brain, which when fired up, signal calls for decisions regarding what is going on around us (Schwarz 1990). But if emotions are necessarily part of who we are, how might we measure it? Feelings are indeed difficult to communicate. Many times we communicate with one another in the absence of words, making it clear that nonverbal, synthetic, holistic, and gestalt ways such as nonverbal body language, facial expressions, and voice intonation are the best for processing emotion. Across radically different cultures, emotional expression involves similar behaviors in body, face, and voice that signal specific intentions and motivations (Matsumoto et al. 2008). Paul Ekman, the American psychologist, conducted seminal research on the correlations of emotions and biological signals and discovered that there were psychophysiological differences across emotions, such as the relation between emotions and facial expression (Ekman 1993). Emotion-specific physiology, emotion-specific autonomic, central nervous system activations researches enable us to correlate emotional biological signals with neurological parts (affective neuroscience), extracting emotional data from biological signals. The key is to translate all those signals to emotional status. The objective of this experimental model is to implement visualization, simulation, and fabrication technologies to arrive at a design method that influences our sense of space by intertwining physical, artificial and virtual dimensions of space. To achieve the goal of this project, I plan to pursue the following objectives: 1. Measure and analyze Biosignals 2. Translate signals into changes in an adaptive structure

METHODOLOGY
My method to create responsive cyber-physical environment involves tangible computing devices/interfaces, robotic self-adjusting structures, interactive systems of control, programmable materials, human behavior, and a sensory network. In my experimental model, biological and neurological signals are essential to identify the precise emotions and feelings of the users, and they address the challenge of depression, Autism, and PTSD in people. My approach is to use affective computing to create reconfigurable spaces that use biosignals (e.g., heartbeat, perspiration, and skin conductance, sweat secretion, muscle tone, and body temperature) of their occupants to autonomously respond to their feelings and needs. Examples of these responses include changing the size, location, and shape of a window, a wall, a display, or the overall shape of the space. For example, if the occupant feels hot or depressed, space increases natural light, provides fresh air, and offers a view to ameliorate the occupant's condition. (Figure 1) Researches have established relationships between mental health and factors such as fresh air, natural light, housing quality, privacy, furniture configuration, and personal control (Wells and Harris 2007, Evans et al.2003). To achieve the goal of making our thoughts become materially tactile, this research requires a multi-faceted approach. Part of it is engaged in a sensory network of collecting data and understanding the human condition while other parts involve structures, actuation systems, and materials that can respond in kind. This project is produced at the convergence of architecture, computer science, psychology, neuroscience, and material engineering.

Figure 2
The figure shows the variation of low-to-high frequency power of the RR-interval spectra (top) and variation (bottom) as two important heart rate features in classification of valence (right) and dominance. We used these data to correlate biological data with emotions and change the built environment according to the user's emotions.

Measure and Analyze Bio Signals
Building upon the premise that a body is a continuum between the environment and the mind, in this project, we used technology to measure biometric data in order to define emotions and actively reconfigure the physical space. Human emotions are complex and varied. Two orthogonal dimensions known as valence and arousal are commonly used to capture the broad range of human emotion. Valence denotes the pleasure-displeasure or positive-negative aspect of emotions while arousal captures its corresponding activation or excitement. (Figure 2) While regions within the brain's limbic system are primar-ily responsible for emotions, different emotions give rise to subsequent measurable changes in physiological signals such as skin conductance and electroencephalography (EEG), which can then be analyzed for emotion recognition. Our autonomic nervous system plays a role in enabling reactions that are emotion-related (Janig 2003). When correlated with the sympathetic autonomic nervous activation system, emotions could be extracted from neurological signals. In our model, using affective computing, biological and neurological signals such as heartbeat, perspiration, brain voltage, and skin conductance, sweat secretion, and body temperature are being measured to identify the precise emotions.
The project is to equip users and the environment with smart embedded devices (such as Empatica E4 smartwatch and Emotive EPOC +14 Channel EEG Headset) that can collect users' real-time behavioral data, mental and physical, within the realm of IoT.

Biological Emotion Detection. Empatica's E4
wristband was used to collect heart pulses and skin conductor, revealing the inner perception of outer reality. Collected data with the wristband can be streamed through Bluetooth, and this stream can be viewed live on the E4 real-time app. By downloading data in CSV format (raw numbers), we analyze the data and extract emotions. In particular, data for galvanic skin response, blood volume pulse, and skin temperature appear to contain patterns correlated with emotions. The changes in biological measurements are indicative of factors such as excitement, arousal, or stress. Real-time data from the wristbands is to be run through the trained model and predicted emotion relayed to the aforementioned server (which in turn causes the physical model to respond to predicted emotional state). The data for training machine learning models are being collected both in-the-lab and in longitudinal fashion through normal daily living activities. It has been used for running local and cloud (AWS) training experiments. (Figure 3). To translate biosignals to an emotional status, we use fuzzy logic and machine learning techniques, e.g., deep neural net-works, Random Forest, and Decision Tree to create emotion recognition algorithms. Using IDE (Py-Charm) and Python.exe we can view the data. Interactive ground-truth data collection solutions are used for training these algorithms. Through comparing the accuracy of different machine learning algorithms, the results show that applying Decision Tree on sensor data from wearable wristbands yielded promising preliminary results (86% accuracy) in recognizing 6 basic human emotions and mental states. We tracked emotion combining skin conductance and heart rate measurements. We developed a multi-timescale state-space model incorporating both these point processes for tracking both valence and arousal. Skin conductance and heart rate provide complementary information regarding emotion and can be combined to provide a better estimate of mental state. (Figure 4) Figure 3 (b)The highlighted row shows how biological metrics reacted during intense negative emotion. GSR and HR mean both spiked, and IBI at its lowest which indicate stress. (c)Mapping emotions with biological data (d) wall reconfiguration/emotional interaction based on changes in biological data.

Neurological Emotion Detection. Catherine Malabou, in her book What Should We Do with Our
Brain? explains adaptability as plasticity, as a quality by which our brain changes throughout lives. She believes that our built environment can be impacted by new transformative findings in neuroscience, such as how expressions and gestures we display on the surface reflect the neurocircuitry deep inside our brains (Malabou 2008). In doing so, we used the Emotiv EPOC 14 channel Electroencephalography to collect emotional data from the brain through voltage measurements at the scalp. There are multiple works related to using signals from EEG to drive motors, such as servo motors controlled with the arm (Somer et al. 2016), wheelchair controlled with hand movement (Huang et al. 2012), and actual movement by imagining the action (Guger et al. 1999). These researched activities are more physical rather than emotional. Emotiv Community SDK provides an API for extracting certain pre-processed information (from the raw data) related to the wearer's brain, which functions like performance metrics related to the user's emotional state such as Engagement/Boredom, Long-term Excitement (arousal), Instantaneous Excitement, Focus (attention), Interest (valence), Relaxation (meditation), and Stress (frustration). Ground truth data provided by the user is used as training data samples to infer the collected data and the correlated performance metric. In addition, frequency bands measure the speed of processing information and interaction with other regions of the brain. There are four band measurements available through the SDK namely Theta, Alpha, Beta and Gamma for every sensor. Each of these bands is related to specific emotional status. We focused on phase synchronization, coherence, and the correlation between signals (as opposed to data from a single sensor) to distinguish emotional state. By using performance metric, frequency band measurements, ground truth data provided by the user, and fuzzy logic we are trying to decode emotions from the brain and correlate them with the changes in the built environment. At this point, we can change the wall configuration by relaxed state, excitement, and engagement. Below is a table that illustrates the correlations between the band-wave-measurements and ground truth data. The graphs show that theta waves (largest spikes in the first graph) are wellcorrelated with excitement, alpha waves (green) are well indicative of relaxed state. For the engagement, it appeared that 'stress' performance metric was indicative. We used these three measurements to indi-cate the state, and therefore, wall configuration. (Figure 5).

Translate Biosignals into Actionable Changes in an Adaptive Structure
The proposed architecture is a flexible and reconfigurable structure that can undertake different configurations through active shapes and kinetic components. To accomplish its objective and create experimental spaces that correlate user emotion and motion of the building, we started with a wall as a case study for such structures using embedded responsiveness and material intelligence. This prototype can change its shape, size, location and open up its surface to a view, to control the light and natural ventilation, or to ameliorate the occupant's condition. Once all the elements come to full integration, the same method could be applied to larger, inhabitable structures. Here the built environment is treated as a learning machine capable of feedback cycles, re-ceiving data and implementing change as needed. To achieve an adaptive wall that can respond to the user's emotions we took below phases:

Identify Maximum Flexibility in Adaptive wall.
In this phase, we designed adaptive structures that have the ability to change according to emotional data. An intelligent adaptive architecture can vary its mobility, form, and geometry. It has three main components: structural engineering, embedded computation, and flexible mechanisms. Maintaining stability while allowing flexibility is a major challenge in adaptive structures. The stability of a building requires rigidity and determinant forces as opposed to variable forces in flexible structures. To identify the balance between flexibility and stability we used simulation tools such as Grasshopper, Kangaroo Physics 2, Ivy plugin, and Python to design, simulate, and test a series of kinetic components, deployable structures, and transformable modules for different configurations. We computationally analyzed the wall's structure to detect weaknesses and optimize its form by applying different combinations of stress distributions, torques, bending patterns, natural forces, hinge forces, and plasticity. The algorithmic logic is used to perform predefined operations, calculate the results, perform simulations, evaluate the design strategies, and subsequently generate an optimized parametric design output. (Figure6)

Identify Proper Actuating System and Materials.
We also identified proper actuating systems and explore the potential of using new programmable materials and soft robotics as actuators. We explore three main approaches to the actuating system. First, mechanical actuators, e.g., servo, stepper motor, and linear actuator. Second, programmable materials such as shape memory alloy (SMA), conductive materials, and pneumatic systems and soft robotics. We explored the potential of smart materials that can be changed by external stimuli such as temperature, moisture, or electricity in a controlled fashion to activate kinetic components. Third, activating inherent material flexibility to adapt to a new configuration by (i) geometrical changes such as adding porosity to wood (e.g., cutting holes) that activates its flexibility potential; and (ii) integration of rigid and flexible materials to create composite ones that can simultaneously have both properties, e.g., rigid stiff resin can be used on the base of a module on the wall that requires stability while soft materials can be used on the outer parts of it for flexibility. (Figure 7) Figure 6 Computational analysis of the wall's structure to detect weaknesses and tables representing the stability of the adaptive triangular wall for various torque (left) and for varying angular drag (right).

Figure 7
Study of SMA (Shape Memory Alloy), conductive materials, pneumatic systems, mechanical actuators, and soft robotics for actuation and structural transformations.

Correlate Emotions with Structural Changes.
In this phase, we mapped the cognitive and emotional state of occupants to structural changes. The wall should automatically offer additional openings to increase the natural light and ventilation to moderate the subject's emotional status. We examined valence and arousal responses corresponding to different structures. The state-space methods approach, in combination with machine learning techniques, can detect changes in biomarkers. Then we mapped a relationship between a person's feelings and the optimal configuration of the structure around them. Once a mapping from architectural configurations to emotional changes is established, we implemented the reverse process for real-world deployment, i.e., making configuration changes to an adaptive wall via actuators to drive an occupant's emotions toward a pleasant or appealing memory of a feeling. After collecting data in my lab, where subjects were shown media to provoke feelings, then walls communicated using REST API with a Django server storing the emotional data. A change in the latest stored state triggered a change in the walls or changes the color of the lights. We used Raspberry Pi devices to poll the emotional data and control the actuators. (Figure 8)

Figure 8
Illumination and operation of the wall prototype using user's emotional data, Emotional interaction via face gesture, vision commands, mobile app, and wall reconfiguration based on facial, tactile, and visual emotional expressions.

FUTURE DEVELOPMENT
In addition to reevaluation and improvement of what has been done, further development will focus on the cognitive and sensory network infrastructure to autopilot tasks for space. Smart spaces will come to learn from users and change the environment based on the users' behavioral patterns. This ma-chine learning analysis and evaluation of successful interactions would make the system gain experience in order to optimize its prioritized tasks such as user healing, user comfort, and adaptive structural stability. Our future development can take advantage of machine learning techniques to endow spaces with personality and character of the user.

CONCLUSION
This paper seeks to address the ways in which datadriven design strategies can autonomously respond to human needs within human-computer interaction in adaptive spaces to improve mental health and well-being of the occupants. By focusing on affective computing this research presents an alternative method by which to transcend the limitations of our spaces and empower the human to accomplish tasks it currently unable to perform. This research pushes the boundaries of what can be achieved to improve the human affective experience within space. This approach is a framework to transform the built environment into a living organism that is networked, intelligent, sympathetic, sensitive, adaptive, and yet under the comprehensive control of the user. The life-like behavioral spaces influence personal health as an autonomous heterogeneous agent. The outcome of this ongoing research is a platform for designers to study and create new possibilities of architecture in the current technological moment in collaboration with other disciplines.