.touch the heart.

your heart is more than one and less than two

We often dissolve ourselves into stereotypes, belonging to a social class, a gender, a particular culture. Each one is able to give up part of herself imitating others, only to achieve perfection. However, each one builds its own account of its existence facing configurations and behaviours that are difficult to predict. A permanent oscillation of actions and reactions influences our personality and shapes the world in which we all live. What we feel is the central question to understand human relations. This implies an open dialogue and examining the dark corners of our soul. How can we understand other beings or hope that they can understand what we ourselves cannot?

INTERACTIVE INSTALLATION

2020

Touch the Heart interactive installation, is built around two "hearts" hanging from the ceiling, with which people interact, pulling or pushing, letting them oscillate, and possibly collide. A Kinect sensor detects both hearts and participants. Visuals are projected from the ceiling onto the floor, created in real-time by a computer program. Unlike many projects that perform projection mapping onto objects, in Touch the Heart the visuals are synthesised so as to surround the shadows of the moving hearts on the floor, which themselves result from the projection itself. Based on the position of each shadow, circular waves are created which become linked when the shadows approach each other. It also considers heart collisions, and the relative positions of participants and hearts to create zones and bonding effects between them. The whole rendering varies along time, in a day-and-night cycle. Sound effects are also synthesised in real-time, considering hearts movements and collisions, and the bondings between persons.

Background

Touch the Heart originated in the summer of 2017, intentionally as a web-based project. Curated by Doreen A. RĂ­os for the Wrong - New Digital Biennale, the online version had its premiere at Empty vessels || Vulnerable bodies It was also exhibited at FILE ONLINE 2018, and is now at: arianing.eu/touch-the-heart

The decision of deliberately starting this project online was our vivid protest against the situation lived by many artists in our city, where it is so hard to find a place to present any interactive or performative artwork. Even in non-conventional places, we spend our time and money only for a few people to participate. The online space is the only one still left. But we need more. Artists need to push the boundaries of audience experience.

The current incarnation, Touch the Heart installation is a physical version, being a completely new artwork. It is very different, not only because it is an actual installation, where the shadows of the hearts play an importante role, but also because it required writing the software from scratch.

I want to thank Eurotux for providing us a place to deploy, test and exhibit the installation. Even if it was not fully suitable for the artwork, it was very important to have it available. It remains difficult to find adequate places for exhibiting such interactive installations in our city.

Hardware

computer, projector, kinect sensor, 2 speakers, 2 "hearts" and ropes

Software

Rust, Javascript, OpenGL Shading Language, Gibberish

Software Architecture

The software for the project involves a client-server architecture, written in Rust, JavaScript and GLSL. The server, written in Rust, receives sensor information from the Kinect, processes the depth map and video information and generates information for the client. The client, written in JavaScript and GLSL, runs on a browser, communicates with the server, produces the visuals and synthesises the sound. Visuals are produced by custom shader in GLSL and the the sound synthesises uses the Gibberish library.

The server, written in Rust, involves a multi-threaded architecture, with several threads playing different roles, communicating by crossbeam channels. Communication with the client is via websockets, through the tungstenite library, with data serialized in the JSON format, using serde. A wrapper for the freenect library, for the Kinect, was also written. To facilitate the deployment of the installation in a new space, a training mode allows detection of the projection zone in relation with the Kinect, as well as the relationship between the hearts positions and their shadows on the floor. For this, projections are made, the deployer moves hearts around, and both video and depth map information from the Kinect sensor are used, to obtain data so that a least squares curve fitting is performed, to determine model parameters. This made use of the cgmath and nalgebra linear algebra libraries. When the installation is running, only the depth map information is used as input.

The main role of the client is producing the visuals and synthesising sound, according with the information received from the server. Except for the training mode, the rendering while running is performed by a custom written shader in GLSL. The rendering is performed based on the calculated position of each heart shadow; it creates circular waves around each shadow, which become linked when the shadows are close to each other. When hearts collide, explosion-like waves expands along the floor, starting with an intense glare which diminishes along time. The whole rendering has colours which vary along time, in a day-and-night cycle. A spatial simplex noise component is used to produce some pseudo-randomness, both for the circular waves, collision effect, the bonding effects and the zones related to groups of persons.

Sound synthesis is performed using the Gibberish library. This library performs per-sample processing, allowing more effects than the block-rate processing of audio graphs in the WebAudio API. Sounds are generated for the heart beats, for the collisions, for the bondings between persons, and also a background sound.