Dancing with Sheldon! BMC meets Wearable_Dynamics

Dancing With Sheldon 2016 by Paola Tognazzi from paola tognazzi on Vimeo.

Code the algorithm of the Jedi in you! Design Methodologies of programming code applied to the study of physical coordination and choreographic composition.
It’s a workshop that examines how technology can translate the vocabulary of a performing body into code, disrupt it and create a new one through an interactive audio-visual tool. it’s designed for artificial intelligence academics, interactive installations developers, dancers and musicians.
The workshop integrates methodologies of body expressivity with the interactive tool WearMe_SuperNow, bracelets with motion capture sensors, and the software tools “Dancing with Sheldon” and «Choreographing Spaces» to orchestrate group audio-visual compositions and learn movement dynamic qualities. The participants, wearing the motion capture sensors, will explore how to change the physical quality of their movements transforming one music sequence into a generative system of music variations. At the end we will compose together a 10 minutes group performance.

I transcribed the code of the semantic of 3 dance techniques (BMC, Cunningham and Flying Low by David Zambrano) into an interactive digital tool.
Cunningham technique works on the ability to change direction at will and play with a dynamic range of speeds from fast to slow and vice versa.
Flying Low explores the primary laws of physics, cohesion and expansion, utilizing simple movement patterns, speed, spiraling and the release of energy, to activate the relationship between the center and the joints.
The Interactive tools Dancing with Sheldon and Choreographing Spaces, translate these principles into coding parameters and apply them into specific tools for tempos and direction’s manipulations.

Introduction of the theory behind

One could say that, in the human body, the brain is like a real time coder at an exponential scale applying loops and conditionals to the various body parts, muscles, tendons, ligaments… to coordinate together the physics of the movements.
Software Programming use specific communication syntax and tools: such as iteration, conditionals, random functions… They are not only design tools, their relevance is amplified to the decision making process.
When I use a conditional the action is isolated to the parameters within the condition leaving the possibility to differentiate the actions between objects.
We could also say that Body Mind Centering is a system and a technique to visualize the effects of the different coordination’s. It engages conditionals in the syntax’s movement. For each movement quality, for each fluid type, the person has to engage some body parts while releasing others implying different coordination’s between the systems.

Body Mind Centering is an experiential study, developed by Bonnie Bainbridge Cohen, based on the embodiment and application of anatomical, physiological, psychophysical and developmental principles, utilizing movement, touch, voice and mind. Its uniqueness lies in the specificity with which each of the body systems can be personally embodied and integrated, in the fundamental groundwork of developmental repatterning, and the utilization of a body-based language to describe movement and body-mind relationships. It has the context of self-discovery and openness. Each person is both the student and the subject matter and the underlying goal is to discover the ease that underlies transformation.

This workshop offers an analogue reinterpretation and application of the programming tools, to the systems of physical coordination of the body. We will analyse the semantic variations that occur, which define the identity of the movements.
It is the result of a process of research into new forms of composition and expressive communication, I am conducting, at Wearable_Dynamycs, using the interactive tool WearMe_SuperNow.

3 focuses:

1. One of physical exploration of the connection between arms and back and the different qualities of body movement patterns, using fundamentals of techniques like Body Mind Centering, Cunningham, David Zambrano and Body Weather (Min Tanaka).
2. Another one, in which we will experience, how wearing the interactive bracelets with motion capture sensors, change the dynamics of movement and choreographic sequences, thanks to their feedback.
3. We will work on how, varying the musicality and rhythm of the sequences, change the meanings and messages in communication.
It is directed to focus on the different patterns of movement, as an active mode of approach to the use of space inside and outside the body, and as a communication, transformation and intermediation tool.

VIDEO WORKSHOP WITH Software Developers AND AI professors

1 – Through physical exercises will be explained and applied the fundaments of BMC (Body Mind Centering), Flying Low (David Zambrano), Cunningham and Body Weather
(Min Tanaka) techniques, to develop choreographic sequences, work with different movement’s qualities and analyse how each of them changes the meaning of the same movement.
2 – Physical consciousness, connecting the extremities to the torso in order to give weight to the movements this way control with more specificity the sensors and study how various arm movements affect the back and other parts of the body.
3 – It will be explained to the group how the interactive system works.
4 – The sensors will be made available to the participants to try the experience, while exploring space.
5 – We will repeat the choreographic sequences while wearing the sensors and analyse how hearing oneself, affects our fantasy and our way of putting the body in motion.
6 – Recording of texts produced by the participants or music in the sound library, of the interactive system to use as aural feedback.
7 – To conclude we will share reflections on how each one interpret differently the movement qualities and how is influenced by what they have learned from the other participants.

WearMe_SuperNow description

It is an interactive tool. It consists of bracelets with motion capture sensors. This installation allows people to control through the intensity of their movements the tempo, volume and delay of the audio tracks, interacting through wireless sensors with a moving sound space.
When moving, the sensors capture the data from each accelerometer. These data express the degree of acceleration to which it is being subjected. The data are filtered in Pure Data and every movement produces different sounds varying in tempo according to the movement’s energy.
• Choose which sound to assign in each sensor, the effects of tempo, delay, volume, between three different degrees of sensitivity calibration.
• Save settings and create storyboards.
• Interchange audio and their effects from a sensor to another through the permutation system.
• Record live, text, audios that go directly into the sensors.

Paola Tognazzi studied industrial design at the IED in Milan. Cunningham, Ballet, Flying Low by David Zambrano and Body Mind Centering at the School of arts SNDO in Amsterdam. In 2001 she graduated majoring in Dance Theatre Direction within interactive audio-visual installations. She worked as assistant of Sasha Waltz in Nobody, with Min Tanaka in Japan, with Nir of Volff – Total Brutal Berlin and as executive producer of interactive operas at Studio Azzurro, Milan. In 2008 she founded Wearable_Dynamics. Her work explores the sensuality of interactive systems and creates artistic experiences that involve physically and emotionally the audience, encouraging them the development of sensory awareness.