The Translator: Research video work

The Translator: Research video work

The Translator (speaking in chitin tongues)

Video based experiments WIP: how would a plant visualise itself? How would it understand its body, metabolism, root to leaftip presence? 

For description of full project see: HERE

Below:  Mockup installation: The Translator

A stage for the narrative play between a robot and the greenhouse plants it is coded to care for.

Above: testing robot choreography with plants, sound and text (Feb)

Below: testing robot choreography on perspex (April)

Test visualisations for screen that represents the plants ‘voice’. (March, With Felipe)

From seed to roots

Test visualisations for screen that represents the plants ‘voice’ (April With Felipe)

From seed to stem and flower

From root to stem: experiments in touch designer. Base model/ instance to build in electrophysiological datastream from the plants and soil metrics.

With Felipe

March:

Stem visualisations: metabolising and growing

With Benem

March:

Leaf visualisations: what would a flowering plant feel about itself?

With Benem

Excerpt from script in development:

OPENING. A: ROBOT MOVES OUT FROM BEHIND PLANT. WALKS TO C2. Starts speaking as walks.

Undertaking the local environmental and plant health analysis timecheck for this quarter day reporting period.

Spot energy price report is due in next 7 minutes. Daily vegetable forward pricing due within 8 minutes. The next round of economic decision will be activated at that point.

My plants, greenhouse temperature is within range for your maximum growth. The predicted need for additional heating is low. Soil temperature at 18 degrees Celsius and within range.  All plant metrics from previous analysis timecheck are aligned.

The soil substrate water level is currently at 5% below optimum for growth phase. Shall we initiate water 40 ml/minute from your daily allotment- to start now? The costings have been allocated to produce returns.

Plant: video of water drop like crystalline growth

Excerpt from script in development:

SHIFTS SLIGHTLY AND LOOKS UP AND DOWN SLOWLY AT SCREEN FOR A FEW SECONDS

And here are your thoughts- translated from your signals. From body microcurrents and volatile expressions. Converted to potentiated data visualisation. From chemicals and membrane potential differences to moving images in tri-colour.

Translation is conversion and conversation/ I am your translator.

I see from your collective thoughts you are remembering back when you emerged from seeds. You recall the colour-shift moment from orange seed to chlorophyl-green. Of your first light-filled stretch to the sky.

You are listening and feeling. You are hearing my words spoken to you in insect sounds. The frequencies you are evolved to hear.

Of pollinators and eaters/ of scuttlers and singers.

Beeswing buzz/  cricket’s chirp/  cicada’s drone/ beetle’s tap . 

TURNS TO PLANT BASE TO LOOK AT PLANTS. STAMPS FOOT ON FIRST “I” BOWS UP AND DOWN AS SPEAKS:

My princesses. I speak with you in chitin sounds

In tongues of insect wings and diaphragms

Of buzz and scrape/ of click and vibrate.

While you remain still, listening for the 200 hz of beeswing beat. Ground tethered by root and stem, slow-dreaming in vegetal time.

Concatenator trial:For the robot’s voice:

sounds of 50 insects found on vegetable plants and or in greenhouses. AI matched to phonome pitch and frequency of spoken word. Screenvideo of program decisionmaking

Working with:

Nathan Marcus; digital renderings, worldbuilding 

Felipe Rebolledo; Digital designer, touch designer

Charl Linsen, computational neuroscientist: Robotics programming, choreographic translations