I am an experienced location sound recordist who has become enchanted by the world of sound design and music for video, podcasts, and video games. I am fascinated by the ways in which sound and music can support, emphasize, and embolden stories.
I am an experienced location sound recordist who has become enchanted by the world of sound design and music for video, podcasts, and video games. I am fascinated by the ways in which sound and music can support, emphasize, and embolden stories.
For Chris Letcher's Composing for Screen course in Spring 2022, I had the opportunity to write a new score for Lokoza (2017) (dir. Isabelle Mayor).
The instrumentation for this particular scene is a combination of sound design (e.g. turning wind noise into something tonal to mimic the movement of flames), backing instruments written on a computer, and then a real human (Pete Harvey) playing the lead cello melody.
The scene involves a child (Themba) prying into his father's trauma—that of having been badly burnt at an oil refinery. Some notes on my intentions:
(i) I used pizzicato strings to capture the way in which the child is tip-toeing around his father's cold exterior, trying hesitantly to pry deeper.
(ii) I wrote the sweeping legato cello melody in order represent the agony felt by the father and which is hidden behind his cold exterior. This is a pain that he cannot and does not communicate through words in this scene, but it still exists, so I felt that it needed its expression through music.
(iii) Various additional string plucks pop out of the tension at certain points, as a way of representing the emotions simmering in the background that are eager to surface but which remain constrained.
Harleen Singh and I created an interactive sound installation as part of a larger exhibition put together by Asteria Creative, an Edinburgh-based interdisciplinary collective that uses art to promote the just governance of outer space. This took place @ Whitespace Gallery, Edinburgh on 18th-21st April 2022.
We programmed Arduino boards to use data from pressure-sensitive and ultrasonic distance sensors to send data to Max/MSP. This, in turn, fed MIDI data into Ableton Live, which allowed for live FX manipulation.
The basic concept was to have an atmospheric drone playing constantly through the speakers—this represented outer space. Visitors were then prompted to engage with 'outer space' by means of the pressure and distance sensors. Doing so applied a direct sonic effect to the constant drone, thus encouraging an atmosphere in which visitors are open and encouraged to thinking about space!
See video for a demonstration of the pressure sensors.