Wednesday, 28 September 2016

COMM2591 - HMsEx Tertiary Project Research & Development :: Week 9

Meeting with Mohammad Fard on Tuesday regarding vibrotactile drowsiness study at Bundoora campus:


  • Went well. I'd revised and studied my BB literature again prior to going in. Spoke with confidence and authority about methodology with which to implement this within new test design.
  • Key points to remember here are BB's (according successful lit: Lanes, Kasian, Owens, Marsh 1998) are required to be played amongst pink noise with a carrier tone 15db above the amplitude of the noise.
    • Even the control group were played pink noise and a carrier tone.
      • This could be adapted using road noise 

  • Isochronics and Monaural methodology can be at greater amplitudes and with a single speaker presentation model (which suits our study conditions best also)
    • Will & Berg (2007) observed brainwave synchronisation through high grade EEG equipment when playing monaural drum and click acoustic stimuli to participants at 79db.

  • Following this meeting my objective is to build a new MaxMSP patch that tests these monaural methods matching the literature with precision.
    • Need to thoroughly investigate Isochronics and Monaural methods before commencing this, after current obligations with GYRE.
    • Exciting! I look forward to getting into it.

Binaural Auditory Beats Affect Vigilance Performance and Mood
Lanes, Kasian, Owens, Marsh 1998

Brain wave synchronization and entrainment to periodic acoustic stimuli
Will U, Berg E 2007






  • This week I also created a Psychomotor Vigilance Task in MaxMSP.


The PVT is a simple task where the subject presses a button as soon as the light appears. The light will turn on randomly every few seconds for 5–10 minutes. The main measurement of this task is not to assess the reaction time, but to see how many times the button is not pressed when the light is on. The purpose of the PVT is to measure sustained attention, and give a numerical measure of sleepiness by counting the number of lapses in attention of the tested subject.


Apologies for the resolution, this site maxes at 320x240 resolution
The patch takes spacebar as an input and displays the LED randomly every few seconds 
for 5 minutes (adjustable to 10min) - the patch records both how long it takes to respond and how many times the LED was missed (lapse in attention)

This information is recorded in a text file that can be imported into Excel.



PVT literature sources I used to develop the patch:
https://faculty.washington.edu/wobbrock/pubs/ph-13.pdf
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.611.458&rep=rep1&type=pdf
http://www.unisanet.unisa.edu.au/staff/matthewthomas/GREG/roach(shortpvt)06.pdf




  • Also this week I discovered "Brain.fm" through a podcast I was listening to regarding brainwave entrainment methods.
      • The creators/owners of the site talked a lot of sales bullshit and didn't get into the hard science behind their algorithms - primarily because brain.fm is a paid service and they aren't disclosing their proprietary algos
      • However, they do back up their claims with studies conducted by Neuroscientists
        • Dr. Ben Morillon. Cognitive Neuroscientist, Inserm Researcher Aix-Marseille Université
        • Dr. Giovanni Santostasi. Neuroscientist at Northwestern University. 
        • A team 'advisors' (consisting of 4 PhD + 1 Professor)




  • I am very keen to dig deep into their methodology and discover how exactly their are extending "auditory neuroscience" and yielding these dramatic results. Snake Oil???
  • I'm going to start an account and try it for myself.







COMM2591 - HMsEx Secondary Project Research & Development :: Week 8

Tom suggested breaking down the GYRE composition into 'movements' based on conversation with Darrin. Potential to allow the visuals and sound to alternate intensities, fits with concept.

Explored this and generated some sketches for Tom using the Hauptwerk Grand Organ emulation.


GYRE Intro sketch:

  • Purely using Pedal voice & Carillon to set tone
  • Using filtered noise to create swirling texture building intensity
  • Noise source: visual clips sent to me from Tom - opened in audacity as Raw Audio 
    • aligned with secondary GYRE concept of circular feedback in process





GYRE Oscillation sketch:
  • Experimented with Native Instruments Reaktor ensemble Spiral to generate swirling MIDI patterns
  • Edited good takes into MIDI clip in Ableton and added draft accompaniment
  • Aligned with GYRE concept




GYRE Grain sketch:
  • Exploring possibilities of dynamic timbral modulation to create disjunct and flux with acoustic organ voicing
    • Using same MIDI pattern sent to Hauptwerk sent to the MTH Grand Organ
    • Using modulated granular synthesis on recording of Hauptwerk played against acoustic organ voicing
    • Recycling within process aligned with GYRE concept
    • Ability to slowly rupture and "de-rupture" purely through timbral modulation attractive as a means to articulate GYRE concept





Mini lit review assessment task:

Enhancing Kinesthetic Awareness and Proprioception Through the Application of Strategic Sonic Modalities.

Kinesthetic Awareness is the sensory process of locating the body within space, utilizing sensory input to temporally locate the body within external space. This is distinct from the process of Proprioception, which pertains explicitly to the internal awareness and control of the muscular system which dictate the negotiation of body within space. Often the two are often incorrectly conflated, though the dichotomy is critical especially within the context of clinical, athletic and professional dance practice.

Sonic modalities have been explored within the context of enhancing Proprioception and Kinesthetic Awareness within the work of Barrass et al. (2009) Through the use of GPS coordinates and accelerometer data to extrapolate velocity and acceleration, parameters of live synthesis were modulated and played to an elite rowing team in real time. The athlete participants in the study all reported that the sonic events that were generated from their movements allowed a perception of their technique that gave greater insight than watching a recorded video of their training sessions, allowing them to improve their technique.

The natural relationship between sound, kinesthetic awareness and proprioception are clearly evident within the context of instrumental musicianship and voice training. The feedback between the perception of gesture driven sound and motor dexterity to achieve finer, and more harmonious control of sound through gesture has been articulated by Orenstein (2003) stating that motor learning is reliant on the ability for the performer to identify and respond to discrete perceptual cues. Orenstein draws parallels with the kinesthetic awareness required of elite athletes to identify perceptual motion cues of velocity and acceleration to excel in their field with the sonic perceptual motion cues of pitch and rhythm as being equally vital in the the context of voice training to achieve excellence through enhanced proprioceptive control. Orenstein continues to identify sonic perceptual cues within sports by illustrating the desirable tonal beauty of a perfect shot in basketball, the “swish” as being a clear kinesthetic indicator of proprioceptive control.


In their systematic review of the practical application of sonification techniques used to express physical characteristics, Dubus and Bresin (2013) reviewed a randomised selection of 179 journals from a database of 739 providing a comprehensive view of practical sonic modalities dating back to 1945. Though the topic of the enquiry was to build a unified topography of the design methodologies explored within sonification practice, Dubus and Bresin made clear note of the strong relationship between auditory perception and motor control and that of the ability of sound to articulate movement through space. Indeed, their review plots the implementation of sonic characteristics to describe movement and spatial awareness. Their findings showed not only that basic pitch relationships are the prevailing design choice to articulate movement, but that spatialisation of sound is almost exclusively used to articulate kinematics. However, Dubus and Bresin note that only a marginal amount of the mapping strategies employed within the publications they reviewed actually have actually had those strategies evaluated, which is alarming. Clearly a greater amount of rigour from a design perspective needs to be implemented to advance the practical application of movement sonification further into the 21st Century.


Reference list:

Barrass S, Mattes K, Schaffert N, Effenburg A 2009, ‘Exploring Function And Aethetics In Sonifications For Elite Sports’, In. Proc. 2nd Int. Conference on Music Communication Science (ICoMCS2). 3rd-4th December 2009, Sydney, Australia.

Ohrenstein, D 2003​ ‘Journal of Singing’ - The Official Journal of the National Association of Teachers of Singing 60.1 (Sep 2003): 29-35.

Dubus G, Bresin R 2013 ‘A Systematic Review of Mapping Strategies for the Sonification of Physical Quantities’ PLoS ONE 8(12): e82491. doi:10.1371/ journal.pone.0082491 



Wednesday, 21 September 2016

COMM2591 - HMsEx Secondary Project Research :: Week 7

After preliminary discussions with Thomas Pentland, our project is titled GYRE.

GYRE (noun)
1. a ring or circle.
2. a circular course or motion.


  • Describes the circular motion Tom is intending to use within his visual motif.
  • Also used to describe the feedback process involved with the collaboration as material circulates back and forth between us.

Using this GYRE concept as a guide for sound design, I'm interested in exploring a rhythmic motif that implies circular motion.

I'm drawn to the Risset Rhythm illusion. I would like to explore this.


Jean-Claude Risset described an “eternal accelerando” illusion, related to Shepard tones, in which a rhythm can be constructed to give the perception of continuous acceleration. The effect can in principle be derived from any rhythmic template, producing patterns with aspects of fractal self-similarity.  - (Stowell D. 2011)

  • Explore using the Risset Accelerando and inverting it to create "eternal" Ritardo to create the composition - Sense of flux, circular motion
  • Can it be applied to MIDI? - Can the Organ play Risset Rhythms in melodic scale?



Started using Code Academy to pick up JavaScript as it integrates well with MaxMSP (and scripting handles recursion much better than straight patching in Max)

MaxMSP + JavaScript solution to create Risset Rhythm MIDI generator would be a great outcome

PPQN issues with Ableton Live:
Ableton has a kind of "dynamic" way of handling Pulses Per Quarter Note (midi resolution) which will make working to a static PPQN difficult..


Wednesday, 14 September 2016

COMM2591 - HMsEx Major Project Addendum + Secondary Project Research :: Week 6


Major Project Addendum:

  • Processed Spectral data of the Sound Condition captures for FF
    • This will assist with further processing in the write up
    • Also required to develop a data visualisation tool for FF to load all spectral and pressure data tables and generate a "pressure sonograph" - more detail from FF pending
  • Processed the video files captured in the session last week



Move Like You Don't Want To Disturb The Air

Spectral content of sound generated from pressure data in 'Move Like You Don't Want To Disturb The Air'
Sound Condition: Move Like You Don't Want To Disturb The Air




Gulliver's Travel's

Spectral content of sound generated from pressure data in 'Gulliver's Travels'

Sound Condition: Gulliver's Travels





Omnipresent: All Seeing Being

Spectral content of sound generated from pressure data in 'Omnipresent: All Seeing Being'

Sound Condition: Omnipresent: All Seeing Being








Puppet

Spectral content of sound generated from pressure data in 'Puppet'

Sound Condition: Puppet











Secondary project: Melbourne Town Hall Grand Organ Composition Piece

  • Collaborating with Thomas Pentland (visuals)


Initial ideas:

  • Use this opportunity to take sound designer role - articulate Tom's concept through sound

Initial trial sketches for first tour of the MTH Grand Organ:

Experimenting with dynamics (velocity ramps) using all note durations between 4n - 64n (In C Minor)


Zoomed-in view of the black (higher density) sections in the above clip (64n)




Notes about the tour of the MTH Grand Organ:

  • Explored the innards of the organ, incredible construction.. speechless.
  • Discovered that being a wind instrument, the MIDI velocity information is redundant (of course!) - therefore my clip to trial was useless
  • Will have to reconsider my approach
  • Set up my patch for the organ - selected voicing
    • Pedals: all subtone, no "trumpet" voicing
    • Swell: Organ tone, mostly highs to avoid clash with pedals
    • Carillon: can do some interesting retriggers with the bells, useful for phrase 'punctuation' and darker, somber tones (also gives odd sense of immense spatiality when used as the only high frequency voice)
    • Glockenspiel: also good for retriggering - good options for harmonisation with the Carillon voicing



Wednesday, 7 September 2016

COMM2591 - HMsEx Major Project Research / Study Execution :: Week 5

Saturday 27.08.16 - Met with Frank Feltham: SIAL 9:30am

Discussed where FF was at in terms of preparation for data capture for the study dates 29-31st Aug and identified problematic areas of FF's current data collection patch.

Developed data capture and audio/video synchronisation with ableton live across two video streams within FF's MaxMSP patch.
- Refined trigger system for synchronisation (removed auto trigger from pressure data, switched to manual trigger)
- Replaced [jit.record] with [jit.vhs] and created audio [adc] feed in subpatch from ableton live

Outcome: FF's patch ready for data collection and live monitoring


Sunday 28.08.16 - Solo studio session at SIAL 10am

Adapted my MaxMSP interaction design patch to:
- Accommodate for 3x data streams from Wii Balance Boards
- Utilise discrete (X & Y) centre of mass values for discrete MIDI CC values from each discrete board
- Utilise summed (X & Y) centre of mass values for a discrete MIDI CC values for the sum of all boards
- Utilise discrete sum total pressure value for discrete MIDI CC & Note On values from each discrete board
- Utilise discrete sum total pressure value for a discrete MIDI CC values from the sum of all boards

Refined and completed sound design for Synthesis Treatment A: "Move Like You Don't Want To Disturb The Air"

Two tones:
Bass Tone (Operator, Sine D#0 :: 40hz) looping independent of interaction
Piano Tone (Sampler, 'Lovely Keys' patch, Discrete MIDI Note values from each board) gesture driven, no sound = no gesture

Using Discrete Sum values on each board to trigger:
- MIDI Note On messages, pressure sum float value scaled to Velocity, no exponential scaling
- MIDI CC values, pressure sum float value scaled from 0-127, no exponential scaling.

Ableton Mapping:
Pressure Sum of any board attenuates volume of Bass tone by 10db at full pressure
Pressure Sum of any board attenuates volume of Piano tone by 3db at full pressure
Pressure Sum of any board modulates dry/wet knob of reverb on Piano tone
Pressure Sum of any board modulates amount knob of erosion plugin on Piano tone
Pressure Sum of any board modulates frequency knob of erosion plugin on Piano tone
Pressure Sum of each board modulates frequency cutoff of lowpass filter on Piano tone



Interaction & Sound Design choices:

Bass tone represents weight of the air.
- Full pressure depresses the volume of the air with high sensitivity, encourages light controlled feet.
- Felt deep within body and non localised (deep sub freq)

Piano Tone pitch mapping:
- Board 1: A2 / Board 2: G#2 / Board 3: A2 :: I pitched down one Semitone for the second board to have an sonic cue of progression and resolution as also for clarity. As the second board is triggered there is a clear pitch change to distinguish the gestures, which resolves itself with the third board at the initial pitch once again.

Piano Tone FX mapping:
- Erosion: 'Amount' value full range mapped to pressure sum. Full pressure introduces maximum noise into the instrument. Noise representing both 'Air' and the 'Disturbance' of air.
- Erosion: 'Frequency' value (bandpass filter in series) full range (300hz - 18khz) inversely mapped to full pressure sum. Pressure relationship to sound equates to bandpass filtering out high frequencies with increase in pressure, sensation of the air being disturbed as high frequencies are removed.

- Reverb: full pressure inversely mapped to dry/wet. More dry tone passing through the reverb with full pressure brings the sound 'closer' to the sound stage, bringing the noise from the erosion into focus. Making "the air" feel disturbed. Encourages light, controlled feet.


Testing sound design patch for 'Move Like You Don't Want To Disturb The Air'



Functionality tested parallel operation of both FF and JC max patches with Ableton Live and OSCulator. Worked seamlessly on my machine, achieving 50FPS max 15FPS min for 2 video streams while all software in use.

Outcome: Use my machine for the study.



Monday 29.08.16 - Met with Kirsten Packham & Frank Feltham: SIAL 9:30am

Initial calibration and introduction day. Met KP, FF briefed her on the study. KP ran through her 4 routines while I observed the data flow within the collection patch. Made notes to reflect how the data was shaped for each routine. Made SD notes developing existing ideas/sketches to refine SD implementation for each routine.


Confirmed with FF completion sound design for Synthesis Treatment B: "Gulliver's Travels"

Custom Ableton Operator patch, mirrored across 3 instrument channels in Ableton Live
Each instrument channel responding to discrete MIDI channel (1, 2, 3) per Wii Balance Board 1, 2, 3
All three instruments playing bass tone (Operator D#0 :: 40hz) looping independent of interaction
4x Squarewave (D) Voices per instrument
Parallel Output routing (no serial FM)
Low Pass Filter 12db Per Octave
Spread: 87%
Transposed down -12ST





Using Discrete Sum values on each board to trigger:
- MIDI CC values, pressure sum float value scaled from 0-127, no exponential scaling.

Ableton Mapping:
Pressure Sum of each board raises discrete instrument channel gain level to 0db at full pressure
Pressure Sum of each board raises MIDI Macro knob to 127 at full pressure

Instrument Macro Mapping (Mirrored across each board/instrument channel):
Pressure Sum from each board raises Operator volume level to 0db at full pressure
Pressure Sum from each board raises OSC A Fixed Frequency from 10hz to 2000hz at full pressure
Pressure Sum from each board raises OSC B Fine Frequency from 0 to 1000 at full pressure
Pressure Sum from each board raises OSC C Fixed Frequency from 10hz to 2000hz at full pressure
Pressure Sum from each board raises OSC D Fine Frequency from 0 to 1000 at full pressure
Pressure Sum from each board raises LPF cutoff frequency from 30hz to 1200hz at full pressure





Interaction & Sound Design choices:

Reductive approach and uniformity paramount in this synthesis treatment. FF & JC agreed that Gulliver's Travels SD should primarily utilise pitch to reflect immediate changes in pressure as contrast to other ST methods in pilot study. This was augmented and developed with volume and gain mappings to engender more pronounced sense of gestural interaction with sound. This was again further developed with the opening the cutoff frequency of the Low Pass Filter.

4x Squarewave OSC per instrument with extremely wide range of pitch modulation designed to reflect extremely large sonic changes with subtle pressure changes. Transition from non pressure state very low sonic state to high frequency full pressure state incrementally brings the sound from being environmental and felt to cerebral and heard - with the intention of transferring pressure changes from the environment to the mind.

Side note: In the realisation of the SD for this treatment I wanted to explore the notion of stepped frequency changes with units of pressure to reflect the image notes from KP in Gulliver's Travels, "Little hands helping me up and down." FF and I agreed that at least one ST should not be image driven and more clinical, so I did not pursue this approach.




Testing sound design for 'Gulliver's Travels'





Tuesday 30.08.16 - Met with Kirsten Packham & Frank Feltham: SIAL 11:00am

Performed and recorded KP images, "Move Like You Don't Want To Disturb The Air" and "Gulliver's Travels"

Randomised ordering of the Sound Condition with the No Sound Condition

Made the decision to not remain present for the qualitative data collection of the "video recall" session with FF & KP for each image to ensure FF & KP could converse deeply within these sessions. As a result, I remain unaware as to the success of my synthesis treatments.

Overall feeling: good day of data collection. Good rapport with KP. KP appears excited by possibilities of pressure sonification. KP appears to have fun with process!

Solo session:

Refined and completed sound design for Synthesis Treatment C: "Omnipresent All Seeing Being"

Modified version of 'Minor to Major Lead' Operator patch, mirrored across 3 instrument channels in Ableton Live
Each instrument channel responding to discrete MIDI channel (1, 2, 3) per Wii Balance Board 1, 2, 3
All three instruments playing bass tone (Operator F2 :: 175hz Fundamental) looping independent of interaction
4x Sawtooth wave (Sw32) Voices per instrument
Parallel Output routing (no serial FM)
Low Pass Filter 12db Per Octave
Spread: 100%


Using Discrete Sum values on each board to trigger:
- MIDI CC values, pressure sum float value scaled from 0-127, 0.4 exponential scaling.

Using Discrete X Axis COM values to trigger:
- MIDI CC values, X axis COM float value scaled from 0-127, no exponential scaling.



Ableton Mapping:
X Axis COM value of each board controls discrete track panning (full range) for each discrete instrument channel

Pressure Sum of each board raises discrete instrument FX: Erosion "Amount" Macro from 0 to 200 at full pressure
Pressure Sum of each board raises discrete instrument FX: Erosion "Frequency" Macro from 3000hz to 18000hz at full pressure
Pressure Sum of each board raises discrete instrument Volume level to 0db at full pressure
Pressure Sum of each board raises discrete instrument "Minor to Major" Macro to 127 at full pressure
Pressure Sum of each board raises discrete instrument "Filter Freq" Macro from 1000hz to 18500hz at full pressure


Instrument Macro Mapping (Mirrored across each board/instrument channel):
Pressure Sum from each board raises Operator volume level from to -12db to 0db at full pressure
Pressure Sum from each board modulates OSC B Fine Frequency from 201 to 285 at full pressure
Pressure Sum from each board modulates OSC D Fine Frequency from 790 to 883 at full pressure
Pressure Sum of each board raises low pass filter frequency cutoff from 1000hz to 18500hz at full pressure






Interaction & Sound Design choices:

A balance of practical/clinical application of reductive sound design parameters and sound design choices informed by the image notes. I desired a sound bed that reflected an Omnipresent awareness - shimmering high frequency legato chords to be modulated by the interaction. 

The 'Minor to Major' Operator patch successfully maps a sonically rich harmonic shift that can be applied while notes are held (legato) over a 0-127 MIDI CC range - so this became a compelling direction to explore. I augmented this sonically with dynamically modulated parameters of the Erosion effect plugin, adding increasing amounts of high frequency content with higher levels of interaction. 

This along with discrete volume and discrete panning assignments also assisted in allowing each instrument voice to be distinct from each other while maintaining uniform assignment across each board.


Refined and completed sound design for Synthesis Treatment D: "Puppet"

Custom Operator patch, mirrored across 3 instrument channels in Ableton Live
Each instrument channel responding to discrete MIDI channel (1, 2, 3) per Wii Balance Board 1, 2, 3
Each instrument channel responding to discrete MIDI Note_On and Note_Off messages per Wii Balance Board 1, 2, 3
All three instruments inactive without interaction
4x Sine wave Voices per instrument
Parallel Output routing (no serial FM)
Low Pass Filter 12db Per Octave
Spread: 100%


Using Discrete Sum values on each board to trigger:
- MIDI CC values, pressure sum float value scaled from 0-127, no exponential scaling.
- MIDI Note_On values, pressure sum float value positive threshold crossing of 0.51 sends discrete MIDI Note_On message, no exponential scaling.
- MIDI Note_On values, pressure sum float value negative threshold crossing of 0.51 sends discrete MIDI Note_Off message, no exponential scaling.

Using Discrete X Axis COM values to trigger:
- MIDI Note_On values, X axis COM float value positive threshold crossing of 0.51 sends discrete MIDI Note_On message, no exponential scaling.
- MIDI Note_On values, X axis COM float value negative threshold crossing of 0.51 sends discrete MIDI Note_Off message, no exponential scaling.

Using Discrete Y Axis COM values to trigger:
- MIDI Note_On values, X axis COM float value positive threshold crossing of 0.51 sends discrete MIDI Note_On message, no exponential scaling.
- MIDI Note_On values, X axis COM float value negative threshold crossing of 0.51 sends discrete MIDI Note_Off message, no exponential scaling.

Ableton Mapping:
X Axis COM value of board 1 sends MIDI Note_On/Off: A5 on MIDI Ch1
X Axis COM value of board 2 sends MIDI Note_On/Off: G5 on MIDI Ch2
X Axis COM value of board 3 sends MIDI Note_On/Off: A5 on MIDI Ch3

Y Axis COM value of board 1 sends MIDI Note_On/Off: F5 on MIDI Ch1
Y Axis COM value of board 2 sends MIDI Note_On/Off: E5 on MIDI Ch2
Y Axis COM value of board 3 sends MIDI Note_On/Off: F5 on MIDI Ch3

Pressure Sum value of board 1 sends MIDI Note_On/Off: A2 on MIDI Ch1
Pressure Sum value of board 2 sends MIDI Note_On/Off: G2 on MIDI Ch2
Pressure Sum value of board 3 sends MIDI Note_On/Off: A2 on MIDI Ch3

Pressure Sum of each board raises discrete instrument Lowpass Filter Cutoff from 1000hz to 18500hz at full pressure
Pressure Sum of each board raises discrete instrument OSC B Fine Frequency from 0 to 50 at full pressure
Pressure Sum of each board raises discrete instrument OSC C Fine Frequency from 0 to 50 at full pressure
Pressure Sum of each board raises discrete instrument OSC D Fine Frequency from 0 to 50 at full pressure




Interaction & Sound Design choices:

Sound and interaction design choices for Puppet were as informed by the image notes as by providing an interesting contrast to previous ID & SD approaches. FF & JC agreed that the interaction design should reflect the staccato movements implied by the Puppet image notes. 

This necessitated a fully gesture driven system to generate MIDI Note_On/Note_Off messages. As each axis of COM values rest at 0.5 float values, this made sense as a threshold point - allowing left to right and posterior to anterior foot transitions to trigger and hold notes within an Ableton instrument and also have these notes closed with a gesture driven Note_Off message when the movement is completed and the axis returns to its rest value at 0.5. I allowed this threshold value to be calibrated to any value, for customisation.

I applied a similar system to the pressure data so that MIDI Note_On/Off pairs could be triggered with pressure interaction on each board. 

With this new system functioning I was able to articulate the movements with single transient non-legato tones which matched the jerky movements I observed when KP performed Puppet on day 1 and also provided a good contrast to the previous interaction and synthesis treatments.

For the sound design, I made a custom FM synthesis patch using all four voices of Operator with very short envelopes - loosely modelling the Xylophone/Glockenspiel tones I envisaged being suitable for the Puppet image, detuning and layering each voice to produce one clear thick tone. I spent some time auditioning this patch without the WiiBB input, experimenting with unsteady rhythms - which matched my desired outcome for Puppet sonically.

I employed the same pitch change motif as "Move As If You Don't Want To Disturb The Air" by reducing the pitch for the second board by one step which resolves at the original pitch at board 3. This was applied to all three MIDI note assignments per board. I selected the Pressure Sum value MIDI Note pairs as A2, G2 & A2 of boards 1, 2 & 3 - to allow a lower tone to reflect the first sonic change heard with the first point of contact in the interaction. This lower tone provides a lower frequency grounding for the other two note pairs triggered by centre of mass transitions to play against.

The COM transition MIDI Note Pairs were raised 3 octaves above the Pressure Sum value MIDI Note Pairs to allow for greater auditory localisation for the performer, as I observed the Puppet routine involving many instances of posterior/anterior and left/right transitions while feet are planted before transitioning boards.

To augment and develop this, I added a very small amount of pitch modulation uniformly across 3 out of 4 voices in all instruments so that these "wobbles" gestures would be articulated within the sound and could be played with.

Finally, the Pressure Sum value of each board was designed to open up a filter with greater pressure giving the Z-Axis of movement sonic articulation.

All in all, Puppet was the most interesting and satisfying synthesis treatment and interaction design process of the study for me. It also feels like it possesses the largest scope for further exploration into interaction and sound design, I feel like a just scratched the surface - despite it being a very dynamic and engaging instrument, which is encouraging.



Sound designing for 'Puppet'




Wednesday 310816 - Met with Kirsten Packham & Frank Feltham: SIAL 11:30am

Performed and recorded KP images, "Omnipresent All Seeing Being" and "Puppet"

Added new condition, 'Walk Condition' to capture walk control again- randomised these with the ordering of the Sound Condition the No Sound Condition

Again, made the decision to not remain present for the qualitative data collection of the "video recall" session with FF & KP for each image to ensure FF & KP could converse deeply within these sessions. As a result, I remain unaware as to the success of my synthesis treatments.

Overall feeling: another good day of data collection. Good rapport with KP. KP appears to remain excited by possibilities of pressure sonification. Had lunch with KP and FF, had great interaction with KP, interested in potential collaborations further down the road for SD!