Tuesday, 18 October 2016

COMM2591 - HMsEx Secondary Project Execution + Semester Final Thoughts :: Week 12


GYRE Execution:

The DisOrganEyesd concert at Melbourne Town Hall was a great deal more emotional to be a part of than expected. Perhaps in part because this is the final year of my program, but also due to the history of the venue and the high turnout of patronage. There was something of a hallowed feeling in the air during the anticipation of the performances and at the peaks of each. There is a level of respect for the history of the venue that was palpable.

I was pretty impressed with the work everyone presented. Shane & Pat's performance really lifted the room visually and sonically. Denby and Lisa brought a nice, heavy sense of atmosphere and Job was entertaining with his performance as well as providing some pretty stunning detailed visuals. Keith's piece was pretty mindblowing. Unfortunately I think it went above many people's heads (probably everyones but his to an extent!) and I think it was a case of overcooking it a little - as the version at his rehearsal the previous week sounded perfect - the changes he made pushed it over the edge in my opinion.

After weeks of agonising over my composition and the visual performance, I was pretty happy with the performance of GYRE! I'm usually quite critical of my own compositional work, in the past sometimes allowing it to overshadow the development process - but with this piece I feel that the restraint I exercised paid off in the execution. I feel that the reductionist direction I began to take with this piece is an ethic that I've had the need to pursue with composition for some time. It felt good to satiate it and the results were equally gratifying. This piece represents a greater control over deploying melodic structures in the facility of heightened emotional content than previous work in my folio. I attribute this to the limitations imposed by the organ and working to a theme. These necessitated a greater focus on arrangement and greatly assisted with establishing a musical logic that provided the framework with which I began to work in to create the piece.


In the performance of GYRE I gained a sense of the audience reaction in certain moments which felt like the piece hit those emotional peaks quite hard - which was gratifying. I also felt that critically, the piece could have been improved by shortening the length as I feel it stretched the attention span of the audience slightly beyond the threshold of maintaining those emotional moments. This is perhaps what became a key consideration during the development of GYRE. With the primary objective being  to create a "hypnotising" composition there is necessity to avoid major deviation from a central musical motif or to work with rupture or strong contrasting harmonic juxtaposition. Peaks require troughs. I used the accompaniment of the synthesised tracks and the organ idiophone accompaniment to disperse and embellish these during the swells of the main organ score - however I feel that the troughs could have been pushed further with harmonic minimalism to prolong attention to the conclusion of the piece and within the scope of the "hypnotising" concept.

I received great feedback about the piece and people felt that the sound and vision worked in harmony which was gratifying. People also made reference to the masking in the visual elements of the projection made the surface feel as though it was a part of the piece and there was a level of kineticism present that made the performance feel synchronous and engaging.




CoDesign: Study 3 - Creative Agency Walking:

Creatively, a very fulfilling project. Explored two of my key passions which are Sound Design and Interaction Design in a postgraduate context which was fantastic. Also a great opportunity to work with dance/body work and sound, which is an area I'd been very keen to explore for years. Hopefully this leads to more work in that direction. Great relationship with Frank Feltham and Kirsten Packham. This project has in turn led to an application to a second degree in Industrial Design and working on a large project that will comprise my honours work. I could not be happier with the outcome! I will be assisting FF with the interpretation and writing up of the data collected, which may be presented at NIME. Very excited! I came to academic study with the pursuit of exploring the use of sound in a practical context (how I expected that to be present in an arts program I'm not sure, arts and pragmatism are almost polarised!) I really feel that I am on track to developing skills that will take me to the next level of post graduate enquiry and start raising my own questions that will inform my PhD.

HMsEx Final Thoughts:

Where was this course in the Arts program! I imagine some of the Art Music and The Brain material  (which I missed out on) informed the HMsEx content. I was searching for this type of content in all of my Arts program! The mechanics of how not the historical or political reasons of why. A neurological framework for media art facilitates artistic pursuits with scientific relevance. I gained a greater understanding of how arcane psychophysiology still is in the 21st Century and how media art actually comprises almost all of the key modalities to explore it. This opens up a wealth of potential academic inquiries, I find myself considering the psychophysiogical implications of the work I'll be undertaking next year and into the future.



Final HMsEx Slide Presentation 













Tuesday, 11 October 2016

COMM2591 - HMsEx Secondary Project Research & Development :: Week 11

GYRE Project update:


  • Shifted the key from E Minor (a little light) to G# Aeolian Mode (feels much deeper and engaging)
  • Generated many automated iterations from NI Reaktor ensemble Spiral as this was the tool I used to generate "GYRE Oscillation" which is the sketch that Tom selected that matched his vision.
  • Compiled single MIDI track from these takes and arranged them into master MIDI track for main organ voice for GYRE.
    • Found three nice moments of rising intensity moving up the scale and lowering intensity moving down the scale in varying temporal patterns
    • Mirrored these moments across the Y axis with transposition giving a rich and deep chordal movement
    • Painstakingly moved MIDI notes away from the grid across the whole composition - accentuating temporal displacement and complexity, also added a 'played' feel with offset notes in chords 
    • Mirrored these again across the X axis giving the composition moments of rising and falling ebb and flow within a bookended compositional macrostructure
    • Removed all velocity information
Sketching and composing GYRE in Ableton Live
  • Used multiple variations of FM synthesis techniques to create accompaniment, some quite percussive, others softer attack.
    • Experimented with timing on the accompaniment utilising my exponential decay generator
    • Intensified the 'temporal flux' by using modulated audio delay on these sections (giving a stronger sense of speeding up and slowing down)
    • Modulated other various FX and synthesiser parameters over multiple passes of the composition giving it more life and expression
      • I macro mapped expressive parameters ie: filter, envelope, LFO amount to pitch etc and used this to give greater expressive effects to the sustain phase of the envelope which worked quite well at creating tonal and timbral juxtaposition against the organ voicing.

  • Less really is more with this project. I'm finding with the scale of the organ and the size of the venue this is an exercise in stripping away elements rather than building and layering them - which is my usual practice. 
  • Revised the primary organ MIDI arrangement even further, removing about 30% of the note information across the whole piece to give more moments of dynamics
  • Removed and augmented the generated material with manual editing of MIDI note information, found myself editing this visually (by tethering the piano roll to the scale I was using) working in this way really assisted in further facilitating a sense of temporal flux.

  • Used the Glockenspiel and Carillon voicing to add further accenting to the FM synthesised accompaniment, using these sparingly really worked best with the density of the rest of the composition. 




    Tuesday, 4 October 2016

    COMM2591 - HMsEx Secondary Project Research & Development :: Week 10

    GYRE Project update:


    • Tom liked 'GYRE Oscillation' & "GYRE Drone Intro"
      • GYRE Oscillation in particular fit perfectly with a sequence he is planning for the middle of the performance.
      • Moving forward using GYRE Oscillation as a basis for the centrepiece of the composition and build outward
      • Will either scrap or build upon GYRE Drone Intro as a lead in to GYRE Oscillation
    • Composition/performance length has been shortened to 7-8 minutes max to fit everyone in




    • Need for Revision in GYRE Oscillation sketch post session in situ
      • Key and phrasing did not sound powerful enough
      • Percussion sounded very odd, considering abandoning it all together
      • Grain voicing vs Organ acoustic 'timbral flux' concept was not as compelling in situ as hoped 
        • Will most likely abandon this and move forward developing 'temporal flux' concept





    Built a MaxMSP tool to generate exponential decay patterns in MIDI




    I achieved this using recursion. The [detonate] object in Max takes a list with a MIDI note value and timing data in milliseconds. This draws MIDI notes into a piano roll when executed. By sending a starting value through a recursive patch that multiplied it by a float that is less than 1.0 - every time the millisecond value goes through the loop it has the same amount removed from the previous time, which shortens at every loop.

    These values are collected into a [coll] then when the value is below a threshold of 5ms the loop stops and dumps the [coll] into [detonate] which generates the MIDI pattern - which is ready to import into Ableton.

    This effect is usually achieved using audio delays, but by having the MIDI pattern on hand I can send these patterns to the Glockenspiel and the Carillon or my synthesisers.


    Inspired by compositions using this technique:

    Autechre - Drane2


    I think using similar harmonic content to the organ in this manner will generate a sense of temporal flux I am seeking with this composition. I am confident that the percussive attacks (using FM synthesis and envelopes) will contrast quite nicely with the organ when played in the same key.

    I am excited to hear how this develops!




    I experimented with IanniX software this week to generate MIDI patterns. IanniX is incredible free software used to generate scores and control messages using moving graphical objects inspired by Iannis Xenaxis.

    IanniX is particularly useful for projects with a strong visual theme, you can program graphically and hear the result using OSC messages assigned to synthesisers.

    After spending some time with IanniX I decided not to proceed - though I've wanted to use this tool for a while to create scores - I feel that I need to use software I am familiar with to achieve a greater result for the composition with more control.







    I experimented with Steve Reich's famous Phasing technique with MIDI patterns in Ableton using Carillon and Glockenspiel emulations using different loop lengths that met at alternating times. I really enjoy employing this technique - I find it generates complexity and engagement while maintaining a minimalist aesthetic really well.

    But I'm starting to feel this may take the composition in another direction. I think it may be best to stick to the ideas contained within GYRE Oscillation and work on using the accompaniment to create dynamism within the composition (as the original sketch felt a bit flat in situ)


    Steve Reich - Piano Phase
    Performed  by Tine Allegaert & Lukas Huisman (www.lukashuisman.be)













    Wednesday, 28 September 2016

    COMM2591 - HMsEx Tertiary Project Research & Development :: Week 9

    Meeting with Mohammad Fard on Tuesday regarding vibrotactile drowsiness study at Bundoora campus:


    • Went well. I'd revised and studied my BB literature again prior to going in. Spoke with confidence and authority about methodology with which to implement this within new test design.
    • Key points to remember here are BB's (according successful lit: Lanes, Kasian, Owens, Marsh 1998) are required to be played amongst pink noise with a carrier tone 15db above the amplitude of the noise.
      • Even the control group were played pink noise and a carrier tone.
        • This could be adapted using road noise 

    • Isochronics and Monaural methodology can be at greater amplitudes and with a single speaker presentation model (which suits our study conditions best also)
      • Will & Berg (2007) observed brainwave synchronisation through high grade EEG equipment when playing monaural drum and click acoustic stimuli to participants at 79db.

    • Following this meeting my objective is to build a new MaxMSP patch that tests these monaural methods matching the literature with precision.
      • Need to thoroughly investigate Isochronics and Monaural methods before commencing this, after current obligations with GYRE.
      • Exciting! I look forward to getting into it.

    Binaural Auditory Beats Affect Vigilance Performance and Mood
    Lanes, Kasian, Owens, Marsh 1998

    Brain wave synchronization and entrainment to periodic acoustic stimuli
    Will U, Berg E 2007






    • This week I also created a Psychomotor Vigilance Task in MaxMSP.


    The PVT is a simple task where the subject presses a button as soon as the light appears. The light will turn on randomly every few seconds for 5–10 minutes. The main measurement of this task is not to assess the reaction time, but to see how many times the button is not pressed when the light is on. The purpose of the PVT is to measure sustained attention, and give a numerical measure of sleepiness by counting the number of lapses in attention of the tested subject.


    Apologies for the resolution, this site maxes at 320x240 resolution
    The patch takes spacebar as an input and displays the LED randomly every few seconds 
    for 5 minutes (adjustable to 10min) - the patch records both how long it takes to respond and how many times the LED was missed (lapse in attention)

    This information is recorded in a text file that can be imported into Excel.



    PVT literature sources I used to develop the patch:
    https://faculty.washington.edu/wobbrock/pubs/ph-13.pdf
    http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.611.458&rep=rep1&type=pdf
    http://www.unisanet.unisa.edu.au/staff/matthewthomas/GREG/roach(shortpvt)06.pdf




    • Also this week I discovered "Brain.fm" through a podcast I was listening to regarding brainwave entrainment methods.
        • The creators/owners of the site talked a lot of sales bullshit and didn't get into the hard science behind their algorithms - primarily because brain.fm is a paid service and they aren't disclosing their proprietary algos
        • However, they do back up their claims with studies conducted by Neuroscientists
          • Dr. Ben Morillon. Cognitive Neuroscientist, Inserm Researcher Aix-Marseille Université
          • Dr. Giovanni Santostasi. Neuroscientist at Northwestern University. 
          • A team 'advisors' (consisting of 4 PhD + 1 Professor)




    • I am very keen to dig deep into their methodology and discover how exactly their are extending "auditory neuroscience" and yielding these dramatic results. Snake Oil???
    • I'm going to start an account and try it for myself.







    COMM2591 - HMsEx Secondary Project Research & Development :: Week 8

    Tom suggested breaking down the GYRE composition into 'movements' based on conversation with Darrin. Potential to allow the visuals and sound to alternate intensities, fits with concept.

    Explored this and generated some sketches for Tom using the Hauptwerk Grand Organ emulation.


    GYRE Intro sketch:

    • Purely using Pedal voice & Carillon to set tone
    • Using filtered noise to create swirling texture building intensity
    • Noise source: visual clips sent to me from Tom - opened in audacity as Raw Audio 
      • aligned with secondary GYRE concept of circular feedback in process





    GYRE Oscillation sketch:
    • Experimented with Native Instruments Reaktor ensemble Spiral to generate swirling MIDI patterns
    • Edited good takes into MIDI clip in Ableton and added draft accompaniment
    • Aligned with GYRE concept




    GYRE Grain sketch:
    • Exploring possibilities of dynamic timbral modulation to create disjunct and flux with acoustic organ voicing
      • Using same MIDI pattern sent to Hauptwerk sent to the MTH Grand Organ
      • Using modulated granular synthesis on recording of Hauptwerk played against acoustic organ voicing
      • Recycling within process aligned with GYRE concept
      • Ability to slowly rupture and "de-rupture" purely through timbral modulation attractive as a means to articulate GYRE concept





    Mini lit review assessment task:

    Enhancing Kinesthetic Awareness and Proprioception Through the Application of Strategic Sonic Modalities.

    Kinesthetic Awareness is the sensory process of locating the body within space, utilizing sensory input to temporally locate the body within external space. This is distinct from the process of Proprioception, which pertains explicitly to the internal awareness and control of the muscular system which dictate the negotiation of body within space. Often the two are often incorrectly conflated, though the dichotomy is critical especially within the context of clinical, athletic and professional dance practice.

    Sonic modalities have been explored within the context of enhancing Proprioception and Kinesthetic Awareness within the work of Barrass et al. (2009) Through the use of GPS coordinates and accelerometer data to extrapolate velocity and acceleration, parameters of live synthesis were modulated and played to an elite rowing team in real time. The athlete participants in the study all reported that the sonic events that were generated from their movements allowed a perception of their technique that gave greater insight than watching a recorded video of their training sessions, allowing them to improve their technique.

    The natural relationship between sound, kinesthetic awareness and proprioception are clearly evident within the context of instrumental musicianship and voice training. The feedback between the perception of gesture driven sound and motor dexterity to achieve finer, and more harmonious control of sound through gesture has been articulated by Orenstein (2003) stating that motor learning is reliant on the ability for the performer to identify and respond to discrete perceptual cues. Orenstein draws parallels with the kinesthetic awareness required of elite athletes to identify perceptual motion cues of velocity and acceleration to excel in their field with the sonic perceptual motion cues of pitch and rhythm as being equally vital in the the context of voice training to achieve excellence through enhanced proprioceptive control. Orenstein continues to identify sonic perceptual cues within sports by illustrating the desirable tonal beauty of a perfect shot in basketball, the “swish” as being a clear kinesthetic indicator of proprioceptive control.


    In their systematic review of the practical application of sonification techniques used to express physical characteristics, Dubus and Bresin (2013) reviewed a randomised selection of 179 journals from a database of 739 providing a comprehensive view of practical sonic modalities dating back to 1945. Though the topic of the enquiry was to build a unified topography of the design methodologies explored within sonification practice, Dubus and Bresin made clear note of the strong relationship between auditory perception and motor control and that of the ability of sound to articulate movement through space. Indeed, their review plots the implementation of sonic characteristics to describe movement and spatial awareness. Their findings showed not only that basic pitch relationships are the prevailing design choice to articulate movement, but that spatialisation of sound is almost exclusively used to articulate kinematics. However, Dubus and Bresin note that only a marginal amount of the mapping strategies employed within the publications they reviewed actually have actually had those strategies evaluated, which is alarming. Clearly a greater amount of rigour from a design perspective needs to be implemented to advance the practical application of movement sonification further into the 21st Century.


    Reference list:

    Barrass S, Mattes K, Schaffert N, Effenburg A 2009, ‘Exploring Function And Aethetics In Sonifications For Elite Sports’, In. Proc. 2nd Int. Conference on Music Communication Science (ICoMCS2). 3rd-4th December 2009, Sydney, Australia.

    Ohrenstein, D 2003​ ‘Journal of Singing’ - The Official Journal of the National Association of Teachers of Singing 60.1 (Sep 2003): 29-35.

    Dubus G, Bresin R 2013 ‘A Systematic Review of Mapping Strategies for the Sonification of Physical Quantities’ PLoS ONE 8(12): e82491. doi:10.1371/ journal.pone.0082491 



    Wednesday, 21 September 2016

    COMM2591 - HMsEx Secondary Project Research :: Week 7

    After preliminary discussions with Thomas Pentland, our project is titled GYRE.

    GYRE (noun)
    1. a ring or circle.
    2. a circular course or motion.


    • Describes the circular motion Tom is intending to use within his visual motif.
    • Also used to describe the feedback process involved with the collaboration as material circulates back and forth between us.

    Using this GYRE concept as a guide for sound design, I'm interested in exploring a rhythmic motif that implies circular motion.

    I'm drawn to the Risset Rhythm illusion. I would like to explore this.


    Jean-Claude Risset described an “eternal accelerando” illusion, related to Shepard tones, in which a rhythm can be constructed to give the perception of continuous acceleration. The effect can in principle be derived from any rhythmic template, producing patterns with aspects of fractal self-similarity.  - (Stowell D. 2011)

    • Explore using the Risset Accelerando and inverting it to create "eternal" Ritardo to create the composition - Sense of flux, circular motion
    • Can it be applied to MIDI? - Can the Organ play Risset Rhythms in melodic scale?



    Started using Code Academy to pick up JavaScript as it integrates well with MaxMSP (and scripting handles recursion much better than straight patching in Max)

    MaxMSP + JavaScript solution to create Risset Rhythm MIDI generator would be a great outcome

    PPQN issues with Ableton Live:
    Ableton has a kind of "dynamic" way of handling Pulses Per Quarter Note (midi resolution) which will make working to a static PPQN difficult..


    Wednesday, 14 September 2016

    COMM2591 - HMsEx Major Project Addendum + Secondary Project Research :: Week 6


    Major Project Addendum:

    • Processed Spectral data of the Sound Condition captures for FF
      • This will assist with further processing in the write up
      • Also required to develop a data visualisation tool for FF to load all spectral and pressure data tables and generate a "pressure sonograph" - more detail from FF pending
    • Processed the video files captured in the session last week



    Move Like You Don't Want To Disturb The Air

    Spectral content of sound generated from pressure data in 'Move Like You Don't Want To Disturb The Air'
    Sound Condition: Move Like You Don't Want To Disturb The Air




    Gulliver's Travel's

    Spectral content of sound generated from pressure data in 'Gulliver's Travels'

    Sound Condition: Gulliver's Travels





    Omnipresent: All Seeing Being

    Spectral content of sound generated from pressure data in 'Omnipresent: All Seeing Being'

    Sound Condition: Omnipresent: All Seeing Being








    Puppet

    Spectral content of sound generated from pressure data in 'Puppet'

    Sound Condition: Puppet











    Secondary project: Melbourne Town Hall Grand Organ Composition Piece

    • Collaborating with Thomas Pentland (visuals)


    Initial ideas:

    • Use this opportunity to take sound designer role - articulate Tom's concept through sound

    Initial trial sketches for first tour of the MTH Grand Organ:

    Experimenting with dynamics (velocity ramps) using all note durations between 4n - 64n (In C Minor)


    Zoomed-in view of the black (higher density) sections in the above clip (64n)




    Notes about the tour of the MTH Grand Organ:

    • Explored the innards of the organ, incredible construction.. speechless.
    • Discovered that being a wind instrument, the MIDI velocity information is redundant (of course!) - therefore my clip to trial was useless
    • Will have to reconsider my approach
    • Set up my patch for the organ - selected voicing
      • Pedals: all subtone, no "trumpet" voicing
      • Swell: Organ tone, mostly highs to avoid clash with pedals
      • Carillon: can do some interesting retriggers with the bells, useful for phrase 'punctuation' and darker, somber tones (also gives odd sense of immense spatiality when used as the only high frequency voice)
      • Glockenspiel: also good for retriggering - good options for harmonisation with the Carillon voicing



    Wednesday, 7 September 2016

    COMM2591 - HMsEx Major Project Research / Study Execution :: Week 5

    Saturday 27.08.16 - Met with Frank Feltham: SIAL 9:30am

    Discussed where FF was at in terms of preparation for data capture for the study dates 29-31st Aug and identified problematic areas of FF's current data collection patch.

    Developed data capture and audio/video synchronisation with ableton live across two video streams within FF's MaxMSP patch.
    - Refined trigger system for synchronisation (removed auto trigger from pressure data, switched to manual trigger)
    - Replaced [jit.record] with [jit.vhs] and created audio [adc] feed in subpatch from ableton live

    Outcome: FF's patch ready for data collection and live monitoring


    Sunday 28.08.16 - Solo studio session at SIAL 10am

    Adapted my MaxMSP interaction design patch to:
    - Accommodate for 3x data streams from Wii Balance Boards
    - Utilise discrete (X & Y) centre of mass values for discrete MIDI CC values from each discrete board
    - Utilise summed (X & Y) centre of mass values for a discrete MIDI CC values for the sum of all boards
    - Utilise discrete sum total pressure value for discrete MIDI CC & Note On values from each discrete board
    - Utilise discrete sum total pressure value for a discrete MIDI CC values from the sum of all boards

    Refined and completed sound design for Synthesis Treatment A: "Move Like You Don't Want To Disturb The Air"

    Two tones:
    Bass Tone (Operator, Sine D#0 :: 40hz) looping independent of interaction
    Piano Tone (Sampler, 'Lovely Keys' patch, Discrete MIDI Note values from each board) gesture driven, no sound = no gesture

    Using Discrete Sum values on each board to trigger:
    - MIDI Note On messages, pressure sum float value scaled to Velocity, no exponential scaling
    - MIDI CC values, pressure sum float value scaled from 0-127, no exponential scaling.

    Ableton Mapping:
    Pressure Sum of any board attenuates volume of Bass tone by 10db at full pressure
    Pressure Sum of any board attenuates volume of Piano tone by 3db at full pressure
    Pressure Sum of any board modulates dry/wet knob of reverb on Piano tone
    Pressure Sum of any board modulates amount knob of erosion plugin on Piano tone
    Pressure Sum of any board modulates frequency knob of erosion plugin on Piano tone
    Pressure Sum of each board modulates frequency cutoff of lowpass filter on Piano tone



    Interaction & Sound Design choices:

    Bass tone represents weight of the air.
    - Full pressure depresses the volume of the air with high sensitivity, encourages light controlled feet.
    - Felt deep within body and non localised (deep sub freq)

    Piano Tone pitch mapping:
    - Board 1: A2 / Board 2: G#2 / Board 3: A2 :: I pitched down one Semitone for the second board to have an sonic cue of progression and resolution as also for clarity. As the second board is triggered there is a clear pitch change to distinguish the gestures, which resolves itself with the third board at the initial pitch once again.

    Piano Tone FX mapping:
    - Erosion: 'Amount' value full range mapped to pressure sum. Full pressure introduces maximum noise into the instrument. Noise representing both 'Air' and the 'Disturbance' of air.
    - Erosion: 'Frequency' value (bandpass filter in series) full range (300hz - 18khz) inversely mapped to full pressure sum. Pressure relationship to sound equates to bandpass filtering out high frequencies with increase in pressure, sensation of the air being disturbed as high frequencies are removed.

    - Reverb: full pressure inversely mapped to dry/wet. More dry tone passing through the reverb with full pressure brings the sound 'closer' to the sound stage, bringing the noise from the erosion into focus. Making "the air" feel disturbed. Encourages light, controlled feet.


    Testing sound design patch for 'Move Like You Don't Want To Disturb The Air'



    Functionality tested parallel operation of both FF and JC max patches with Ableton Live and OSCulator. Worked seamlessly on my machine, achieving 50FPS max 15FPS min for 2 video streams while all software in use.

    Outcome: Use my machine for the study.



    Monday 29.08.16 - Met with Kirsten Packham & Frank Feltham: SIAL 9:30am

    Initial calibration and introduction day. Met KP, FF briefed her on the study. KP ran through her 4 routines while I observed the data flow within the collection patch. Made notes to reflect how the data was shaped for each routine. Made SD notes developing existing ideas/sketches to refine SD implementation for each routine.


    Confirmed with FF completion sound design for Synthesis Treatment B: "Gulliver's Travels"

    Custom Ableton Operator patch, mirrored across 3 instrument channels in Ableton Live
    Each instrument channel responding to discrete MIDI channel (1, 2, 3) per Wii Balance Board 1, 2, 3
    All three instruments playing bass tone (Operator D#0 :: 40hz) looping independent of interaction
    4x Squarewave (D) Voices per instrument
    Parallel Output routing (no serial FM)
    Low Pass Filter 12db Per Octave
    Spread: 87%
    Transposed down -12ST





    Using Discrete Sum values on each board to trigger:
    - MIDI CC values, pressure sum float value scaled from 0-127, no exponential scaling.

    Ableton Mapping:
    Pressure Sum of each board raises discrete instrument channel gain level to 0db at full pressure
    Pressure Sum of each board raises MIDI Macro knob to 127 at full pressure

    Instrument Macro Mapping (Mirrored across each board/instrument channel):
    Pressure Sum from each board raises Operator volume level to 0db at full pressure
    Pressure Sum from each board raises OSC A Fixed Frequency from 10hz to 2000hz at full pressure
    Pressure Sum from each board raises OSC B Fine Frequency from 0 to 1000 at full pressure
    Pressure Sum from each board raises OSC C Fixed Frequency from 10hz to 2000hz at full pressure
    Pressure Sum from each board raises OSC D Fine Frequency from 0 to 1000 at full pressure
    Pressure Sum from each board raises LPF cutoff frequency from 30hz to 1200hz at full pressure





    Interaction & Sound Design choices:

    Reductive approach and uniformity paramount in this synthesis treatment. FF & JC agreed that Gulliver's Travels SD should primarily utilise pitch to reflect immediate changes in pressure as contrast to other ST methods in pilot study. This was augmented and developed with volume and gain mappings to engender more pronounced sense of gestural interaction with sound. This was again further developed with the opening the cutoff frequency of the Low Pass Filter.

    4x Squarewave OSC per instrument with extremely wide range of pitch modulation designed to reflect extremely large sonic changes with subtle pressure changes. Transition from non pressure state very low sonic state to high frequency full pressure state incrementally brings the sound from being environmental and felt to cerebral and heard - with the intention of transferring pressure changes from the environment to the mind.

    Side note: In the realisation of the SD for this treatment I wanted to explore the notion of stepped frequency changes with units of pressure to reflect the image notes from KP in Gulliver's Travels, "Little hands helping me up and down." FF and I agreed that at least one ST should not be image driven and more clinical, so I did not pursue this approach.




    Testing sound design for 'Gulliver's Travels'





    Tuesday 30.08.16 - Met with Kirsten Packham & Frank Feltham: SIAL 11:00am

    Performed and recorded KP images, "Move Like You Don't Want To Disturb The Air" and "Gulliver's Travels"

    Randomised ordering of the Sound Condition with the No Sound Condition

    Made the decision to not remain present for the qualitative data collection of the "video recall" session with FF & KP for each image to ensure FF & KP could converse deeply within these sessions. As a result, I remain unaware as to the success of my synthesis treatments.

    Overall feeling: good day of data collection. Good rapport with KP. KP appears excited by possibilities of pressure sonification. KP appears to have fun with process!

    Solo session:

    Refined and completed sound design for Synthesis Treatment C: "Omnipresent All Seeing Being"

    Modified version of 'Minor to Major Lead' Operator patch, mirrored across 3 instrument channels in Ableton Live
    Each instrument channel responding to discrete MIDI channel (1, 2, 3) per Wii Balance Board 1, 2, 3
    All three instruments playing bass tone (Operator F2 :: 175hz Fundamental) looping independent of interaction
    4x Sawtooth wave (Sw32) Voices per instrument
    Parallel Output routing (no serial FM)
    Low Pass Filter 12db Per Octave
    Spread: 100%


    Using Discrete Sum values on each board to trigger:
    - MIDI CC values, pressure sum float value scaled from 0-127, 0.4 exponential scaling.

    Using Discrete X Axis COM values to trigger:
    - MIDI CC values, X axis COM float value scaled from 0-127, no exponential scaling.



    Ableton Mapping:
    X Axis COM value of each board controls discrete track panning (full range) for each discrete instrument channel

    Pressure Sum of each board raises discrete instrument FX: Erosion "Amount" Macro from 0 to 200 at full pressure
    Pressure Sum of each board raises discrete instrument FX: Erosion "Frequency" Macro from 3000hz to 18000hz at full pressure
    Pressure Sum of each board raises discrete instrument Volume level to 0db at full pressure
    Pressure Sum of each board raises discrete instrument "Minor to Major" Macro to 127 at full pressure
    Pressure Sum of each board raises discrete instrument "Filter Freq" Macro from 1000hz to 18500hz at full pressure


    Instrument Macro Mapping (Mirrored across each board/instrument channel):
    Pressure Sum from each board raises Operator volume level from to -12db to 0db at full pressure
    Pressure Sum from each board modulates OSC B Fine Frequency from 201 to 285 at full pressure
    Pressure Sum from each board modulates OSC D Fine Frequency from 790 to 883 at full pressure
    Pressure Sum of each board raises low pass filter frequency cutoff from 1000hz to 18500hz at full pressure






    Interaction & Sound Design choices:

    A balance of practical/clinical application of reductive sound design parameters and sound design choices informed by the image notes. I desired a sound bed that reflected an Omnipresent awareness - shimmering high frequency legato chords to be modulated by the interaction. 

    The 'Minor to Major' Operator patch successfully maps a sonically rich harmonic shift that can be applied while notes are held (legato) over a 0-127 MIDI CC range - so this became a compelling direction to explore. I augmented this sonically with dynamically modulated parameters of the Erosion effect plugin, adding increasing amounts of high frequency content with higher levels of interaction. 

    This along with discrete volume and discrete panning assignments also assisted in allowing each instrument voice to be distinct from each other while maintaining uniform assignment across each board.


    Refined and completed sound design for Synthesis Treatment D: "Puppet"

    Custom Operator patch, mirrored across 3 instrument channels in Ableton Live
    Each instrument channel responding to discrete MIDI channel (1, 2, 3) per Wii Balance Board 1, 2, 3
    Each instrument channel responding to discrete MIDI Note_On and Note_Off messages per Wii Balance Board 1, 2, 3
    All three instruments inactive without interaction
    4x Sine wave Voices per instrument
    Parallel Output routing (no serial FM)
    Low Pass Filter 12db Per Octave
    Spread: 100%


    Using Discrete Sum values on each board to trigger:
    - MIDI CC values, pressure sum float value scaled from 0-127, no exponential scaling.
    - MIDI Note_On values, pressure sum float value positive threshold crossing of 0.51 sends discrete MIDI Note_On message, no exponential scaling.
    - MIDI Note_On values, pressure sum float value negative threshold crossing of 0.51 sends discrete MIDI Note_Off message, no exponential scaling.

    Using Discrete X Axis COM values to trigger:
    - MIDI Note_On values, X axis COM float value positive threshold crossing of 0.51 sends discrete MIDI Note_On message, no exponential scaling.
    - MIDI Note_On values, X axis COM float value negative threshold crossing of 0.51 sends discrete MIDI Note_Off message, no exponential scaling.

    Using Discrete Y Axis COM values to trigger:
    - MIDI Note_On values, X axis COM float value positive threshold crossing of 0.51 sends discrete MIDI Note_On message, no exponential scaling.
    - MIDI Note_On values, X axis COM float value negative threshold crossing of 0.51 sends discrete MIDI Note_Off message, no exponential scaling.

    Ableton Mapping:
    X Axis COM value of board 1 sends MIDI Note_On/Off: A5 on MIDI Ch1
    X Axis COM value of board 2 sends MIDI Note_On/Off: G5 on MIDI Ch2
    X Axis COM value of board 3 sends MIDI Note_On/Off: A5 on MIDI Ch3

    Y Axis COM value of board 1 sends MIDI Note_On/Off: F5 on MIDI Ch1
    Y Axis COM value of board 2 sends MIDI Note_On/Off: E5 on MIDI Ch2
    Y Axis COM value of board 3 sends MIDI Note_On/Off: F5 on MIDI Ch3

    Pressure Sum value of board 1 sends MIDI Note_On/Off: A2 on MIDI Ch1
    Pressure Sum value of board 2 sends MIDI Note_On/Off: G2 on MIDI Ch2
    Pressure Sum value of board 3 sends MIDI Note_On/Off: A2 on MIDI Ch3

    Pressure Sum of each board raises discrete instrument Lowpass Filter Cutoff from 1000hz to 18500hz at full pressure
    Pressure Sum of each board raises discrete instrument OSC B Fine Frequency from 0 to 50 at full pressure
    Pressure Sum of each board raises discrete instrument OSC C Fine Frequency from 0 to 50 at full pressure
    Pressure Sum of each board raises discrete instrument OSC D Fine Frequency from 0 to 50 at full pressure




    Interaction & Sound Design choices:

    Sound and interaction design choices for Puppet were as informed by the image notes as by providing an interesting contrast to previous ID & SD approaches. FF & JC agreed that the interaction design should reflect the staccato movements implied by the Puppet image notes. 

    This necessitated a fully gesture driven system to generate MIDI Note_On/Note_Off messages. As each axis of COM values rest at 0.5 float values, this made sense as a threshold point - allowing left to right and posterior to anterior foot transitions to trigger and hold notes within an Ableton instrument and also have these notes closed with a gesture driven Note_Off message when the movement is completed and the axis returns to its rest value at 0.5. I allowed this threshold value to be calibrated to any value, for customisation.

    I applied a similar system to the pressure data so that MIDI Note_On/Off pairs could be triggered with pressure interaction on each board. 

    With this new system functioning I was able to articulate the movements with single transient non-legato tones which matched the jerky movements I observed when KP performed Puppet on day 1 and also provided a good contrast to the previous interaction and synthesis treatments.

    For the sound design, I made a custom FM synthesis patch using all four voices of Operator with very short envelopes - loosely modelling the Xylophone/Glockenspiel tones I envisaged being suitable for the Puppet image, detuning and layering each voice to produce one clear thick tone. I spent some time auditioning this patch without the WiiBB input, experimenting with unsteady rhythms - which matched my desired outcome for Puppet sonically.

    I employed the same pitch change motif as "Move As If You Don't Want To Disturb The Air" by reducing the pitch for the second board by one step which resolves at the original pitch at board 3. This was applied to all three MIDI note assignments per board. I selected the Pressure Sum value MIDI Note pairs as A2, G2 & A2 of boards 1, 2 & 3 - to allow a lower tone to reflect the first sonic change heard with the first point of contact in the interaction. This lower tone provides a lower frequency grounding for the other two note pairs triggered by centre of mass transitions to play against.

    The COM transition MIDI Note Pairs were raised 3 octaves above the Pressure Sum value MIDI Note Pairs to allow for greater auditory localisation for the performer, as I observed the Puppet routine involving many instances of posterior/anterior and left/right transitions while feet are planted before transitioning boards.

    To augment and develop this, I added a very small amount of pitch modulation uniformly across 3 out of 4 voices in all instruments so that these "wobbles" gestures would be articulated within the sound and could be played with.

    Finally, the Pressure Sum value of each board was designed to open up a filter with greater pressure giving the Z-Axis of movement sonic articulation.

    All in all, Puppet was the most interesting and satisfying synthesis treatment and interaction design process of the study for me. It also feels like it possesses the largest scope for further exploration into interaction and sound design, I feel like a just scratched the surface - despite it being a very dynamic and engaging instrument, which is encouraging.



    Sound designing for 'Puppet'




    Wednesday 310816 - Met with Kirsten Packham & Frank Feltham: SIAL 11:30am

    Performed and recorded KP images, "Omnipresent All Seeing Being" and "Puppet"

    Added new condition, 'Walk Condition' to capture walk control again- randomised these with the ordering of the Sound Condition the No Sound Condition

    Again, made the decision to not remain present for the qualitative data collection of the "video recall" session with FF & KP for each image to ensure FF & KP could converse deeply within these sessions. As a result, I remain unaware as to the success of my synthesis treatments.

    Overall feeling: another good day of data collection. Good rapport with KP. KP appears to remain excited by possibilities of pressure sonification. Had lunch with KP and FF, had great interaction with KP, interested in potential collaborations further down the road for SD!

    Wednesday, 17 August 2016

    COMM2591 - HMsEx Major Project Research / Development :: Week 4

    CoDesign Study3 - Creative Agency Walking 

    Aims:

    • Compare and contrast sonic biofeedback as an effective modality against the absence of sonic stimuli and trial this as an effective entrainment method for precision in body and movement kinematics.
    • Collect quantitative pressure and centre of mass (COM) data during movement routines for analysis.
    • Trial extended gesture driven sonification techniques to interrogate and build upon existing literature.

    Early interaction design ideas:

    • Needs to be expressive and immediate (sound felt, not heard)
    • Exponential/logarithmic curvature in MIDI scaling (greater expression)


    Literature to read:

    Sonification - Stephen Barrass (University of Canberra)

    Barrass papers:
    http://www.canberra.edu.au/about-uc/faculties/arts-design/courses/undergraduate/media-arts-and-production/tabs/staff/media-arts-and-production-staff/barass-stephen

    Exponential/Log tools:
    https://www.desmos.com/calculator
    http://www.timotheegroleau.com/Flash/experiments/easing_function_generator.htm
    http://www.mathsisfun.com/algebra/exponents-logarithms.html



    Kirsten Packham (De Quincey Co) is the sole study participant. A trained professional in body and movement work, KP has provided the following information to guide the sound design.

    These 'Images' are imaginative qualitative descriptors that assist focus and interoceptive kinesthetic awareness while conducting the respective routine they describe.

    I have used these to make early sound design notes which will be put into practice later.


    1. MOVE LIKE YOU DON'T WANT TO DISTURB THE AIR 

    Very slow. All the molecules in the room are affected by your movement. 
    • (bass rumble + high end - high and lowpass filters :: mids notched out )
    • chimes

    Colluding with the air. 
    • reverb on top end? ie: room size, decay 

    Feet lift slightly, glide across floor and negotiate with the air. 
    • Filter (opening? / closing? - small range)

    Minimal articulation and gait. 
    • articulation spikes to be sonified? possible distraction here

    Weight held in core. 
    • central bass tone at centre of gravity 

    Legato. 
    • pitch bend
    • max 1 semitone on bass *if at all*
    • maybe 2+ semitones on high tone



    2. PUPPET 

    Slow. 
    • Fine control
    • Rhythmic
    Articulate loose feet. 
    • Wide control mapping??
    • Rhythmic articulation, tight envelopes
    • Each note discrete playing in a “circus” scale
    Awkward. 
    • Wide range
    Sense of up/down, lifting and lowering. 
    • Filter
    • Pitch
    • Wide control range
    • Bells, FM
      Backing track?
    Variation in each iteration/step.

    • cycling MIDI notes through scale?
    • pitch modulation?


    3. GLACIER CALVING 

    Slow-very slow. 
    • Drone

    Feet - ball/heel/toes sink as if crumbling and breaking off into floor - being absorbed by floor. 
    • Filter closing with each step
    • Delay/Reverb
    • Noise/Pitched down Ice crunchy sound on top through filter

    Scale is massive. 
    • Bass

    Energy down.

    • Bass


    4. OMNIPRESENT - ALL SEEING BEING 

    Slow. 


    Purposeful, forward direction but sensing/360 Omni central awareness. 
    • Panning 
    • Constant tone, amp + possibly filtering independent stereo channels drops and raises to indicate unbalanced gait
    • Mid High frequencies
    • Chords (synth tbd - basic fm)

    Weight held in core. 
    • Pan triggered from Centre of Mass
    • Perhaps all triggers from COM

    Whole body glides. Gait even.
    • Sound cues to entrain central movement



    5. GULLIVER'S TRAVELS 

    Slow. 

    Incrementally lifted and lowered. 

    Weight lifted up by many hands, carried and carefully lowered. 
    • Up fast, lowered slow
    • Pitch going up matched to pressure
    • Pitch alone?
    • Distinct Clear tonal change between up and down env

    Articulate feet.



    7. DEER PREY 

    Slow with variation and stops.
    • Rhythmic? ala ‘Puppet'
    • Use sound cues to reinforce stops

    Light feet listening, sensing vibrations. 
    • Pressure too heavy or uneven gait - wobble filter?

    Articulate. 
    • Precise mapping

    Weight held in core.
    • Centre of Mass

    Also - Music references

    Machinefabriek - Singel
    Oren Ambarchi - Song of Separation part 2
    Alva Noto - J
    Christina Kubsich - In Transition
    Triosk - Lazyboat

    (^ Great music selection! However I don't think I will be using these as timbral inspiration after all - familiarity may bias the data)


    Finally from Frank:

    "Finally - i've been finding this paper by Varni et al 2012 attached quite cool as it uses a MOOG cutoff filter to respond to ideas of energy. Have a look and we can discuss tomorrow.  btw i have a MOOG minitaur !!"

    (Think i'll keep hardware out of this in the aim of simplicity - dynamic parameter mapping using some digital, some outboard will be a mess)





    Eyetracking/Gazepoint 4D study resources:

    GP3 Model
    http://www.gazept.com/developer/

    (Designed for windows 7&8)

    API doc: http://www.gazept.com/Publications/Gazepoint_API_v2.0.pdf

    TCIP Comms in Max:
    http://www.maxobjects.com/?v=objects&id_objet=4684
    https://cycling74.com/2006/10/23/networking-max-talking-to-max/#.V7UqHj5962w




    Wednesday, 10 August 2016

    COMM2591 - HMsEx Week 3 Lecture Research :: Week 3

    Major project, journal this week:


    What is it? 
    Research study at Bundoora campus investigating vigilance entrainment through auditory cues within the broader context of awareness/drowsiness in drivers.

     How does it work?
    Exploring the potential for Binaural Beats, Isochronic tones and other modalities for brainwave entrainment through auditory cues.

     Why does it work? 
    [Summarise current research, psychology, biology, neuroscience]


    Adapted from Randy J. Larsen and Edward Diener, “Promises Problems with the Circumflex Model of Emotion,” Review of Personality and Social Psychology 13 (1992): 31.




    Valence, as used in psychology, especially in discussing emotions, means the intrinsic attractiveness (positive valence) or aversiveness (negative valence) of an event, object, or situation. However, the term is also used to characterise and categorise specific emotions.







    Mood
    : State of mood that is experiences at a particular time
    Objectless affective states

    Emotion: A natural instinctive state of mind deriving from one’s circumstances or relationship with others

    Feeling: Emotions with cognitive overlay

    Entrainment: "Appetitive vs Aversive"



    Senses to exploit:
    Hearing
    Vision
    Smell
    Touch
    Proprioception (muscle contraction) <- this may be useful in our research.
    Balance



    Brian Eno > scent :: look up

    Scents and Sensibility


    From Details Magazine, July 1992, by Brian Eno

    http://music.hyperreal.org/artists/brian_eno/interviews/detail92.html

    Draws parallels between the sensorial power and immediacy of scent to the immeasurable elements of sound through a musicological framework, 


    "Rock music, I kept saying, was a music of timbre and texture, of the physical experience of sound, in a way that no other music had ever been or could have ever been. It dealt with a potentially infinite sonic pallette, a palette whose gradations and combinations would never adequately be described, and where the attempt at description must always lag behind the infinites of permutation."




    Richard Davidson - The emotional life of your brain








    Affective Neuroscience: (study of brain mechanics that underlie emotion)





    Antonio Damasio - reason vs emotion
    All in the mind website / ABC - Radio National
    The master and his emissary podcast - balance of emotion vs reason






    Jason Satterfield “Brain Mind & Behaviour: Emotions & Health”
    “From mode to emotion in musical communication”
    Appraisal theories of emotion - situational





    David Eagleman -  “The Brain” SBS
    http://www.eagleman.com/research/113-the-brain-pbs






    Daniel Khaneman - "Thinking fast and thinking slow"
    System 1 & System 2





    Daniel Gilbert - “Stumbling on Happiness”






    Sound design: Michel Chion, Gayard & Torque?



    Investigate old woman/young woman illusion rates - hemispheric switching? backed up by science?? Skeptical! Omega 3 ups rate of hemispheric switching? Citation!


    Writers block overcome by omega 3? CITATION!



    Galvanic Skin Response to measure and record chord relationships and goosebump response?


    Lobe stimulation with Magnets? TMS


    Plint experience: Write up