9th December 2015 - Moderators: Douglas McCutcheon & Michaela Socher
There has been a significant expansion in the number of studies using automatic methods for parent-child interaction analysis during the last five years. The quality of such relationships has been shown to have a strong influence on the child’s later development, emotion regulation, psychopathology, and secure-attachment. Using automatic methods one can extract features and analyze the communication considering the multimodal nature and dynamics of behaviors. In this webinar I will present “Social signal processing for studying parent-infant interaction”, by Avril et al (2014), an example of recent work that proposes a method to acquire and extract relevant social signals from a naturalistic parent-child interaction. The authors extracted several cues from body posture and speech using Kinect sensors, and selected interaction metrics in order to characterize synchrony during the interaction. The findings from the computational method were assessed using the Coding Interactive Behavior (CIB) measure, a widely-used validated global interaction scale for parent-child interaction.
Virtual acoustic environments can be realized using binaural technology in combination with loudspeakers in acoustically optimized rooms. This presentation is about a recently developed listening environment and its suitability for different kinds of psychoacoustic experiments. This not only includes a room acoustic analysis but also focuses on the technical components of the used auralization system. For listening experiments that have to be carried out on-site, a mobile hearing booth has been designed and constructed, which will also be presented. A tracking system is used to immediately adjust and update the auralized scene in case of user movement in the real world. Prior experience with and advantages and disadvantages of an electromagnetic and an optical tracking system will be shared.
Hearing with two ears is superior over hearing withone ear especially in complex auditory environments since it pro- vides accessto binaural cues. However, listeners who suffer from a hearing loss displaydifficulties regarding auditory spatial hearing. Until today it is a matter ofdebate to what degree hearing devices give access to binaural cues and whetherlisteners can integrate them into a coherent percept. The present studyutilized the mismatch negativity as an objective electrophysiological measureto assess automatic spatial discrimination that requires bilateral integrationin normal hearing adults. Bilateral listening conditions have been comparedwith unilateral listening conditions where a one sided conductive hearing losswas mimicked. Listeners were seated in a free field. Within a classic passiveoddball paradigm, acoustic stimuli were presented from two different fixedlocations. Auditory evoked potentials were analyzed for the differentconditions. All participants showed strong individual mismatch negativityresponses in the bilateral condition. During unilateral listening, reduced oreven absent MMN amplitudes were seen. These findings suggest that the mismatchnegativity elicited by acoustic spatial deviations can be used as an objectivemeasure to investigate spatial hearing, complementing or replacing behavioralmethods for example in incompliant groups such as children.