Publications preview

Publications

Publications, papers, and talks informing our tools and methods.

Selected publications spanning generative systems, audiovisual cognition, machine learning for sound and image, and applied technical research.

2025

Orchestrating Emergent Storytelling with Embodied Multi-Agent Systems

NeurIPS 2025 Creative AI Track

Embodied LLM agents with memory, coordination, and spatial awareness designed to produce stable emergent narrative systems in interactive worlds.

Read
2024

Fast, interactive, AI-assisted 3D lung tumour segmentation

Poster Presentation

An interactive 3D segmentation workflow for lung tumors that keeps clinicians in the loop while improving speed and boundary quality.

Read
2023

Improving non-small cell lung cancer segmentation on a challenging dataset

Poster Presentation

A study in robustness that combines normalization, lung isolation, self-supervision, loss tuning, and test-time augmentation for harder segmentation transfers.

Read
2021

The debate on screen time: An empirical case study in infant-directed video

Book Chapter

An eye-tracking grounded contribution to the screen-time debate, focused on how infants actually engage with educational video.

Read
2018

Automatic Programming of VST Sound Synthesizers Using Deep Networks and Other Techniques

Journal Article

Research on using deep networks and search techniques to automatically set synthesizer parameters that match a target sound.

Read
2017

Time Domain Neural Audio Style Transfer

Conference Proceedings

Audio style transfer work that operates directly on the waveform, reducing reliance on phase reconstruction and moving closer to real-time stylization.

Read

Write once run anywhere revisited: machine learning and audio tools in the browser with C++ and emscripten

Conference Proceedings

A methodology for deploying interactive audio and machine learning tools to the browser without rewriting C++ systems from scratch.

Read
2014

Auracle: how are salient cues situated in audiovisual content?

Research Report

A collaborative project on how sound and image cues interact in audiovisual attention, including datasets and analysis tools for gaze behavior.

Read

How Humans Hear and Imagine Musical Scales: Decoding Absolute and Relative Pitch with fMRI

Conference Abstract

An fMRI study of how the brain encodes musical scales that are heard versus imagined, comparing absolute and relative pitch strategies.

View

Optimising signal-to-noise ratios in Tots TV can create adult-like viewing behaviour in infants

Conference Presentation

A study of how media editing and audiovisual clarity can shape infant attention and move viewing behavior toward adult-like patterns.

Read

Audiovisual Scene Synthesis

Ph.D. Thesis

A thesis connecting perception research and computational art through sound-image synthesis and models of audiovisual representation.

Read

Audiovisual Resynthesis in an Augmented Reality

Conference Proceedings

An augmented reality artwork that recomposes what participants hear and see in real time using salient audiovisual fragments.

Read
2013

Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes

Journal Article

A paper showing how video synchronizes viewer attention while task still reshapes where people look in both static and moving scenes.

Read

Do low-level visual features have a causal influence on gaze during dynamic scene viewing?

Abstract

An investigation into how motion, contrast, and edge information contribute to gaze allocation in dynamic scenes.

Read

Corpus-based visual synthesis: an approach for artistic stylization

Symposium Paper

A real-time stylization method that reconstructs photos and video from a learned image corpus to create painterly and memory-like visual substitutions.

Read

Mining Unlabeled Electronic Music Databases through 3D Interactive Visualization of Latent Component Relationships

Conference Proceedings

A 3D interface for exploring large music archives with weak metadata, using latent audio features to reveal semantic relationships.

Read
2012

Do the eyes really have it? Dynamic allocation of attention when viewing moving faces

Journal Article

A study showing that viewers shift attention across the eyes, mouth, and nose depending on which facial region is most informative.

Read
2011

Clustering of Gaze During Dynamic Scene Viewing is Predicted by Motion

Journal Article

A paper showing that motion and flicker are strong predictors of where viewers look in dynamic scenes.

Read

Going with the flow? The endogenous/exogenous influences on gaze control in dynamic scenes

Conference Contribution

A bridge paper between low-level attention models and higher-order scene understanding, focused on how task and meaning shape gaze in moving images.

Read

Watching the world go by: Attentional prioritization of social motion during dynamic scene viewing

Journal Article

A study of whether socially meaningful motion gets priority in visual attention during naturalistic moving scenes.

Read