Now you are here: Home > Research Interests > Visual attention driven by auditory cues [ English ] [ Japanese ]

Visual attention driven by auditory cues

Human visual attention can be modulated not only by visual stimuli but also by ones from other modalities such as audition. Hence, incorporating auditory information into a human visual attention model would be a key issue for building more sophisticated models.

We propose a novel computational model of human visual attention driven by auditory cues. Founded on the Bayesian surprise model that is considered to be promising in the literature, our model uses surprising auditory events to serve as a clue for selecting synchronized visual features and then emphasizes the selected features to form the final surprise map.

Our approach to audio-visual integration focuses on using effective visual features alone but not all available features for simulating visual attention with the help of auditory information.


Jiro Nakajima, Akisato Kimura, Akihiro Sugimoto, Kunio Kashino
"Visual attention driven by auditory cues: Selecting visual features in synchronization with attracting auditory cues,"
Proc. International Conference on Multimedia Modeling (MMM2015),
Sydey, Australia, January 2015.
[ Bibliography ]