Skip to content

Software, methods and user studies exploring the cross-modal interpretation of music and visual art

Notifications You must be signed in to change notification settings

polifonia-project/deep-listening

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 

Repository files navigation

component-id name description type work-package project resource demo licence related-components contributors bibliography
deep-listening
Deep Listening
Software, methods and user studies exploring the cross-modal interpretation of music and visual art
UserInterface
WP1
WP5
polifonia-project
CC-BY_v4
story
Paul#1_OrganComparison
persona
Paul_Organ_Advisor
publication
Paul Mulholland, Adam Stoneman, Naomi Barker, Mark Maguire, Jason Carvalho, Enrico Daga, and Paul Warren. 2023. The Sound of Paintings: Using Citizen Curation to Explore the Cross-Modal Personalization of Mu- seum Experiences. In UMAP ’23 Adjunct: Adjunct Proceedings of the 31st ACM Conference on User Modeling, Adaptation and Personalization (UMAP ’23 Adjunct), June 26–29, 2023, Limassol, Cyprus. ACM, New York, NY, USA, 11 pages. https://doi.org/10.1145/3563359.3596662

Deep Listening

Deep Listening is being carried out as part of the Polifonia project to investigate how the cross-modal interpretation of music and visual art can enhance what you hear and what you see.

The work further extends the Deep Viewpoints software that was developed as part of the EU H2020 SPICE project to support the process of Slow Looking at visual art. Within Deep Viewpoints, the processes of observing and responding to art are guided by scripts. Each script is made up of a sequence of stages containing artworks, statements and various prompts or questions to which the reader of the script can respond. During the EU H2020 funded SPICE project, the Irish Museum of Modern Art (IMMA) used Deep Viewpoints as part of an initiative to reach communities traditionally underserved by the museum sector and to bring new perspectives to the museum’s collection and exhibitions. Participating communities were not only involved in interpreting artworks with the guidance of the scripts but also creating new scripts, mediating how others observe and think about art.

Recent work in collaboration between the Polifonia and SPICE projects has investigated how Deep Viewpoints could be extended to support the cross-modal interpretation of music and visual art. First, support was added for embedding YouTube videos within the scripts to support listening concurrent with viewing artworks, reading associated text, and answering pro- vided prompts within the page of the app. Second, functionality to support multiple choice as well as free text responses to questions was added to support the rating of music on a scale or selecting an emotion that matched the music. Third, a responsive web de- sign (RWD) approach was taken to supporting both the following and authoring of scripts. This was done to potentially support the following and authoring of scripts on personal smartphones (potentially with headphones) while in the museum as well as on larger screen devices.

The revised software has been used in two ways: (i) a musicologist curating experiences that link music to visual art in a museum collection, and (ii) visitors to a museum exhibition experiencing and creating cross-modal experiences.

A second experiment into Deep Listening has explored how cross modal interpretation works online using a range of music and visual art.

The browsing and navigation paradigm developed for Deep Listening is also being applied to support the public in exploring the ORGANS dataset.

About

Software, methods and user studies exploring the cross-modal interpretation of music and visual art

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published