Using Interactive Machine Learning to Sonify Visually Impaired Dancers’ Movement
Katan, Simon. 2016. Using Interactive Machine Learning to Sonify Visually Impaired Dancers’ Movement. Proceedings of MOCO'16, July 05 - 07, 2016, Thessaloniki, GA, Greece, [Article]
Text
MOCO16_SKATAN_noCP.docx - Accepted Version Available under License Creative Commons Attribution Non-commercial No Derivatives. Download (613kB) |
Abstract or Description
This preliminary research investigates the application of Interactive Machine Learning (IML) to sonify the movements of visually impaired dancers. Using custom wearable devices with localized sound, our observations demonstrate how sonification enables the communication of time-based information about movements such as phrase length and periodicity, and nuanced information such as magnitudes and accelerations. The work raises a number challenges regarding the application of IML to this domain. In particular we identify a need for ensuring even rates of change in regression models when performing sonification and a need for consideration of how to convey machine learning approaches to end users.
Item Type: |
Article |
||||||
Identification Number (DOI): |
|||||||
Keywords: |
Interactive Machine Learning; Accessible Interfaces; Dance; Sonification. |
||||||
Departments, Centres and Research Units: |
|||||||
Dates: |
|
||||||
Item ID: |
18854 |
||||||
Date Deposited: |
02 Sep 2016 13:20 |
||||||
Last Modified: |
20 Jun 2017 10:35 |
||||||
Peer Reviewed: |
Yes, this version has been peer-reviewed. |
||||||
URI: |
View statistics for this item...
Edit Record (login required) |