Write once run anywhere revisited: machine learning and audio tools in the browser with C++ and emscripten

Zbyszynski, Michael; Grierson, Mick; Yee-King, Matthew and Fedden, Leon. 2017. 'Write once run anywhere revisited: machine learning and audio tools in the browser with C++ and emscripten'. In: Web Audio Conference 2017. Centre for Digital Music, Queen Mary University of London, United Kingdom 21-23 August 2017. [Conference or Workshop Item]

[img]
Preview
Text
Web audio conference rapidmix codecircle paper(2).pdf - Published Version
Available under License Creative Commons Attribution.

Download (485kB) | Preview

Abstract or Description

A methodology for deploying interactive machine learning and audio tools written in C++ across a wide variety of platforms, including web browsers, is described. The work flow involves development of the code base in C++, making use of all the facilities available to C++ programmers, then transpiling to asm.js bytecode, using Emscripten to allow use of the libraries in web browsers. Audio capabilities are provided via the C++ Maximilian library that is transpiled and connected to the Web Audio API, via the ScriptProcessorNode. Machine learning is provided via the RapidLib library which implements neural networks, k-NN and Dynamic Time Warping for regression and classification tasks. An online, browser-based IDE is the final part of the system, making the toolkit available for education and rapid prototyping purposes, without requiring software other than a web browser. Two example use cases are described: rapid prototyping of novel, electronic instruments and education. Finally, an evaluation of the performance of the libraries is presented, showing that they perform acceptably well in the web browser, compared to the native counterparts but there is room for improvement here. The system is being used by thousands of students in our on-campus and online courses.

Item Type:

Conference or Workshop Item (Paper)

Additional Information:

This work was partially funded under the HEFCE Catalyst Programme, project code PK31.
The research leading to these results has also received funding from the European Research Council under the European Union's Horizon2020 programme, H2020-ICT-2014-1 Project ID 644862

Related URLs:

Departments, Centres and Research Units:

Computing > Embodied AudioVisual Interaction Group (EAVI)

Dates:

DateEvent
16 May 2017Accepted
21 August 2017Published

Event Location:

Centre for Digital Music, Queen Mary University of London, United Kingdom

Date range:

21-23 August 2017

Item ID:

20968

Date Deposited:

11 Sep 2017 11:00

Last Modified:

29 Apr 2020 16:32

URI:

https://research.gold.ac.uk/id/eprint/20968

View statistics for this item...

Edit Record Edit Record (login required)