Human-in-the-Loop Adaptation for Interactive Beat Tracking

Kazuhiko Yamamoto (YAMAHA)
International Society for Music Information Retrieval, Proc. of ISMIR, 2021

ismir         logo

teaser



Abstract

In music information retrieval (MIR), beat-tracking is one of the most fundamental and important task. However, a perfect algorithm is difficult to achieve. In addition, there could be a no unique correct answer because what one interprets as a beat differs for each individual. To address this, we propose a novel human-in-the-loop user interface that allows the system to interactively adapt to a specific user and target music. In our system, the user does not need to correct all errors manually, but rather only a small portion of the errors. The system then adapts the internal neural network model to the target, and automatically corrects remaining errors. This is achieved by a novel adaptive runtime self-attention in which the adaptable parameters are intimately integrated as a part of the user interface. It enables both low-cost training using only a local context of the music piece, and by contrast, highly effective runtime adaptation using the global context. We show our framework dramatically reduces the user’s effort of correcting beat tracking errors in our experiments.




Video



Description

We present a novel adaptive runtime self-attention model (ARSA), in which the adaptable parameters are intimately integrated as a part of the interactive user interface. The ARSA is trained using only a local context for the training dataset and adapts to the user using the global context at runtime. This locally-aware learning reduces the computational cost during training, and the globally-aware runtime adaptation allows the effects of locally modified user feedback to be distributed throughout the entire piece of music. This strategy is based on our assumption that the amount of local feedback that the user can practically provide could be insufficient for adaptation.

arsa



Presentation



Publication

paper

・Kazuhiko Yamamoto. 2021. Human-in-the-Loop Adaptation for Interactive Beat Tracking.
International Society for Music Information Retrieval, Proc. of ISMIR, 2021  [PDF]

・山本 和彦. 2021. Human-in-the-Loop型適応によるインタラクティブな音楽的拍節解析.
WISS, 第29回インタラクティブシステムとソフトウェアに関するワークショップ, 2021   [PDF]





Contact

Kazuhiko Yamamoto – www.yamo-n.org
mail:   yamo_o(at)acm.org
twitter:   @yamo_o