“Bowing-Net: Motion Generation for String Instruments Based on Bowing Information” by Hirata, Tanaka, Shimamura and Morishima

  • ©Asuka Hirata, Keitaro Tanaka, Ryo Shimamura, and Shigeo Morishima

Conference:


Type(s):


Entry Number: 37

Title:

    Bowing-Net: Motion Generation for String Instruments Based on Bowing Information

Presenter(s)/Author(s):



Abstract:


    This paper presents a deep learning based method that generates body motion for string instrument performance from raw audio. In contrast to prior methods which aim to predict joint position from audio, we first estimate information that dictates the bowing dynamics, such as the bow direction and the played string. The final body motion is then determined from this information following a conversion rule. By adopting the bowing information as the target domain, not only is learning the mapping more feasible, but also the produced results have bowing dynamics that are consistent with the given audio. We confirmed that our results are superior to existing methods through extensive experiments.

Keyword(s):



Acknowledgements:


    This work was supported by JST Mirai Program No. JPMJMI19B2, and JSPS KAKENHI Nos. 19H01129 and 19H04137.


PDF:



ACM Digital Library Publication:



Overview Page: