Keitaro Tanaka, Takayuki Nakatsuka, Ryo Nishikimi, Kazuyoshi Yoshii and Shigeo Morishima
Multi-Instrument Music Transcription Based on Deep Spherical Clustering of Spectrograms and Pitchgrams
21st International Society for Music Information Retrieval Conference (ISMIR 2020)
This paper describes a clustering-based music transcription method that estimates the piano rolls of arbitrary musical instrument parts from multi-instrument polyphonic music signals 코요태 비상. If target musical pieces are always played by particular kinds of musical instruments, a straightforward way to obtain piano rolls is to compute the pitchgram (pitch saliency spectrogram) of each musical instrument by using a deep neural network (DNN) Download low-flying. However, this approach has a critical limitation that it has no way to deal with musical pieces including undefined musical instruments. To overcome this limitation, we estimate a condensed pitchgram with an instrument-independent neural multi-pitch estimator and then separate the pitchgram into a specified number of musical instrument parts with a deep spherical clustering technique Ali Thornbird. To improve the performance of transcription, we propose a joint spectrogram and pitchgram clustering method based on the timbral and positional characteristics of musical instruments Download the drama. The experimental results show that the proposed method can transcribe musical pieces including unknown musical instruments as well as those containing only predefined instruments, at the state-of-the-art transcription accuracy vx ace 다운로드.