ARME Virtuoso Strings Dataset

aBirmingham City University, Sound and Music Analysis (SoMA) Group, Birmingham, UK
bUniversity of Birmingham, Birmingham, UK
cUniversity of Warwick, Coventry, UK

Virtuoso Strings Dataset

Why? To perform a detailed analysis of note onset types in a range of different musical conditions.

What?

Ensemble Composer Excerpt Bars
Quartet Haydn Op. 74 No. 1 Finale 1-48
Quartet Beethoven Op. 59 No. 3 Finale 210-271
Trio Dohnanyi Op. 10, Marcia 1-20
Duet Mozart K424 Mvt. 3 Var 2 32-48
Solo Haydn Op. 74 No. 1 Finale 1-285
How?
Using DPA 4099 series Lavalier microphones in the following setup.

Onset detection results from all systems per instrument.

Who? The musicians in the above photo are the Coull Quartet.

Musical scores? Yes.

Note onset annotations? Yes.

Note type annotations? Yes, but only for one take.

Was each performance recorded more than once? Yes, each performance was recorded 12 times.

Were all recordings performed in the same style of playing?

No, each piece was recorded 12 times (i.e., 12 separate takes) each for 3 different musical conditions. The conditions represent different playing styles, and were chosen to span a wide range of realistic performance types. The following are definitions of the three recorded conditions:

  • Normal condition (NR) represents a concert style performance.
  • Speed condition (SP) includes accelerando and decelerando spontaneously initiated by a single musician (i.e., the leader) throughout each performance.
  • Deadpan condition (DP) represents performances, where musicians were asked to play with minimal expression in tempo and accentuation.

For additional information please refer to the paper.

BibTeX

If you use the Virtuoso Strings in your research, feel free to cite this paper:

Tomczak M., M.S. Li, M. Di Luca, Virtuoso Strings: A Dataset of String Ensemble 
    Recordings and Onset Annotations for Timing Analysis, Extended Abstracts 
    for the Late-Breaking Demo Sessions of the 24th International Society for 
    Music Information Retrieval (ISMIR) Conference, 2023.