Xin Xu commited on
Commit
713e177
·
1 Parent(s): 172276e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -16
README.md CHANGED
@@ -13,24 +13,24 @@ tags:
13
 
14
  # Text-to-Attribute Understanding
15
  This is the text-to-attribute model to extract musical attributes from text, introduced in the paper [*MuseCoco: Generating Symbolic Music from Text*](https://arxiv.org/abs/2306.00110) and [first released in this repository](https://github.com/microsoft/muzic/tree/main/musecoco).
16
- It is based on BERT-large and has multiple classification heads for diverse musical attributes.
17
 
18
- There is the mapping between keywords used in the model and musical attributes:
19
  ```json
20
- {
21
- "I1s2": "Instrument",
22
- "R1": "Rhythm Danceability",
23
- "R3": "Rhythm Intensity",
24
- "S2s1": "Artist",
25
- "S4": "Genre",
26
- "B1s1": "Bar",
27
- "TS1s1": "Time Signature",
28
- "K1": "Key",
29
- "T1s1": "Tempo",
30
- "P4": "Pitch Range",
31
- "EM1": "Emotion",
32
- "TM1": "Time"
33
- }
34
  ```
35
 
36
  # BibTeX entry and citation info
 
13
 
14
  # Text-to-Attribute Understanding
15
  This is the text-to-attribute model to extract musical attributes from text, introduced in the paper [*MuseCoco: Generating Symbolic Music from Text*](https://arxiv.org/abs/2306.00110) and [first released in this repository](https://github.com/microsoft/muzic/tree/main/musecoco).
16
+ It is based on BERT-large and has multiple classification heads for diverse musical attributes:
17
 
18
+ musical attributes:
19
  ```json
20
+ [
21
+ "Instrument",
22
+ "Rhythm Danceability",
23
+ "Rhythm Intensity",
24
+ "Artist",
25
+ "Genre",
26
+ "Bar",
27
+ "Time Signature",
28
+ "Key",
29
+ "Tempo",
30
+ "Pitch Range",
31
+ "Emotion",
32
+ "Time"
33
+ ]
34
  ```
35
 
36
  # BibTeX entry and citation info