system HF staff commited on
Commit
f4dc8d1
1 Parent(s): 638d0c3

Update log.txt

Browse files
Files changed (1) hide show
  1. log.txt +48 -0
log.txt ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Writing logs to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/log.txt.
2
+ Loading nlp dataset glue, subset stsb, split train.
3
+ Loading nlp dataset glue, subset stsb, split validation.
4
+ Loaded dataset. Found: 140 labels: ([0.0, 0.06700000166893005, 0.11800000071525574, 0.14300000667572021, 0.17000000178813934, 0.20000000298023224, 0.23100000619888306, 0.25, 0.3330000042915344, 0.4000000059604645, 0.4169999957084656, 0.5, 0.6000000238418579, 0.6430000066757202, 0.6669999957084656, 0.7269999980926514, 0.75, 0.800000011920929, 0.8330000042915344, 0.8500000238418579, 0.8889999985694885, 0.8999999761581421, 0.9440000057220459, 1.0, 1.100000023841858, 1.2000000476837158, 1.25, 1.2730000019073486, 1.2860000133514404, 1.3329999446868896, 1.399999976158142, 1.5, 1.5333333015441895, 1.555999994277954, 1.5829999446868896, 1.600000023841858, 1.6430000066757202, 1.6670000553131104, 1.7000000476837158, 1.7330000400543213, 1.75, 1.777999997138977, 1.7999999523162842, 1.8459999561309814, 2.0, 2.1111111640930176, 2.200000047683716, 2.25, 2.3299999237060547, 2.3329999446868896, 2.375, 2.4000000953674316, 2.4666666984558105, 2.5, 2.5329999923706055, 2.5829999446868896, 2.5880000591278076, 2.5999999046325684, 2.625, 2.6470000743865967, 2.6670000553131104, 2.700000047683716, 2.75, 2.7690000534057617, 2.799999952316284, 2.818000078201294, 2.8299999237060547, 2.875, 2.9089999198913574, 2.9170000553131104, 3.0, 3.055999994277954, 3.066999912261963, 3.0999999046325684, 3.1110000610351562, 3.1670000553131104, 3.200000047683716, 3.2309999465942383, 3.25, 3.2730000019073486, 3.3329999446868896, 3.3333332538604736, 3.4000000953674316, 3.437999963760376, 3.444000005722046, 3.4549999237060547, 3.5, 3.5329999923706055, 3.5999999046325684, 3.615000009536743, 3.625, 3.6429998874664307, 3.6670000553131104, 3.6700000762939453, 3.691999912261963, 3.75, 3.765000104904175, 3.7690000534057617, 3.777777671813965, 3.7860000133514404, 3.799999952316284, 3.8329999446868896, 3.8459999561309814, 3.8570001125335693, 3.867000102996826, 3.875, 3.9089999198913574, 3.9230000972747803, 3.928999900817871, 3.933000087738037, 3.937999963760376, 3.940999984741211, 4.0, 4.056000232696533, 4.091000080108643, 4.099999904632568, 4.132999897003174, 4.176000118255615, 4.199999809265137, 4.25, 4.308000087738037, 4.329999923706055, 4.333000183105469, 4.363999843597412, 4.400000095367432, 4.5, 4.571000099182129, 4.5714287757873535, 4.599999904632568, 4.666999816894531, 4.7270002365112305, 4.75, 4.7779998779296875, 4.800000190734863, 4.817999839782715, 4.85699987411499, 4.875, 4.908999919891357, 4.922999858856201, 5.0])
5
+ Detected float labels. Doing regression.
6
+ Loading transformers AutoModelForSequenceClassification: roberta-base
7
+ Tokenizing training data. (len: 5749)
8
+ Tokenizing eval data (len: 1500)
9
+ Loaded data and tokenized in 15.870370388031006s
10
+ Training model across 2 GPUs
11
+ ***** Running training *****
12
+ Num examples = 5749
13
+ Batch size = 128
14
+ Max sequence length = 128
15
+ Num steps = 220
16
+ Num epochs = 5
17
+ Learning rate = 3e-05
18
+ Failed to predict with model <class 'torch.nn.parallel.data_parallel.DataParallel'>. Check tokenizer configuration.
19
+ Writing logs to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/log.txt.
20
+ Loading nlp dataset glue, subset stsb, split train.
21
+ Loading nlp dataset glue, subset stsb, split validation.
22
+ Loaded dataset. Found: 140 labels: ([0.0, 0.06700000166893005, 0.11800000071525574, 0.14300000667572021, 0.17000000178813934, 0.20000000298023224, 0.23100000619888306, 0.25, 0.3330000042915344, 0.4000000059604645, 0.4169999957084656, 0.5, 0.6000000238418579, 0.6430000066757202, 0.6669999957084656, 0.7269999980926514, 0.75, 0.800000011920929, 0.8330000042915344, 0.8500000238418579, 0.8889999985694885, 0.8999999761581421, 0.9440000057220459, 1.0, 1.100000023841858, 1.2000000476837158, 1.25, 1.2730000019073486, 1.2860000133514404, 1.3329999446868896, 1.399999976158142, 1.5, 1.5333333015441895, 1.555999994277954, 1.5829999446868896, 1.600000023841858, 1.6430000066757202, 1.6670000553131104, 1.7000000476837158, 1.7330000400543213, 1.75, 1.777999997138977, 1.7999999523162842, 1.8459999561309814, 2.0, 2.1111111640930176, 2.200000047683716, 2.25, 2.3299999237060547, 2.3329999446868896, 2.375, 2.4000000953674316, 2.4666666984558105, 2.5, 2.5329999923706055, 2.5829999446868896, 2.5880000591278076, 2.5999999046325684, 2.625, 2.6470000743865967, 2.6670000553131104, 2.700000047683716, 2.75, 2.7690000534057617, 2.799999952316284, 2.818000078201294, 2.8299999237060547, 2.875, 2.9089999198913574, 2.9170000553131104, 3.0, 3.055999994277954, 3.066999912261963, 3.0999999046325684, 3.1110000610351562, 3.1670000553131104, 3.200000047683716, 3.2309999465942383, 3.25, 3.2730000019073486, 3.3329999446868896, 3.3333332538604736, 3.4000000953674316, 3.437999963760376, 3.444000005722046, 3.4549999237060547, 3.5, 3.5329999923706055, 3.5999999046325684, 3.615000009536743, 3.625, 3.6429998874664307, 3.6670000553131104, 3.6700000762939453, 3.691999912261963, 3.75, 3.765000104904175, 3.7690000534057617, 3.777777671813965, 3.7860000133514404, 3.799999952316284, 3.8329999446868896, 3.8459999561309814, 3.8570001125335693, 3.867000102996826, 3.875, 3.9089999198913574, 3.9230000972747803, 3.928999900817871, 3.933000087738037, 3.937999963760376, 3.940999984741211, 4.0, 4.056000232696533, 4.091000080108643, 4.099999904632568, 4.132999897003174, 4.176000118255615, 4.199999809265137, 4.25, 4.308000087738037, 4.329999923706055, 4.333000183105469, 4.363999843597412, 4.400000095367432, 4.5, 4.571000099182129, 4.5714287757873535, 4.599999904632568, 4.666999816894531, 4.7270002365112305, 4.75, 4.7779998779296875, 4.800000190734863, 4.817999839782715, 4.85699987411499, 4.875, 4.908999919891357, 4.922999858856201, 5.0])
23
+ Detected float labels. Doing regression.
24
+ Loading transformers AutoModelForSequenceClassification: roberta-base
25
+ Tokenizing training data. (len: 5749)
26
+ Tokenizing eval data (len: 1500)
27
+ Loaded data and tokenized in 16.126989126205444s
28
+ Training model across 2 GPUs
29
+ ***** Running training *****
30
+ Num examples = 5749
31
+ Batch size = 8
32
+ Max sequence length = 128
33
+ Num steps = 3590
34
+ Num epochs = 5
35
+ Learning rate = 2e-05
36
+ Eval pearson correlation: 89.20755032155974%
37
+ Best acc found. Saved model to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/.
38
+ Eval pearson correlation: 90.38407106335993%
39
+ Best acc found. Saved model to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/.
40
+ Eval pearson correlation: 90.50123920521006%
41
+ Best acc found. Saved model to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/.
42
+ Eval pearson correlation: 90.99130607031618%
43
+ Best acc found. Saved model to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/.
44
+ Eval pearson correlation: 91.08696741479217%
45
+ Best acc found. Saved model to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/.
46
+ Saved tokenizer <textattack.models.tokenizers.auto_tokenizer.AutoTokenizer object at 0x7fe416ce2700> to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/.
47
+ Wrote README to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/README.md.
48
+ Wrote training args to /p/qdata/jm8wx/research/text_attacks/textattack/outputs/training/roberta-base-glue:stsb-2020-06-29-12:29/train_args.json.