YusuphaJuwara commited on
Commit
7c9d4dd
1 Parent(s): f405f52

Push model using huggingface_hub.

Browse files
Files changed (2) hide show
  1. README.md +5 -1028
  2. model.safetensors +1 -1
README.md CHANGED
@@ -1,1032 +1,9 @@
1
  ---
2
  tags:
3
- - Diffusion
4
- - Data Generation
5
- language: en
6
- task: Data generation for computer vision tasks
7
- datasets: MNIST
8
- metrics:
9
- epoch: 30
10
- train_loss:
11
- - 0.012823267839848995
12
- - 0.01400498952716589
13
- - 0.013216765597462654
14
- - 0.018119243904948235
15
- - 0.009935321286320686
16
- - 0.016109945252537727
17
- - 0.017806213349103928
18
- - 0.019163455814123154
19
- - 0.029926661401987076
20
- - 0.01744312420487404
21
- - 0.014265676029026508
22
- - 0.013890810310840607
23
- - 0.013515425845980644
24
- - 0.015186883509159088
25
- - 0.014874613843858242
26
- - 0.014196827076375484
27
- - 0.016532881185412407
28
- - 0.012824700213968754
29
- - 0.021470380946993828
30
- - 0.013137794099748135
31
- - 0.01830434426665306
32
- - 0.013973318971693516
33
- - 0.017635220661759377
34
- - 0.018215475603938103
35
- - 0.014680536463856697
36
- - 0.013698616065084934
37
- - 0.01598021388053894
38
- - 0.020114580169320107
39
- - 0.01238455343991518
40
- - 0.015865936875343323
41
- - 0.01747105084359646
42
- - 0.009441102854907513
43
- - 0.01144119631499052
44
- - 0.016735738143324852
45
- - 0.012419845908880234
46
- - 0.011100444942712784
47
- - 0.01321379840373993
48
- - 0.015999609604477882
49
- - 0.01474534347653389
50
- - 0.014897378161549568
51
- - 0.017970088869333267
52
- - 0.015076511539518833
53
- - 0.012804070487618446
54
- - 0.01309606246650219
55
- - 0.02567661926150322
56
- - 0.016334712505340576
57
- - 0.014284917153418064
58
- - 0.012897239066660404
59
- - 0.014166726730763912
60
- - 0.013747263699769974
61
- - 0.012792781926691532
62
- - 0.01412202324718237
63
- - 0.016400407999753952
64
- - 0.01397175993770361
65
- - 0.017354147508740425
66
- - 0.019110742956399918
67
- - 0.010663657449185848
68
- - 0.008344266563653946
69
- - 0.012649071402847767
70
- - 0.017087502405047417
71
- - 0.019236430525779724
72
- - 0.015192188322544098
73
- - 0.013947468250989914
74
- - 0.013416292145848274
75
- - 0.015221521258354187
76
- - 0.017262130975723267
77
- - 0.011644840240478516
78
- - 0.013088601641356945
79
- - 0.012088057585060596
80
- - 0.017562881112098694
81
- - 0.019925445318222046
82
- - 0.01784220151603222
83
- - 0.011969857849180698
84
- - 0.015469592064619064
85
- - 0.019924024119973183
86
- - 0.014720432460308075
87
- - 0.01510943379253149
88
- - 0.015400998294353485
89
- - 0.014177696779370308
90
- - 0.012275025248527527
91
- - 0.010276070795953274
92
- - 0.010272706858813763
93
- - 0.01713935099542141
94
- - 0.015382635407149792
95
- - 0.012620631605386734
96
- - 0.009233148768544197
97
- - 0.012297826819121838
98
- - 0.015988431870937347
99
- - 0.012332918122410774
100
- - 0.015352596528828144
101
- - 0.0167554821819067
102
- - 0.013832821510732174
103
- - 0.013237307779490948
104
- - 0.017024654895067215
105
- - 0.012983175925910473
106
- - 0.012097544968128204
107
- - 0.01560136117041111
108
- - 0.014102261513471603
109
- - 0.018285010010004044
110
- - 0.013786952011287212
111
- - 0.010772432200610638
112
- - 0.01394947711378336
113
- - 0.010544677264988422
114
- - 0.018424034118652344
115
- - 0.018021907657384872
116
- - 0.009211936965584755
117
- - 0.014426711946725845
118
- - 0.013266295194625854
119
- - 0.017766905948519707
120
- - 0.015775548294186592
121
- - 0.01708303391933441
122
- - 0.013853863812983036
123
- - 0.012525591067969799
124
- - 0.018341541290283203
125
- - 0.011820320971310139
126
- - 0.0205695703625679
127
- - 0.01782200299203396
128
- - 0.01899688132107258
129
- - 0.017581019550561905
130
- - 0.010265433229506016
131
- - 0.010465427301824093
132
- - 0.014634435996413231
133
- - 0.00984617043286562
134
- - 0.008189251646399498
135
- - 0.01200923603028059
136
- - 0.0207128394395113
137
- - 0.019611632451415062
138
- - 0.01824965327978134
139
- - 0.01852397620677948
140
- - 0.016900964081287384
141
- - 0.010768377222120762
142
- - 0.019043229520320892
143
- - 0.015030838549137115
144
- - 0.013169635087251663
145
- - 0.013553139753639698
146
- - 0.017849035561084747
147
- - 0.015256358310580254
148
- - 0.019099527969956398
149
- - 0.011816896498203278
150
- - 0.01237241830676794
151
- - 0.015891196206212044
152
- - 0.01877140998840332
153
- - 0.012287910096347332
154
- - 0.01550232619047165
155
- - 0.011428726837038994
156
- - 0.012310859747231007
157
- - 0.01966482773423195
158
- - 0.013357913121581078
159
- - 0.017263049259781837
160
- - 0.013650835491716862
161
- - 0.008801552467048168
162
- - 0.014836424961686134
163
- - 0.013532405719161034
164
- - 0.01859424076974392
165
- - 0.017885936424136162
166
- - 0.018176468089222908
167
- - 0.013489016331732273
168
- - 0.01467969175428152
169
- - 0.016618497669696808
170
- - 0.014185432344675064
171
- - 0.017818456515669823
172
- - 0.009879892691969872
173
- - 0.013116213493049145
174
- - 0.013495173305273056
175
- - 0.012831974774599075
176
- - 0.01381061039865017
177
- - 0.01348500233143568
178
- - 0.01241365261375904
179
- - 0.02341916784644127
180
- - 0.011800087988376617
181
- - 0.01634468138217926
182
- - 0.015589670278131962
183
- - 0.017334531992673874
184
- - 0.009898250922560692
185
- - 0.015152733772993088
186
- - 0.012557579204440117
187
- - 0.012003399431705475
188
- - 0.013643868267536163
189
- - 0.010354544036090374
190
- - 0.017722439020872116
191
- - 0.021626291796565056
192
- - 0.008682645857334137
193
- - 0.01713278703391552
194
- - 0.016171297058463097
195
- - 0.008838783018290997
196
- - 0.016474051401019096
197
- - 0.01868913136422634
198
- - 0.014924957416951656
199
- - 0.014347616583108902
200
- - 0.011755186133086681
201
- - 0.018426943570375443
202
- - 0.01104152575135231
203
- - 0.019549930468201637
204
- - 0.011546071618795395
205
- - 0.015665920451283455
206
- - 0.02282123826444149
207
- - 0.01269321795552969
208
- - 0.011152430437505245
209
- - 0.00930074043571949
210
- - 0.010433575138449669
211
- - 0.011550144292414188
212
- - 0.013477005995810032
213
- - 0.014451905153691769
214
- - 0.010782433673739433
215
- - 0.013498359359800816
216
- - 0.015078330412507057
217
- - 0.016399716958403587
218
- - 0.01136429887264967
219
- - 0.010671067051589489
220
- - 0.018186135217547417
221
- - 0.01386172417551279
222
- - 0.015849830582737923
223
- - 0.01540683675557375
224
- - 0.01613001897931099
225
- - 0.01217152364552021
226
- - 0.00974529329687357
227
- - 0.01546423975378275
228
- - 0.009228259325027466
229
- - 0.012445847503840923
230
- - 0.018619777634739876
231
- - 0.01749054715037346
232
- - 0.01590076833963394
233
- - 0.01728537119925022
234
- - 0.01219201646745205
235
- - 0.014165084809064865
236
- - 0.015441886149346828
237
- - 0.015286616049706936
238
- - 0.013067981228232384
239
- - 0.01795738935470581
240
- - 0.011690407060086727
241
- - 0.0114912623539567
242
- - 0.01852347142994404
243
- - 0.014564932323992252
244
- - 0.01221140194684267
245
- - 0.009303784929215908
246
- - 0.013683191500604153
247
- - 0.014592486433684826
248
- - 0.011168139986693859
249
- - 0.016509901732206345
250
- - 0.015250502154231071
251
- - 0.012980044819414616
252
- - 0.017193503677845
253
- - 0.014241276308894157
254
- - 0.01352495327591896
255
- - 0.014312700368463993
256
- - 0.01205169502645731
257
- - 0.019303178414702415
258
- - 0.015187379904091358
259
- - 0.013486715964972973
260
- - 0.014180604368448257
261
- - 0.017823662608861923
262
- - 0.007880790159106255
263
- - 0.0142303965985775
264
- - 0.015002582222223282
265
- - 0.015248741023242474
266
- - 0.012377183884382248
267
- - 0.018062474206089973
268
- - 0.01577080599963665
269
- - 0.015157760120928288
270
- - 0.01293419674038887
271
- - 0.013524855487048626
272
- - 0.00948166474699974
273
- - 0.013994897715747356
274
- - 0.01415166910737753
275
- - 0.01376145239919424
276
- - 0.010151868686079979
277
- - 0.012259445153176785
278
- - 0.010235017165541649
279
- - 0.0148311173543334
280
- - 0.01451211515814066
281
- - 0.01301118265837431
282
- - 0.0144817428663373
283
- - 0.014342723414301872
284
- - 0.017440808936953545
285
- - 0.01444151159375906
286
- - 0.015937108546495438
287
- - 0.009662298485636711
288
- - 0.013905493542551994
289
- - 0.020401744171977043
290
- - 0.017314929515123367
291
- - 0.014985499903559685
292
- - 0.0170380137860775
293
- - 0.01272120513021946
294
- - 0.019480247050523758
295
- - 0.016183510422706604
296
- - 0.018123166635632515
297
- - 0.013028769753873348
298
- - 0.01676720753312111
299
- - 0.010111472569406033
300
- - 0.017393961548805237
301
- - 0.01071438379585743
302
- - 0.01852787472307682
303
- - 0.01947920210659504
304
- - 0.010889174416661263
305
- - 0.013105122372508049
306
- - 0.01446854043751955
307
- - 0.01717580482363701
308
- - 0.011589503847062588
309
- - 0.014562658034265041
310
- - 0.015842849388718605
311
- - 0.017068952322006226
312
- - 0.018368331715464592
313
- - 0.01331920176744461
314
- - 0.01984517090022564
315
- - 0.015060808509588242
316
- - 0.021684622392058372
317
- - 0.011352002620697021
318
- - 0.014470428228378296
319
- - 0.017445730045437813
320
- - 0.015076295472681522
321
- - 0.014610438607633114
322
- - 0.01602226123213768
323
- - 0.015543230809271336
324
- - 0.01677011139690876
325
- - 0.016283180564641953
326
- - 0.015959979966282845
327
- - 0.01595875434577465
328
- - 0.012761864811182022
329
- - 0.012963272631168365
330
- - 0.013297231867909431
331
- - 0.009222302585840225
332
- - 0.013840700499713421
333
- - 0.013001928105950356
334
- - 0.014025570824742317
335
- - 0.0161904264241457
336
- - 0.0154469208791852
337
- - 0.013219536282122135
338
- - 0.011061489582061768
339
- - 0.015389002859592438
340
- - 0.01836708001792431
341
- - 0.0155992666259408
342
- - 0.015926094725728035
343
- - 0.013008791022002697
344
- - 0.01637180894613266
345
- - 0.01538177765905857
346
- - 0.013004442676901817
347
- - 0.011528365314006805
348
- - 0.013921217992901802
349
- - 0.018385309725999832
350
- - 0.01701006107032299
351
- - 0.017620017752051353
352
- - 0.01246157567948103
353
- - 0.013555098325014114
354
- - 0.01473406795412302
355
- - 0.012706800363957882
356
- - 0.01504921168088913
357
- - 0.013201153837144375
358
- - 0.01328678522258997
359
- - 0.017405414953827858
360
- - 0.011417783796787262
361
- - 0.011685892008244991
362
- - 0.019112056121230125
363
- - 0.01587916910648346
364
- - 0.01727406308054924
365
- - 0.0112075824290514
366
- - 0.012984800152480602
367
- - 0.014845979399979115
368
- - 0.018044691532850266
369
- - 0.008773308247327805
370
- - 0.013302433304488659
371
- - 0.014669881202280521
372
- - 0.010797330178320408
373
- - 0.016289670020341873
374
- - 0.010762711055576801
375
- - 0.015740416944026947
376
- - 0.008375279605388641
377
- - 0.014242181554436684
378
- - 0.009737513028085232
379
- - 0.014595738612115383
380
- - 0.007744857110083103
381
- - 0.015657784417271614
382
- - 0.01875326782464981
383
- - 0.01675497554242611
384
- - 0.016140559688210487
385
- - 0.013950037769973278
386
- - 0.009707340970635414
387
- - 0.014766794629395008
388
- - 0.015048055909574032
389
- - 0.020532306283712387
390
- - 0.011998273432254791
391
- - 0.012096351012587547
392
- - 0.009772085584700108
393
- - 0.016024203971028328
394
- - 0.015159565024077892
395
- - 0.01893521286547184
396
- - 0.012398621998727322
397
- - 0.015035883523523808
398
- - 0.013200602494180202
399
- - 0.01297819335013628
400
- - 0.014850227162241936
401
- - 0.012569400481879711
402
- - 0.011463789269328117
403
- - 0.01914750039577484
404
- - 0.016546089202165604
405
- - 0.013123472221195698
406
- - 0.01785363256931305
407
- - 0.01556733250617981
408
- - 0.015059936791658401
409
- - 0.01624828204512596
410
- - 0.01555894035845995
411
- - 0.01877269335091114
412
- - 0.012586002238094807
413
- - 0.015588192269206047
414
- - 0.019493980333209038
415
- - 0.010896150022745132
416
- - 0.012365048751235008
417
- - 0.018048135563731194
418
- - 0.01322846207767725
419
- - 0.010157682001590729
420
- - 0.015510390512645245
421
- - 0.014838223345577717
422
- - 0.014606245793402195
423
- - 0.013889896683394909
424
- - 0.01562536135315895
425
- - 0.016269979998469353
426
- - 0.011120443232357502
427
- - 0.012311367318034172
428
- - 0.016331760212779045
429
- - 0.013054169714450836
430
- - 0.012259026989340782
431
- - 0.010218982584774494
432
- - 0.012240665033459663
433
- - 0.01499510370194912
434
- - 0.008712150156497955
435
- - 0.014474675990641117
436
- - 0.020589366555213928
437
- - 0.01163767371326685
438
- - 0.016640610992908478
439
- - 0.012417085468769073
440
- - 0.01637069508433342
441
- - 0.010092266835272312
442
- - 0.007979915477335453
443
- - 0.013861279003322124
444
- - 0.020293615758419037
445
- - 0.014164695516228676
446
- - 0.01727980375289917
447
- - 0.022441163659095764
448
- - 0.009661526419222355
449
- - 0.016905898228287697
450
- - 0.01778845675289631
451
- - 0.014682134613394737
452
- - 0.01623203232884407
453
- - 0.016101740300655365
454
- - 0.01499561034142971
455
- - 0.010601217858493328
456
- - 0.014314215630292892
457
- - 0.015224532224237919
458
- - 0.011455584317445755
459
- - 0.012796347960829735
460
- - 0.013873777352273464
461
- - 0.008255258202552795
462
- - 0.01189986988902092
463
- - 0.011325284838676453
464
- - 0.012866211123764515
465
- - 0.014096754603087902
466
- - 0.01247140672057867
467
- - 0.015570642426609993
468
- - 0.01712246611714363
469
- - 0.016929982230067253
470
- - 0.014927409589290619
471
- - 0.02124587818980217
472
- - 0.015775633975863457
473
- - 0.013950047083199024
474
- - 0.011876738630235195
475
- - 0.01574871502816677
476
- - 0.00985360611230135
477
- - 0.022024810314178467
478
- - 0.013082724995911121
479
- - 0.019405748695135117
480
- - 0.01664225198328495
481
- - 0.014247894287109375
482
- - 0.02082131989300251
483
- - 0.012476740404963493
484
- - 0.013244024477899075
485
- - 0.012926043942570686
486
- - 0.01253123115748167
487
- - 0.010814059525728226
488
- - 0.015011614188551903
489
- - 0.010579361580312252
490
- - 0.013690914027392864
491
- - 0.019288914278149605
492
- - 0.020176099613308907
493
- - 0.012059354223310947
494
- - 0.01425460260361433
495
- - 0.011957324109971523
496
- - 0.02035270445048809
497
- - 0.012012625113129616
498
- - 0.01069235522300005
499
- - 0.01427639089524746
500
- - 0.018094366416335106
501
- - 0.014400746673345566
502
- - 0.01803451217710972
503
- - 0.01307073887437582
504
- - 0.01562817394733429
505
- - 0.009896297939121723
506
- - 0.01562829688191414
507
- - 0.014600986614823341
508
- - 0.01303951721638441
509
- - 0.01127172727137804
510
- - 0.01354335155338049
511
- - 0.012155872769653797
512
- - 0.009703471325337887
513
- - 0.016426043584942818
514
- - 0.009624569676816463
515
- - 0.015315492637455463
516
- - 0.018331613391637802
517
- - 0.015069100074470043
518
- - 0.012325316667556763
519
- - 0.011092607863247395
520
- - 0.0249860230833292
521
- - 0.015099314972758293
522
- - 0.012199461460113525
523
- - 0.015286272391676903
524
- - 0.010672382079064846
525
- - 0.012871230952441692
526
- - 0.015252256765961647
527
- - 0.012810402549803257
528
- - 0.01720958575606346
529
- - 0.014615659601986408
530
- - 0.013605503365397453
531
- - 0.015750763937830925
532
- - 0.012200957164168358
533
- - 0.015435395762324333
534
- - 0.01310784462839365
535
- - 0.010705225169658661
536
- - 0.013346503488719463
537
- - 0.0158690195530653
538
- - 0.017373913899064064
539
- - 0.01032863650470972
540
- - 0.01754222996532917
541
- - 0.012087023817002773
542
- - 0.014727254398167133
543
- - 0.015761785209178925
544
- - 0.01621333323419094
545
- - 0.018171971663832664
546
- - 0.01704476587474346
547
- - 0.01423647627234459
548
- - 0.019179701805114746
549
- - 0.015728333964943886
550
- - 0.013848857022821903
551
- - 0.01672997511923313
552
- - 0.013004470616579056
553
- - 0.014746264554560184
554
- - 0.012624476104974747
555
- - 0.015064740553498268
556
- - 0.01916174404323101
557
- - 0.01315955352038145
558
- - 0.010424467734992504
559
- - 0.02070016786456108
560
- - 0.013983497396111488
561
- - 0.011878845281898975
562
- - 0.017836587503552437
563
- - 0.01865456812083721
564
- - 0.008779608644545078
565
- - 0.01493915542960167
566
- - 0.013999911956489086
567
- - 0.01927504874765873
568
- - 0.016437072306871414
569
- - 0.017044104635715485
570
- - 0.013921873643994331
571
- - 0.015478767454624176
572
- - 0.016095401719212532
573
- - 0.014682628214359283
574
- - 0.011523298919200897
575
- - 0.012401215732097626
576
- - 0.01654362678527832
577
- - 0.014758080244064331
578
- - 0.011245347559452057
579
- - 0.010827702470123768
580
- - 0.01433934923261404
581
- - 0.020738644525408745
582
- - 0.012697630561888218
583
- - 0.012890391983091831
584
- - 0.01499839685857296
585
- - 0.013889237307012081
586
- - 0.016720646992325783
587
- - 0.011753135360777378
588
- - 0.012277161702513695
589
- - 0.014482881873846054
590
- - 0.011510663665831089
591
- - 0.013431458733975887
592
- - 0.015981094911694527
593
- - 0.01333930529654026
594
- - 0.01777767762541771
595
- - 0.01792670413851738
596
- - 0.008067646995186806
597
- - 0.012709519825875759
598
- - 0.01515245158225298
599
- - 0.013160709291696548
600
- - 0.013352620415389538
601
- - 0.013449607416987419
602
- - 0.01919690892100334
603
- - 0.01764628477394581
604
- - 0.015411906875669956
605
- - 0.01842479594051838
606
- - 0.017747223377227783
607
- - 0.017139282077550888
608
- - 0.015141540206968784
609
- - 0.010441670194268227
610
- - 0.015034055337309837
611
- - 0.012910894118249416
612
- - 0.008726266212761402
613
- - 0.01265596505254507
614
- - 0.01235686894506216
615
- - 0.015260112471878529
616
- - 0.011199109256267548
617
- - 0.011697500944137573
618
- - 0.013121848925948143
619
- - 0.013714001514017582
620
- - 0.01488557644188404
621
- - 0.010089879855513573
622
- - 0.015145250596106052
623
- - 0.012644529342651367
624
- - 0.017067959532141685
625
- - 0.016669129952788353
626
- - 0.012216424569487572
627
- - 0.013608650304377079
628
- - 0.01566939987242222
629
- - 0.021635346114635468
630
- - 0.019400086253881454
631
- - 0.012982329353690147
632
- - 0.014019284397363663
633
- - 0.010342517867684364
634
- - 0.013440292328596115
635
- - 0.017487898468971252
636
- - 0.011608651839196682
637
- - 0.013461456634104252
638
- - 0.01215245109051466
639
- - 0.010870520025491714
640
- - 0.01481886487454176
641
- - 0.011247104033827782
642
- - 0.012012018822133541
643
- - 0.020108671858906746
644
- - 0.009791767224669456
645
- - 0.013818460516631603
646
- - 0.012373198755085468
647
- - 0.015562979504466057
648
- - 0.009700983762741089
649
- - 0.01830868236720562
650
- - 0.014884519390761852
651
- - 0.017453141510486603
652
- - 0.012960207648575306
653
- - 0.013329400680959225
654
- - 0.020377518609166145
655
- - 0.01584053225815296
656
- - 0.022579431533813477
657
- - 0.012453061528503895
658
- - 0.01224344503134489
659
- - 0.013955259695649147
660
- - 0.013638895004987717
661
- - 0.010344361886382103
662
- - 0.01172588299959898
663
- - 0.014188824221491814
664
- - 0.011871449649333954
665
- - 0.01977592334151268
666
- - 0.009035217575728893
667
- - 0.018265288323163986
668
- - 0.016679933294653893
669
- - 0.017023902386426926
670
- - 0.011005993001163006
671
- - 0.014177837409079075
672
- - 0.00821842160075903
673
- - 0.013294666074216366
674
- - 0.010644595138728619
675
- - 0.012984835542738438
676
- - 0.01668861322104931
677
- - 0.014755960553884506
678
- - 0.016254611313343048
679
- - 0.01814805157482624
680
- - 0.017976421862840652
681
- - 0.01782398298382759
682
- - 0.013756806030869484
683
- - 0.012347078882157803
684
- - 0.01985752210021019
685
- - 0.016416190192103386
686
- - 0.012861793860793114
687
- - 0.013136442750692368
688
- - 0.01782294362783432
689
- - 0.014176762662827969
690
- - 0.02169553004205227
691
- - 0.018910352140665054
692
- - 0.022274469956755638
693
- - 0.012336819432675838
694
- - 0.015652846544981003
695
- - 0.013342234306037426
696
- - 0.01287817396223545
697
- - 0.013322776183485985
698
- - 0.012300722301006317
699
- - 0.018305854871869087
700
- - 0.01772584393620491
701
- - 0.01648421585559845
702
- - 0.010563869960606098
703
- - 0.014292917214334011
704
- - 0.014135020785033703
705
- - 0.014334271661937237
706
- - 0.012492325156927109
707
- - 0.011820167303085327
708
- - 0.013000532053411007
709
- - 0.014183474704623222
710
- - 0.014309058897197247
711
- - 0.011591054499149323
712
- - 0.01653086394071579
713
- - 0.013208683580160141
714
- - 0.015129126608371735
715
- - 0.012074793688952923
716
- - 0.015013131313025951
717
- - 0.014697468839585781
718
- - 0.015994155779480934
719
- - 0.016092775389552116
720
- - 0.015920303761959076
721
- - 0.009286300279200077
722
- - 0.019796520471572876
723
- - 0.014934549108147621
724
- - 0.011215019039809704
725
- - 0.012207811698317528
726
- - 0.01521605346351862
727
- - 0.015231722965836525
728
- - 0.013893797062337399
729
- - 0.013705774210393429
730
- - 0.011231712996959686
731
- - 0.014398312196135521
732
- - 0.018288392573595047
733
- - 0.012559470720589161
734
- - 0.019288979470729828
735
- - 0.016910990700125694
736
- - 0.0188098456710577
737
- - 0.010434349067509174
738
- - 0.01039385050535202
739
- - 0.011827045120298862
740
- - 0.010298928245902061
741
- - 0.011737331748008728
742
- - 0.0172980148345232
743
- - 0.01669054850935936
744
- - 0.01325226854532957
745
- - 0.01639167219400406
746
- - 0.01835121400654316
747
- - 0.013120968826115131
748
- - 0.016998067498207092
749
- - 0.0166577510535717
750
- - 0.014302127063274384
751
- - 0.010408739559352398
752
- - 0.015543085522949696
753
- - 0.017319561913609505
754
- - 0.012878327630460262
755
- - 0.014857746660709381
756
- - 0.013139505870640278
757
- - 0.0168275348842144
758
- - 0.016646094620227814
759
- - 0.017396247014403343
760
- - 0.014108224771916866
761
- - 0.013417663052678108
762
- - 0.01063772477209568
763
- - 0.01797383278608322
764
- - 0.013910521753132343
765
- - 0.012188446708023548
766
- - 0.015480458736419678
767
- - 0.014167087152600288
768
- - 0.01717229187488556
769
- - 0.013753081671893597
770
- - 0.01638750545680523
771
- - 0.015632394701242447
772
- - 0.011088617146015167
773
- - 0.01437865849584341
774
- - 0.018101511523127556
775
- - 0.015528233721852303
776
- - 0.017554456368088722
777
- - 0.01689785346388817
778
- - 0.01743229292333126
779
- - 0.013922883197665215
780
- - 0.02019209787249565
781
- - 0.01679144613444805
782
- - 0.012340784072875977
783
- - 0.018699143081903458
784
- - 0.012576859444379807
785
- - 0.012424788437783718
786
- - 0.016325782984495163
787
- - 0.019729821011424065
788
- - 0.0161475520581007
789
- - 0.016162084415555
790
- - 0.018768850713968277
791
- - 0.015459022484719753
792
- - 0.011390656232833862
793
- - 0.012461150996387005
794
- - 0.014575926586985588
795
- - 0.009682287462055683
796
- - 0.01698310300707817
797
- - 0.012632041238248348
798
- - 0.021149717271327972
799
- - 0.019964180886745453
800
- - 0.016660109162330627
801
- - 0.01210025604814291
802
- - 0.014287162572145462
803
- - 0.0203152596950531
804
- - 0.013169988058507442
805
- - 0.012745195999741554
806
- - 0.014474919997155666
807
- - 0.01163947582244873
808
- - 0.013910344801843166
809
- - 0.01585008203983307
810
- - 0.01226817723363638
811
- - 0.009646845050156116
812
- - 0.01794772781431675
813
- - 0.015741243958473206
814
- - 0.012830757535994053
815
- - 0.011555778793990612
816
- - 0.011784943751990795
817
- - 0.016921674832701683
818
- - 0.013738307170569897
819
- - 0.014343390241265297
820
- - 0.024495357647538185
821
- - 0.013112387619912624
822
- - 0.016558578237891197
823
- - 0.01302757766097784
824
- - 0.012513913214206696
825
- - 0.0069653731770813465
826
- - 0.011409434489905834
827
- - 0.011552640236914158
828
- - 0.016775038093328476
829
- - 0.014066943898797035
830
- - 0.017050785943865776
831
- - 0.011610496789216995
832
- - 0.013864459469914436
833
- - 0.016255836933851242
834
- - 0.01660570688545704
835
- - 0.013101796619594097
836
- - 0.014169583097100258
837
- - 0.016673296689987183
838
- - 0.018819589167833328
839
- - 0.010896094143390656
840
- - 0.015093866735696793
841
- - 0.008889035321772099
842
- - 0.012060987763106823
843
- - 0.013384399935603142
844
- - 0.010977615602314472
845
- - 0.015598194673657417
846
- - 0.012413672171533108
847
- - 0.013533071614801884
848
- - 0.014770056121051311
849
- - 0.010948504321277142
850
- - 0.016012858599424362
851
- - 0.015198669396340847
852
- - 0.013875390402972698
853
- - 0.010970446281135082
854
- - 0.013627617619931698
855
- - 0.015181371942162514
856
- - 0.015710419043898582
857
- - 0.014879398047924042
858
- - 0.013725529424846172
859
- - 0.015766313299536705
860
- - 0.013644215650856495
861
- - 0.01482765655964613
862
- - 0.009816099889576435
863
- - 0.01274476945400238
864
- - 0.015070877969264984
865
- - 0.012759066186845303
866
- - 0.016125066205859184
867
- - 0.015448709018528461
868
- - 0.011427978985011578
869
- - 0.012982896529138088
870
- - 0.012414917349815369
871
- - 0.01636517234146595
872
- - 0.010730203241109848
873
- - 0.011487369425594807
874
- - 0.013683789409697056
875
- - 0.013662184588611126
876
- - 0.017001986503601074
877
- - 0.01382617000490427
878
- - 0.009383502416312695
879
- - 0.016134606674313545
880
- - 0.017617885023355484
881
- - 0.015364051796495914
882
- - 0.014077490195631981
883
- - 0.014432237483561039
884
- - 0.01435584295541048
885
- - 0.015633055940270424
886
- - 0.015428056009113789
887
- - 0.015231630764901638
888
- - 0.013195951469242573
889
- - 0.013370838016271591
890
- - 0.013780822046101093
891
- - 0.01329419668763876
892
- - 0.0189259871840477
893
- - 0.015402832068502903
894
- - 0.01572522521018982
895
- - 0.01575031876564026
896
- - 0.018626395612955093
897
- - 0.013295643962919712
898
- - 0.014998710714280605
899
- - 0.008765538223087788
900
- - 0.013446087017655373
901
- - 0.015360396355390549
902
- - 0.016361825168132782
903
- - 0.015448288060724735
904
- - 0.013242118060588837
905
- - 0.018956122919917107
906
- - 0.013785823248326778
907
- - 0.014343373477458954
908
- - 0.016193579882383347
909
- - 0.016720961779356003
910
- - 0.019078420475125313
911
- - 0.016126666218042374
912
- - 0.015943700447678566
913
- - 0.016192467883229256
914
- - 0.019414741545915604
915
- - 0.018353885039687157
916
- - 0.011089794337749481
917
- - 0.01368016842752695
918
- - 0.019154470413923264
919
- - 0.015048072673380375
920
- - 0.00982813909649849
921
- - 0.013871165923774242
922
- - 0.01382596604526043
923
- - 0.01271446980535984
924
- - 0.01459297351539135
925
- - 0.01996687985956669
926
- - 0.017132440581917763
927
- - 0.01672506146132946
928
- - 0.01334786880761385
929
- - 0.013684909790754318
930
- - 0.024313203990459442
931
- - 0.013124577701091766
932
- - 0.017736446112394333
933
- - 0.015900198370218277
934
- - 0.009727501310408115
935
- - 0.015330366790294647
936
- - 0.013912336900830269
937
- - 0.014411951415240765
938
- - 0.010139884427189827
939
- - 0.016144011169672012
940
- - 0.01038780715316534
941
- - 0.01749086193740368
942
- - 0.015228607691824436
943
- - 0.014620549976825714
944
- - 0.013376684859395027
945
- - 0.014660811983048916
946
- - 0.017577560618519783
947
- - 0.013142507523298264
948
- - 0.012512920424342155
949
- license: unknown
950
- model-index:
951
- - name: diffusion-practice-v1
952
- results:
953
- - task:
954
- type: nlp
955
- name: Data Generation with Diffusion Model
956
- dataset:
957
- name: MNIST
958
- type: mnist
959
- metrics:
960
- - type: loss
961
- value: '0.01'
962
- name: Loss
963
- verified: false
964
  ---
965
 
966
- # NLI-FEVER Model
967
-
968
- This model is fine-tuned for Natural Language Inference (NLI) tasks using the FEVER dataset.
969
-
970
- ## Model description
971
-
972
- ## Intended uses & limitations
973
-
974
- This model is intended for use in NLI tasks, particularly those related to fact-checking and verifying information.
975
- It should not be used for tasks it wasn't explicitly trained for.
976
-
977
- ## Training and evaluation data
978
-
979
- The model was trained on the FEVER (Fact Extraction and VERification) dataset.
980
-
981
- ## Training procedure
982
-
983
- The model was trained for 30 epochs
984
- Train Losses of [0.012823267839848995, 0.01400498952716589, 0.013216765597462654, 0.018119243904948235, 0.009935321286320686, 0.016109945252537727, 0.017806213349103928, 0.019163455814123154, 0.029926661401987076, 0.01744312420487404, 0.014265676029026508, 0.013890810310840607, 0.013515425845980644, 0.015186883509159088, 0.014874613843858242, 0.014196827076375484, 0.016532881185412407, 0.012824700213968754, 0.021470380946993828, 0.013137794099748135, 0.01830434426665306, 0.013973318971693516, 0.017635220661759377, 0.018215475603938103, 0.014680536463856697, 0.013698616065084934, 0.01598021388053894, 0.020114580169320107, 0.01238455343991518, 0.015865936875343323, 0.01747105084359646, 0.009441102854907513, 0.01144119631499052, 0.016735738143324852, 0.012419845908880234, 0.011100444942712784, 0.01321379840373993, 0.015999609604477882, 0.01474534347653389, 0.014897378161549568, 0.017970088869333267, 0.015076511539518833, 0.012804070487618446, 0.01309606246650219, 0.02567661926150322, 0.016334712505340576, 0.014284917153418064, 0.012897239066660404, 0.014166726730763912, 0.013747263699769974, 0.012792781926691532, 0.01412202324718237, 0.016400407999753952, 0.01397175993770361, 0.017354147508740425, 0.019110742956399918, 0.010663657449185848, 0.008344266563653946, 0.012649071402847767, 0.017087502405047417, 0.019236430525779724, 0.015192188322544098, 0.013947468250989914, 0.013416292145848274, 0.015221521258354187, 0.017262130975723267, 0.011644840240478516, 0.013088601641356945, 0.012088057585060596, 0.017562881112098694, 0.019925445318222046, 0.01784220151603222, 0.011969857849180698, 0.015469592064619064, 0.019924024119973183, 0.014720432460308075, 0.01510943379253149, 0.015400998294353485, 0.014177696779370308, 0.012275025248527527, 0.010276070795953274, 0.010272706858813763, 0.01713935099542141, 0.015382635407149792, 0.012620631605386734, 0.009233148768544197, 0.012297826819121838, 0.015988431870937347, 0.012332918122410774, 0.015352596528828144, 0.0167554821819067, 0.013832821510732174, 0.013237307779490948, 0.017024654895067215, 0.012983175925910473, 0.012097544968128204, 0.01560136117041111, 0.014102261513471603, 0.018285010010004044, 0.013786952011287212, 0.010772432200610638, 0.01394947711378336, 0.010544677264988422, 0.018424034118652344, 0.018021907657384872, 0.009211936965584755, 0.014426711946725845, 0.013266295194625854, 0.017766905948519707, 0.015775548294186592, 0.01708303391933441, 0.013853863812983036, 0.012525591067969799, 0.018341541290283203, 0.011820320971310139, 0.0205695703625679, 0.01782200299203396, 0.01899688132107258, 0.017581019550561905, 0.010265433229506016, 0.010465427301824093, 0.014634435996413231, 0.00984617043286562, 0.008189251646399498, 0.01200923603028059, 0.0207128394395113, 0.019611632451415062, 0.01824965327978134, 0.01852397620677948, 0.016900964081287384, 0.010768377222120762, 0.019043229520320892, 0.015030838549137115, 0.013169635087251663, 0.013553139753639698, 0.017849035561084747, 0.015256358310580254, 0.019099527969956398, 0.011816896498203278, 0.01237241830676794, 0.015891196206212044, 0.01877140998840332, 0.012287910096347332, 0.01550232619047165, 0.011428726837038994, 0.012310859747231007, 0.01966482773423195, 0.013357913121581078, 0.017263049259781837, 0.013650835491716862, 0.008801552467048168, 0.014836424961686134, 0.013532405719161034, 0.01859424076974392, 0.017885936424136162, 0.018176468089222908, 0.013489016331732273, 0.01467969175428152, 0.016618497669696808, 0.014185432344675064, 0.017818456515669823, 0.009879892691969872, 0.013116213493049145, 0.013495173305273056, 0.012831974774599075, 0.01381061039865017, 0.01348500233143568, 0.01241365261375904, 0.02341916784644127, 0.011800087988376617, 0.01634468138217926, 0.015589670278131962, 0.017334531992673874, 0.009898250922560692, 0.015152733772993088, 0.012557579204440117, 0.012003399431705475, 0.013643868267536163, 0.010354544036090374, 0.017722439020872116, 0.021626291796565056, 0.008682645857334137, 0.01713278703391552, 0.016171297058463097, 0.008838783018290997, 0.016474051401019096, 0.01868913136422634, 0.014924957416951656, 0.014347616583108902, 0.011755186133086681, 0.018426943570375443, 0.01104152575135231, 0.019549930468201637, 0.011546071618795395, 0.015665920451283455, 0.02282123826444149, 0.01269321795552969, 0.011152430437505245, 0.00930074043571949, 0.010433575138449669, 0.011550144292414188, 0.013477005995810032, 0.014451905153691769, 0.010782433673739433, 0.013498359359800816, 0.015078330412507057, 0.016399716958403587, 0.01136429887264967, 0.010671067051589489, 0.018186135217547417, 0.01386172417551279, 0.015849830582737923, 0.01540683675557375, 0.01613001897931099, 0.01217152364552021, 0.00974529329687357, 0.01546423975378275, 0.009228259325027466, 0.012445847503840923, 0.018619777634739876, 0.01749054715037346, 0.01590076833963394, 0.01728537119925022, 0.01219201646745205, 0.014165084809064865, 0.015441886149346828, 0.015286616049706936, 0.013067981228232384, 0.01795738935470581, 0.011690407060086727, 0.0114912623539567, 0.01852347142994404, 0.014564932323992252, 0.01221140194684267, 0.009303784929215908, 0.013683191500604153, 0.014592486433684826, 0.011168139986693859, 0.016509901732206345, 0.015250502154231071, 0.012980044819414616, 0.017193503677845, 0.014241276308894157, 0.01352495327591896, 0.014312700368463993, 0.01205169502645731, 0.019303178414702415, 0.015187379904091358, 0.013486715964972973, 0.014180604368448257, 0.017823662608861923, 0.007880790159106255, 0.0142303965985775, 0.015002582222223282, 0.015248741023242474, 0.012377183884382248, 0.018062474206089973, 0.01577080599963665, 0.015157760120928288, 0.01293419674038887, 0.013524855487048626, 0.00948166474699974, 0.013994897715747356, 0.01415166910737753, 0.01376145239919424, 0.010151868686079979, 0.012259445153176785, 0.010235017165541649, 0.0148311173543334, 0.01451211515814066, 0.01301118265837431, 0.0144817428663373, 0.014342723414301872, 0.017440808936953545, 0.01444151159375906, 0.015937108546495438, 0.009662298485636711, 0.013905493542551994, 0.020401744171977043, 0.017314929515123367, 0.014985499903559685, 0.0170380137860775, 0.01272120513021946, 0.019480247050523758, 0.016183510422706604, 0.018123166635632515, 0.013028769753873348, 0.01676720753312111, 0.010111472569406033, 0.017393961548805237, 0.01071438379585743, 0.01852787472307682, 0.01947920210659504, 0.010889174416661263, 0.013105122372508049, 0.01446854043751955, 0.01717580482363701, 0.011589503847062588, 0.014562658034265041, 0.015842849388718605, 0.017068952322006226, 0.018368331715464592, 0.01331920176744461, 0.01984517090022564, 0.015060808509588242, 0.021684622392058372, 0.011352002620697021, 0.014470428228378296, 0.017445730045437813, 0.015076295472681522, 0.014610438607633114, 0.01602226123213768, 0.015543230809271336, 0.01677011139690876, 0.016283180564641953, 0.015959979966282845, 0.01595875434577465, 0.012761864811182022, 0.012963272631168365, 0.013297231867909431, 0.009222302585840225, 0.013840700499713421, 0.013001928105950356, 0.014025570824742317, 0.0161904264241457, 0.0154469208791852, 0.013219536282122135, 0.011061489582061768, 0.015389002859592438, 0.01836708001792431, 0.0155992666259408, 0.015926094725728035, 0.013008791022002697, 0.01637180894613266, 0.01538177765905857, 0.013004442676901817, 0.011528365314006805, 0.013921217992901802, 0.018385309725999832, 0.01701006107032299, 0.017620017752051353, 0.01246157567948103, 0.013555098325014114, 0.01473406795412302, 0.012706800363957882, 0.01504921168088913, 0.013201153837144375, 0.01328678522258997, 0.017405414953827858, 0.011417783796787262, 0.011685892008244991, 0.019112056121230125, 0.01587916910648346, 0.01727406308054924, 0.0112075824290514, 0.012984800152480602, 0.014845979399979115, 0.018044691532850266, 0.008773308247327805, 0.013302433304488659, 0.014669881202280521, 0.010797330178320408, 0.016289670020341873, 0.010762711055576801, 0.015740416944026947, 0.008375279605388641, 0.014242181554436684, 0.009737513028085232, 0.014595738612115383, 0.007744857110083103, 0.015657784417271614, 0.01875326782464981, 0.01675497554242611, 0.016140559688210487, 0.013950037769973278, 0.009707340970635414, 0.014766794629395008, 0.015048055909574032, 0.020532306283712387, 0.011998273432254791, 0.012096351012587547, 0.009772085584700108, 0.016024203971028328, 0.015159565024077892, 0.01893521286547184, 0.012398621998727322, 0.015035883523523808, 0.013200602494180202, 0.01297819335013628, 0.014850227162241936, 0.012569400481879711, 0.011463789269328117, 0.01914750039577484, 0.016546089202165604, 0.013123472221195698, 0.01785363256931305, 0.01556733250617981, 0.015059936791658401, 0.01624828204512596, 0.01555894035845995, 0.01877269335091114, 0.012586002238094807, 0.015588192269206047, 0.019493980333209038, 0.010896150022745132, 0.012365048751235008, 0.018048135563731194, 0.01322846207767725, 0.010157682001590729, 0.015510390512645245, 0.014838223345577717, 0.014606245793402195, 0.013889896683394909, 0.01562536135315895, 0.016269979998469353, 0.011120443232357502, 0.012311367318034172, 0.016331760212779045, 0.013054169714450836, 0.012259026989340782, 0.010218982584774494, 0.012240665033459663, 0.01499510370194912, 0.008712150156497955, 0.014474675990641117, 0.020589366555213928, 0.01163767371326685, 0.016640610992908478, 0.012417085468769073, 0.01637069508433342, 0.010092266835272312, 0.007979915477335453, 0.013861279003322124, 0.020293615758419037, 0.014164695516228676, 0.01727980375289917, 0.022441163659095764, 0.009661526419222355, 0.016905898228287697, 0.01778845675289631, 0.014682134613394737, 0.01623203232884407, 0.016101740300655365, 0.01499561034142971, 0.010601217858493328, 0.014314215630292892, 0.015224532224237919, 0.011455584317445755, 0.012796347960829735, 0.013873777352273464, 0.008255258202552795, 0.01189986988902092, 0.011325284838676453, 0.012866211123764515, 0.014096754603087902, 0.01247140672057867, 0.015570642426609993, 0.01712246611714363, 0.016929982230067253, 0.014927409589290619, 0.02124587818980217, 0.015775633975863457, 0.013950047083199024, 0.011876738630235195, 0.01574871502816677, 0.00985360611230135, 0.022024810314178467, 0.013082724995911121, 0.019405748695135117, 0.01664225198328495, 0.014247894287109375, 0.02082131989300251, 0.012476740404963493, 0.013244024477899075, 0.012926043942570686, 0.01253123115748167, 0.010814059525728226, 0.015011614188551903, 0.010579361580312252, 0.013690914027392864, 0.019288914278149605, 0.020176099613308907, 0.012059354223310947, 0.01425460260361433, 0.011957324109971523, 0.02035270445048809, 0.012012625113129616, 0.01069235522300005, 0.01427639089524746, 0.018094366416335106, 0.014400746673345566, 0.01803451217710972, 0.01307073887437582, 0.01562817394733429, 0.009896297939121723, 0.01562829688191414, 0.014600986614823341, 0.01303951721638441, 0.01127172727137804, 0.01354335155338049, 0.012155872769653797, 0.009703471325337887, 0.016426043584942818, 0.009624569676816463, 0.015315492637455463, 0.018331613391637802, 0.015069100074470043, 0.012325316667556763, 0.011092607863247395, 0.0249860230833292, 0.015099314972758293, 0.012199461460113525, 0.015286272391676903, 0.010672382079064846, 0.012871230952441692, 0.015252256765961647, 0.012810402549803257, 0.01720958575606346, 0.014615659601986408, 0.013605503365397453, 0.015750763937830925, 0.012200957164168358, 0.015435395762324333, 0.01310784462839365, 0.010705225169658661, 0.013346503488719463, 0.0158690195530653, 0.017373913899064064, 0.01032863650470972, 0.01754222996532917, 0.012087023817002773, 0.014727254398167133, 0.015761785209178925, 0.01621333323419094, 0.018171971663832664, 0.01704476587474346, 0.01423647627234459, 0.019179701805114746, 0.015728333964943886, 0.013848857022821903, 0.01672997511923313, 0.013004470616579056, 0.014746264554560184, 0.012624476104974747, 0.015064740553498268, 0.01916174404323101, 0.01315955352038145, 0.010424467734992504, 0.02070016786456108, 0.013983497396111488, 0.011878845281898975, 0.017836587503552437, 0.01865456812083721, 0.008779608644545078, 0.01493915542960167, 0.013999911956489086, 0.01927504874765873, 0.016437072306871414, 0.017044104635715485, 0.013921873643994331, 0.015478767454624176, 0.016095401719212532, 0.014682628214359283, 0.011523298919200897, 0.012401215732097626, 0.01654362678527832, 0.014758080244064331, 0.011245347559452057, 0.010827702470123768, 0.01433934923261404, 0.020738644525408745, 0.012697630561888218, 0.012890391983091831, 0.01499839685857296, 0.013889237307012081, 0.016720646992325783, 0.011753135360777378, 0.012277161702513695, 0.014482881873846054, 0.011510663665831089, 0.013431458733975887, 0.015981094911694527, 0.01333930529654026, 0.01777767762541771, 0.01792670413851738, 0.008067646995186806, 0.012709519825875759, 0.01515245158225298, 0.013160709291696548, 0.013352620415389538, 0.013449607416987419, 0.01919690892100334, 0.01764628477394581, 0.015411906875669956, 0.01842479594051838, 0.017747223377227783, 0.017139282077550888, 0.015141540206968784, 0.010441670194268227, 0.015034055337309837, 0.012910894118249416, 0.008726266212761402, 0.01265596505254507, 0.01235686894506216, 0.015260112471878529, 0.011199109256267548, 0.011697500944137573, 0.013121848925948143, 0.013714001514017582, 0.01488557644188404, 0.010089879855513573, 0.015145250596106052, 0.012644529342651367, 0.017067959532141685, 0.016669129952788353, 0.012216424569487572, 0.013608650304377079, 0.01566939987242222, 0.021635346114635468, 0.019400086253881454, 0.012982329353690147, 0.014019284397363663, 0.010342517867684364, 0.013440292328596115, 0.017487898468971252, 0.011608651839196682, 0.013461456634104252, 0.01215245109051466, 0.010870520025491714, 0.01481886487454176, 0.011247104033827782, 0.012012018822133541, 0.020108671858906746, 0.009791767224669456, 0.013818460516631603, 0.012373198755085468, 0.015562979504466057, 0.009700983762741089, 0.01830868236720562, 0.014884519390761852, 0.017453141510486603, 0.012960207648575306, 0.013329400680959225, 0.020377518609166145, 0.01584053225815296, 0.022579431533813477, 0.012453061528503895, 0.01224344503134489, 0.013955259695649147, 0.013638895004987717, 0.010344361886382103, 0.01172588299959898, 0.014188824221491814, 0.011871449649333954, 0.01977592334151268, 0.009035217575728893, 0.018265288323163986, 0.016679933294653893, 0.017023902386426926, 0.011005993001163006, 0.014177837409079075, 0.00821842160075903, 0.013294666074216366, 0.010644595138728619, 0.012984835542738438, 0.01668861322104931, 0.014755960553884506, 0.016254611313343048, 0.01814805157482624, 0.017976421862840652, 0.01782398298382759, 0.013756806030869484, 0.012347078882157803, 0.01985752210021019, 0.016416190192103386, 0.012861793860793114, 0.013136442750692368, 0.01782294362783432, 0.014176762662827969, 0.02169553004205227, 0.018910352140665054, 0.022274469956755638, 0.012336819432675838, 0.015652846544981003, 0.013342234306037426, 0.01287817396223545, 0.013322776183485985, 0.012300722301006317, 0.018305854871869087, 0.01772584393620491, 0.01648421585559845, 0.010563869960606098, 0.014292917214334011, 0.014135020785033703, 0.014334271661937237, 0.012492325156927109, 0.011820167303085327, 0.013000532053411007, 0.014183474704623222, 0.014309058897197247, 0.011591054499149323, 0.01653086394071579, 0.013208683580160141, 0.015129126608371735, 0.012074793688952923, 0.015013131313025951, 0.014697468839585781, 0.015994155779480934, 0.016092775389552116, 0.015920303761959076, 0.009286300279200077, 0.019796520471572876, 0.014934549108147621, 0.011215019039809704, 0.012207811698317528, 0.01521605346351862, 0.015231722965836525, 0.013893797062337399, 0.013705774210393429, 0.011231712996959686, 0.014398312196135521, 0.018288392573595047, 0.012559470720589161, 0.019288979470729828, 0.016910990700125694, 0.0188098456710577, 0.010434349067509174, 0.01039385050535202, 0.011827045120298862, 0.010298928245902061, 0.011737331748008728, 0.0172980148345232, 0.01669054850935936, 0.01325226854532957, 0.01639167219400406, 0.01835121400654316, 0.013120968826115131, 0.016998067498207092, 0.0166577510535717, 0.014302127063274384, 0.010408739559352398, 0.015543085522949696, 0.017319561913609505, 0.012878327630460262, 0.014857746660709381, 0.013139505870640278, 0.0168275348842144, 0.016646094620227814, 0.017396247014403343, 0.014108224771916866, 0.013417663052678108, 0.01063772477209568, 0.01797383278608322, 0.013910521753132343, 0.012188446708023548, 0.015480458736419678, 0.014167087152600288, 0.01717229187488556, 0.013753081671893597, 0.01638750545680523, 0.015632394701242447, 0.011088617146015167, 0.01437865849584341, 0.018101511523127556, 0.015528233721852303, 0.017554456368088722, 0.01689785346388817, 0.01743229292333126, 0.013922883197665215, 0.02019209787249565, 0.01679144613444805, 0.012340784072875977, 0.018699143081903458, 0.012576859444379807, 0.012424788437783718, 0.016325782984495163, 0.019729821011424065, 0.0161475520581007, 0.016162084415555, 0.018768850713968277, 0.015459022484719753, 0.011390656232833862, 0.012461150996387005, 0.014575926586985588, 0.009682287462055683, 0.01698310300707817, 0.012632041238248348, 0.021149717271327972, 0.019964180886745453, 0.016660109162330627, 0.01210025604814291, 0.014287162572145462, 0.0203152596950531, 0.013169988058507442, 0.012745195999741554, 0.014474919997155666, 0.01163947582244873, 0.013910344801843166, 0.01585008203983307, 0.01226817723363638, 0.009646845050156116, 0.01794772781431675, 0.015741243958473206, 0.012830757535994053, 0.011555778793990612, 0.011784943751990795, 0.016921674832701683, 0.013738307170569897, 0.014343390241265297, 0.024495357647538185, 0.013112387619912624, 0.016558578237891197, 0.01302757766097784, 0.012513913214206696, 0.0069653731770813465, 0.011409434489905834, 0.011552640236914158, 0.016775038093328476, 0.014066943898797035, 0.017050785943865776, 0.011610496789216995, 0.013864459469914436, 0.016255836933851242, 0.01660570688545704, 0.013101796619594097, 0.014169583097100258, 0.016673296689987183, 0.018819589167833328, 0.010896094143390656, 0.015093866735696793, 0.008889035321772099, 0.012060987763106823, 0.013384399935603142, 0.010977615602314472, 0.015598194673657417, 0.012413672171533108, 0.013533071614801884, 0.014770056121051311, 0.010948504321277142, 0.016012858599424362, 0.015198669396340847, 0.013875390402972698, 0.010970446281135082, 0.013627617619931698, 0.015181371942162514, 0.015710419043898582, 0.014879398047924042, 0.013725529424846172, 0.015766313299536705, 0.013644215650856495, 0.01482765655964613, 0.009816099889576435, 0.01274476945400238, 0.015070877969264984, 0.012759066186845303, 0.016125066205859184, 0.015448709018528461, 0.011427978985011578, 0.012982896529138088, 0.012414917349815369, 0.01636517234146595, 0.010730203241109848, 0.011487369425594807, 0.013683789409697056, 0.013662184588611126, 0.017001986503601074, 0.01382617000490427, 0.009383502416312695, 0.016134606674313545, 0.017617885023355484, 0.015364051796495914, 0.014077490195631981, 0.014432237483561039, 0.01435584295541048, 0.015633055940270424, 0.015428056009113789, 0.015231630764901638, 0.013195951469242573, 0.013370838016271591, 0.013780822046101093, 0.01329419668763876, 0.0189259871840477, 0.015402832068502903, 0.01572522521018982, 0.01575031876564026, 0.018626395612955093, 0.013295643962919712, 0.014998710714280605, 0.008765538223087788, 0.013446087017655373, 0.015360396355390549, 0.016361825168132782, 0.015448288060724735, 0.013242118060588837, 0.018956122919917107, 0.013785823248326778, 0.014343373477458954, 0.016193579882383347, 0.016720961779356003, 0.019078420475125313, 0.016126666218042374, 0.015943700447678566, 0.016192467883229256, 0.019414741545915604, 0.018353885039687157, 0.011089794337749481, 0.01368016842752695, 0.019154470413923264, 0.015048072673380375, 0.00982813909649849, 0.013871165923774242, 0.01382596604526043, 0.01271446980535984, 0.01459297351539135, 0.01996687985956669, 0.017132440581917763, 0.01672506146132946, 0.01334786880761385, 0.013684909790754318, 0.024313203990459442, 0.013124577701091766, 0.017736446112394333, 0.015900198370218277, 0.009727501310408115, 0.015330366790294647, 0.013912336900830269, 0.014411951415240765, 0.010139884427189827, 0.016144011169672012, 0.01038780715316534, 0.01749086193740368, 0.015228607691824436, 0.014620549976825714, 0.013376684859395027, 0.014660811983048916, 0.017577560618519783, 0.013142507523298264, 0.012512920424342155].
985
-
986
- ## How to use
987
-
988
- You can use this model directly with a pipeline for text classification:
989
-
990
- ```python
991
- from transformers import pipeline
992
-
993
- classifier = pipeline("text-classification", model="YusuphaJuwara/nli-fever")
994
- result = classifier("premise", "hypothesis")
995
- print(result)
996
- ```
997
-
998
- ## Saved Metrics
999
-
1000
- This model repository includes a `metrics.json` file containing detailed training metrics.
1001
- You can load these metrics using the following code:
1002
-
1003
- ```python
1004
- from huggingface_hub import hf_hub_download
1005
- import json
1006
-
1007
- metrics_file = hf_hub_download(repo_id="YusuphaJuwara/nli-fever", filename="metrics.json")
1008
- with open(metrics_file, 'r') as f:
1009
- metrics = json.load(f)
1010
-
1011
- # Now you can access metrics like:
1012
- print("Last epoch: ", metrics['last_epoch'])
1013
- print("Final validation loss: ", metrics['val_losses'][-1])
1014
- print("Final validation accuracy: ", metrics['val_accuracies'][-1])
1015
- ```
1016
-
1017
- These metrics can be useful for continuing training from the last epoch or for detailed analysis of the training process.
1018
-
1019
- ## Training results
1020
- ![Include a plot of your training metrics here](training_plot.png)
1021
-
1022
- Limitations and bias
1023
- ## This model may exhibit biases present in the training data. Always validate results and use the model responsibly.
1024
-
1025
- ## Plots
1026
- ![Labels distribution plots](label_distribution.png)
1027
- ![loss plots](loss_plot.png)
1028
- ![accuracy plots](accuracy_plot.png)
1029
- ![f1 score plots](f1_score_plot.png)
1030
- ![confusion matrix plots](confusion_matrix.png)
1031
- ![precision recall curve plots](precision_recall_curve.png)
1032
- ![roc curve plots](roc_curve.png)
 
1
  ---
2
  tags:
3
+ - model_hub_mixin
4
+ - pytorch_model_hub_mixin
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  ---
6
 
7
+ This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
+ - Library: [More Information Needed]
9
+ - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0d003564ed4f431623ec740b772846f6a79d2c66d77e0e2c5ce537a229f79ca7
3
  size 40785636
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0f6d3b8e71dd3dcf1a4a5607923422b3f0b1950c5759d0a897b04957118da77d
3
  size 40785636