bboldt's picture
Move top-level data/ to systems/
5248a77
raw
history blame contribute delete
523 Bytes
{
"metrics": {
"analysis": {
"Token Count": 110000,
"Line Count": 10000,
"Tokens per Line": 11.0,
"Tokens per Line SD": 0.0,
"Unique Tokens": 10,
"Unique Lines": 2499,
"1-gram Entropy": 2.750844933129976,
"1-gram Normalized Entropy": 0.8280868382924013,
"2-gram Entropy": 4.070906394838263,
"2-gram Conditional Entropy": 1.3200614617082875,
"Entropy per Line": 30.259294264429737,
"EoS Token Present": true,
"EoS Padding": true
}
}
}