bboldt's picture
Move top-level data/ to systems/
5248a77
raw
history blame contribute delete
524 Bytes
{
"metrics": {
"analysis": {
"Token Count": 110000,
"Line Count": 10000,
"Tokens per Line": 11.0,
"Tokens per Line SD": 0.0,
"Unique Tokens": 8,
"Unique Lines": 1373,
"1-gram Entropy": 2.5732431163576788,
"1-gram Normalized Entropy": 0.8577477054525596,
"2-gram Entropy": 3.7502942276789883,
"2-gram Conditional Entropy": 1.1770511113213096,
"Entropy per Line": 28.305674279934504,
"EoS Token Present": true,
"EoS Padding": true
}
}
}