Phips commited on
Commit
73f57b9
·
verified ·
1 Parent(s): 245831d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -4
README.md CHANGED
@@ -107,7 +107,4 @@ See [file list](https://huggingface.co/datasets/Phips/BHI/resolve/main/files.txt
107
  ## Upload
108
 
109
  I uploaded the dataset as multi-part zip archive files with a max of 25GB per file, resulting in 6 archive files.
110
-
111
- This should work with lfs file size limit, and i chose zip because its such a common format. I could have of course used another format like 7z or zpaq or something.
112
-
113
- I actually once in the past worked on an archiver called [ShareArchiver](https://github.com/Phhofm/ShareArchiver) where my main idea was, that online shared data (like this dataset) generally gets archived once (by the uploader) but downloaded and extracted maybe a thousand times. So resulting file size (faster download time for those thousand downloads) and extraction speed (those thousand extraction) would be waay more important than compression speed. So the main idea is we are trading archiving time (very long time to archive) of that one person for faster downloads and extraction for all. The design of this archiver was that I chose only highly assymetrical compression algos, where compression times can very slow as long as decompression speed is high, and then it would brute force during compression, meaning of those available highly assymetric compression algos, it would compress each single file with all of them, check the resulting file sizes, and add only the smallest one to the .share archive. Just something from the past I wanted to mention. (one could also use the max flag to just use all of them, meaning also the symmetrical ones, just to brute force the smallest archive file possible (using paq8o etc), but of corse compression time would also be very long, but this flag was more for archiving purposes than online sharing purposes, in a case where store space would be waay more important than either compression or decompression speed.)
 
107
  ## Upload
108
 
109
  I uploaded the dataset as multi-part zip archive files with a max of 25GB per file, resulting in 6 archive files.
110
+ This should work with lfs file size limit, and i chose zip because its such a common format.