hysts-bot commited on
Commit
58e8603
·
verified ·
1 Parent(s): 0173092

Upload folder using huggingface_hub

Browse files
0.codes.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:894f896cd320c3c481a80e5d942453e3455ef91ab1205a73a57ba5cd3e5b3297
3
- size 2731164
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba888d34c7bae99e7efafe82919fa57c79b0c4cc090610dd3ff8fb6ad6abe771
3
+ size 2732444
0.metadata.json CHANGED
@@ -1,6 +1,6 @@
1
  {
2
  "passage_offset": 0,
3
- "num_passages": 3973,
4
- "num_embeddings": 682500,
5
  "embedding_offset": 0
6
  }
 
1
  {
2
  "passage_offset": 0,
3
+ "num_passages": 3976,
4
+ "num_embeddings": 682831,
5
  "embedding_offset": 0
6
  }
0.residuals.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:eee937b5299ee1b5c58bd54b99b8590dc3a2bd8a6343254a080711380781878d
3
- size 87361200
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3ec9ffdfc1932b8b557497a45525ca8e46a5b51e96e19168ad9eef254f5a53db
3
+ size 87403568
avg_residual.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ab1b294fb90b30fdc6dd74e7a3e4453361706d94b10d1024bcd574b3ccbf0614
3
  size 1205
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5427315cdee8433d755a8fe320678cd01b1ea6fda093cb5e949a7cc66e71e7ce
3
  size 1205
buckets.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:69e1faa4b9ff3ed048e66b3aa724c85b48e098069213a1a6bac47acef37b3f56
3
  size 2904
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:77519d44b948d91d5f6694e01c45b21cae12ad419812e49d8360693b2e0d4917
3
  size 2904
centroids.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7773324b5d9c0f336bc4fba44e43766ff68cbd6cf6133c7e32f9bb9ae865d3a2
3
  size 2098342
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f82d321edc56431c140c0ed6a97a695033fbe5c5b75495362beb4ddd7ca674c8
3
  size 2098342
collection.json CHANGED
@@ -3971,5 +3971,8 @@
3971
  "Challenges in the automated evaluation of Retrieval-Augmented Generation (RAG) Question-Answering (QA) systems include hallucination problems in domain-specific knowledge and the lack of gold standard benchmarks for company internal tasks. This results in difficulties in evaluating RAG variations, like RAG-Fusion (RAGF), in the context of a product QA task at Infineon Technologies. To solve these problems, we propose a comprehensive evaluation framework, which leverages Large Language Models (LLMs) to generate large datasets of synthetic queries based on real user queries and in-domain documents, uses LLM-as-a-judge to rate retrieved documents and answers, evaluates the quality of answers, and ranks different variants of Retrieval-Augmented Generation (RAG) agents with RAGElo's automated Elo-based competition. LLM-as-a-judge rating of a random sample of synthetic queries shows a moderate, positive correlation with domain expert scoring in relevance, accuracy, completeness, and precision. While RAGF outperformed RAG in Elo score, a significance analysis against expert annotations also shows that RAGF significantly outperforms RAG in completeness, but underperforms in precision.",
3972
  "While RAGF outperformed RAG in Elo score, a significance analysis against expert annotations also shows that RAGF significantly outperforms RAG in completeness, but underperforms in precision. In addition, Infineon's RAGF assistant demonstrated slightly higher performance in document relevance based on MRR@5 scores. We find that RAGElo positively aligns with the preferences of human annotators, though due caution is still required. Finally, RAGF's approach leads to more complete answers based on expert annotations and better answers overall based on RAGElo's evaluation criteria.",
3973
  "The blooming of virtual reality and augmented reality (VR/AR) technologies has driven an increasing demand for the creation of high-quality, immersive, and dynamic environments. However, existing generative techniques either focus solely on dynamic objects or perform outpainting from a single perspective image, failing to meet the needs of VR/AR applications. In this work, we tackle the challenging task of elevating a single panorama to an immersive 4D experience. For the first time, we demonstrate the capability to generate omnidirectional dynamic scenes with 360-degree views at 4K resolution, thereby providing an immersive user experience. Our method introduces a pipeline that facilitates natural scene animations and optimizes a set of 4D Gaussians using efficient splatting techniques for real-time exploration. To overcome the lack of scene-scale annotated 4D data and models, especially in panoramic formats, we propose a novel Panoramic Denoiser that adapts generic 2D diffusion priors to animate consistently in 360-degree images, transforming them into panoramic videos with dynamic scenes at targeted regions. Subsequently, we elevate the panoramic video into a 4D immersive environment while preserving spatial and temporal consistency.",
3974
- "Subsequently, we elevate the panoramic video into a 4D immersive environment while preserving spatial and temporal consistency. By transferring prior knowledge from 2D models in the perspective domain to the panoramic domain and the 4D lifting with spatial appearance and geometry regularization, we achieve high-quality Panorama-to-4D generation at a resolution of (4096 times 2048) for the first time. See the project website at https://4k4dgen.github.io."
 
 
 
3975
  ]
 
3971
  "Challenges in the automated evaluation of Retrieval-Augmented Generation (RAG) Question-Answering (QA) systems include hallucination problems in domain-specific knowledge and the lack of gold standard benchmarks for company internal tasks. This results in difficulties in evaluating RAG variations, like RAG-Fusion (RAGF), in the context of a product QA task at Infineon Technologies. To solve these problems, we propose a comprehensive evaluation framework, which leverages Large Language Models (LLMs) to generate large datasets of synthetic queries based on real user queries and in-domain documents, uses LLM-as-a-judge to rate retrieved documents and answers, evaluates the quality of answers, and ranks different variants of Retrieval-Augmented Generation (RAG) agents with RAGElo's automated Elo-based competition. LLM-as-a-judge rating of a random sample of synthetic queries shows a moderate, positive correlation with domain expert scoring in relevance, accuracy, completeness, and precision. While RAGF outperformed RAG in Elo score, a significance analysis against expert annotations also shows that RAGF significantly outperforms RAG in completeness, but underperforms in precision.",
3972
  "While RAGF outperformed RAG in Elo score, a significance analysis against expert annotations also shows that RAGF significantly outperforms RAG in completeness, but underperforms in precision. In addition, Infineon's RAGF assistant demonstrated slightly higher performance in document relevance based on MRR@5 scores. We find that RAGElo positively aligns with the preferences of human annotators, though due caution is still required. Finally, RAGF's approach leads to more complete answers based on expert annotations and better answers overall based on RAGElo's evaluation criteria.",
3973
  "The blooming of virtual reality and augmented reality (VR/AR) technologies has driven an increasing demand for the creation of high-quality, immersive, and dynamic environments. However, existing generative techniques either focus solely on dynamic objects or perform outpainting from a single perspective image, failing to meet the needs of VR/AR applications. In this work, we tackle the challenging task of elevating a single panorama to an immersive 4D experience. For the first time, we demonstrate the capability to generate omnidirectional dynamic scenes with 360-degree views at 4K resolution, thereby providing an immersive user experience. Our method introduces a pipeline that facilitates natural scene animations and optimizes a set of 4D Gaussians using efficient splatting techniques for real-time exploration. To overcome the lack of scene-scale annotated 4D data and models, especially in panoramic formats, we propose a novel Panoramic Denoiser that adapts generic 2D diffusion priors to animate consistently in 360-degree images, transforming them into panoramic videos with dynamic scenes at targeted regions. Subsequently, we elevate the panoramic video into a 4D immersive environment while preserving spatial and temporal consistency.",
3974
+ "Subsequently, we elevate the panoramic video into a 4D immersive environment while preserving spatial and temporal consistency. By transferring prior knowledge from 2D models in the perspective domain to the panoramic domain and the 4D lifting with spatial appearance and geometry regularization, we achieve high-quality Panorama-to-4D generation at a resolution of (4096 times 2048) for the first time. See the project website at https://4k4dgen.github.io.",
3975
+ "With the proliferation of domain-specific models, model merging has emerged as a set of techniques that combine the capabilities of multiple models into one that can multitask without the cost of additional training. In this paper, we propose a new model merging technique, Drop and rEscaLe via sampLing with mAgnitude (DELLA-Merging), that employs a novel pruning technique, MAGPRUNE, which shows significant advantages over DARE and TIES. MAGPRUNE first ranks the parameters in order of their magnitude and assigns higher dropout probabilities (p) to parameters with lower ranks corresponding to lower magnitudes. To approximate the original embeddings, MAGPRUNE employs a rescaling operation on the parameters that survive the random dropping by 1/(1 - p). On three different expert models considered for merging (LM, Math, Code) and corresponding benchmark datasets (AlpacaEval, GSM8K, MBPP), DELLA shows an average improvement of 2.4 points over baseline methods employing delta parameter pruning (an improvement of 3.6 points over TIES, 1.2 points over DARE), and 11.1 points over the no-pruning baseline (TA).",
3976
+ "We release the source code at: https://github.com/declare-lab/della.",
3977
+ "We propose Ruby Teaming, a method that improves on Rainbow Teaming by including a memory cache as its third dimension. The memory dimension provides cues to the mutator to yield better-quality prompts, both in terms of attack success rate (ASR) and quality diversity. The prompt archive generated by Ruby Teaming has an ASR of 74%, which is 20% higher than the baseline. In terms of quality diversity, Ruby Teaming outperforms Rainbow Teaming by 6% and 3% on Shannon's Evenness Index (SEI) and Simpson's Diversity Index (SDI), respectively."
3978
  ]
doclens.0.json CHANGED
@@ -1 +1 @@
1
- [178,205,218,148,184,163,221,185,200,228,172,155,210,222,88,206,226,67,132,212,91,206,104,212,174,205,132,159,230,175,216,198,227,190,212,198,122,213,169,204,92,197,118,191,191,224,69,219,197,72,218,77,175,111,155,217,220,170,231,91,221,217,95,146,177,123,195,205,151,209,207,36,202,200,226,176,232,53,167,199,89,184,213,104,154,153,216,214,215,174,205,72,211,78,221,212,232,223,73,158,220,158,202,222,189,165,205,175,222,132,126,179,219,110,209,158,208,98,176,192,226,34,158,205,126,178,224,182,227,100,152,191,169,195,163,172,208,117,199,217,167,217,157,163,194,217,200,217,23,221,209,146,150,204,200,125,215,232,68,147,212,41,223,178,152,173,210,139,198,182,196,207,95,176,205,223,83,216,85,207,210,52,177,178,230,197,119,226,99,182,210,212,77,138,199,123,179,111,219,69,223,65,204,215,83,197,87,211,132,216,135,178,157,166,216,85,170,195,208,190,175,134,220,200,67,221,211,66,227,222,226,190,209,205,67,207,139,208,127,186,205,168,221,179,223,117,148,221,216,80,189,125,199,202,64,218,77,195,190,221,181,98,143,214,220,97,187,127,219,122,216,87,138,212,194,112,219,227,101,220,100,164,234,109,221,102,223,89,184,205,219,77,188,223,172,171,175,152,175,137,213,197,114,205,221,138,181,174,227,73,147,144,178,147,215,152,182,204,80,210,123,211,121,209,224,224,219,211,163,133,187,148,151,163,221,94,133,213,72,187,224,216,162,154,224,184,118,204,220,154,117,220,162,202,223,195,110,197,151,224,88,182,217,221,214,118,218,118,164,205,97,221,183,154,206,197,74,170,219,103,230,215,192,224,78,184,72,201,227,221,191,181,104,190,224,221,99,123,206,102,228,202,74,195,96,225,176,232,231,96,225,206,61,173,101,190,211,90,213,199,203,184,209,173,160,207,203,99,201,195,132,195,214,148,211,227,192,215,212,127,162,213,114,178,111,207,71,129,182,212,75,176,209,137,213,224,220,70,232,116,228,179,200,184,222,93,202,71,219,123,213,119,204,173,135,118,207,96,216,107,210,202,189,172,211,76,146,195,169,83,227,62,219,223,125,158,202,226,98,214,225,76,156,211,204,110,190,224,200,64,228,223,72,213,102,214,178,219,128,211,127,187,213,160,216,134,202,150,186,201,195,230,199,206,193,205,219,216,84,213,223,192,222,109,187,209,79,153,204,224,72,167,190,183,183,228,133,227,103,197,225,203,205,201,207,213,210,78,138,195,133,195,180,190,185,207,189,100,191,229,198,129,226,135,205,164,226,190,73,200,231,209,216,147,221,228,165,213,209,181,157,222,117,216,58,148,219,92,222,219,63,191,218,186,197,158,172,204,109,204,102,223,229,91,192,217,186,204,144,173,209,95,186,222,225,178,183,222,71,193,182,232,63,227,228,82,209,162,219,183,222,122,212,195,60,219,130,223,69,216,111,198,224,80,223,74,164,148,227,193,216,139,207,198,148,227,199,169,230,225,86,138,165,206,91,213,164,215,229,173,216,210,92,166,205,102,196,78,202,74,187,202,229,191,222,57,206,115,183,94,225,173,198,109,212,224,126,215,44,221,80,196,95,177,218,217,198,109,218,191,221,214,227,71,207,123,206,90,219,218,108,181,214,123,150,204,214,208,225,199,191,192,213,93,217,149,178,199,106,204,80,208,60,131,132,232,218,213,115,202,94,175,227,217,114,200,222,178,180,185,192,193,205,173,226,217,222,197,184,205,211,136,214,221,62,213,89,169,184,135,198,98,216,225,71,230,133,213,154,216,189,122,224,108,204,88,207,227,93,191,204,214,66,153,219,209,203,218,237,229,222,148,174,218,107,214,169,223,216,173,220,148,199,188,175,203,142,188,210,98,167,174,224,103,156,97,220,111,216,70,130,221,216,115,166,164,162,210,216,69,173,203,77,216,175,172,210,98,199,213,88,221,82,184,131,201,220,133,211,116,156,219,213,227,128,221,237,172,217,64,180,221,70,169,220,216,134,215,177,190,197,229,221,226,74,189,190,219,86,222,74,213,106,184,182,205,221,229,143,201,198,162,209,128,221,209,60,216,146,228,106,215,154,214,202,82,194,141,168,217,213,80,140,114,220,149,229,215,150,211,170,199,109,218,108,206,125,204,164,183,148,193,153,188,143,160,199,210,124,225,105,178,75,222,112,205,75,199,83,222,215,158,104,205,70,215,223,158,220,99,124,194,91,219,65,209,149,234,188,222,218,200,66,208,101,208,188,201,207,198,160,205,219,118,170,195,141,128,182,189,211,51,179,216,112,152,196,224,80,223,177,218,127,186,113,151,117,159,184,200,231,212,134,222,68,216,204,90,151,202,158,173,146,144,209,222,89,171,188,206,215,117,191,223,80,215,68,151,159,210,74,208,105,204,105,174,228,67,147,202,214,41,222,163,183,142,227,188,200,208,132,157,191,195,94,166,201,155,214,132,185,195,165,214,219,156,217,100,147,220,105,220,223,225,117,176,166,207,134,218,202,172,183,217,76,229,139,217,87,208,181,214,169,154,199,231,208,200,157,213,222,49,207,191,139,219,218,82,122,154,137,190,176,192,180,231,80,208,136,185,112,189,187,192,208,219,110,208,102,186,225,220,218,235,214,96,133,198,199,95,152,83,219,190,124,148,222,213,207,145,217,168,200,32,208,198,171,205,218,104,218,136,230,77,139,178,212,214,129,211,233,93,221,95,155,153,221,151,201,108,193,216,172,208,72,177,217,208,225,60,81,172,234,180,146,184,180,199,222,224,182,106,183,105,221,208,81,221,137,211,44,198,224,205,82,187,164,147,213,163,229,192,219,218,190,228,221,218,116,231,97,182,126,213,193,173,207,103,207,122,161,156,193,153,192,92,198,155,223,224,106,223,170,177,225,40,206,203,221,152,208,116,203,105,197,221,65,226,111,224,215,215,142,154,211,84,184,135,169,212,213,76,229,225,130,207,85,209,177,116,140,150,229,209,105,164,186,214,219,62,148,203,230,118,207,216,217,211,114,201,134,217,87,197,124,209,186,184,164,189,163,212,85,219,106,209,220,168,221,212,112,189,183,229,200,192,106,224,114,220,95,141,223,222,219,226,79,187,202,212,222,193,128,223,209,160,178,197,222,111,152,114,164,206,179,189,183,122,212,149,150,130,214,115,195,118,204,208,120,195,163,159,220,125,222,174,211,144,140,220,80,173,204,87,215,149,227,97,229,221,119,213,139,212,82,215,68,213,136,186,168,202,85,213,93,168,218,63,222,121,213,90,218,94,197,221,182,220,112,224,221,83,215,194,138,181,197,221,91,186,194,99,179,213,210,76,188,214,216,79,213,92,220,94,201,108,213,131,221,153,219,74,159,172,218,217,219,108,209,35,217,112,157,206,148,211,126,214,107,191,105,164,224,220,126,204,201,211,119,203,129,165,205,130,218,137,211,178,204,215,206,119,209,75,193,182,115,178,207,217,220,184,199,167,189,230,140,208,167,173,195,136,220,205,64,186,159,194,75,213,189,110,189,208,120,176,182,162,196,183,137,134,198,172,193,184,172,212,189,214,200,95,167,199,110,195,116,180,204,66,217,126,222,226,193,166,143,121,218,224,196,140,226,142,206,66,214,140,218,144,201,221,152,204,154,221,101,225,69,216,212,128,181,200,180,206,136,193,226,226,219,141,168,202,209,105,178,128,208,197,145,163,218,124,220,147,220,156,211,69,212,226,132,223,165,211,110,229,194,193,204,208,90,218,106,152,215,134,180,182,204,211,80,211,108,204,102,208,102,192,94,191,206,203,140,165,213,197,105,186,186,210,62,208,165,213,174,186,115,211,164,193,128,201,169,217,167,217,174,222,167,129,170,217,218,182,159,113,208,85,219,126,219,89,172,213,212,69,205,135,124,205,67,208,52,186,160,225,125,190,65,227,168,180,166,151,194,205,88,138,204,110,124,223,80,193,106,179,183,163,220,148,205,218,200,221,198,202,190,213,87,227,239,81,213,115,211,219,201,204,105,216,83,210,95,170,219,197,86,228,86,209,125,231,83,198,194,171,144,183,222,216,216,226,204,109,216,115,180,227,116,207,223,84,190,215,117,212,79,214,230,218,134,208,92,209,99,181,220,61,212,85,199,185,174,171,220,221,57,146,224,214,81,183,218,152,163,193,72,203,111,130,222,204,217,229,209,201,189,220,213,72,192,214,221,157,197,124,209,100,223,162,223,73,85,221,68,210,46,205,76,118,160,180,200,85,218,34,196,170,204,88,168,224,71,188,207,166,181,210,101,148,215,76,219,217,104,199,180,210,208,69,201,202,216,224,89,206,88,128,125,196,203,106,183,182,84,228,200,120,171,166,153,215,210,82,210,65,163,131,213,83,180,205,92,152,228,99,212,212,209,94,215,218,95,206,179,236,219,204,102,196,110,160,218,140,228,102,218,177,172,193,95,209,215,203,220,78,213,191,189,168,204,218,192,89,213,223,208,230,168,202,169,222,56,155,203,73,197,156,208,222,127,212,223,160,217,201,223,205,217,100,221,208,143,218,22,208,224,86,209,90,220,144,182,139,219,79,194,183,225,216,209,190,102,221,220,203,152,215,113,230,106,212,221,106,197,111,185,211,83,162,200,208,219,124,210,92,161,216,221,79,208,166,211,142,179,222,197,79,185,226,182,193,195,170,202,113,192,231,228,197,180,152,223,155,215,97,213,85,205,145,200,89,209,201,139,217,163,133,206,174,112,205,213,217,123,201,136,132,221,206,105,200,139,183,160,215,134,221,208,181,208,101,210,225,116,183,217,80,186,120,162,194,201,97,188,209,174,122,229,214,205,120,218,228,228,83,134,201,119,171,210,55,227,107,225,64,174,203,130,213,226,95,206,157,139,154,171,139,224,219,154,190,206,218,217,221,86,135,217,213,183,118,187,186,128,212,79,184,186,156,188,134,216,172,155,195,210,176,223,125,128,217,213,214,217,94,189,192,133,167,85,187,217,60,158,221,122,217,117,224,95,206,89,199,74,205,210,70,209,83,196,77,169,188,228,114,188,105,229,182,174,223,59,181,205,127,225,222,128,190,219,112,197,115,222,205,115,184,214,210,238,196,212,85,207,86,183,215,178,187,112,194,176,212,90,217,80,164,179,187,138,221,214,203,96,227,198,82,209,207,96,208,76,228,224,205,207,186,178,221,224,189,143,229,182,229,132,213,201,186,218,67,222,221,83,216,44,220,68,214,103,137,210,48,210,211,88,222,169,225,155,181,200,207,62,219,152,179,130,215,86,183,90,202,164,179,205,128,219,224,148,190,117,213,221,184,175,146,223,216,82,230,92,201,231,218,172,207,209,189,150,216,142,226,46,197,219,92,228,168,216,111,221,204,216,190,180,189,206,108,204,101,224,74,223,65,181,142,211,203,122,220,214,66,179,185,222,116,120,209,222,101,172,209,232,56,223,177,212,107,191,168,193,193,105,212,105,163,209,191,117,194,170,211,190,121,221,90,161,207,67,226,90,216,60,171,219,89,220,184,214,194,144,182,214,208,196,133,226,214,77,190,215,63,115,208,92,227,204,68,197,77,178,188,204,216,202,194,152,175,181,208,131,196,223,128,177,116,207,102,220,121,216,199,202,207,74,211,196,217,177,214,218,221,93,219,192,215,143,139,183,226,83,102,219,211,100,194,47,119,204,160,143,180,149,183,226,96,172,212,186,198,207,182,207,221,164,146,224,61,200,140,147,196,214,60,226,215,111,216,186,180,161,234,196,124,208,214,221,189,178,112,168,208,136,169,181,202,124,201,74,211,143,192,175,119,163,202,67,191,172,148,179,223,122,217,117,180,190,186,209,72,193,182,218,80,156,138,201,202,218,208,149,190,109,224,64,192,114,219,199,210,208,97,232,205,197,188,189,87,213,152,196,164,131,225,94,219,205,163,191,172,196,189,206,110,201,73,191,122,210,173,208,73,221,136,222,185,224,74,213,86,185,221,170,210,69,165,156,210,102,211,210,221,197,209,154,127,212,180,208,139,231,207,50,166,96,150,158,206,228,214,202,95,162,220,191,136,217,54,155,201,140,179,191,91,213,220,74,145,216,232,45,208,217,209,182,160,182,100,221,155,219,227,160,180,209,147,174,212,67,209,82,213,148,208,51,176,88,210,71,117,174,218,82,211,188,170,210,186,136,176,220,157,189,167,190,212,160,212,135,201,219,49,162,165,209,117,175,213,152,176,220,221,124,150,204,220,73,228,194,218,57,195,173,159,173,175,206,176,229,79,164,201,203,152,214,116,137,219,66,222,214,80,175,202,90,225,113,219,206,78,190,89,214,50,209,154,188,227,194,157,195,74,186,206,130,198,73,212,60,204,122,222,99,205,196,229,213,83,230,108,171,126,192,104,216,207,117,217,197,214,79,207,110,221,79,217,144,206,160,206,172,197,183,207,217,207,113,210,221,71,161,221,164,227,214,142,177,185,180,103,130,198,123,205,74,216,102,219,160,217,75,204,114,192,213,166,188,118,222,227,92,195,219,161,200,221,69,203,143,198,198,217,198,66,212,50,208,116,199,125,210,207,167,225,116,207,97,184,99,220,184,203,184,219,177,167,202,214,55,207,161,197,122,212,226,187,96,216,201,188,135,224,207,139,225,230,220,121,221,107,212,66,170,169,210,199,102,220,94,159,184,207,92,207,231,214,125,227,220,205,58,193,203,215,223,229,78,196,170,185,196,162,234,56,201,123,171,231,196,86,162,199,213,220,68,200,68,205,88,225,135,220,82,182,215,222,79,152,230,62,162,218,184,224,67,206,99,189,124,214,197,73,204,105,221,179,102,218,232,80,214,181,170,204,165,216,207,217,212,195,176,215,106,192,160,221,182,217,57,211,88,198,233,113,171,204,138,193,209,225,59,176,184,134,223,151,193,200,217,100,225,79,180,142,190,123,222,80,232,216,133,216,148,211,110,198,96,187,224,95,208,112,178,227,94,171,96,181,209,170,225,196,206,94,216,87,217,171,191,82,218,127,227,176,219,207,230,79,214,203,105,213,143,174,188,125,193,220,60,215,172,214,101,211,110,161,117,187,180,125,218,220,62,208,203,217,87,198,156,216,226,161,161,223,224,72,178,198,213,195,219,208,140,175,217,74,201,201,66,186,154,229,89,226,169,204,87,184,85,161,133,201,80,176,188,114,224,77,207,126,202,83,219,200,125,172,169,190,216,80,88,221,68,218,133,216,117,217,157,217,170,190,124,214,210,156,231,84,207,204,113,200,70,222,162,208,227,92,223,136,167,195,221,221,77,173,213,109,214,117,211,217,89,217,91,210,152,194,206,202,110,216,177,190,207,227,185,172,230,172,207,171,199,234,207,149,194,192,179,212,209,210,101,198,225,85,164,211,110,194,182,211,224,65,228,218,79,224,81,122,208,154,129,206,92,193,171,148,188,221,80,220,161,165,166,161,214,99,210,64,174,224,221,105,200,122,230,216,94,223,128,225,161,219,126,187,137,191,222,214,148,151,198,218,210,110,208,228,184,211,35,202,218,195,216,115,212,95,177,199,101,184,208,202,212,134,193,129,192,81,182,223,70,226,230,134,167,183,198,222,227,227,226,63,213,109,187,177,219,223,203,144,179,209,103,177,181,158,221,90,222,166,207,175,230,207,99,205,234,210,210,168,223,143,210,187,209,204,150,209,213,208,193,221,214,77,215,199,81,197,82,177,190,210,231,79,179,221,64,182,199,82,204,204,95,172,187,178,209,86,222,220,118,192,223,88,220,77,174,104,224,137,182,186,96,207,198,74,152,196,217,206,79,214,208,204,180,94,215,81,177,160,201,164,173,205,76,199,220,228,91,215,155,226,79,133,181,136,182,226,96,221,109,209,223,71,202,95,217,87,202,204,183,210,187,212,81,226,184,224,88,170,214,198,226,142,212,81,209,189,172,192,221,216,123,221,126,204,218,222,76,205,73,225,221,73,204,108,201,88,174,197,136,223,90,189,56,207,147,206,212,73,201,83,204,112,137,227,67,208,137,219,225,65,200,186,99,214,97,215,74,203,65,199,216,108,216,80,206,219,104,226,180,225,199,186,197,226,157,102,177,107,231,156,141,226,70,220,216,223,64,214,66,201,174,170,207,46,202,131,173,218,125,217,157,234,192,159,174,209,95,196,224,59,220,69,211,130,203,222,88,208,86,198,127,219,228,75,218,170,168,198,128,215,54,211,167,186,117,211,162,221,219,105,223,99,223,127,202,218,213,143,194,181,200,180,230,224,97,181,132,173,202,221,57,151,220,77,220,160,206,188,101,197,72,213,95,193,212,189,105,226,100,205,201,56,211,93,178,212,88,208,83,213,165,219,183,236,121,220,210,94,212,171,186,218,137,212,129,175,203,223,134,194,95,193,191,105,229,208,102,196,120,191,221,217,65,206,200,74,168,180,199,217,119,223,68,211,125,204,105,180,164,215,227,128,211,166,218,86,185,74,214,57,200,171,111,185,73,199,220,213,192,216,107,211,115,219,227,192,221,101,203,65,211,51,216,84,193,121,214,86,195,115,179,229,90,215,92,207,63,179,212,38,202,104,182,125,179,99,147,184,210,166,227,232,164,120,218,169,203,154,192,224,217,122,160,205,206,221,80,191,217,166,202,78,206,147,202,155,195,76,204,136,191,112,195,160,147,226,91,224,216,212,177,188,165,174,130,203,221,220,133,209,147,216,69,159,155,143,213,94,227,139,209,163,183,199,112,217,213,98,217,96,185,158,173,229,51,209,195,227,214,161,213,83,168,229,209,118,221,224,59,179,161,220,209,193,199,199,212,107,226,219,204,117,166,223,122,166,181,163,176,223,176,130,223,221,202,89,188,147,160,143,218,223,206,151,201,161,130,176,175,138,126,209,112,230,94]
 
1
+ [178,205,218,148,184,163,221,185,200,228,172,155,210,222,88,206,226,67,132,212,91,206,104,212,174,205,132,159,230,175,216,198,227,190,212,198,122,213,169,204,92,197,118,191,191,224,69,219,197,72,218,77,175,111,155,217,220,170,231,91,221,217,95,146,177,123,195,205,151,209,207,36,202,200,226,176,232,53,167,199,89,184,213,104,154,153,216,214,215,174,205,72,211,78,221,212,232,223,73,158,220,158,202,222,189,165,205,175,222,132,126,179,219,110,209,158,208,98,176,192,226,34,158,205,126,178,224,182,227,100,152,191,169,195,163,172,208,117,199,217,167,217,157,163,194,217,200,217,23,221,209,146,150,204,200,125,215,232,68,147,212,41,223,178,152,173,210,139,198,182,196,207,95,176,205,223,83,216,85,207,210,52,177,178,230,197,119,226,99,182,210,212,77,138,199,123,179,111,219,69,223,65,204,215,83,197,87,211,132,216,135,178,157,166,216,85,170,195,208,190,175,134,220,200,67,221,211,66,227,222,226,190,209,205,67,207,139,208,127,186,205,168,221,179,223,117,148,221,216,80,189,125,199,202,64,218,77,195,190,221,181,98,143,214,220,97,187,127,219,122,216,87,138,212,194,112,219,227,101,220,100,164,234,109,221,102,223,89,184,205,219,77,188,223,172,171,175,152,175,137,213,197,114,205,221,138,181,174,227,73,147,144,178,147,215,152,182,204,80,210,123,211,121,209,224,224,219,211,163,133,187,148,151,163,221,94,133,213,72,187,224,216,162,154,224,184,118,204,220,154,117,220,162,202,223,195,110,197,151,224,88,182,217,221,214,118,218,118,164,205,97,221,183,154,206,197,74,170,219,103,230,215,192,224,78,184,72,201,227,221,191,181,104,190,224,221,99,123,206,102,228,202,74,195,96,225,176,232,231,96,225,206,61,173,101,190,211,90,213,199,203,184,209,173,160,207,203,99,201,195,132,195,214,148,211,227,192,215,212,127,162,213,114,178,111,207,71,129,182,212,75,176,209,137,213,224,220,70,232,116,228,179,200,184,222,93,202,71,219,123,213,119,204,173,135,118,207,96,216,107,210,202,189,172,211,76,146,195,169,83,227,62,219,223,125,158,202,226,98,214,225,76,156,211,204,110,190,224,200,64,228,223,72,213,102,214,178,219,128,211,127,187,213,160,216,134,202,150,186,201,195,230,199,206,193,205,219,216,84,213,223,192,222,109,187,209,79,153,204,224,72,167,190,183,183,228,133,227,103,197,225,203,205,201,207,213,210,78,138,195,133,195,180,190,185,207,189,100,191,229,198,129,226,135,205,164,226,190,73,200,231,209,216,147,221,228,165,213,209,181,157,222,117,216,58,148,219,92,222,219,63,191,218,186,197,158,172,204,109,204,102,223,229,91,192,217,186,204,144,173,209,95,186,222,225,178,183,222,71,193,182,232,63,227,228,82,209,162,219,183,222,122,212,195,60,219,130,223,69,216,111,198,224,80,223,74,164,148,227,193,216,139,207,198,148,227,199,169,230,225,86,138,165,206,91,213,164,215,229,173,216,210,92,166,205,102,196,78,202,74,187,202,229,191,222,57,206,115,183,94,225,173,198,109,212,224,126,215,44,221,80,196,95,177,218,217,198,109,218,191,221,214,227,71,207,123,206,90,219,218,108,181,214,123,150,204,214,208,225,199,191,192,213,93,217,149,178,199,106,204,80,208,60,131,132,232,218,213,115,202,94,175,227,217,114,200,222,178,180,185,192,193,205,173,226,217,222,197,184,205,211,136,214,221,62,213,89,169,184,135,198,98,216,225,71,230,133,213,154,216,189,122,224,108,204,88,207,227,93,191,204,214,66,153,219,209,203,218,237,229,222,148,174,218,107,214,169,223,216,173,220,148,199,188,175,203,142,188,210,98,167,174,224,103,156,97,220,111,216,70,130,221,216,115,166,164,162,210,216,69,173,203,77,216,175,172,210,98,199,213,88,221,82,184,131,201,220,133,211,116,156,219,213,227,128,221,237,172,217,64,180,221,70,169,220,216,134,215,177,190,197,229,221,226,74,189,190,219,86,222,74,213,106,184,182,205,221,229,143,201,198,162,209,128,221,209,60,216,146,228,106,215,154,214,202,82,194,141,168,217,213,80,140,114,220,149,229,215,150,211,170,199,109,218,108,206,125,204,164,183,148,193,153,188,143,160,199,210,124,225,105,178,75,222,112,205,75,199,83,222,215,158,104,205,70,215,223,158,220,99,124,194,91,219,65,209,149,234,188,222,218,200,66,208,101,208,188,201,207,198,160,205,219,118,170,195,141,128,182,189,211,51,179,216,112,152,196,224,80,223,177,218,127,186,113,151,117,159,184,200,231,212,134,222,68,216,204,90,151,202,158,173,146,144,209,222,89,171,188,206,215,117,191,223,80,215,68,151,159,210,74,208,105,204,105,174,228,67,147,202,214,41,222,163,183,142,227,188,200,208,132,157,191,195,94,166,201,155,214,132,185,195,165,214,219,156,217,100,147,220,105,220,223,225,117,176,166,207,134,218,202,172,183,217,76,229,139,217,87,208,181,214,169,154,199,231,208,200,157,213,222,49,207,191,139,219,218,82,122,154,137,190,176,192,180,231,80,208,136,185,112,189,187,192,208,219,110,208,102,186,225,220,218,235,214,96,133,198,199,95,152,83,219,190,124,148,222,213,207,145,217,168,200,32,208,198,171,205,218,104,218,136,230,77,139,178,212,214,129,211,233,93,221,95,155,153,221,151,201,108,193,216,172,208,72,177,217,208,225,60,81,172,234,180,146,184,180,199,222,224,182,106,183,105,221,208,81,221,137,211,44,198,224,205,82,187,164,147,213,163,229,192,219,218,190,228,221,218,116,231,97,182,126,213,193,173,207,103,207,122,161,156,193,153,192,92,198,155,223,224,106,223,170,177,225,40,206,203,221,152,208,116,203,105,197,221,65,226,111,224,215,215,142,154,211,84,184,135,169,212,213,76,229,225,130,207,85,209,177,116,140,150,229,209,105,164,186,214,219,62,148,203,230,118,207,216,217,211,114,201,134,217,87,197,124,209,186,184,164,189,163,212,85,219,106,209,220,168,221,212,112,189,183,229,200,192,106,224,114,220,95,141,223,222,219,226,79,187,202,212,222,193,128,223,209,160,178,197,222,111,152,114,164,206,179,189,183,122,212,149,150,130,214,115,195,118,204,208,120,195,163,159,220,125,222,174,211,144,140,220,80,173,204,87,215,149,227,97,229,221,119,213,139,212,82,215,68,213,136,186,168,202,85,213,93,168,218,63,222,121,213,90,218,94,197,221,182,220,112,224,221,83,215,194,138,181,197,221,91,186,194,99,179,213,210,76,188,214,216,79,213,92,220,94,201,108,213,131,221,153,219,74,159,172,218,217,219,108,209,35,217,112,157,206,148,211,126,214,107,191,105,164,224,220,126,204,201,211,119,203,129,165,205,130,218,137,211,178,204,215,206,119,209,75,193,182,115,178,207,217,220,184,199,167,189,230,140,208,167,173,195,136,220,205,64,186,159,194,75,213,189,110,189,208,120,176,182,162,196,183,137,134,198,172,193,184,172,212,189,214,200,95,167,199,110,195,116,180,204,66,217,126,222,226,193,166,143,121,218,224,196,140,226,142,206,66,214,140,218,144,201,221,152,204,154,221,101,225,69,216,212,128,181,200,180,206,136,193,226,226,219,141,168,202,209,105,178,128,208,197,145,163,218,124,220,147,220,156,211,69,212,226,132,223,165,211,110,229,194,193,204,208,90,218,106,152,215,134,180,182,204,211,80,211,108,204,102,208,102,192,94,191,206,203,140,165,213,197,105,186,186,210,62,208,165,213,174,186,115,211,164,193,128,201,169,217,167,217,174,222,167,129,170,217,218,182,159,113,208,85,219,126,219,89,172,213,212,69,205,135,124,205,67,208,52,186,160,225,125,190,65,227,168,180,166,151,194,205,88,138,204,110,124,223,80,193,106,179,183,163,220,148,205,218,200,221,198,202,190,213,87,227,239,81,213,115,211,219,201,204,105,216,83,210,95,170,219,197,86,228,86,209,125,231,83,198,194,171,144,183,222,216,216,226,204,109,216,115,180,227,116,207,223,84,190,215,117,212,79,214,230,218,134,208,92,209,99,181,220,61,212,85,199,185,174,171,220,221,57,146,224,214,81,183,218,152,163,193,72,203,111,130,222,204,217,229,209,201,189,220,213,72,192,214,221,157,197,124,209,100,223,162,223,73,85,221,68,210,46,205,76,118,160,180,200,85,218,34,196,170,204,88,168,224,71,188,207,166,181,210,101,148,215,76,219,217,104,199,180,210,208,69,201,202,216,224,89,206,88,128,125,196,203,106,183,182,84,228,200,120,171,166,153,215,210,82,210,65,163,131,213,83,180,205,92,152,228,99,212,212,209,94,215,218,95,206,179,236,219,204,102,196,110,160,218,140,228,102,218,177,172,193,95,209,215,203,220,78,213,191,189,168,204,218,192,89,213,223,208,230,168,202,169,222,56,155,203,73,197,156,208,222,127,212,223,160,217,201,223,205,217,100,221,208,143,218,22,208,224,86,209,90,220,144,182,139,219,79,194,183,225,216,209,190,102,221,220,203,152,215,113,230,106,212,221,106,197,111,185,211,83,162,200,208,219,124,210,92,161,216,221,79,208,166,211,142,179,222,197,79,185,226,182,193,195,170,202,113,192,231,228,197,180,152,223,155,215,97,213,85,205,145,200,89,209,201,139,217,163,133,206,174,112,205,213,217,123,201,136,132,221,206,105,200,139,183,160,215,134,221,208,181,208,101,210,225,116,183,217,80,186,120,162,194,201,97,188,209,174,122,229,214,205,120,218,228,228,83,134,201,119,171,210,55,227,107,225,64,174,203,130,213,226,95,206,157,139,154,171,139,224,219,154,190,206,218,217,221,86,135,217,213,183,118,187,186,128,212,79,184,186,156,188,134,216,172,155,195,210,176,223,125,128,217,213,214,217,94,189,192,133,167,85,187,217,60,158,221,122,217,117,224,95,206,89,199,74,205,210,70,209,83,196,77,169,188,228,114,188,105,229,182,174,223,59,181,205,127,225,222,128,190,219,112,197,115,222,205,115,184,214,210,238,196,212,85,207,86,183,215,178,187,112,194,176,212,90,217,80,164,179,187,138,221,214,203,96,227,198,82,209,207,96,208,76,228,224,205,207,186,178,221,224,189,143,229,182,229,132,213,201,186,218,67,222,221,83,216,44,220,68,214,103,137,210,48,210,211,88,222,169,225,155,181,200,207,62,219,152,179,130,215,86,183,90,202,164,179,205,128,219,224,148,190,117,213,221,184,175,146,223,216,82,230,92,201,231,218,172,207,209,189,150,216,142,226,46,197,219,92,228,168,216,111,221,204,216,190,180,189,206,108,204,101,224,74,223,65,181,142,211,203,122,220,214,66,179,185,222,116,120,209,222,101,172,209,232,56,223,177,212,107,191,168,193,193,105,212,105,163,209,191,117,194,170,211,190,121,221,90,161,207,67,226,90,216,60,171,219,89,220,184,214,194,144,182,214,208,196,133,226,214,77,190,215,63,115,208,92,227,204,68,197,77,178,188,204,216,202,194,152,175,181,208,131,196,223,128,177,116,207,102,220,121,216,199,202,207,74,211,196,217,177,214,218,221,93,219,192,215,143,139,183,226,83,102,219,211,100,194,47,119,204,160,143,180,149,183,226,96,172,212,186,198,207,182,207,221,164,146,224,61,200,140,147,196,214,60,226,215,111,216,186,180,161,234,196,124,208,214,221,189,178,112,168,208,136,169,181,202,124,201,74,211,143,192,175,119,163,202,67,191,172,148,179,223,122,217,117,180,190,186,209,72,193,182,218,80,156,138,201,202,218,208,149,190,109,224,64,192,114,219,199,210,208,97,232,205,197,188,189,87,213,152,196,164,131,225,94,219,205,163,191,172,196,189,206,110,201,73,191,122,210,173,208,73,221,136,222,185,224,74,213,86,185,221,170,210,69,165,156,210,102,211,210,221,197,209,154,127,212,180,208,139,231,207,50,166,96,150,158,206,228,214,202,95,162,220,191,136,217,54,155,201,140,179,191,91,213,220,74,145,216,232,45,208,217,209,182,160,182,100,221,155,219,227,160,180,209,147,174,212,67,209,82,213,148,208,51,176,88,210,71,117,174,218,82,211,188,170,210,186,136,176,220,157,189,167,190,212,160,212,135,201,219,49,162,165,209,117,175,213,152,176,220,221,124,150,204,220,73,228,194,218,57,195,173,159,173,175,206,176,229,79,164,201,203,152,214,116,137,219,66,222,214,80,175,202,90,225,113,219,206,78,190,89,214,50,209,154,188,227,194,157,195,74,186,206,130,198,73,212,60,204,122,222,99,205,196,229,213,83,230,108,171,126,192,104,216,207,117,217,197,214,79,207,110,221,79,217,144,206,160,206,172,197,183,207,217,207,113,210,221,71,161,221,164,227,214,142,177,185,180,103,130,198,123,205,74,216,102,219,160,217,75,204,114,192,213,166,188,118,222,227,92,195,219,161,200,221,69,203,143,198,198,217,198,66,212,50,208,116,199,125,210,207,167,225,116,207,97,184,99,220,184,203,184,219,177,167,202,214,55,207,161,197,122,212,226,187,96,216,201,188,135,224,207,139,225,230,220,121,221,107,212,66,170,169,210,199,102,220,94,159,184,207,92,207,231,214,125,227,220,205,58,193,203,215,223,229,78,196,170,185,196,162,234,56,201,123,171,231,196,86,162,199,213,220,68,200,68,205,88,225,135,220,82,182,215,222,79,152,230,62,162,218,184,224,67,206,99,189,124,214,197,73,204,105,221,179,102,218,232,80,214,181,170,204,165,216,207,217,212,195,176,215,106,192,160,221,182,217,57,211,88,198,233,113,171,204,138,193,209,225,59,176,184,134,223,151,193,200,217,100,225,79,180,142,190,123,222,80,232,216,133,216,148,211,110,198,96,187,224,95,208,112,178,227,94,171,96,181,209,170,225,196,206,94,216,87,217,171,191,82,218,127,227,176,219,207,230,79,214,203,105,213,143,174,188,125,193,220,60,215,172,214,101,211,110,161,117,187,180,125,218,220,62,208,203,217,87,198,156,216,226,161,161,223,224,72,178,198,213,195,219,208,140,175,217,74,201,201,66,186,154,229,89,226,169,204,87,184,85,161,133,201,80,176,188,114,224,77,207,126,202,83,219,200,125,172,169,190,216,80,88,221,68,218,133,216,117,217,157,217,170,190,124,214,210,156,231,84,207,204,113,200,70,222,162,208,227,92,223,136,167,195,221,221,77,173,213,109,214,117,211,217,89,217,91,210,152,194,206,202,110,216,177,190,207,227,185,172,230,172,207,171,199,234,207,149,194,192,179,212,209,210,101,198,225,85,164,211,110,194,182,211,224,65,228,218,79,224,81,122,208,154,129,206,92,193,171,148,188,221,80,220,161,165,166,161,214,99,210,64,174,224,221,105,200,122,230,216,94,223,128,225,161,219,126,187,137,191,222,214,148,151,198,218,210,110,208,228,184,211,35,202,218,195,216,115,212,95,177,199,101,184,208,202,212,134,193,129,192,81,182,223,70,226,230,134,167,183,198,222,227,227,226,63,213,109,187,177,219,223,203,144,179,209,103,177,181,158,221,90,222,166,207,175,230,207,99,205,234,210,210,168,223,143,210,187,209,204,150,209,213,208,193,221,214,77,215,199,81,197,82,177,190,210,231,79,179,221,64,182,199,82,204,204,95,172,187,178,209,86,222,220,118,192,223,88,220,77,174,104,224,137,182,186,96,207,198,74,152,196,217,206,79,214,208,204,180,94,215,81,177,160,201,164,173,205,76,199,220,228,91,215,155,226,79,133,181,136,182,226,96,221,109,209,223,71,202,95,217,87,202,204,183,210,187,212,81,226,184,224,88,170,214,198,226,142,212,81,209,189,172,192,221,216,123,221,126,204,218,222,76,205,73,225,221,73,204,108,201,88,174,197,136,223,90,189,56,207,147,206,212,73,201,83,204,112,137,227,67,208,137,219,225,65,200,186,99,214,97,215,74,203,65,199,216,108,216,80,206,219,104,226,180,225,199,186,197,226,157,102,177,107,231,156,141,226,70,220,216,223,64,214,66,201,174,170,207,46,202,131,173,218,125,217,157,234,192,159,174,209,95,196,224,59,220,69,211,130,203,222,88,208,86,198,127,219,228,75,218,170,168,198,128,215,54,211,167,186,117,211,162,221,219,105,223,99,223,127,202,218,213,143,194,181,200,180,230,224,97,181,132,173,202,221,57,151,220,77,220,160,206,188,101,197,72,213,95,193,212,189,105,226,100,205,201,56,211,93,178,212,88,208,83,213,165,219,183,236,121,220,210,94,212,171,186,218,137,212,129,175,203,223,134,194,95,193,191,105,229,208,102,196,120,191,221,217,65,206,200,74,168,180,199,217,119,223,68,211,125,204,105,180,164,215,227,128,211,166,218,86,185,74,214,57,200,171,111,185,73,199,220,213,192,216,107,211,115,219,227,192,221,101,203,65,211,51,216,84,193,121,214,86,195,115,179,229,90,215,92,207,63,179,212,38,202,104,182,125,179,99,147,184,210,166,227,232,164,120,218,169,203,154,192,224,217,122,160,205,206,221,80,191,217,166,202,78,206,147,202,155,195,76,204,136,191,112,195,160,147,226,91,224,216,212,177,188,165,174,130,203,221,220,133,209,147,216,69,159,155,143,213,94,227,139,209,163,183,199,112,217,213,98,217,96,185,158,173,229,51,209,195,227,214,161,213,83,168,229,209,118,221,224,59,179,161,220,209,193,199,199,212,107,226,219,204,117,166,223,122,166,181,163,176,223,176,130,223,221,202,89,188,147,160,143,218,223,206,151,201,161,130,176,175,138,126,209,112,230,94,211,17,103]
ivf.pid.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:043c4a81ed36ce580b981db8042c1572348b0bbb1cfe9c1df655a7c762a8e9d5
3
- size 1797016
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f7dda714f27c9bb93c977459da5ba8b6549af07ecd112bb7751fe8c94ab42030
3
+ size 1797464
metadata.json CHANGED
@@ -37,7 +37,7 @@
37
  "checkpoint":"colbert-ir/colbertv2.0",
38
  "triples":"/future/u/okhattab/root/unit/experiments/2021.10/downstream.distillation.round2.2_score/round2.nway6.cosine.ib/examples.64.json",
39
  "collection":[
40
- "list with 3973 elements starting with...",
41
  [
42
  "Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
43
  "Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
@@ -50,7 +50,7 @@
50
  "root":".ragatouille/",
51
  "experiment":"colbert",
52
  "index_root":null,
53
- "name":"2024-06/24/14.54.05",
54
  "rank":0,
55
  "nranks":1,
56
  "amp":true,
@@ -59,8 +59,8 @@
59
  },
60
  "num_chunks":1,
61
  "num_partitions":8192,
62
- "num_embeddings":682500,
63
- "avg_doclen":171.7845456834,
64
  "RAGatouille":{
65
  "index_config":{
66
  "index_type":"PLAID",
 
37
  "checkpoint":"colbert-ir/colbertv2.0",
38
  "triples":"/future/u/okhattab/root/unit/experiments/2021.10/downstream.distillation.round2.2_score/round2.nway6.cosine.ib/examples.64.json",
39
  "collection":[
40
+ "list with 3976 elements starting with...",
41
  [
42
  "Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
43
  "Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
 
50
  "root":".ragatouille/",
51
  "experiment":"colbert",
52
  "index_root":null,
53
+ "name":"2024-06/24/15.54.03",
54
  "rank":0,
55
  "nranks":1,
56
  "amp":true,
 
59
  },
60
  "num_chunks":1,
61
  "num_partitions":8192,
62
+ "num_embeddings":682831,
63
+ "avg_doclen":171.7381790744,
64
  "RAGatouille":{
65
  "index_config":{
66
  "index_type":"PLAID",
pid_docid_map.json CHANGED
@@ -3971,5 +3971,8 @@
3971
  "3969":"2406.14783",
3972
  "3970":"2406.14783",
3973
  "3971":"2406.13527",
3974
- "3972":"2406.13527"
 
 
 
3975
  }
 
3971
  "3969":"2406.14783",
3972
  "3970":"2406.14783",
3973
  "3971":"2406.13527",
3974
+ "3972":"2406.13527",
3975
+ "3973":"2406.11617",
3976
+ "3974":"2406.11617",
3977
+ "3975":"2406.11654"
3978
  }
plan.json CHANGED
@@ -37,7 +37,7 @@
37
  "checkpoint": "colbert-ir\/colbertv2.0",
38
  "triples": "\/future\/u\/okhattab\/root\/unit\/experiments\/2021.10\/downstream.distillation.round2.2_score\/round2.nway6.cosine.ib\/examples.64.json",
39
  "collection": [
40
- "list with 3973 elements starting with...",
41
  [
42
  "Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
43
  "Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
@@ -50,7 +50,7 @@
50
  "root": ".ragatouille\/",
51
  "experiment": "colbert",
52
  "index_root": null,
53
- "name": "2024-06\/24\/14.54.05",
54
  "rank": 0,
55
  "nranks": 1,
56
  "amp": true,
@@ -59,6 +59,6 @@
59
  },
60
  "num_chunks": 1,
61
  "num_partitions": 8192,
62
- "num_embeddings_est": 682500.0008544922,
63
- "avg_doclen_est": 171.7845458984375
64
  }
 
37
  "checkpoint": "colbert-ir\/colbertv2.0",
38
  "triples": "\/future\/u\/okhattab\/root\/unit\/experiments\/2021.10\/downstream.distillation.round2.2_score\/round2.nway6.cosine.ib\/examples.64.json",
39
  "collection": [
40
+ "list with 3976 elements starting with...",
41
  [
42
  "Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
43
  "Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
 
50
  "root": ".ragatouille\/",
51
  "experiment": "colbert",
52
  "index_root": null,
53
+ "name": "2024-06\/24\/15.54.03",
54
  "rank": 0,
55
  "nranks": 1,
56
  "amp": true,
 
59
  },
60
  "num_chunks": 1,
61
  "num_partitions": 8192,
62
+ "num_embeddings_est": 682830.9815673828,
63
+ "avg_doclen_est": 171.73817443847656
64
  }