Jack Monas commited on
Commit
40ddbbb
·
1 Parent(s): 907f97d
Files changed (1) hide show
  1. app.py +0 -5
app.py CHANGED
@@ -328,11 +328,6 @@ def main():
328
  "The `world_model_tokenized_data` dataset uses NVIDIA’s Discrete Video 8x8x8 Cosmos Tokenizer to convert raw 256x256 video into tokens. For the Compression Challenge, this tokenizer is mandatory for a consistent benchmark. Alternative tokenizers are permitted for the Sampling and Evaluation Challenges."
329
  )
330
 
331
- with st.expander("What happens if my Compression Challenge model achieves a loss below 8.0 with MAGVIT after March 1st?"):
332
- st.write(
333
- "If you submit a Compression Challenge solution using the MAGVIT tokenizer and achieve a loss below 8.0 on our held-out test set, you remain eligible for the original $10K award until August 31, 2025 (six months from March 1st). However, this submission will not qualify for the CVPR/ICCV leaderboard, which uses the Cosmos tokenizer as the new standard."
334
- )
335
-
336
  with st.expander("What metrics are used to evaluate the Sampling Challenge submissions?"):
337
  st.write(
338
  "Submissions are evaluated by comparing the predicted frame (2 seconds ahead) to the ground-truth frame using Peak Signal-to-Noise Ratio (PSNR)."
 
328
  "The `world_model_tokenized_data` dataset uses NVIDIA’s Discrete Video 8x8x8 Cosmos Tokenizer to convert raw 256x256 video into tokens. For the Compression Challenge, this tokenizer is mandatory for a consistent benchmark. Alternative tokenizers are permitted for the Sampling and Evaluation Challenges."
329
  )
330
 
 
 
 
 
 
331
  with st.expander("What metrics are used to evaluate the Sampling Challenge submissions?"):
332
  st.write(
333
  "Submissions are evaluated by comparing the predicted frame (2 seconds ahead) to the ground-truth frame using Peak Signal-to-Noise Ratio (PSNR)."