mknolan's picture
Upload README.md with huggingface_hub
56d4429 verified
metadata
title: InternViT Development Test
emoji: 🔧
colorFrom: indigo
colorTo: purple
sdk: docker
pinned: false

InternViT-6B with CUDA Development Tools

This Space uses the PyTorch CUDA development image to properly install flash-attn with NVCC.

Changes in this Version

  • Using PyTorch CUDA development image instead of runtime image
  • Includes NVCC (NVIDIA CUDA Compiler) needed for flash-attn
  • Specific flash-attn version (1.0.9) compatible with CUDA 11.7
  • Enhanced diagnostics to verify flash-attn installation

Dependencies Added

  • einops: Required for vision transformer operations
  • flash-attn: Required for efficient attention computation
  • CUDA build tools for proper compilation

Instructions

  1. Click the "Test Model Loading" button
  2. Wait for the model to load and run the test
  3. Check the results for success or errors