YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Grok 2

This repository contains the weights of Grok 2, a model trained and used at xAI in 2024.

Usage: Serving with SGLang

  • Download the weights. You can replace /local/grok-2 with any other folder name you prefer.

    hf download xai-org/grok-2 --local-dir /local/grok-2
    

    You might encounter some errors during the download. Please retry until the download is successful.
    If the download succeeds, the folder should contain 42 files and be approximately 500 GB.

  • Launch a server.

    Install the latest SGLang inference engine (>= v0.5.1) from https://github.com/sgl-project/sglang/

    Use the command below to launch an inference server. This checkpoint is TP=8, so you will need 8 GPUs (each with > 40GB of memory).

    python3 -m sglang.launch_server --model /local/grok-2 --tokenizer-path /local/grok-2/tokenizer.tok.json --tp 8 --quantization fp8 --attention-backend triton
    
  • Send a request.

    This is a post-trained model, so please use the correct chat template.

    python3 -m sglang.test.send_one --prompt "Human: What is your name?<|separator|>\n\nAssistant:"
    

    You should be able to see the model output its name, Grok.

    Learn more about other ways to send requests here.

License

The weights are licensed under the Grok 2 Community License Agreement.

Downloads last month
319
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ 14 Ask for provider support