New friendli container not working

#2
by Papersnake - opened
docker run \
  -p 8000:8000 \
  --gpus all \
  -v ~/.cache/huggingface:/root/.cache/huggingface \
  -e FRIENDLI_CONTAINER_SECRET=$FRIENDLI_CONTAINER_SECRET \
  $FRIENDLI_CONTAINER_IMAGE \
    --hf-model-name FriendliAI/Meta-Llama-3-8B-Instruct-fp8

However, error occurs:

[2024-06-02 09:09:48.002] [info] [::] gRPC Server listening on 0.0.0.0:0, selected_port=40565
[2024-06-02 09:09:48.003] [info] [::] Worker-0 is launched.
[2024-06-02 09:09:48.003] [info] [::] gRPC Server listening on 0.0.0.0:0, selected_port=43699
[2024-06-02 09:09:48.003] [info] [::] Master is launched.
[2024-06-02 09:09:48.009] [info] [::] Worker-0 at 0.0.0.0:43699 is joined.
[2024-06-02 09:09:48.009] [info] [::] All workers joined.
[2024-06-02 09:09:48.010] [info] [::] Received a worker start message.
[2024-06-02 09:09:48.087] [info] [::] Use a default algorithm policy as no algo policy file provided.
[2024-06-02 09:09:48.087] [info] [::] Worker-0 starts model checkpoint loading.
[2024-06-02 09:09:48.338] [info] [::] Enabled /v1/chat/completions API with chat template: "{% set loop_messages = messages %}{% for message in loop_messages %}{% set content = '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] | trim + '<|eot_id|>' %}{% if loop.index0 == 0 %}{% set content = bos_token + content %}{% endif %}{{ content }}{% endfor %}{{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }}"
[2024-06-02 09:09:48.772] [info] [::] Try to find sessions from session-uris.
[2024-06-02 09:09:48.773] [warning] [::] Tokenizer's vocab size is 128000, while we got a vocab size of 128256 for the model; make sure that you really want this.
[2024-06-02 09:09:48.773] [info] [::] Found a session at 0.0.0.0:40565, now wait for this session to be pre-ready.
[2024-06-02 09:10:15.015] [info] [::] Worker-0 finished model checkpoint loading.
[2024-06-02 09:10:15.015] [info] [::] Wait for workers pre-ready.
terminate called after throwing an instance of 'pfdnn::NotSupportedException'
  what():  No backend supports this parameter

I'm using Friendli Engine v1.5.27-54b7800d+ , and I'm sure that previous version(about 15 days ago?) works.

Papersnake changed discussion status to closed

the gpu I'm using doesn't support fp8

Sign up or log in to comment