Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Spaces:

Duplicated fromย  huggingface/inference-playground

FallnAI
/
LLM-Inference
Sleeping

App Files Files Community
Fetching metadata from the HF Docker repository...
LLM-Inference / src /lib /components /InferencePlayground
Ctrl+K
Ctrl+K
  • 5 contributors
History: 211 commits
mishig's picture
mishig HF Staff
Fix streaming extra new lines
0ece011 5 months ago
  • InferencePlayground.svelte
    19.6 kB
    Fix streaming extra new lines 5 months ago
  • InferencePlaygroundCodeSnippets.svelte
    15.3 kB
    view docs + close from top 7 months ago
  • InferencePlaygroundConversation.svelte
    3.13 kB
    view docs + close from top 7 months ago
  • InferencePlaygroundConversationHeader.svelte
    2.87 kB
    border dark 7 months ago
  • InferencePlaygroundGenerationConfig.svelte
    4.49 kB
    [system prompts] Support default system prompts 5 months ago
  • InferencePlaygroundHFTokenModal.svelte
    4.57 kB
    format 8 months ago
  • InferencePlaygroundMessage.svelte
    1.77 kB
    lint fix 7 months ago
  • InferencePlaygroundModelSelector.svelte
    2.25 kB
    fix reactivity 5 months ago
  • InferencePlaygroundModelSelectorModal.svelte
    6.36 kB
    text align 7 months ago
  • generationConfigSettings.ts
    1.01 kB
    [Settings] max_tokens: { default: 2048 } 6 months ago
  • inferencePlaygroundUtils.ts
    2.13 kB
    order 5 months ago
  • types.ts
    698 Bytes
    Compare models 7 months ago