Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
ac63724
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
144 commits
mishig
HF Staff
typo maxTokens vs max_tokens
ac63724
8 months ago
InferencePlayground.svelte
12.3 kB
Messages must alternate between user/assistant roles
8 months ago
InferencePlaygroundCodeSnippets.svelte
Safe
9.52 kB
format
8 months ago
InferencePlaygroundConversation.svelte
Safe
1.65 kB
format
8 months ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
9 months ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.57 kB
format
8 months ago
InferencePlaygroundMessage.svelte
Safe
1.55 kB
"stop" btn for streaming messages
8 months ago
InferencePlaygroundModelSelector.svelte
Safe
2.39 kB
format
8 months ago
InferencePlaygroundModelSelectorModal.svelte
Safe
6.17 kB
misc
8 months ago
generationConfigSettings.ts
Safe
934 Bytes
default steps
8 months ago
inferencePlaygroundUtils.ts
Safe
2.29 kB
typo maxTokens vs max_tokens
8 months ago
types.ts
Safe
607 Bytes
System message as part of Conversation
9 months ago