Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
fcefed2
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
113 commits
mishig
HF Staff
better link for "Create new token"
fcefed2
8 months ago
InferencePlayground.svelte
11.4 kB
Hide "API Quota" div for now
8 months ago
InferencePlaygroundCodeSnippets.svelte
Safe
8.65 kB
padding
9 months ago
InferencePlaygroundConversation.svelte
Safe
1.61 kB
wip
9 months ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
9 months ago
InferencePlaygroundHFTokenModal.svelte
4.31 kB
better link for "Create new token"
8 months ago
InferencePlaygroundMessage.svelte
Safe
1.54 kB
order imports
9 months ago
InferencePlaygroundModelSelector.svelte
Safe
2.07 kB
quick fixes
9 months ago
InferencePlaygroundModelSelectorModal.svelte
3.62 kB
Model selector w-full
8 months ago
generationConfigSettings.ts
Safe
933 Bytes
Rm advanced options for config
9 months ago
inferencePlaygroundUtils.ts
Safe
2.16 kB
make tokens count working for non-streaming as well
9 months ago
types.ts
Safe
607 Bytes
System message as part of Conversation
9 months ago