Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
ba51e63
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
123 commits
mishig
HF Staff
set/get model from query params
80b13a0
10 months ago
InferencePlayground.svelte
12.1 kB
set/get model from query params
10 months ago
InferencePlaygroundCodeSnippets.svelte
9.44 kB
format
10 months ago
InferencePlaygroundConversation.svelte
Safe
1.64 kB
snippets "showToken" feature
10 months ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
11 months ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.84 kB
format
10 months ago
InferencePlaygroundMessage.svelte
Safe
1.54 kB
order imports
11 months ago
InferencePlaygroundModelSelector.svelte
2.3 kB
set/get model from query params
10 months ago
InferencePlaygroundModelSelectorModal.svelte
5.8 kB
more darkmode
10 months ago
generationConfigSettings.ts
Safe
933 Bytes
Rm advanced options for config
11 months ago
inferencePlaygroundUtils.ts
Safe
2.16 kB
make tokens count working for non-streaming as well
11 months ago
types.ts
Safe
607 Bytes
System message as part of Conversation
11 months ago