Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
7705415
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
103 commits
mishig
HF Staff
improve widget token placeholder
7705415
10 months ago
InferencePlayground.svelte
Safe
9.84 kB
make tokens count working for non-streaming as well
10 months ago
InferencePlaygroundCodeSnippets.svelte
Safe
8.59 kB
improve widget token placeholder
10 months ago
InferencePlaygroundConversation.svelte
Safe
1.61 kB
wip
10 months ago
InferencePlaygroundGenerationConfig.svelte
Safe
3.37 kB
order imports
10 months ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.26 kB
wip
10 months ago
InferencePlaygroundMessage.svelte
Safe
1.54 kB
order imports
10 months ago
InferencePlaygroundModelSelector.svelte
Safe
2.05 kB
types file
10 months ago
InferencePlaygroundModelSelectorModal.svelte
Safe
3.57 kB
make search working
10 months ago
generationConfigSettings.ts
Safe
1.01 kB
Remove top_k & repetiton_penalty as they are not supported
10 months ago
inferencePlaygroundUtils.ts
Safe
2.16 kB
make tokens count working for non-streaming as well
10 months ago
types.ts
Safe
607 Bytes
System message as part of Conversation
10 months ago