|
# Fluently v4 LCM - Onnx Olive DirectML Optimized
|
|
|
|
## Original Model
|
|
https://huggingface.co/fluently/Fluently-v4-LCM
|
|
|
|
## C# Inference Demo
|
|
https://github.com/TensorStack-AI/OnnxStack
|
|
|
|
```csharp
|
|
// Create Pipeline
|
|
var pipeline = LatentConsistencyPipeline.CreatePipeline("D:\\Models\\Fluently-v4-LCM-onnx");
|
|
|
|
// Prompt
|
|
var promptOptions = new PromptOptions
|
|
{
|
|
Prompt = "Illustrate a cheerful barista preparing a cup of coffee behind the counter of a cozy cafe."
|
|
};
|
|
|
|
// Run pipeline
|
|
var result = await pipeline.GenerateImageAsync(promptOptions);
|
|
|
|
// Save Image Result
|
|
await result.SaveAsync("Result.png");
|
|
```
|
|
## Inference Result
|
|
 |