This model was exported from ibm-granite/granite-3b-code-instruct-128k using Optimum with the basic optimization by ONNX Runtime.

The repository owner maintains this model for use with Transformers.js and DirectML. The required external data format support was implemented in 3.4.0.

Trial

This model is currently being tested as follows:

import  { pipeline } from "@huggingface/transformers"

const generator = await pipeline(
  "text-generation",
  "kazssym/granite-3b-code-instruct-128k-onnx",
  {
    device: "dml",
    use_external_data_format: true,
    session_options: {
    },
  },
)
Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including kazssym/granite-3b-code-instruct-128k-onnx