This is a Llama3
model uploaded using the KerasHub library and can be used with JAX, TensorFlow, and PyTorch backends.
This model is related to a CausalLM
task.
Model config:
- name: llama3_backbone
- trainable: True
- vocabulary_size: 128256
- num_layers: 16
- num_query_heads: 32
- hidden_dim: 2048
- intermediate_dim: 8192
- rope_max_wavelength: 10000
- rope_scaling_factor: 1.0
- num_key_value_heads: 8
- layer_norm_epsilon: 1e-06
- dropout: 0
This model card has been generated automatically and should be completed by the model author. See Model Cards documentation for more information.
- Downloads last month
- 33
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The HF Inference API does not support text-generation models for keras-hub
library.