Spaces:
Runtime error
Models
Models are combinations of tf.keras
layers and models that can be trained.
Several pre-built canned models are provided to train encoder networks. These models are intended as both convenience functions and canonical examples.
BertClassifier
implements a simple classification model containing a single classification head using the Classification network. It can be used as a regression model as well.BertTokenClassifier
implements a simple token classification model containing a single classification head over the sequence output embeddings.BertSpanLabeler
implementats a simple single-span start-end predictor (that is, a model that predicts two values: a start token index and an end token index), suitable for SQuAD-style tasks.BertPretrainer
implements a masked LM and a classification head using the Masked LM and Classification networks, respectively.DualEncoder
implements a dual encoder model, suitbale for retrieval tasks.Seq2SeqTransformer
implements the original Transformer model for seq-to-seq tasks.T5Transformer
implements a standalone T5 model for seq-to-seq tasks. The models are compatible with released T5 architecture and converted checkpoints. The modules are implemented astf.Module
. To use with Keras, users can wrap them within Keras customized layers, i.e. we can define the modules inside the__init__
of Keras layer and call the modules incall
.