DistilBERT
- Original Link : https://keras.io/api/keras_nlp/models/distil_bert/
- Last Checked at : 2024-11-26
Models, tokenizers, and preprocessing layers for DistilBERT, as described in “DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter”.
For a full list of available presets, see the models page.