Bert
- 원본 링크 : https://keras.io/api/keras_nlp/models/bert/
- 최종 확인 : 2024-11-26
Models, tokenizers, and preprocessing layers for BERT, as described in “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding”.
For a full list of available presets, see the models page.