RoBERTa
- 원본 링크 : https://keras.io/api/keras_nlp/models/roberta/
- 최종 확인 : 2024-11-26
Models, tokenizers, and preprocessing layers for RoBERTa, as described in “RoBERTa: A Robustly Optimized BERT Pretraining Approach”.
For a full list of available presets, see the models page.