XLM-RoBERTa
- 원본 링크 : https://keras.io/api/keras_nlp/models/xlm_roberta/
- 최종 확인 : 2024-11-26
Models, tokenizers, and preprocessing layers for XLM-Roberta, as described in “Unsupervised Cross-lingual Representation Learning at Scale”.
For a full list of available presets, see the models page.