Rami Al-Rfou
Rami Al-Rfou
Home
Experience
Education
Talks
Projects
Publications
Patents
Resume
Contact
Light
Dark
Automatic
Noah Constant
Latest
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
The Power of Scale for Parameter-Efficient Prompt Tuning
nmT5: Is Parallel Data Still Relevant for Pre-training Massively Multilingual Language Models?
mT5: A massively multilingual pre-trained text-to-text transformer
LAReQA: Language-agnostic answer retrieval from a multilingual pool
Bridging the Gap for Tokenizer-Free Language Models
Character-Level Language Modeling with Deeper Self-Attention
Cite
×