Colin Raffel: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful t...
2 views
74
16
3 years ago
01:35:14
1
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 14 - T5 and Large Language Models
3 years ago
00:26:30
2
MLT __init__ Session #13: Multitask Prompted Training Enables Zero-Shot Task Generalization
3 years ago
00:37:22
6
[ML News] DeepMind builds Gopher | Google builds GLaM | Suicide capsule uses AI to check access
4 years ago
06:27:03
1
Conceptual Understanding of Deep Learning Workshop
5 years ago
01:04:19
2
Colin Raffel: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
8 years ago
00:44:55
58
Colin Raffel - Doing Strange Things with Attention - AI With The Best October 14-15, 2017
Back to Top