Meet the Transformers: GPT-2, Megatron, Turing
Автор: Przemek Chojecki
Загружено: 2020-05-11
Просмотров: 587
Описание:
Transformers are powerful deep learning models best suited for text generation. The breakthrough came with OpenAI's GPT-2:
/ gpt-2-megatron-turing-natural-language-gen...
GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages. It is trained with a simple objective: predict the next word, given all of the previous words within some text. Thanks to the diversity of the dataset, this simple goal contains naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.
Visit us at: www.datasciencerush.com
#gpt2 #megatron #turing
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: