Paper简读 - ChatGPT相关的GPT-1GPT-2GPT-3
Posted SpikeKing
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Paper简读 - ChatGPT相关的GPT-1GPT-2GPT-3相关的知识,希望对你有一定的参考价值。
欢迎关注我的CSDN:https://spike.blog.csdn.net/
本文地址:https://blog.csdn.net/caroline_wendy/article/details/128909400
GPT、GPT-2、GPT-3:Generative Pre-trained Transformer,生成式预训练Transformer
-
Wiki: https://en.wikipedia.org/wiki/GPT-3
-
GPT-3 Demo: https://gpt3demo.com/
时间线:
- Transformer, 2017.6, Attention is all you need
- GPT, 2018.6, Improving Language Understanding by Generative Pre-Training: 使用Transformer的解码器,在没有标签的文本上,预训练模型
- BERT, 2018.10, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding: Bidirectional Encoder Representations from Transformers,Transformer的编码器
- GPT-2, 2019.2,
以上是关于Paper简读 - ChatGPT相关的GPT-1GPT-2GPT-3的主要内容,如果未能解决你的问题,请参考以下文章
Paper简读 - ProGen: Language Modeling for Protein Generation
ChatGPT - ProGen: Language Modeling for Protein Generation 论文简读
Paper简读 - ProGen2: Exploring the Boundaries of Protein Language Models
Paper简读 - ProGen2: Exploring the Boundaries of Protein Language Models