본문 바로가기

News/논문

ELMo, GPT, BERT

[CS224N-2019W] 13. Contextual Word Representations and Pretraining

https://gnoej671.tistory.com/51

 

 

Comparison between BERT, GPT-2 and ELMo

https://medium.com/@gauravghati/comparison-between-bert-gpt-2-and-elmo-9ad140cd1cda

 

 

The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
http://jalammar.github.io/illustrated-bert/


08-3: ELMo
https://youtu.be/zV8kIUwH32M


08-4: GPT
https://youtu.be/o_Wl29aW5XM


08-5: BERT
https://youtu.be/IwtexRHoWG0


pilsung-kang/Text-Analytics/08 Seq2Seq Learning and Pre-trained Models/
https://github.com/pilsung-kang/Text-Analytics/tree/master/08%20Seq2Seq%20Learning%20and%20Pre-trained%20Models



BERT 톺아보기
http://docs.likejazz.com/bert/

 


[NLP | 논문리뷰] BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding 상편
https://velog.io/@xuio/NLP-%EB%85%BC%EB%AC%B8%EB%A6%AC%EB%B7%B0-BERT-Pre-training-of-Deep-Bidirectional-Transformers-forLanguage-Understanding-%EC%83%81%ED%8E%B8

 

 

[NLP | 논문리뷰] BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding 하편
https://velog.io/@xuio/NLP-%EB%85%BC%EB%AC%B8%EB%A6%AC%EB%B7%B0-BERT-Pre-training-of-Deep-Bidirectional-Transformers-for-Language-Understanding-%ED%95%98%ED%8E%B8

 

 

 

[ELMO] DEEP CONTEXTUALIZED WORD REPRESENTATIONS
https://hugrypiggykim.com/2018/06/08/elmo-deep-contextualized-word-representations/

 

 

BERT: PRE-TRAINING OF DEEP BIDIRECTIONAL TRANSFORMERS FOR LANGUAGE UNDERSTANDING
https://hugrypiggykim.com/2018/12/09/bert-pre-training-of-deep-bidirectional-transformers-for-language-understanding/


Illustrated Transformer
http://jalammar.github.io/illustrated-transformer/


BERT Explain
https://towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp-f8b21a9b6270


BERT Explained: State of the art language model for NLP
https://towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp-f8b21a9b6270


[Pytorch][BERT] 버트 소스코드 이해
https://hyen4110.tistory.com/87


[Pytorch][BERT] 버트 소스코드 이해_② BertConfig
https://hyen4110.tistory.com/88


[Pytorch][BERT] 버트 소스코드 이해_③ BertTokenizer
https://hyen4110.tistory.com/89


[Pytorch][BERT] 버트 소스코드 이해_④ BertModel
https://hyen4110.tistory.com/90


[Pytorch][BERT] 버트 소스코드 이해_⑤ BertEmbedding
https://hyen4110.tistory.com/91


[Pytorch][BERT] 버트 소스코드 이해_⑥ BertEncoder
https://hyen4110.tistory.com/99


[Pytorch][BERT] 버트 소스코드 이해_⑦ Bert Pooler
https://hyen4110.tistory.com/102


[Pytorch][BERT] 버트 소스코드 이해_⑧ BERT model 입력값
https://hyen4110.tistory.com/103


[Pytorch][BERT] 버트 소스코드 이해_⑨ BERT model 출력값
https://hyen4110.tistory.com/104


[Pytorch][BERT] 버트 소스코드 이해_⑩ BERT Layer
https://hyen4110.tistory.com/105


[Pytorch][BERT] 버트 소스코드 이해_⑪ BertSelfAttention
https://hyen4110.tistory.com/106


[Pytorch][BERT] 버트 소스코드 이해_⑫ BertSelfOutput
https://hyen4110.tistory.com/107

 

 

BERT Research - Ep. 1 - Key Concepts & Sources
https://youtu.be/FKlPCK1uFrc

 

Applying BERT to Question Answering (SQuAD v1.1)
 https://www.youtube.com/watch?v=l8ZYCvgGu0o 

 

 

GPT-1(Improving languague understanding by Generative Pre-Training)란?+벡터 흐름 하나하나 자세하게 설명
https://velog.io/@gypsi12/GPT-1Improving-languague-understanding-by-Generative-Pre-Training%EB%9E%80%EB%B2%A1%ED%84%B0-%ED%9D%90%EB%A6%84-%ED%95%98%EB%82%98%ED%95%98%EB%82%98-%EC%9E%90%EC%84%B8%ED%95%98%EA%B2%8C-%EC%84%A4%EB%AA%85

 

[GPT-2] Language Models are Unsupervised Multitask Learners

https://velog.io/@stapers/GPT-2-Language-Models-are-Unsupervised-Multitask-Learners

 

 

OpenAI GPT-2 - Language Models are Unsupervised Multitask Learners(GPT2 논문 설명)

https://greeksharifa.github.io/nlp(natural%20language%20processing)%20/%20rnns/2019/08/28/OpenAI-GPT-2-Language-Models-are-Unsupervised-Multitask-Learners/

 

 

[자연어처리][paper review] GPT-2 : Language Models are Unsupervised Multitask Learners

https://supkoon.tistory.com/25

 

 

[논문리뷰] GPT1, GPT2, GPT3 차이와 한계

https://velog.io/@jus6886/%EB%85%BC%EB%AC%B8%EB%A6%AC%EB%B7%B0-GPT1-GPT2-GPT3-%EC%B0%A8%EC%9D%B4%EC%99%80-%ED%95%9C%EA%B3%84

 

 

[NLP] GPT-2 , GPT-3 , ALBERT , ELECTRA

https://amber-chaeeunk.tistory.com/98

 


토크나이저 정리(BPE,WordPiece,SentencePiece)
https://velog.io/@gypsi12/%ED%86%A0%ED%81%AC%EB%82%98%EC%9D%B4%EC%A0%80-%EC%A0%95%EB%A6%ACBPEWordPieceSentencePiece

 

 

5. Byte-Pair Encoding (BPE) 토큰화

https://wikidocs.net/166825

 

 

 

'News > 논문' 카테고리의 다른 글

논문 검색 사이트  (0) 2023.07.21
논문 읽기/쓰기  (0) 2023.07.07
RNN, Transformer  (0) 2023.07.06
arxiv-utils Google Chrome Extension  (0) 2023.06.22









>