< Back to modules
What Is Attention in Language Models?
Luis Serrano
What Is Attention in Language Models?
Luis SerranoA huge roadblock for language models is when a word can be used in two different contexts. When this problem is encountered, the model needs to use the context of the sentence in order to decipher which meaning of the word to use. This is precisely what self-attention models do.
Share: