Introducing Command R+: Our new, most powerful model in the Command R family.

Learn More
< Back to blog

What Is Attention in Language Models?

Image of Luis SerranoLuis Serrano

Feb 13, 2023

Blog Post Featured Image

A huge roadblock for language models is when a word can be used in two different contexts. When this problem is encountered, the model needs to use the context of the sentence in order to decipher which meaning of the word to use. This is precisely what self-attention models do.

Share: