TensorLearn
Back to Course
LLM Engineering: Transformers & RAG
Module 1 of 12

1. Architecture: Attention is All You Need

1. The Bottleneck

RNNs (Old AI) read word-by-word. They forgot the beginning of the sentence. Transformers read the ENTIRE sentence at once.

2. Self-Attention

Every word looks at every other word to understand context. "River Bank" vs "Bank Account". The word "Bank" attends to "River" or "Account" to know its meaning.

Mark as Completed

TensorLearn - AI Engineering for Professionals