Dot products and duality | Chapter 9, Essence of linear algebra

Why the formula for dot products matches their geometric intuition.
Help fund future projects:
An equally valuable form of support is to simply share some of the videos.
Home page:

Dot products are a nice geometric tool for understanding projection. But now that we know about linear transformations, we can get a deeper feel for what’s going on with the dot product, and the connection between its numerical computation and its geometric interpretation.

Full series:

Future series like this are funded by the community, through Patreon, where supporters get early access as the series is being produced.


3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted about new videos, subscribe, and click the bell to receive notifications (if you’re into that).

If you are new to this channel and want to see more, a good place to start is this playlist:

Various social media stuffs:


47 thoughts on “Dot products and duality | Chapter 9, Essence of linear algebra”

  1. I think I have watched the part between 6:30 and 10:00 at least 8 times, but it finally clicked and made geometric sense.
    Thank you for the cool new way to look at linear algebra. I love how it gives us a more intuitive way to think about vectors.

  2. Is this summary correct ? A linear transformation defined as the projection and then scaling (by a scalar b) of a any vector from a higher dimension to the number line is equivalent to the dot product of any vector with a the unit vector of the number line (scaled by b). This equivalency is due to the computational similarity between both procedures.

  3. I watched this video 4 years ago in Highschool and didn't really understand why you insisted that 1×2 Matrices and vectors in 2 dimensional space had such a profound correspondence.

    My only takeaway at the time was that this video proved that the numerical and geometric views of the dot product were the same. That was cool.

    I'm a math major now and I realize you were planting the seed for dual vectors in Dual Vector spaces! There is more than what meets the eye here and this video is beautiful.

  4. Had the same question related to the relation between dot product of vectors and projection before starting this video, had a smile on my face when Grant started with this question in middle and tried to explain it. That's the inspiring thing about this channel, diving deep into the why of these relations 🙂

  5. grant i think it would be a better idea if you called 1d vectors as 1d vectors and not "numbers". It just throws me off a bit. Maybe its just me

  6. Hello,
    Someone can explain why at 03:48, 2(v w) = (2 v) w from a geometric pint of view. from his explanation I understand that scaling one vector does not change the projection of a vector onto it. But I can't understand why the formula holds

  7. 🤯 Just awesome to see it play out like that. Maths could have been so much fun at school and university with a teacher like you. Or even access to your great visualizations. But hey – life long learning 😉

  8. Basically the projection process is the dot product. It is also like doing an LT process, because the projection gives the proportions of the LT and the subsequent multiplication gives the scale/magnitude of the LT. Now the algebra of this LT process is ac+bc. So the dot product must be ac+bc.

  9. @Inaugurated

    This is an answer to the clean explanation given by "Inaugurated" a few messages below. Your summary was really good, and I suggest anyone who is having difficulties to understand this topic to read it. Though, I was still struggling about one specific point, which I could eventually clarify. I am writting this with the hope it might help in case anyone would have the same difficulty.

    Here is the part of his message I'd like to detail : "Note that ANY vector, v, in the original 2D space is a linear combination of the basis vectors i-hat and j-hat. After the transformation, this still holds true: v = c1 i-hat + c2 j-hat and L(v) = c1 L(i-hat) + c2 L(j-hat). So, because L(i-hat) and L(j-hat) are projections of i-hat and j-hat, L(v) is a projection of v onto the number line!

    At first, I had difficulties to see the very last assumption as a direct obvious fact, but here is why we can indeed assume that this is true:

    What is important to emphazise here is the property in wich a linear transformation (and orthogonal projection is a linear transformation) of a linear combinaison of components is equal to the linear combination of the transformations of those same components. That property can be visualised as : "scaling and adding somme components then applying a transformation to the result is the same as transforming some component, then scaling and adding them".

    Here, we call our linear transformation L, this transformation being the orthogonal projection on the line. On the other hand, v is any vector expressed (as we always can) as a linear combinaison of the basis vectors i-hat and j-hat, let's write: v= c1*i-hat + c2*j-hat

    So with that in mind, it is correct to say that L(v) = L(c1*i-hat + c2*j-hat) (the projection of v on the line) is the same as c1*L(i-hat) + c2*L(j-hat) (the sum of the projections of i-hat scaled by c1 and j-hat scaled by c2 on the line)

    So indeed, the whole reasoning is clicking well !

  10. Ok, this is the second “dot product” YouTube video that I have watched and neither one relates the use of a dot product to reality. I can easily add products of two vectors to get a scalar….so what…what does it have to do with the price of eggs in china? What’s the application…praxis???

  11. I have been able to grasp all other videos preceding this but I don't get this one at all.
    Can someone please explain the following:

    1. What is the effect of a dot product when considering Linear Transformations?
    2. What does the scalar output (1-D, on number line) from dot product actually tells or suggests about the two vectors that we are performing the dot product on?
    3. In previous videos, an intuitive understanding was given for different vector/matrix operations. For example, matrix-vector multiplication means transforming vector to a different space while Linear combination means using the basis vectors of a space to output a vector in that space. I am looking for a similar intuitive understanding of dot product's impact on the two vectors.
    4. Also, I was just completely lost at the projection part.

    Please point me to a good resource to understand all of it, if possible.

  12. It's funny, I never really understood (the computation of) matrix multiplication until I learned about dot products (I did matrix multiplication several times in middle and high school, learned what I needed, and subsequently forgot). Yet here you intentionally put off dot products because matrix multiplication is the best way to understand (the intuition of) dot products.

    When I've taught matrix multiplication as part of ACT prep the way I was taught, students don't retain it. When I've taught dot products first then show them that matrix multiplication is a bunch of dot products, they do retain it, though they don't get it. So I stand by those who teach dot products first.

    I actually came to this series trying to find a way to answer why matrix multiplication is a bunch of dot products, because while an easy concept to recall computation, it really comes from left field conceptually. Ironically, my distillation of this video is not that the intuition of matrix multiplication comes from dot products, but rather that dot products come from matrix multiplication!

  13. Okay I am really confused. How can you multiply the 2 column vectors at the example at the beginning of the video? The first vector contains 1 column and the second vector contains 3 rows. I thought the rule stated that the columns of the first vector should equal the rows of the second vector for multiplication to be possible.

  14. what I did not get Is why I need this number in the first place, what does projecting a matrix onto another one and multiplygin the lenghts represent

  15. This is the only one in the series that I really don't get. It feels like we are just saying things that are obvious and I'm missing the significance.

  16. SO main phrase is: Anytime you have an 1×2 linear transformation, there is an vector associated with it, in a way that if you apply the 1×2 transformation to an vector is the same thing as taking an dot product of the same vector with the vector associated to that transformation whose coordinates are the i hat and j hat, which he more briefly explains at the end. That is what you need to understand.

  17. I wonder how the second interpretation of vectors he described here would be used if representing physical applications of linear algebra, such as forces in 3d space. Is it even sensible to use a linear transformation to describe something like a force

  18. I don't remember projections being mentioned previously in this series. Is that exactly what it looks like, drawing a line from the point of the vector perpendicular to the line your projecting onto to find the tip of the projection?

  19. I'm totally in love with this series, but this is the first video I felt a little lost on – I hadn't encountered the notion of 'projection' before, and while I can Google it and get a formula, that hasn't helped me with the whole 'get a visual intuition' thing, that this series is all about. If you ever revisit this, I'd love if you could add a footnote video about that.

  20. To be completely honest, my professor's expectations are so high and beyond just understanding the essence, that spending hours watching this is practically useless to me but I actually…enjoy it.. I don't know how you manage to make math sound romantic but these series remind me of movies like The Theory of Everything and it's making me feel better about procrastinating because I'm still studying even though unproductively in terms of my exams.

  21. the animation at 5:12 is just! can't find a word for it

    Thank you for taking the effort of making this series! I'm currently watching it as a repetition from what I learned (almost only in an abstract manner) in my first semester and it's just perfect to fill in some gaps of knowledge and to get a deeper understanding of what's actually going on. Especially the concept of matrices that convert for example a 2D Vector into a 1D, 2D or 3D vector was totally new to me.


Leave a Comment