Skip to content
Blog series on Tensor Algebra

Chapter 1

Welcome to this blog series on Tensor Algebra. This blog series is my attempt to explain Tensor Algebra as I learnt from various sources over the Internet. This chapter is mostly to motivate why you might want to study Tensor Algebra.

Pre-requisite

This blog series assumes a familiarity with Linear Algebra (concepts like matrix-multiplication, dot products, linear combination, etc.) This blog should be particularly helpful for those who never had formal college degree in Physics and as such find it hard to delve into certain subjects where Tensors are pre-requisite. No knowledge of calculus is needed to study Tensor Algebra. I may do a follow up series on Tensor Calculus though.

Why would you want to study Tensor Algebra?

Most people first hear about Tensor Algebra when they start out their journey to delve deeper into the General Theory of Relatively (GTR). In GTR, Eistein shows how space-time forms a single object which is curved by massive objects ( and other crazy ideas like blackholes and gravitational waves).

_Space time curves near massive objects as predicted by General Theory of Relativity, but what does that even mean?_
Space time curves near massive objects as predicted by General Theory of Relativity, but what does that even mean?

You might have also heard of big bang and how universe is expanding since. Tensors are an indispensable tool to understand what these mean mathematically.

_What does it really mean mathematically when people say Universe is expanding?_
What does it really mean mathematically when people say Universe is expanding?

Another popular example where tensors come up often is Quantum Mechanics(QM) and Quantum Computing(QC), In QM, the mystical concept of Quantum Superposition where in a particles can remain in multiple states until it is observed is essential linear combination of tensors which we will be cover later.

_Quantum Superposition: Is the cat dead or alive?_
Quantum Superposition: Is the cat dead or alive?

You may have heard that nothing travels faster than speed of light, yet if you separate two particles whose states are entangled, simple act of observing one instantaneously affects the other. Again, this is mathematically represented by tensor product which we will cover as well later in this series.

_"Spooky action at a distance"- Einstein. Huh?_
"Spooky action at a distance"- Einstein. Huh?

There is probably nothing more intriguing and fascinating than the unintuitive world of small scale (QM) as well as mind warping idea that time isn't same for everyone or that our universe is expanding. Tensors is a beautiful theory that allows one to study the underlying (complicated) Geometry of Space-time, something that one cannot get just by watching popular science articles or documentaries on these topics.

Alfred North Whitehead said

The idea that physicists would in future have to study the theory of tensors created real panic amongst them following the first announcement that Eistein's predictions had been verified.

However, GTR and QM are not the only subjects tensors are used in. Tensors are prevalent in many domains of engineering and science, and good understanding of tensors would build a solid foundation for learning numerous other STEM subjects.

What are Tensors?

It is hard to explain what Tensor is. Different people tend to give different definition of tensor. And whats worse is that most of them are partially correct, but none completely. There are generally three ways to think about Tensor.

  1. Tensors are multi-dimensional arrays. An array is a list of numbers. If you replicate that list multiple times, like a table or MS excel, than it is called a 2-D array and so on and so forth. For example, you may have heard of popular scientific computation library called TensorFlow, which is all about manipulation of multi-dimensional arrays. Tensors are characterised by rank, as follows:

    • Scalar (rank 0 tensor): [5], [1], [2.5], [\pi]
    • Vector (rank 1 tensor): [1, 2, 3], [0.1, 3, \sqrt{2}]
    • Matrix (rank 2 tensor): \begin{bmatrix}a & b\\c & d\end{bmatrix}, \begin{bmatrix}1 & 2 & 3\\4 & 5 & 6\end{bmatrix}
    • So on for rank 3, 4, ...

    This definition is incorrect

    Because Tensors can be represented as multi-dimensional arrays. However, tensors as concept is much more than just bunch of numbers. While calling multidimensional arrays tensor is generally accepted in Machine Learning community, it is not strictly correct. Tensors have geometrical meaning which is not apparent in this definition.

To be continued...

Comments