Now Reading
The Tensor Product, Demystified

The Tensor Product, Demystified

2023-09-01 23:30:35

Previously on the blog, we have mentioned a recurring theme all through arithmetic: making new issues from previous issues. Mathematicians do that on a regular basis:

  • When you could have two integers, you’ll find their biggest widespread divisor or least widespread a number of.
  • When you could have some units, you’ll be able to type their Cartesian product or their union.
  • When you could have two teams, you’ll be able to assemble their direct sum or their free product.
  • When you could have a topological area, you’ll be able to search for a subspace or a quotient area.
  • When you could have some vector areas, you’ll be able to ask for his or her direct sum or their intersection.
  • The listing goes on!

As we speak, I would wish to deal with a specific strategy to construct a brand new vector area from previous vector areas: the tensor product. This development typically come throughout as scary and mysterious, however I hope to assist shine somewhat mild and dispel a number of the concern. Particularly, we cannot speak about axioms, common properties, or commuting diagrams. As an alternative, we’ll take an elementary, concrete look:

Given two vectors $mathbf{v}$ and $mathbf{w}$, we are able to construct a brand new vector, referred to as the tensor product $mathbf{v}otimes mathbf{w}$. However what’s that vector, actually? Likewise, given two vector areas $V$ and $W$, we are able to construct a brand new vector area, additionally referred to as their tensor product $Votimes W$. However what’s that vector area, actually

Making new vectors from previous

On this dialogue, we’ll assume $V$ and $W$ are finite dimensional vector areas. Which means we are able to consider $V$ as $mathbb{R}^n$ and $W$ as $mathbb{R}^m$ for some optimistic integers $n$ and $m$. So a vector $mathbf{v}$ in $mathbb{R}^n$ is basically only a listing of $n$ numbers, whereas a vector $mathbf{w}$ in $mathbb{R}^m$ is only a listing of $m$ numbers.

Let’s attempt to make new, third vector out of $mathbf{v}$ and $mathbf{w}$. However how? Listed here are two concepts: We are able to stack them on high of one another, or we are able to first multiply the numbers collectively and then stack them on high of one another.

The primary choice offers a brand new listing of $n+m$ numbers, whereas the second choice offers a brand new listing of $nm$ numbers. The primary offers a strategy to construct a brand new area the place the scale add; the second offers a strategy to construct a brand new area the place the scale multiply. The primary is a vector $(mathbf{v},mathbf{w})$ within the direct sum $Voplus W$ (this is similar as their direct product $Vtimes W$); the second is a vector $mathbf{v}otimes mathbf{w}$ within the tensor product $Votimes W$.

And that is it! 

Forming the tensor product $mathbf{v}otimes mathbf{w}$ of two vectors is so much like forming the Cartesian product of two units $Xtimes Y$. Actually, that is precisely what we’re doing if we consider $X$ because the set whose components are the entries of $mathbf{v}$ and equally for $Y$. 

So a tensor product is sort of a grown-up model of multiplication. It is what occurs once you systematically multiply a bunch of numbers collectively, then manage the outcomes into an inventory. It is multi-multiplication, if you’ll.

There’s somewhat extra to the story.

Does each vector in $Votimes W$ appear like $mathbf{v}otimesmathbf{w}$ for some $mathbf{v}in V$ and $mathbf{w}in W$? Not fairly. Keep in mind, a vector in a vector area will be written as a weighted sum of foundation vectors, that are just like the area’s constructing blocks. That is one other occasion of constructing new issues from current ones: we get a brand new vector by taking a weighted sum of some particular vectors!

So a typical vector in $Votimes W$ is a weighted sum of foundation vectors. What are these foundation vectors? Nicely, there should be precisely $nm$ of them, for the reason that dimension of $Votimes W$ is $nm$. Furthermore, we would count on them to be constructed up from the premise of $V$ and the premise of $W$. This brings us once more to the “How can we assemble new issues from previous issues?” query. Requested explicitly: If now we have $n$ bases $mathbf{v}_1,ldots,mathbf{v}_n$ for $V$ and if now we have $m$ bases $mathbf{w}_1,ldots,mathbf{w}_m$ for $W$ then how can we mix them to get a brand new set of $nm$ vectors?

That is completely analogous to the development we noticed above: given an inventory of $n$ issues and an inventory of $m$ issues, we are able to acquire an inventory of $nm$ issues by multiplying all of them collectively. So we’ll do the identical factor right here! We’ll merely multiply the $mathbf{v}_i$ along with the $mathbf{w}_j$ in all attainable combos, besides “multiply $mathbf{v}_i$ and $mathbf{w}_j$ ” now means “take the tensor product of $mathbf{v}_i$ and $mathbf{w}_j$.” 

Concretely, a foundation for $Votimes W$ is the set of all vectors of the shape $mathbf{v}_iotimesmathbf{w}_j$ the place $i$ ranges from $1$ to $n$ and $j$ ranges from $1$ to $m$. For example, suppose $n=3$ and $m=2$ as earlier than. Then we are able to discover the six foundation vectors for $Votimes W$ by forming a ‘multiplication chart.’ (The delicate strategy to say that is: “$Votimes W$ is the free vector area on $Atimes B$, the place $A$ is a set of turbines for $V$ and $B$ is a set of turbines for $W$.”)

So $Votimes W$ is the six-dimensional area with foundation

$${mathbf{v}_1otimesmathbf{w}_1,;mathbf{v}_1otimesmathbf{w}_2,; mathbf{v}_2otimesmathbf{w}_1,;mathbf{v}_2otimesmathbf{w}_2,;mathbf{v}_3otimesmathbf{w}_1,;mathbf{v}_3otimesmathbf{w}_2 }$$

This would possibly really feel somewhat summary with all of the $otimes$ symbols littered in all places. However do not forget—we all know precisely what every $mathbf{v}_iotimesmathbf{w}_j$ seems to be like—it is only a listing of numbers! Which listing of numbers? Nicely, 

So what’s $Votimes W$? It is the vector area whose vectors are linear combos of the $mathbf{v}_iotimesmathbf{w}_j$. For instance, listed below are a few vectors on this area:

Nicely, technically

Technically, $mathbf{v}otimesmathbf{w}$ is known as the outer product of $mathbf{v}$ and $mathbf{w}$ and is outlined by $$mathbf{v}otimesmathbf{w}:=mathbf{v}mathbf{w}^high$$ the place $mathbf{w}^high$ is similar as $mathbf{w}$ however written as a row vector. (And if the entries of $mathbf{w}$ are advanced numbers, then we additionally exchange every entry by its advanced conjugate.) So technically the tensor product of vectors is matrix:

This will likely appear to be in battle with what we did above, nevertheless it’s not! The 2 go hand-in-hand. Any $mtimes n$ matrix will be reshaped right into a $nmtimes 1$ column vector and vice versa. (So so far, we have exploiting the truth that $mathbb{R}^3otimesmathbb{R}^2$ is isomorphic to $mathbb{R}^6$.) You would possibly discuss with this as matrix-vector duality.

See Also

It is somewhat like a process-state duality. On the one hand, a matrix $mathbf{v}otimesmathbf{w}$ is a course of—it is a concrete illustration of a (linear) transformation. However, $mathbf{v}otimesmathbf{w}$ is, abstractly talking, a vector. And a vector is the mathematical gadget that physicists use to explain the state of a quantum system. So matrices encode processes; vectors encode states. The upshot is {that a} vector in a tensor product $Votimes W$ will be seen in both means just by reshaping the numbers as an inventory or as a rectangle.

By the best way, this concept of viewing a matrix as a course of can simply be generalized to larger dimensional arrays, too. These arrays are referred to as tensors and everytime you do a bunch of these processes collectively, the ensuing mega-process offers rise to a tensor community. However manipulating high-dimensional arrays of numbers can get very messy in a short time: there are many numbers that all must be multiplied collectively. That is like multi-multi-multi-multi…plication. Thankfully, tensor networks include beautiful photos that make these computations quite simple. (It goes again to Roger Penrose’s graphical calculus.) It is a dialog I would wish to have right here, nevertheless it’ll have to attend for another day!

In quantum physics

One utility of tensor merchandise is expounded to the transient assertion I made above: “A vector is the mathematical gadget that physicists use to explain the state of a quantum system.” To elaborate: if in case you have somewhat quantum particle, maybe you’d wish to know what it’s doing. Or what it’s able to doing. Or the likelihood that it’ll be doing one thing. In essence, you are asking: What’s its standing? What’s its state? The reply to this query— offered by a postulate of quantum mechanics—is given by a unit vector in a vector area. (Actually, a Hilbert area, say $mathbb{C}^n$.) That unit vector encodes details about that particle.

The dimension $n$ is, loosely talking, the variety of various things you might observe after making a measurement on the particle. However what if now we have two little quantum particles? The state of that two-particle system will be described by one thing referred to as a density matrix $rho$ on the tensor product of their respective areas $mathbb{C}^notimesmathbb{C}^n$. A density matrix is a generalization of a unit vector—it accounts for interactions between the 2 particles. 

The identical story holds for $N$ particles—the state of an $N$-particle system will be described by a density matrix on an $N$-fold tensor product.

However why the tensor product? Why is it that this development—out of all issues—describes the interactions inside a quantum system so nicely, so naturally? I don’t know the reply, however maybe the appropriateness of tensor merchandise should not be too stunning. The tensor product itself captures all ways in which staple items can “work together” with one another!

In fact, there’s heaps extra to be stated about tensor merchandise. I’ve solely shared a snippet of primary arithmetic. For a deeper look into the arithmetic, I like to recommend studying via Jeremy Kun’s splendidly lucid How to Conquer Tensorphobia and Tensorphobia and the Outer Product. Get pleasure from!



Source Link

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top