Basic Mathematics to Understand Caltech Quantum Computing Lecture Notes (2)

What's this page for?

This is a page for me collecting mathematical terms that often appears in quantum computing, especially to read these lecture notes smoothly without worrying much about mathematical terms. If you are interested in these fields, I would say this page will also be your help.

This page contains

Tensor Product \(\bigotimes\) with concrete examples

Tensor product is an operation that produces new vector from old vectors. And the dimension of the new vector is usually bigger than old vectors. Let's say we have a vector \(\bm{a}\) and \(\bm{b}\) $$ \bm{a} \in \Reals^2, ~~~~ \bm{b}\in \Reals^3 \\[2em] \bm{a} = \begin{bmatrix} 1 \\ 2 \\ \end{bmatrix}, ~~~~~~ \bm{b} = \begin{bmatrix} 3 \\ 4 \\ 5 \\ \end{bmatrix} \\ $$ and the tensor product \(\bm{a} \otimes \bm{b} \) is $$ \bm{a} \otimes \bm{b} \in \Reals^{2\times 3}\\ \\[2em] \bm{a} \otimes \bm{b} = = \begin{bmatrix} 1 \times \begin{bmatrix} 3 \\ 4 \\ 5 \\ \end{bmatrix} \\\\ 2 \times \begin{bmatrix} 3 \\ 4 \\ 5 \\ \end{bmatrix} \\ \end{bmatrix} = \begin{bmatrix} 1 \times 3 \\ 1\times 4 \\ 1\times 5 \\ 2\times 3\\ 2\times 4\\ 2\times 5\\ \end{bmatrix} = \begin{bmatrix} 3 \\ 4 \\ 5 \\ 6\\ 8\\ 10\\ \end{bmatrix} $$

e.g. 2

We have two matrices \(A\) and \(B\), $$ A \in \Reals^{2 \times 2},~~~~ B \in \Reals^{3 \times 2}\\\\\\ \text{} \\A = \begin{bmatrix} 1 & 3\\ 2 & 4\\ \end{bmatrix}, ~~~~~~ B = \begin{bmatrix} 5&8\\ 6&9\\ 7&10\\ \end{bmatrix} $$ then, \(A \otimes B\) is $$ A \otimes B\ \in \Reals^{(2\times 3) \times (2 \times 2)} \\[2em] \begin{aligned} A \otimes B &= \begin{bmatrix} 1 \times \begin{bmatrix} 5&8\\ 6&9\\ 7&10\\ \end{bmatrix} & 3 \times \begin{bmatrix} 5&8\\ 6&9\\ 7&10\\ \end{bmatrix} \\\\ 2 \times \begin{bmatrix} 5&8\\ 6&9\\ 7&10\\ \end{bmatrix} & 4 \times \begin{bmatrix} 5&8\\ 6&9\\ 7&10\\ \end{bmatrix} \\ \end{bmatrix} \\[2em] \\ &= \begin{bmatrix} 5&8&15&24 \\ 6&9&18&27 \\ 7&10&21&30 \\ 10&16&20&32 \\ 12&18&24&36 \\ 14&20&28&40 \\ \end{bmatrix} \end{aligned} $$ \(B \otimes A\) $$ A \otimes B\ \in \Reals^{(3\times 2) \times (2 \times 2)} \\[2em] \begin{aligned} B \otimes A &= \begin{bmatrix} 5\times \begin{bmatrix} 1 & 3\\ 2 & 4\\ \end{bmatrix}&8 \times \begin{bmatrix} 1 & 3\\ 2 & 4\\ \end{bmatrix}\\\\ 6\times \begin{bmatrix} 1 & 3\\ 2 & 4\\ \end{bmatrix}&9\times \begin{bmatrix} 1 & 3\\ 2 & 4\\ \end{bmatrix}\\\\ 7\times \begin{bmatrix} 1 & 3\\ 2 & 4\\ \end{bmatrix}&10\times \begin{bmatrix} 1 & 3\\ 2 & 4\\ \end{bmatrix} \end{bmatrix} \\[2em] \\ &= \begin{bmatrix} 5&15&8&24 \\ 10&20&16&32 \\ 6&18&9&27 \\ 12&24&18&36 \\ 7&21&10&30 \\ 14&28&20&40 \\ \end{bmatrix} \\ \end{aligned} $$ apparently, tensor product is non-commutive \(A \otimes B \not = B \otimes A\) operation.

e.g. 3

$$ under construction $$

Hilbert space \(\mathcal{H} \)

Hilbert Space is a vector space that contains inner-product operation and completeness. Its elements are real numbers, complex numbers, or square-integrable functions. Completeness a property that naturally comes with those elements.といってみたものの,よくわからん.

For example, \(\Reals^3 \) and \(\Complex^3\) are finite dimensional Hilbert spaces

e.g. 1

For \(\bm{x_1}, \bm{x_2} \in \Reals^3\), $$ \begin{aligned} \bm{x_1} &= \begin{bmatrix} a_1\\ a_2\\ a_3\\ \end{bmatrix} ~~~ \bm{x_2} = \begin{bmatrix} b_1 \\ b_2 \\ b_3\\ \end{bmatrix} \\[2em] \end{aligned} $$

inner-product operation are conducted as $$ \begin{aligned} \lang \bm{x_1}, \bm{x_2} \rang &= \bm{x_1}^T \bm{x_2}\\ &= \begin{bmatrix} a_1&a_2&a_3\\ \end{bmatrix} \begin{bmatrix} b_1 \\ b_2 \\ b_3\\ \end{bmatrix}\\ &= \sum_{n=1}^3 {a_n b_n} \end{aligned} $$

e.g. 2

For \(\bm{z_1}, \bm{z_2} \in \Complex^3\), $$ \begin{aligned} \bm{z_1} &= \begin{bmatrix} a_1 + b_1 i\\ a_2 + b_2 i\\ a_3 + b_3 i\\ \end{bmatrix} ~~~ \bm{z_2} = \begin{bmatrix} c_1 + d_1 i\\ c_2 + d_2 i\\ c_3 + d_3 i\\ \end{bmatrix} \\[2em] \end{aligned} $$

inner-product operation are conducted as $$ \begin{aligned} \lang \bm{z_1}, \bm{z_2} \rang &= \bm{z_1}^\dagger \bm{z_2} \\ &= \begin{bmatrix} a_1 - b_1 i&a_2 - b_2 i&a_3 - b_3 i \end{bmatrix} \begin{bmatrix} c_1 + d_1 i\\ c_2 + d_2 i\\ c_3 + d_3 i\\ \end{bmatrix}\\ &= \sum_{n=1}^3 {(a_n - b_n i) (c_n + d_n i)} \\[4em] \end{aligned} $$ where $$ (\bm{z}^\dagger = (\bm{z}^*)^T) $$

e.g. 3

For infinite dimensional Hilbert space (functions), inner-product operation is derived as follows. Let square-integrable functions \(f_1\) and \(f_2\) $$ \begin{aligned} \int_{-\infin}^{\infin}{|f_1|^2} &< \infin, ~~~ \int_{-\infin}^{\infin}{|f_2|^2} < \infin \end{aligned} $$

the inner product is $$ \lang f_1, f_2 \rang = \int_{-\infin}^{\infin}{f_1^* ~f_2}dx $$

For example let \(f_1\) and \(f_2\) as $$ \begin{aligned} f_1 &= e^{-x^2}, ~~ f_2 = \frac{1}{\sqrt{2 \pi}} e^{-\frac{x^2}{2}}\\[2em] \int_{-\infin}^{\infin}{|f_1|^2} &= \sqrt{\pi}, ~~~ \int_{-\infin}^{\infin}{|f_2|^2} = 1 \\ \\ \end{aligned} $$ then the inner product is $$ \begin{aligned} \lang f_1, f_2 \rang &= \int_{-\infin}^{\infin}{f_1^* ~f_2}dx \\ &= \int_{-\infin}^{\infin}{\frac{1}{\sqrt{2 \pi}} e^{-\frac{3 x^2}{2}}}dx \\ &= \frac{1}{\sqrt{3}} \end{aligned} $$ Usually these square-integrable functions are probability distributions because of the property of probabilities (the sum of probabilities is always 1). In fact, \(f_2= \frac{1}{\sqrt{2 \pi}} e^{-\frac{x^2}{2}}\) is a special case of gaussian distribution (median and variance are both 0).

Tensor Product \(\bigotimes\) more abstract way

By the above examples you understand how to apply tensor product to two vectors(matrices) and what the result is like. Although those concreat examples are really important to understand the basic notion of tensors, it is not enough to tackle with the caltech lecture notes. under construction




References

http://hitoshi.berkeley.edu/221A/tensorproduct.pdf https://www.math3ma.com/blog/the-tensor-product-demystified
https://math.stackexchange.com/questions/2158892working-out-a-concrete-example-of-tensor-product
https://cds.cern.ch/record/1522001/files/978-1-4614-6336-8_BookBackMatter.pdf http://www.theory.caltech.edu/~preskill/ph219/ph219_2018-19 https://en.wikipedia.org/wiki/Unitary_matrix
https://en.wikipedia.org/wiki/Orthogonal_matrix