Tensor network explained

Tensor networks or tensor network states are a class of variational wave functions used in the study of many-body quantum systems[1] and fluids.[2] [3] Tensor networks extend one-dimensional matrix product states to higher dimensions while preserving some of their useful mathematical properties.[4] The wave function is encoded as a tensor contraction of a network of individual tensors.[5] The structure of the individual tensors can impose global symmetries on the wave function (such as antisymmetry under exchange of fermions) or restrict the wave function to specific quantum numbers, like total charge, angular momentum, or spin. It is also possible to derive strict bounds on quantities like entanglement and correlation length using the mathematical structure of the tensor network.[6] This has made tensor networks useful in theoretical studies of quantum information in many-body systems. They have also proved useful in variational studies of ground states, excited states, and dynamics of strongly correlated many-body systems.[7]

Diagrammatic notation

In general, a tensor network diagram (Penrose diagram) can be viewed as a graph where nodes (or vertices) represent individual tensors, while edges represent summation over an index. Free indices are depicted as edges (or legs) attached to a single vertex only.[8] Sometimes, there is also additional meaning to a node's shape. For instance, one can use trapezoids for unitary matrices or tensors with similar behaviour. This way, flipped trapezoids would be interpreted as complex conjugates to them.

History

Foundational research on tensor networks began in 1971 with a paper by Roger Penrose.[9] In “Applications of negative dimensional tensors” Penrose developed tensor diagram notation, describing how the diagrammatic language of tensor networks could be used in applications in physics.[10]

In 1992, Steven R. White developed the Density Matrix Renormalization Group (DMRG) for quantum lattice systems.[11] The DMRG was the first successful tensor network and associated algorithm.[12]

In 2002, Guifre Vidal and Reinhard Werner attempted to quantify entanglement, laying the groundwork for quantum resource theories.[13] [14] This was also the first description of the use of tensor networks as mathematical tools for describing quantum systems.

In 2004, Frank Verstraete and Ignacio Cirac developed the theory of matrix product states, projected entangled pair states, and variational renormalization group methods for quantum spin systems.[15]

In 2006, Vidal developed the multi-scale entanglement renormalization ansatz (MERA).[16] In 2007 he developed entanglement renormalization for quantum lattice systems.[17]

In 2010, Ulrich Schollwock developed the density-matrix renormalization group for the simulation of one-dimensional strongly correlated quantum lattice systems.[18]

In 2014, Román Orús introduced tensor networks for complex quantum systems and machine learning, as well as tensor network theories of symmetries, fermions, entanglement and holography.[19]

Connection to machine learning

Tensor networks have been adapted for supervised learning,[20] taking advantage of similar mathematical structure in variational studies in quantum mechanics and large-scale machine learning. This crossover has spurred collaboration between researchers in artificial intelligence and quantum information science. In June 2019, Google, the Perimeter Institute for Theoretical Physics, and X (company), released TensorNetwork, an open-source library for efficient tensor calculations.[21]

The main interest in tensor networks and their study from the perspective of machine learning is to reduce the number of trainable parameters (in a layer) by approximating a high-order tensor with a network of lower-order ones. Using the so-called tensor train technique (TT),[22] one can reduce an N-order tensor (containing exponentially many trainable parameters) to a chain of N tensors of order 2 or 3, which gives us a polynomial number of parameters.

See also

Notes and References

  1. Orús. Román. 5 August 2019. Tensor networks for complex quantum systems. Nature Reviews Physics. en. 1. 9. 538–550. 10.1038/s42254-019-0086-7. 2522-5820. 1812.04011. 2019NatRP...1..538O. 118989751.
  2. Gourianov . Nikita . Lubasch . Michael . Dolgov . Sergey . van den Berg . Quincy Y. . Babaee . Hessam . Givi . Peyman . Kiffner . Martin . Jaksch . Dieter . A quantum-inspired approach to exploit turbulence structures . Nature Computational Science . 2022-01-01 . 2 . 1 . 30–37 . 10.1038/s43588-021-00181-1 . 38177703 . 2662-8457.
  3. Gourianov . Nikita . Givi . Peyman . Jaksch . Dieter . Pope . Stephen B. . Tensor networks enable the calculation of turbulence probability distributions . 2024 . physics.flu-dyn . 2407.09169 .
  4. Orús. Román. 2014-10-01. A practical introduction to tensor networks: Matrix product states and projected entangled pair states. Annals of Physics. en. 349. 117–158. 1306.2164. 10.1016/j.aop.2014.06.013. 2014AnPhy.349..117O. 118349602. 0003-4916.
  5. Biamonte. Jacob. Bergholm. Ville. 2017-07-31. Tensor Networks in a Nutshell. quant-ph. 1708.00006.
  6. Verstraete. F.. Wolf. M. M.. Perez-Garcia. D.. Cirac. J. I.. 2006-06-06. Criticality, the Area Law, and the Computational Power of Projected Entangled Pair States. Physical Review Letters. 96. 22. 220601. 10.1103/PhysRevLett.96.220601. 16803296. quant-ph/0601075. 2006PhRvL..96v0601V. 1854/LU-8590963. 119396305 . free.
  7. Book: Montangero, Simone. Introduction to tensor network methods : numerical simulations of low-dimensional many-body quantum systems. 28 November 2018. 978-3-030-01409-4. Cham, Switzerland. 1076573498.
  8. Web site: The Tensor Network . 2022-07-30 . Tensor Network . en.
  9. Roger Penrose, "Applications of negative dimensional tensors," in Combinatorial Mathematics and its Applications, Academic Press (1971). See Vladimir Turaev, Quantum invariants of knots and 3-manifolds (1994), De Gruyter, p. 71 for a brief commentary.
  10. Biamonte. Jacob. 2020-04-01. Lectures on Quantum Tensor Networks . quant-ph. 1912.10049.
  11. White. Steven . Density matrix formulation for quantum renormalization groups . 9 Nov 1992 . Physical Review Letters . 69 . 19 . 2863–2866 . 10.1103/PhysRevLett.69.2863 . 10046608 . 2024-10-24.
  12. Web site: Tensor Networks Group . 2024-10-24.
  13. Thomas . Jessica . 2 Mar 2020 . 50 Years of Physical Review A: The Legacy of Three Classics . Physics . 13 . 24 . 10.1103/PhysRevA.45.8185 . 9906912 . 2024-10-24.
  14. Vidal. Guifre. Werner. Reinhard . Computable measure of entanglement . 9 Nov 1992 . Physical Review Letters A . 65 . 3 . 032314 . 10.1103/PhysRevA.65.032314 . 2024-10-24. quant-ph/0102117 .
  15. Verstraete . Frank . Cirac . Ignacio . Matrix Product States, Projected Entangled Pair States, and variational renormalization group methods for quantum spin systems . 9 May 2007 . Advances in Physics . 57 . 2 . 143-224 . 10.1080/14789940801912366 . 0907.2796 . 2024-10-24.
  16. Vidal. Guifre. Werner. Reinhard . Class of Quantum Many-Body States That Can Be Efficiently Simulated . 12 Sep 2008 . Physical Review Letters . 101 . 11 . 110501 . 10.1103/PhysRevLett.101.110501 . quant-ph/0610099 . 2024-10-24.
  17. Vidal. Guifre . 2009-12-09. Entanglement Renormalization: an introduction . quant-ph. 0912.1651.
  18. Schollwock . Ulrich . The density-matrix renormalization group in the age of matrix product states . 20 Aug 2010 . Annals of Physics . 326 . 1 . 96-192 . 10.1016/j.aop.2010.09.012 . 1008.3477 . 2024-10-24.
  19. Orús . Román . Advances on tensor network theory: symmetries, fermions, entanglement, and holography . 26 Nov 2014 . The European Physical Journal B . 87 . 280 . 10.1140/epjb/e2014-50502-9 . 1407.6552 . 2024-10-24.
  20. Stoudenmire. E. Miles. Schwab. David J.. 2017-05-18. Supervised Learning with Quantum-Inspired Tensor Networks. Advances in Neural Information Processing Systems. 29. 4799. 1605.05775.
  21. Web site: Introducing TensorNetwork, an Open Source Library for Efficient Tensor Calculations. 2021-02-02. Google AI Blog. 4 June 2019 . en.
  22. Oseledets . I. V. . 2011-01-01 . Tensor-Train Decomposition . SIAM Journal on Scientific Computing . 33 . 5 . 2295–2317 . 10.1137/090752286 . 2011SJSC...33.2295O . 207059098 . 1064-8275.