2023 journal article
Analogy between Boltzmann Machines and Feynman Path Integrals
Journal of Chemical Theory and Computation.
Machine learning has had a significant impact on multiple areas of science, technology, health, and computer and information sciences. Through the advent of quantum computing, quantum machine learning has developed as a new and important avenue for the study of complex learning problems. Yet there is substantial debate and uncertainty in regard to the foundations of machine learning. Here, we provide a detailed exposition of the mathematical connections between a general machine learning approach called Boltzmann machines and Feynman's description of quantum and statistical mechanics. In Feynman's description, quantum phenomena arise from an elegant, weighted sum over (or superposition of) paths. Our analysis shows that Boltzmann machines and neural networks have a similar mathematical structure. This allows the interpretation that the hidden layers in Boltzmann machines and neural networks are discrete versions of path elements and allows a path integral interpretation of machine learning similar to that in quantum and statistical mechanics. Since Feynman paths are a natural and elegant depiction of interference phenomena and the superposition principle germane to quantum mechanics, this analysis allows us to interpret the goal in machine learning as finding an appropriate combination of paths, and accumulated path-weights, through a network, that cumulatively captures the correct properties of an x-to-y map for a given mathematical problem. We are forced to conclude that neural networks are naturally related to Feynman path-integrals and hence may present one avenue to be considered as quantum problems. Consequently, we provide general quantum circuit models applicable to both Boltzmann machines and Feynman path integrals.