Our “machine learning with nonlinear waves” paper featured in Physics!

Riding waves in Neuromorphic Computing, Marios Mattheakis highlights with a thoughtful viewpoint our recent paper in PRL on the artificial intelligence of nonlinear waves.

The Artificial Intelligence of Waves

In a paper published in Physical Review Letters, with title

Theory of Neuromorphic Computing by Waves: Machine Learning by Rogue Waves, Dispersive Shocks and Solitons

we study artificial neural networks with nonlinear waves as a computing reservoir. We discuss universality and the conditions to learn a dataset in terms of output channels and nonlinearity. A feed-forward three-layered model, with an encoding input layer, a wave layer, and a decoding readout, behaves as a conventional neural network in approximating mathematical functions, real-world datasets, and universal Boolean gates.
The rank of the transmission matrix has a fundamental role in assessing the learning abilities of the wave.
For a given set of training points, a threshold nonlinearity for universal interpolation exists. When considering the nonlinear Schrödinger equation, the use of highly nonlinear regimes implies that solitons, rogue, and shock waves do have a leading role in training and computing. Our results may enable the realization of novel machine learning devices by using diverse physical systems, as nonlinear optics, hydrodynamics, polaritonics, and Bose-Einstein condensates. The application of these concepts to photonics opens the way to
a large class of accelerators and new computational paradigms. In complex wave systems, as multimodal fibers, integrated optical circuits, random, topological devices, and metasurfaces, nonlinear waves can be employed to perform computation and solve complex combinatorial optimization.

The paper was selected as Editors’Suggestion and Featured in Physics

See also

https://arxiv.org/abs/1912.07044

Minimizing large-scale Ising models with disorder and light: the “classical-optics advantage”

Since the 80s we know how to build optical neural networks that simulate the Hopfield model, spin-glasses, and related. New developments in optical technology and light control in random media clearly demonstrate the “optical advantage,” even while limiting to the good old classical physics.

Scalable spin-glass optical simulator

Many developments in science and engineering depend on tackling complex optimizations on large scales. The challenge motivates an intense search for specific computing hardware that takes advantage of quantum features, stochastic elements, nonlinear dissipative dynamics, in-memory operations, or photonics. A paradigmatic optimization problem is finding low-energy states in classical spin systems with fully-random interactions. To date, no alternative computing platform can address such spin-glass problems on a large scale. Here we propose and realize an optical scalable spin-glass simulator based on spatial light modulation and multiple light scattering. By tailoring optical transmission through a disordered medium, we optically accelerate the computation of the ground state of large spin networks with all-to-all random couplings. Scaling of the operation time with the problem size demonstrates an optical advantage over conventional computing. Our results provide a general route towards large-scale computing that exploits speed, parallelism, and coherence of light.

arXiv:2006.00828

The Game of Light

In memoriam: John Horton Conway

In 1970 an article by Martin Gardner appeared in Scientific American disclosing for the first time a “game” invented by John H. Conway: a matrix of ones and zeros changes with time according to simple rules inspired by biology. Cells (ones) survive or die because of overpopulation, or starvation. The simple rules surprisingly generate a variety of binary animals, named gliders, blocks, and spaceships, among others. By pen and paper, Conway demonstrated that complex dynamics spontaneously emerge in the game. Ultimately, Conway’s Game of Life turned out to be a universal Turing machine, and it is the most famous example of Cellular Automaton.

I was deeply inspired by the possibility of generating complexity with simple rules, like many others before me. In more than 50 years, Conway’s Game of Life inspired generations of scientists. “Life” is at the inner core of ideas that pervade nowadays machine learning, evolutionary biology, quantum computing, and many other fields. It also connects to the work of Wolfram and the development of Mathematica.

I was intrigued by the interaction between light and complexity and I wanted to combine the Game of Life with electromagnetic fields. I report below my original post on the topic (dating back to 2008). The article was rejected by many journals and finally published in a book dedicated to the 50 years of the GOL ( Game of Life Cellular Automata, Springer 2010).

The Enlightened Game of Life (EGOL)

The link between light and the development of complex behavior is as subtle as evident. Examples include the moonlight triggered mass spawning of hard corals in the Great Barrier, or the light-switch hypothesis in evolutionary biology, which ascribes the Cambrian explosion of biodiversity to the development of vision. Electromagnetic (EM) radiation drastically alters complex systems, from physics (e.g., climate changes) to biology (e.g., structural colors or bioluminescence). So far the emphasis has been given to bio-physical, or digital, models of the evolution of the eye with the aim of understanding the environmental influence on highly specialized organs. In this manuscript, we consider the way the appearance of photosensitivity affects the dynamics, the emergent properties and the self-organization of a community of interacting agents, specifically, of cellular automata (CA).

Quick and dirty implementation of the EGOL in a Python Notebook

https://github.com/nonlinearxwaves/gameoflife.git

Optimal noise in Ising machines

Ising machines are novel computing devices for the energy minimization of Ising models. These combinatorial optimization problems are of paramount importance for science and technology, but remain difficult to tackle on large scale by conventional electronics. Recently, various photonics-based Ising machines demonstrated ultra-fast computing of Ising ground state by data processing through multiple temporal or spatial optical channels. Experimental noise acts as a detrimental effect in many of these devices. On the contrary, we here demonstrate that an optimal noise level enhances the performance of spatial-photonic Ising machines on frustrated spin problems. By controlling the error rate at the detection, we introduce a noisy-feedback mechanism in an Ising machine based on spatial light modulation. We investigate the device performance on systems with hundreds of individually-addressable spins with all-to-all couplings and we found an increased success probability at a specific noise level. The optimal noise amplitude depends on graph properties and size, thus indicating an additional tunable parameter helpful in exploring complex energy landscapes and in avoiding trapping into local minima. The result points out noise as a resource for optical computing. This concept, which also holds in different nanophotonic neural networks, may be crucial in developing novel hardware with optics-enabled parallel architecture for large-scale optimizations.

arXiv:2004.02208

Published in Nanophotonics

Noise-enhanced spatial-photonic Ising machine

See also

Large scale Ising machine by a spatial light modulator