Boson sampling solitons by quantum machine learning

https://arxiv.org/abs/2110.12379

We use a neural network variational ansatz to compute Gaussian quantum discrete solitons in an array of waveguides described by the quantum discrete nonlinear Schroedinger equation. By training the quantum machine learning model in the phase space, we find different quantum soliton solutions varying the number of particles and interaction strength. The use of Gaussian states enables measuring the degree of entanglement and the boson sampling patterns. We compute the probability of generating different particle pairs when varying the soliton features and unveil that bound states of discrete solitons emit correlated pairs of photons. These results may have a role in boson sampling experiments with nonlinear systems and in developing quantum processors to generate entangled many-photon nonlinear states.

Quantum machine learning and boson sampling

Training Gaussian boson sampling by quantum machine learning

published in Quantum Machine Intelligence 3, 26 (2021)

Pseudocode

We use neural networks to represent the characteristic function of many-body Gaussian states in the quantum phase space. By a pullback mechanism, we model transformations due to unitary operators as linear layers that can be cascaded to simulate complex multi-particle processes. We use the layered neural networks for non-classical light propagation in random interferometers, and compute boson pattern probabilities by automatic differentiation. This is a viable strategy for training Gaussian boson sampling. We demonstrate that multi-particle events in Gaussian boson sampling can be optimized by a proper design and training of the neural network weights. The results are potentially useful to the creation of new sources and complex circuits for quantum technologies.

https://doi.org/10.1007/s42484-021-00052-y

Code for multilevel quantum gates now available on Github

We made available our Python and TensorFlow code about machine learning design of multilevel quantum gates with reservoir computing

GitHub Repository

See also

Phase space machine learning for multi-particle event optimization in Gaussian boson sampling

We use neural networks to represent the characteristic function of many-body Gaussian states in the quantum phase space. By a pullback mechanism, we model transformations due to unitary operators as linear layers that can be cascaded to simulate complex multi-particle processes. We use the layered neural networks for non-classical light propagation in random interferometers, and compute boson pattern probabilities by automatic differentiation. We also demonstrate that multi-particle events in Gaussian boson sampling can be optimized by a proper design and training of the neural network weights. The results are potentially useful to the creation of new sources and complex circuits for quantum technologies.

https://arxiv.org/abs/2102.12142

Official code