Exponential improvement in combinatorial optimization by hyperspins

Classical or quantum physical systems can simulate the Ising Hamiltonian for large-scale optimization and machine learning. However, devices such as quantum annealers and coherent Ising machines suffer an exponential drop in the probability of success in finite-size scaling. We show that by exploiting high dimensional embedding of the Ising Hamiltonian and subsequent dimensional annealing, the drop is counteracted by an exponential improvement in the performance. Our analysis relies on extensive statistics of the convergence dynamics by high-performance computing. We propose a realistic experimental implementation of the new annealing device by off-the-shelf coherent Ising machine technology. The hyperscaling heuristics can also be applied to other quantum or classical Ising machines by engineering nonlinear gain, loss, and non-local couplings.

Hyperscaling in the coherent hyperspin machine

https://arxiv.org/abs/2308.02329

Biosensing with free space whispering gallery mode microlasers

Highly accurate biosensors for few or single molecule detection play a central role in numerous key fields, such as healthcare and environmental monitoring. In the last decade, laser biosensors have been investigated as proofs of concept, and several technologies have been proposed. We here propose a demonstration of polymeric whispering gallery microlasers as biosensors for detecting small amounts of proteins down to 400 pg. They have the advantage of working in free space without any need for waveguiding for input excitation or output signal detection. The photonic microsensors can be easily patterned on microscope slides and operate in air and solution. We estimate the limit of detection up to 148 nm/RIU for three different protein dispersions. In addition, the sensing ability of passive spherical resonators in the presence of dielectric nanoparticles that mimic proteins is described by massive ab initio numerical simulations.

https://doi.org/10.1364/PRJ.477139

3D+1 Quantum Nonlocal Solitons with Gravitational Interaction

https://arxiv.org/abs/2202.10741

Nonlocal quantum fluids emerge as dark-matter models and tools for quantum simulations and technologies. However, strongly nonlinear regimes, like those involving multi-dimensional self-localized solitary waves (nonlocal solitons), are marginally explored for what concerns quantum features. We study the dynamics of 3D+1 solitons in the second-quantized nonlocal nonlinear Schroedinger equation. We theoretically investigate the quantum diffusion of the soliton center of mass and other parameters, varying the interaction length. 3D+1 simulations of the Ito partial differential equations arising from the positive P-representation of the density matrix validate the theoretical analysis. The numerical results unveil the onset of non-Gaussian statistics of the soliton, which may signal quantum-gravitational effects and be a resource for quantum computing. The non-Gaussianity arises from the interplay of the quantum diffusion of the soliton parameters and the stable invariant propagation. The fluctuations and the non-Gaussianity are universal effects expected for any nonlocality and dimensiona

To Python or not to Python, to C++ or not to C++ (with MATLAB, MPI, CUDA, and FORTRAN)

Programming in C is the best for scientific computing.

You certainly disagree; tools like MATLAB increase productivity.

I remember when I started working as a researcher, with a lot of computing. I was facing with my professors used to dinosaurs like FORTRAN or C. MATLAB was not used too much, it was more than 20 years ago!

But MATLAB is a very professional tool, it works like a charm, and many scientific papers are done by MATLAB.

Nowadays, however, most of the students love and want to use Python, or more fashionable things like Julia or R. Python is a beauty, and it is a must for machine learning. Python is free (but many universities give access to MATLAB to the students). But I do not find it very professional. You continuously need to tweak the code, or install a missing package, or -worst of all- check with the filesystem permissions or access, because it is an interpreted language. MATLAB is also interpreted, but its ecosystem is stable and well-integrated in operating systems like Windows, OSX, or Linux. Of more than 200 papers that I wrote, only in one so far I used a Python code.

At the end of the day, for professional pictures, I use MATLAB (with some help from Illustrator or Powerpoint or Gimp, etc.). Many codes in my papers are written in MATLAB, as in the recent work on neuromorphic computing with waves. Also, the deep learning toolbox of MATLAB is valuable.

I made some papers on parallel computing, mainly by the MPI protocol. In the beginning, for MPI, I used FORTRAN, but lately (nearly 15 years ago) I switched to C++. I am still writing codes with MPI and C++, and I am practicing CUDA. You can use CUDA in Python, but you understand CUDA only by C++.

But as far as I enter in the details of a code (and I age), improve or optimize, I realize that I am progressively switching back to C. Just pure C (sic!). The reason is that at a lower programming level, I have better control of what the code does, I can understand better the side effects of some routine. In C, dealing with complex variables, or arrays is more clear to me, despite being much more complicated (but using pointers makes you feel a lot smarter!).

As a side effect, the code is simpler to read and understand, but much less cool and modern. Even if, I have to admit that maintaining a code in C++ is still more efficient for me, with respect to FORTRAN or to C, Notably enough, my last FORTRAN paper is dated 2017!

I am not a boomer, so you cannot say “ok boomer”, but I think that this python-mania is not the best for scientific computing and is not the best for students. It not only an issue of speed (and obviously C is the fastest, with FORTRAN as a good competitor). It is also a matter of how to learn to write programs for scientific computing from scratch. For me, to learn and practice scientific computing, the best is still a wise combination of C and C++, with MATLAB for visualization! While Python (and TensorFlow and all of that) gives its best in machine learning.

Docker, mpi, fftw, fftw-mpi

Docker enables to create containers for your program with all the libraries installed.

This avoids to reinstall all the libraries (say mpich, fftw…) to any user and in new systems

The user just needs to pull the container from a repository. For example nonlinearxwaves/base

I write C++ scientific computing programs with mpich, fftw-mpi and random numbers libraries (as sprng5), which I need to run in both windows and linux systems. Docker simplifies a lot the deployment but also the development of the code.

nonlinearxwaves/base is a container with all of that

After installing Docker you run

docker login

Then you pull the docker image

docker pull nonlinearxwaves/base:0.1

You list the available images with

docker images -a

You identify the image id (in this example it is ec56f7250d5a)

REPOSITORY             TAG                 IMAGE ID            CREATED             SIZE
nonlinearxwaves/base 0.1 ec56f7250d5a 42 hours ago 1.13GB

You run the image with (you must replace the image id with your image id)

docker run -i -t ec56f7250d5a 

And you are in a shell with all the libraries installed and you may compile and run your mpi application in the usual way. In this image you will be the user “user”

user@2ff281ad4621:~$

The number 2ff281ad4621 is the container id that is now running (similar to a virtual machine)

This works with Windows and Linux (and also Mac, but I did not test)

You may also create your images with the Dockerfile

Is docker fast ? or is it better not to use a container? we will test …