Article 2NG4X UK team indicates over 50 photons needed for Boson Sampling Quantum Supremacy

UK team indicates over 50 photons needed for Boson Sampling Quantum Supremacy

by
brian wang
from NextBigFuture.com on (#2NG4X)

Scott Aaronson talked about the China quantum computer researchers achieving 5 photon boson sampling and the theoretical work of a Bristol team.

The Bristol give numerical evidence that BosonSampling, with n photons and m modes, can be approximately simulated by a classical computer in "merely" about n2n time (that is, the time needed to calculate a single ni-n permanent), as opposed to the roughly mn time that one would need if one had to calculate permanents corresponding to all the possible outcomes of the experiment. As a consequence of that, they argue that achieving quantum supremacy via BosonSampling would probably require at least ~50 photons-which would in turn require a "step change" in technology, as they put it.

c247760fa48bd8a8df97bdeb5e3c931a.png

Scott Aaronson completely agree with the Bristol group's view of the asymptotics. Alex Arkhipov and Aaronson repeatedly told experimentalists, in their papers and talks about BosonSampling, that the classical complexity of the problem should only be taken to scale like 2n, rather than like mn. Despite not having a general proof that the problem could actually be solved in ~2n time in the worst case, we said that for two main reasons:

* Even under the most optimistic assumptions, our hardness reductions, from Gaussian permanent estimation and so forth, only yielded ~2n hardness, not ~mn hardness.
* If Aaronson BosonSampling matrix is Haar-random-or otherwise not too skewed to produce outcomes with huge probabilities-then it's not hard to see that we can do approximate BosonSampling in O(n2n) time classically, by using rejection sampling.

The Bristol group have an Arxiv preprint - No imminent quantum supremacy by boson sampling.

It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of photons in linear optics, which has sparked interest as a rapid way to demonstrate this quantum supremacy. Photon statistics are governed by intractable matrix functions known as permanents, which suggests that sampling from the distribution obtained by injecting photons into a linear-optical network could be solved more quickly by a photonic experiment than by a classical computer. The contrast between the apparently awesome challenge faced by any classical sampling algorithm and the apparently near-term experimental resources required for a large boson sampling experiment has raised expectations that quantum supremacy by boson sampling is on the horizon. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. While the largest boson sampling experiments reported so far are with 5 photons, our classical algorithm, based on Metropolised independence sampling (MIS), allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. We argue that the impact of experimental photon losses means that demonstrating quantum supremacy by boson sampling would require a step change in technology.

advancednano?d=yIl2AUoC8zA advancednano?d=dnMXMwOfBR0 advancednano?i=QqIdH4ZBw_o:jxMc6kuUxVE:F advancednano?i=QqIdH4ZBw_o:jxMc6kuUxVE:w advancednano?i=QqIdH4ZBw_o:jxMc6kuUxVE:V advancednano?d=l6gmwiTKsz0 advancednano?i=QqIdH4ZBw_o:jxMc6kuUxVE:g advancednano?d=qj6IDK7rITs advancednano?i=QqIdH4ZBw_o:jxMc6kuUxVE:KQqIdH4ZBw_o
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/blogspot/advancednano
Feed Title NextBigFuture.com
Feed Link https://www.nextbigfuture.com/
Reply 0 comments