« Ego corner | Main | Message, Sponsors, etcetera »

Anyone know if this is for real?

An Optical Solution For The Traveling Salesman Problem

They're claiming an O(NN) solution to a problem that is NP-complete. According to the abstract:

We introduce an optical method based on white light interferometry in order to solve the well-known NP–complete traveling salesman problem. To our knowledge it is the first time that a method for the reduction of non–polynomial time to quadratic time has been proposed. We will show that this achievement is limited by the number of available photons for solving the problem. It will turn out that this number of photons is proportional to NN for a traveling salesman problem with N cities and that for large numbers of cities the method in practice therefore is limited by the signal–to–noise ratio. The proposed method is meant purely as a gedankenexperiment.

This is either wrong, or (as Ken MacLeod put it) "we're all doomed". Anyone want to give it a read-through and tell me if it's plausible but impractical, implausible, or plausible and probably capable of being implemented?

|


60 Comments

user-pic
1:

heh, I'm not really qualified to say if it's legit or not, but, I was wondering what exactly the issue is if it works. Is there some type of encryption that this would obsolete? does it obsolete humans or something?

2:

Nor am I qualified, but I have to wonder - have you saved and printed a copy yet, just in case?

3:


It's very clever, and one could actually build such a thing to solve TSPs for small N, but that it would fail considerably before N = 50 or so due to thermal noise. No cryptography broken. No squamose or rugrose beasties summoned.

4:

I'm not an expert in quantum computing (or "wave-optical computing" as the authors call it), but it looks like a plausible algorithm that's far too difficult to implement. They seem to be taking advantage of some quantum-mechanical features to get it to work. Not sure how quantum computing fits into the P=NP debate though :).

user-pic
5:

N^N photons is rather a lot[1], even assuming working at the level of individual photons and perfect lossless elements (real detectors aren't, real fibres aren't), further you suddenly have to cope with this fundamentally being an approximate analogue solution[2] *and* the fact that the frankly crazy setup time of such a system is handwaved.

Cute, but a gedankenexperiment, as they say.

[1] Particularly when you consider heating effects of such a system with that much energy pumped into it and the errors inherent in thremal expansions would be truly painful.

[2] Regrettably, even interference effects have a fundamental width. The interference or no interference is neatly in the category of 'lies to children'.

user-pic
6:

Quantum? no. This is about as purely classical as it gets :)

7:

While I haven't checked this out too thoroughly yet, the typical problem with using interferometry as a quantum computer (which is what they're doing) is that, while it scales well in time, it scales very poorly in space - you typically need some sort of beam splitter for each qubit, for each step in the algorithm. In this case, the algorithm is short, but for any number of cities where the calculation would be meaningful, the number of optical paths required becomes very large.

user-pic
8:

Optical paths scale as O(N^2) for the fully connected network. This isn't about qubits, there's nothing bitty or digital here.

9:

Encryption relies, to a certain extent, on the fact that problems such as the traveling salesman are hard to solve (NP class of problems). If you are able to 'bend' them into an easier form (hey, Maths was never my strong point) then its open season in the loony bin.

M

user-pic
10:

I've been in the pub most of the evening so this may be nonsense, but can't any NP hard algorithm be solved in polynomial time if you can get O(2^N) computers to each try a different solution in parallel? Isn't this a variant on the same idea, pretty much like DNA computing with a vat full of bugs? Ultimately 2^N scales so rapidly that you run out of silicon/bugs/energy to make photons, and then you're back to trying to work with limited resources again and wishing P = NP.

11:

I didn't read the article, but the abstract alone doesn't seem airtight to me. First, NP refers to "non-deterministic polynomial problems", not "non-polynomial problems". The difference is important, since NP describes a set of problems which could be solved in polynomial time, provided you made the right choice at each choice point (hence the common fantasy that a quantum computer, working as an oracle, could just walk all over NP problems). Second, N^N does not qualify as polynomial (so forget about quadratic), and the whole thing goes down in flames even before you finish the abstract. I'll read it and check back if there's anything to it though.

12:

Apparently the method is polynomial in time, but exponential in energy - so not really an improvement over existing methods for large enough N

user-pic
13:

heh, I'm not really qualified to say if it's legit or not, but, I was wondering what exactly the issue is if it works.

Our host is noted for some minor scribblings. You might find the short story "Antibodies" interesting.

14:

Apparently the method is polynomial in time, but exponential in energy ...

Ah, so that's what the short duration GRBs are!

user-pic
15:

*slaps forehead*

And if you look very closely at the way they define the delays of each of the cities so you can very easily tell when you've got a path which goes through each of the cities exactly once (ignoring loops, of course) it's x 2^i + d for each city i. In effect, you've got another exponential term right there, so it's again exponential in fibre size, scanning length and real honest-to-god time type delay.

Almost missed that.

16:

If you allow QC solutions of P = NP, then it has been solved for a while.

Shor's algorithm factors integers in O((log(N))^3) time and O(log(N)) space. Factoring integers is a NP hard problem. (And of course is directly related to lots of encryption algorithms now, including AFAIK all public/private key algorithms).

Grover's algorithm (which is solving a generic NP problem) apparently has been proved to be the best solution for one NP problem. Grover's algorithm does not run in polynomial time.

I haven't seen any serious discussion on what it means that in quantum computing one NP problem (dictionary lookup) cannot be solved in polynomial time, while another (factoring) can. It seems that quantum computers fall somewhere between deterministic and non-deterministic Turing machines.

17:

Extending on the comment by Arthur Chance, the real issue here is a confusion between 'clock time' and 'computational time'.

NP complete problems require (baring a P = NP proof) a certain number of algorithmic operations, which we tend to shorthand as 'time'. Conveniently for us, "stuff" tends to evolve, so we can map our computational needs to physical systems, which are evolving at some rate dictated by physics, and generating "operations" for us. So we get 'clock time', or 'wall time'. If we are very clever, we can map our algorithms to physical systems which evolve very quickly, so we get more operations for the same amount of time; and if we are even more clever, we can map them to many physical systems at the same time.

This solution maps an NP complete problem onto a very large physical system. It is inaccurate to view the photons in the system as storage, rather you should view them as computational elements, and hence, a very very large computer.

This is interesting, not from a math perspective, because it doesn't yield any movement on the P = NP front, but from a computational mapping perspective, because it hints at ways of building systems which generate many fast interacting operations.

It also raises an interesting question about limited oracles. It is not true that this system would only permit you to solve problems up to size N (where N was the size you built the system for). An oracle which can solve a sub-problem can be used to greatly speed up the solution to a larger problem, and at the very least the resulting approximations would be much better than those available without the oracle.

18:

Even if my computer science studies are now part of the distant past, until I read comment #16 I feeled urged to write something about P, NP and non-deterministic machines. Bobby said it all.

19:

The proposed technique is an analog computer. The idea that analog computers might solve NP problems in polynomial time has been around at least since Dewdney (1984), which lists many analog proposals and their flaws. See the entry on "Quantum Computing" in the Stanford Encyclopedia of Philosophy (plato.stanford.edu), especially the section "Physical 'Short-cuts' of Computation: "It thus appears that all these models cannot serve as counter-examples to the physical Church-Turing thesis (as far as complexity is concerned) since they all require some exponential physical resource."

Dewdney A. K. (1984), ‘On the spaghetti computer and other analog gadgets for problem solving’, Scientific American 250(6),19-26.

20:

Quantum Computing expert Scott Aaronson, who recently accepted an MIT faculty position, comments:

http://scottaaronson.com/blog/?p=260#comments

# Scott Says:
Comment #78, posted on August 7th, 2007 at 18:54

Yes. The authors themselves kindly explain in the abstract why their proposal doesn’t actually work:

It will turn out that this number of photons is proportional to N^N for a traveling salesman problem with N cities and that for large numbers of cities the method in practice therefore is limited by the signal–to–noise ratio. The proposed method is meant purely as a gedankenexperiment.

So unless other people start misinterpreting them, I don’t see any need to man the trenches on this one.

21:

Bobby@16 : you're wrong. Factoring is not known to be NP-hard. Quantum computers are not known to solve NP-hard problems. See Shtetl-Optimized.

user-pic
22:

So it's just another hairy smoking golfball?

user-pic
23:

It looks like they have replaced exponential time, with exponential photons - which are much easier to get hold of, but of course have severe engineering problems.

Charlie@14: so yes they might be root attacks on the structure of the universe ;-)

user-pic
24:

Charlie@14: so yes they might be root attacks on the structure of the universe ;-)

Wonderful. I know just enough to know I don't know whether to be nervous or not.

25:

We are, after all, only three weeks from CASE NIGHTMARE GREEN...so I'd be very afraid of anyone who's spotted buying loads of optical splitters..

user-pic
26:

Charlie's comment at #14 needs to be seen as an early contender for Best Blog Comment Hugo.

user-pic
27:

I know just enough to know I don't know whether to be nervous or not.

If it's quantum, why not be both?

user-pic
28:

Charlie @ 14

The short Gamma Bursters are the light-like surfaces of the light cones of those alien civilizations foolish enough to try to perform this experiment. There are some things sentient beings were not meant to know.

user-pic
29:

Peter Turney @ 19

As I understand it (I'm an amateur in this area, so please tell me if I've got the broom by the straws) QC computation gets its power over some NP problems in the same way as these analog computers do (at least in one way of looking at it). In the realm of the operators which represent the transforms used to perform logical operations in a quantum computer, all operators must be unitary and linear. So a quantum computer is basically an analog computer, operating linearly over the phase of a wavefunction up to measurements. Measurement plays the same role in the uncertainty of the solution as does signal to noise ratio in the paper we're discussing because determining the result to higher certainty requires the entire computation be performed over more times.

user-pic
30:

Bruce @ 27: I often wonder about the visible effects of space wars. You would think we might see a few more explossions from time to time. You know, stars that are of the wrong type to go super nova, planets blown up, warp drives gone bad... I think we're alone. Very very alone.


Jeff

user-pic
31:

Bruce Cohen @ 27:

Sentient beings? We are talking Great Old Gods here!

32:

Special-purpose analog computers have long been known to be able to solve problems in less time than classical (Turing-equivalent) computers. The definitions of P and NP are based on Turing-equivalent machines.

Once you drop that restriction, all kinds of interesting special cases show up. For instance, I remember reading ages ago that you can sort in less than O(n log n) time:

1. Machine a bunch of rigid rods to the length of the numbers you wish to sort (with labels) - that's O(n) time.
2. Line them up together, then whack them all on one end with something suitably large (constant time).

3. Presto, the longest one is sticking out the most, and you can pull them all off in order, which is O(n) time again!

For large data sets, this is best done in the asteroid belt using something like Ceres as your hammer, but in principle it works - much like this solution.

user-pic
33:

Ah, Martin Gardner's spaghetti sorter! I think he did that one in '76 or so. But it's a nice perspective on the P!=NP problem; much like the squaring the circle or trisecting an angle, the meat isn't in the construction itself (which is actually rather easily done), it's in the limitation of the tools you are allowed to use. In one sense, the all-but-certain proof of the conjecture is not much more earth-shaking than the fact that the space of L^1 is not complete.

Speaking of Earth-, er, galaxy-, no Universe-shaking problems; along the lines of Charlie's solution to the puzzle of GRB's, I'm guessing that the accelerating universe is caused by pollution.

Yes, you guessed it, it's all those darn negative-energy constructs. Holding wormholes open to provide advanced connectivities for computing structures - ptui, I spit on SOL limitations - is just too tempting. So what if that changes the geometry of the universe from flat to hyperbolic? It's all about the competition - like those cut-your-heart-out-and-sell-it-back-to-you, those sons-of-bitches in the Bootes Consortium. And besides the Magellanic Trust isn't doing anything that other advanced civilizations haven't done first, a long time ago, and a lot more of; let them other guys cut down on their negative pollution and then maybe we'll think about some universe-friendly alternatives as well . . .

34:

Jeremy@21:
Yes, I was wrong. However, factoring *is* known to be NP hard according to any reference I can find on it. For example, http://www.math.niu.edu/~rusin/known-math/98/complexity_factor and the Wikipedia entry in Integer Factorization.

The reason I was wrong is that Shor's algorithm isn't considered to solve factorization in polynomial time. The reason Shor's algorithm isn't considered to solve it in P time, from what I can tell, is just that Shor's is probabilistic. Peter Shor has some comments that allude to that on the link you gave (thanks).

I'm not sure what that really means in a practical sense. If I can find the solution to the problem (factorization) to any desired degree of confidence in polynomial time, while that doesn't demonstrate that all other problems in NP can be solved that way, it certainly doesn't inspire confidence. And of course it means that factorization as a means of encryption is broken, once QC becomes available.

35:

Some quick googling tells me that a photon of visible light averages 1.8eV. 5050 of 'em therefore represents 2.5*1066 Joules of energy. That's, umm, quite a lot.

user-pic
36:

Could you be green with these photons and recycle them?

37:

ScentOfViolets, see David Brin's short story _An Ever-Reddening Glow_, which has pretty much exactly this premise.

38:

I just have to say, threads like this one make the world a better place.

user-pic
39:

I thought I'd share the topic with some more brain power courtesy of sff.net:

Looks reasonable to me. When you read the actual paper, the time it takes this
analog computer to solve the problem goes as N^2, but the number of photons
it takes goes as N^N. So since the computational element is actually the photons,
the resources needed to solve the problem is *still* more than polynomial in
N.

In any case, there's no theorem that an analog computer can't solve NP problems
faster than a Turing machine.

I presume the "we're all dead" comment means that encryption based on factoring
won't be unbreakable. This is because if you can solve one NP problem with
a Turing machine, you can solve all of them. Still, it seems a bit of hyperbole.

--
Geoffrey A. Landis
http://www.sff.net/people/geoffrey.landis

user-pic
40:

I'm sure that Brin is a very good writer, but he's not to my taste; could you describe the plot? Is it economics in the sense of the 'free energy' in Asimov's "The Gods Themselves"? I don't really consider that competitive economics. To bring this around full circle, in Charlie's 'Merchants' series, in one alternate earth, the people are suffering the effects of global warming. But unlike us, they really don't have any alternatives, and they really can't stop either. Makes you wonder just how close we really came to extinction back then; suppose we had no Maxwell, Planck, et al, or that they did not have the organizational support that they did?

41:

They haven't traded exponential time for exponential anything. They've kept exponential time, and added exponential energy and exponential physical space (as opposed to memory space.)

Read carefully, and you'll see that their physical encoding puts an expoentially increasing delay proportional to 2^i, where i is the index of the city, ranging up to N. So ANY round trip between N cities will require a delay of 2^0 + 2^1 + 2^2 + 2^3 + ... + 2^N = 2^(N+1) - 1, in addition to whatever actual delay you incur according to the path.

The straightforward way to introduce that delay is to add physical size, although I suppose there might be some exotic pulse-slowing physics that would let you get away with that. You're still stuck with the delay, though, which scales... say it with me... exponentially with the number of cities.

Other delay-codings, as they note, are possible, but it is not obvious to me that non-exponential codings are always available, nor that they are trivial to construct when they are. Prove that there is always a non-exponential delay-coding available, which can be found in non-exponential time, and you've got something. Otherwise, you can't trade exponential numbers of operations for fewer operations that exponentially farther apart in time and claim you've reduced the time-complexity of the problem.

Bruce Murphy at 15 noticed this as well, I'm just saying it in more words.

42:

...And let's put that in perspective. Let's assume that there is no exotic physics, and the delays are realized through length. Let's further assume that the delay for city index 0 is 1 angstrom.

The total sum of city delays will be 2^N - 1 angstroms (I erred slightly above.) Thus, for a hundred cities, we're talking about 2^100 angstroms, or ~1.25 x 10^17 kilometers. That is more than 817 trips between the sun and the earth. Which is more than 4 days' delay.

Double that for 101 cities.

Etc.

43:

Nix - Unfortunately I'm not willing to pay £100 for Tomorrow Happens.

44:

15 Responses to “Intellectual whack-a-mole�
http://scottaaronson.com/blog/?p=261#comments

by Scott Aaronson et al. on An Optical Solution For The Traveling Salesman Problem

I'm not trying to draw readers away from Mr. Stross; I'd prefer that readers cross-fertilize. After dinner and drinks and a movie, of course.

45:

Thanks, ScentofViolets! I thought it was Martin Gardner, but wasn't sure.

user-pic
46:

I now have this weird mental image of a computer being hurled by a trebuchet, as a weapon of mass destruction.

47:

I've read a similar paper in 2006. The author describes a similar mechanism for the Hamiltonian path problem. The paper was called:

Solving the Hamiltonian path problem with a light-based computer

http://www.cs.ubbcluj.ro/~moltean/nc_oltean.pdf

48:

#46- I know some people with a trebuchet, and if all goes according to plan should have one here in Edinburgh next year or the year after.

49:

All we need to solve this kind of problems is to be able to send information back in time. I'll get right on building such a machine.

user-pic
50:

10: you're basically correct. In fact, you can see just from the bound in the abstract that this is basically what they're doing.

There are going to be N! paths a travelling salesman can take, and the authors say they need O(N^N) photons to do it, which is the tell -- N^N is the leading in term in Stirling's approximation to N!. So they're (conceptually) building an optical system in which they can explore paths in parallel, and if you shoot enough photons through it you can get a fast answer.

51:

This idea is not so new. It has been arround since 2006.

See here:

http://www.cs.ubbcluj.ro/~moltean/optical


I cite from their 2006 paper:

"In this paper we suggest the use of light for performing useful computations. Namely, we propose a special device which uses light rays for solving the Hamiltonian path problem on a directed graph. The device has a graph-like representation and the light is traversing it following the routes given by the connections between nodes. In each node the rays are uniquely marked so that they can be easily identified. At the destination node we will search only for particular rays that have passed only once through each node. We show that the proposed device can solve small and medium instances of the problem in reasonable time."

52:

35 said: "Some quick googling tells me that a photon of visible light averages 1.8eV. 50^50 of 'em therefore represents 2.5*10^66 Joules of energy. That's, umm, quite a lot."

To put this in perspective, you can use E=mc^2 to get a mass-equivalent for this amount of energy. dividing by the speed of light squared, you get a mass of about 10^50 kilograms. This amount of mass will form a black hole if it is confined to a region smaller than that given by the schwarzschild radius, and you can work out what that radius is using the relation

R = 2Gm/c^2

which turns out to be about 10^20 meters, or 47,000 light years, which is a bit bigger than the galaxy. Because nothing can go faster than light, the computation will thus take at least 47,000 years.

Worse still, this number scales linearly with the number of photons, so it scales as N^N. Let's work out how bad it gets for N = 100. (100^100)/(50^50) = 10^115, so for N = 100 the computation would take a minimum of 10^119 years.

This illustrates why this method takes exponential time. This analysis is detailed on Scott Aaronson's blog.

user-pic
53:

@ 40: Scent of Violets sounds like a daemon's name.

Jeff

user-pic
54:

Actually, it's a filtering device. If you know a little physics or math, you should get it.

55:

NP-Complete joke. Ouch.

56:

http://xkcd.com/287/
NP-Complete joke. Ouch.

user-pic
57:

Bobby@34:

No, again, factoring is _not_ NP-hard.

There is a difference between being in NP, and being NP-hard.

Factoring is in NP, which (effectively) means that any factorization can be checked in polynomial time, but thus far, no deterministic polynomial-time factorization algorithm is known.

A problem is NP-hard, on the other hand, if and only if all NP problems can be reduced to an instance of this problem via a polynomial-time reduction. This is a fairly technical definition, but it basically means that an NP-hard problem is as hard, or harder, than any NP problem. Notice that an NP-hard problem does not, itself, need to be in NP.

NP-hard problems that _are_ in NP are called NP-complete. Factorization is in NP. It is believed _not_ to be NP-complete, and hence not NP-hard, either. The reason for this belief is that the best factorization algorithm is, while not polynomial-time, still subexponential time. The best (deterministic) algorithms for NP-complete problems are all exponential time.

Shor's algorithm is an inherently quantum algorithm, and is hence based off of a different computation model, which has its own hierarchy of complexity. The specific (quantum) complexity class that Shor's algorithm and factorization fall into is called BQP. I suggest you look that up in Wikipedia.

58:

49 said: "All we need to solve this kind of problems is to be able to send information back in time. I'll get right on building such a machine."

Not long ago I read that some scientists were about to do just that - see San Francisco Chronicle: http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2007/01/21/ING5LNJSBF1.DTL

"Researchers are on the verge of experiments that will finally hold retrocausality’s feet to the fire by attempting to send a signal to the past. What’s more, they need not invoke black holes, wormholes, extra dimensions or other exotic implements of time travel. It should all be doable with the help of a state-of-the-art optics workbench and the bizarre yet familiar tricks of quantum particles."

For more on retrocausality, see http://en.wikipedia.org/wiki/Retrocausality

59:

Craig@57:
Thanks for the overview of NP, NP hard, NP complete, etc. I did have some fuzziness in my understanding that your comment, combined with some side reading, cleared up.

Does anyone know if there has been significant work to see if BQP includes NP complete?

60:

#59: On NP in BQP

In the wake of a misleading Economist article on the D-Wave quantum computer, Scott [Aaronson] argues why he believes quantum computers cannot efficiently solve NP-complete problems, i.e., that the complexity class BQP does not contain NP.
[truncated]
http://weblog.fortnow.com/2007/02/on-np-in-bqp.html