Cutting edge

April 14, 2000

Computational complexity can be a friend or an enemy. Either way, we need radical ideas to gain a better understanding of it.

Computer science plays an increasingly important role in everyday life and in all areas of science, technology and business. And yet computers are not all powerful. Alan Turing, a British mathematician, was one of the first people to become aware of this fact in the 1930s, a time when computers did not yet exist in the modern sense of the word.

Turing showed that some problems are "undecidable" - that is to say that they cannot be solved in a "mechanical" manner (by an algorithm designed for an electronic calculator or any other conceivable system: Turing's result remains valid whatever the technology used). For example, it cannot be determined whether a given programme will stop one day, or continue calculating for ever.

The ideas of Turing and other mathematicians concerned with basic principles of computation had a significant influence on the design of the first computers. With the arrival of computers, it was realised that it was not enough for a problem to be solvable in the Turing sense of the word to be solvable in practice: the calculation time necessary could increase too quickly with the quantity of the data. For example, it is very easy to factorise a whole number of 10 digits, quite easy to factorise a whole number of 100 digits, but in the current state of technology we cannot factorise a whole number of 1,000 digits. On the other hand, deciding if a whole number of 1,000 digits is a multiple of 17 presents no problems.

The concern of researchers now spreads from solvability to computational complexity; it is about classifying problems depending on their degree of difficulty.

Computational complexity can be seen as a friend and an enemy. It is a friend because public-key cryptography, an essential tool for the development of electronic commerce, rests on certain, reputedly difficult, problems. If we found an algorithm that was effective for factorising whole numbers, most of the encryption systems now in use would collapse.

But computational complexity is also an enemy because in practice it is an obstacle to the resolution of numerous important problems. It is especially the case for what we call NP-complete problems, among which we can mention the famous travelling salesman problem. Specialists have identified several hundreds that occur in practically all fields of science and technology. For these problems, we must be satisfied with approximate solutions rather than exact ones, or be satisfied with dealing with specific cases.

With my team, I proposed a project called "Logic and algebraic complexity", which has research ministry funding of E30,500 (Pounds 18,500) over three years. Our aim is to understand better the complexity of algorithms that use their data with only a limited number of elementary operations. For example, many algorithms only use numeric data with the help of arithmetical and comparative operations. We hope that it will be easier to understand complexity for this particular category than for the algorithms that use their data in a totally arbitrary way. However, one day we will have to consider the general case.

The present consensus is that this problem cannot be solved with the techniques currently known, and that radical new ideas are needed.

Pascal Koiran is a researcher with the CNRS, France's National Centre for Scientific Research, and leader of the team researching logic and algebraic complexity at the laboratory of computer science of parallelism, Ecole Normale Superieure de Lyon.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored