One claim which appears in this thread is that full subset technique seems

to be exponential. Does anybody know a proof? Sometimes divide

and conquer algorithms solve problems which seem exponential at the

first look.

Michael

53 posts
• Page **4** of **4** • 1, 2, 3, **4**

One claim which appears in this thread is that full subset technique seems

to be exponential. Does anybody know a proof? Sometimes divide

and conquer algorithms solve problems which seem exponential at the

first look.

Michael

to be exponential. Does anybody know a proof? Sometimes divide

and conquer algorithms solve problems which seem exponential at the

first look.

Michael

- algernon
**Posts:**25**Joined:**26 June 2006

For sudoku analysis it is irrelevant whether some technique is NP-complete or polynomial. Sudoku has 81 cells and the problem of NP-completeness only arises if we attempt to make bigger and bigger sudoku. Not so much have this in mind. The number of subsets in sudoku is a constant and not NP-complete.

Pyrrhon

Pyrrhon

- Pyrrhon
**Posts:**240**Joined:**26 April 2006

Pyrrhon wrote:For sudoku analysis it is irrelevant whether some technique is NP-complete or polynomial. Sudoku has 81 cells and the problem of NP-completeness only arises if we attempt to make bigger and bigger sudoku. Not so much have this in mind. The number of subsets in sudoku is a constant and not NP-complete.

I share the point of view of castalia, who claims "polynomial/exponential"

is a legitimate distinction between techniques, which seem appropriate

for humans, and such, which might not (T&E?).

If I would follow your point of view, then wouldn't backtracking (which

is fast enough on 81 cells) be the best solution anyway and further

discussion about smarter techniques/algorithms unnecessary?

Larger boards (e.g. 25x25, ok, strictly speaking _not_ sudoku) make

full subset search (let aside full fish search) even more painfull and

perhaps a smarter algorithm would help with these not-so pathological

cases. I think this forum is not the best place to discuss large boards,

but the question about "full subset search" and its exponential nature

was made in this thread, so I did not bother to follow up on this.

Michael

- algernon
**Posts:**25**Joined:**26 June 2006

Yes, we don't speak about larger boards. An even this is the reason why it is not of interest whether the complexity of a technique is exponential, NP-complete or whatever. A linear function can have a greater value as a polynomial function or an exponential if you calculate it for a special x, here 81 cells. The complexity of a 9-link double forcing chain is a constant if you take it for a constant board size. Neither exponential nor NP-complete, it's a constant. It is the same as in the case of hidden singles. But the constant is much bigger for the 9-link chains. And so we look first for hidden singles.

Pyrrhon

Pyrrhon

- Pyrrhon
**Posts:**240**Joined:**26 April 2006

Pyrrhon wrote:A linear function can have a greater value as a polynomial function or an exponential if you calculate it for a special x, here 81 cells.

I agree. Consider simply the formula with N=9 for 5*2^N=2560 or 300*N^2=24300. Its a matter of the real size of the constants in a given problem. The complexity consideration is only for approaching huge values of N.

The complexity of a 9-link double forcing chain is a constant if you take it for a constant board size.

I would even go further and say, its a constant for any board size. The implication reached with a forcing chain is actually an implication for a given cell, wether N=9 or N=100. Of course, sometimes you can base more eliminations on the implication on larger boards, but that is another issue.

Moschopulus wrote:

I thought that backtracking is a *systematic* way of running through all possible candidate values in all cells. There are different ways to do this systematically, but the key fact is that there is a systematic procedure for trying each possibility, and going back a step when an impossible situation is reached. A counter is then increased by 1, etc.

No. Backtracking has nothing to do with complexity, it is just one possibility of an algorithm. As far as I know, there is already proof that every algorithm which is expressible in backtracking way is also in a non-backtracking way.

Backtracking can be done exponentially and polynomially, depending on the smartness of your algorithm. Although this is no example of an exponential algorithm, consider the game "guess a number between 1 and 100", and I tell you only if the number is greater or lesser than the solution. You could guess "is it 1?", "2?", "3" etc.(trying systematically!), until you reach the correct number, resulting in O(n)=n/2, or you could continuosly take the middle "50?", if greater, "75?", if lesser, "25? etc., resulting in O(n)=log2(n), which is for n=100 no real feat, but consider sorting algorithms with some billions of entries as in search machines like google.

And systematicity has nothing to do with it, too. After all, you could "systematically" choose a random cell, check it, mark it for being checked, and then choose of the remaining unmarked cells another random cell..., and continue.

Ok, a true random and sometimes very human fashion is to choose a cell, follow a way until a dead end seems to be reached, choose other cells and do the same, and then start over again at the first cell, looking for something you have missed...

But, I'd still like to stress the point John has made about looking for polynomial order algorithms to solve every standard sudoku, because that would have quite some implications, given that sudokus are NP-complete. If such algorithms can solve every standard sudoku in polynomial time, they can do it for every bigger sudoku, too, and that would prove that every NP-complete problem can be solved in polynomial time, because there is already proof that every NP-complete problem can be transformed in every other NP-complete problem...

Maria

- maria45
**Posts:**54**Joined:**23 October 2005

maria45 wrote:Moschopulus wrote:

I thought that backtracking is a *systematic* way of running through all possible candidate values in all cells. There are different ways to do this systematically, but the key fact is that there is a systematic procedure for trying each possibility, and going back a step when an impossible situation is reached. A counter is then increased by 1, etc.

No. Backtracking has nothing to do with complexity, it is just one possibility of an algorithm. As far as I know, there is already proof that every algorithm which is expressible in backtracking way is also in a non-backtracking way.

Backtracking can be done exponentially and polynomially, depending on the smartness of your algorithm.

I think you may be grouping 9x9 sudoku with all known problems of order N.

Complexitiy is characteristic of backtracking search solutions in the literature.

Google "backtracking complexity" to see examples like O(N) nodes on average,

O(2**N) worst case. There are some problems that are exponential by nature,

and, modulo P==NP, there will never be a polynomial equivalent algorithm.

Many problem classes admit heuristics that prune tree search effectively,

and, with some given probability, may flirt with polynomial complexity, but

if the problem class is exponential there will be some class instances that

require exponential time to solve.

The heuristics for 9x9 sudoku would be the techniques discussed on this and

other sudoku forums. These techniques work well and efficiently for the most

part, but even for 9x9 there are examples that foil the heuristics and cause

bad backtrack behavior.

Let N=9 for 9x9 sudoku. As N increases the effectiveness of the heuristics

diminishes and the complexity of computing each heuristic increases,

eventually reaching values of N where some heuristics provide diminishing returns.

At some value of N you will arrive at a pedestrain {forward,backmark,etc.}

backtracking tree search with a complexity expression range O(some-function-of-N)

through O(some-other-function-of-N) so that your

backtracking solution can be compared with others in the literature.

Or you may be able to transform an NxN sudoku problem instance into another

problem class, e.g., constraint satisfaction, where solution complexity is

well documented.

maria45 wrote:And systematicity has nothing to do with it, too.

Here systematic does not mean in some specific order, it means exhaustive,

in the sense that in some problems instances one will not be able to

draw a conclusion until all possibilities have been exhausted.

maria45 wrote:But, I'd still like to stress the point John has made about looking for polynomial order algorithms to solve every standard sudoku, because that would have quite some implications, given that sudokus are NP-complete. If such algorithms can solve every standard sudoku in polynomial time, they can do it for every bigger sudoku, too, and that would prove that every NP-complete problem can be solved in polynomial time, because there is already proof that every NP-complete problem can be transformed in every other NP-complete problem...

Don't get out your tux/gown upon arriving at a poylnomial solution for all

9x9 sudoku (although proof of that would be a neat achievement).

NxN sudoku solving is NP-complete. Once you fix N all P=NP awards are voided.

- gsf
- 2014 Supporter
**Posts:**7306**Joined:**21 September 2005**Location:**NJ USA

I wrote a Sudoku Solver too. It is written in Java and it uses a backtracking algorithm with optimized guessing (It is very fast). It also tells if the sudoku has more than one solution. It is online and it is free. You'll need the Java plugin installed in your browser. Envoy: Jon's Sudoku Solver

- Jon
**Posts:**6**Joined:**22 August 2006

53 posts
• Page **4** of **4** • 1, 2, 3, **4**