creint wrote:The reducing part for large grids is very slow 0.8 sec for each given, that can be improved to probably average 1000 givens/sec or faster when you only want known easy strategies.
For fast detection of the easy strategies I use several arrays of bitvectors. The array rc_n for example returns for a given row r and a column c a bitvector rc_n[r,c] which holds the candidates for this cell. The array rn_c returns for for a given row r and a candidate n a bitvector rn_c[r,n] which holds the columns where the candidate n occurs in row r etc.
Adding a clue is easy since there is no ambiguity which candidates have to be deleted in the bitarrays. But for the reducing process you have to
remove clues. If you remove a clue you cannot just reverse the process and set the corresponding bits in the arrays.
So what I do in the moment is to completely recompute the arrays after a single clue is removed (so I start with an empty grid and add again all remaining clues) which of course is very timeconsuming in the case of large grids with tens of thousends clues. Almost all of the 0.8 s goes here - I did not notice this before. Faster approaches are possible and I already have something in mind which should not be too complicated to implement. Thanks for the suggestion.