A. Kadyrov (University of Surrey, U.K.), IAPR Newsletter, Vol. 22, No.2, 2000

The author proposes that geometric fitting should be considered more extensively than it currently is. The main thesis of this book is more careful usage of the conventional least square method. Suppose we know that a point must be on a curve but instead we get another point p' which is near the curve. How can we find the proper point p which is on the curve? Obviously, we can find the nearest point p which is on the curve and consider it as our decision. This observation is the key element of the book. Another idea is that the curve locally can be substituted by a segment to simplify the projection procedure. however, the latter is less used in the book as most of the tasks considered involve linear restraints only.

One can say that correction has been performed since the point p' is changed to a correct point p on the curve. There are many tasks for geometric correction: two points must coincide but they do not: a point must be on a line but it is not; two vectors of orientation must be orthogonal but they are not; a point must be on a conic surface but it is not. In all cases, correction has to be applied. In the book it is proposed that first a system of equations has to be written which define a surface S in a multi-dimensional space. Then the data vector has to be projected onto this surface S. The multi-dimensional data vector can be thought of as a fuzzy ball since the co-ordinates are not assumed to be precise. Substituting the fuzzy ball by a fuzzy ellipsoid, the Mahalanobis distance and Mahalanobis projection are involved and this leads to extensive usage of the probability terminology.

Consider the simplest case. For example, we expect that two points (x',y') and (x'',y'') represent one point. Let us draw the equations x_1 = x_2 and y_1 = y_2 which define a 2D surface S in 4D space and then project the data vector (x',y',x'',y'') onto this surface. The result which we get in this way is not surprising, it is (1/2)(x'+x'',y'+y'') (page 145).

A more complicated task comes when the equations have unknown parameters. For example, a line needs to be drawn through N points. A parameter---the normal vector to the line---is involved in the equations. All the results for all possible normal vectors can be found analytically (``correction stage'') and then the best fitting line can be chosen (``estimation stage'', page 211). This approach also yields the conventional result which is the mechanical principal axis of the set of the given N points (page 224).

The main tool of the book is linear algebra. I cannot agree with the statement that the ``treatment is very different from traditional statistics'' (page 451). The author does not miss any chance to introduce new variables and write down any possible relationships, something that makes the book difficult to read and not very useful for finding answers to even simple questions. Terms like manifold, tangent space, transversality, realistic decision, optimal decision are used without need be. The author does not go into details of image processing techniques such as edge detection, stereo matching, feature point tracking and shape from shading. ``This book does not deal with outlier detection at all'' (page 25). Therefore, the scope of the book is rather narrow. But it can be recommended as a guide for correct and careful usage of the conventional least square method in image processing.