real-time corner detection algorithm for motion estimation

9
Real-time corner detection algorithm for motion estimation Han Wang* and Michael Brady+ This paper presents a fast algorithm for corner detection based on the observation that the total curvature of the grey-level image is proportional to the second order directional deriva- tive in the direction tangential to edge normal, and inversely proportional to the edge strength (norm of the edge normal). This algorithm simply takes the difference of the second tangential derivative with the edge strength, where the first term is the comemess measurement and the second is called a false comer suppression. A subpixel addressing mechanism (called linear interpolation) is utilized for intermediate pixel addressing in the differentiation step, which results in improved accuracy of comer localization and reduced compu- tational complexity. The analysis of comer dislocation leads to a subpixel implementation. The comer tinder is implemented on a hybrid parallel processor PARADOX with a perfor- mance of 14 frames/s for the vision algorithm Droid. Keywords: feature detection, motion estimation, parallel algorithm, corner thding INTRODUCTION A corner detector detects and localizes isolated events in a grey-level image and, together with edge detection, forms an important part of feature extraction’-5. Many commonly-used edge operators (for example Canny6 and LOG (Laplacian of Gaussian)7) fail to detect junctions, while these errors can be compensated by a separate corner detector. In some structure-from-motion vision algorithms, comer detection provides the fundamental data for 3D information reconstruction8-10. These algo- rithms are sometimes used for 3D object tracking’ or for *School of Electrical and Electronic Engineering, Nanyang Technolo- gical University, Singapore 2263 (email: [email protected]) +Robotics Research Group, Oxford University, 19 Parks Road, Oxford OX1 3PJ, UK Paper received: 19 July 1994; revised paper received: 9 January 1995 real-time tasks, such as robot navigation”, where both accuracy and speed are key requirements. Consistency: most importantly, if comers are to be used as features upon which subsequent processing is to be based, they must be detected consistently. Accuracy: the corner should be located precisely; errors in location are magnified in structure-from- motion projection into 3D space. Complexity: speed is a prime requirement for real- time jobs such as robot navigation. Not only can reduced algorithmic complexity contribute to faster implementation, but parallel technology can also provide an order of magnitude (or more) speedup. The first criterion provides a measure of detection, while the second measures localization. These two measurements are often conflicting, as good detection (e.g. by coarse Gaussian smoothing) often leads to poor - localization6. We shall discuss the new comer detector around these two issues, and suggest an approach to combining them, achieving optimal trade-off. BACKGROUND Although there has been intense study of edge detection, there are far fewer reports in the literature about comer detection (e.g. Rosenfeld and Kitchen12, Zuniga and Haralick’). NagelI proposed a comer model as a pixel, where (VZ12 is maximum, hence V((VZ12) = 0, or V(@$+ (@) . = 0. Expanding, using the chain rule, he finds: (i gj(gj=o 0262-6856/95/$09.50 0 1995 Elsevier Science B.V. All rights reserved Image and Vision Computing Volume 13 Number 9 November 1995 695

Upload: han-wang

Post on 28-Aug-2016

215 views

Category:

Documents


3 download

TRANSCRIPT

Real-time corner detection algorithm for motion

estimation

Han Wang* and Michael Brady+

This paper presents a fast algorithm for corner detection based on the observation that the total curvature of the grey-level image is proportional to the second order directional deriva- tive in the direction tangential to edge normal, and inversely proportional to the edge strength (norm of the edge normal). This algorithm simply takes the difference of the second tangential derivative with the edge strength, where the first term is the comemess measurement and the second is called a false comer suppression. A subpixel addressing mechanism (called linear interpolation) is utilized for intermediate pixel addressing in the differentiation step, which results in improved accuracy of comer localization and reduced compu- tational complexity. The analysis of comer dislocation leads to a subpixel implementation. The comer tinder is implemented on a hybrid parallel processor PARADOX with a perfor- mance of 14 frames/s for the vision algorithm Droid.

Keywords: feature detection, motion estimation, parallel algorithm, corner thding

INTRODUCTION

A corner detector detects and localizes isolated events in a grey-level image and, together with edge detection, forms an important part of feature extraction’-5. Many commonly-used edge operators (for example Canny6 and LOG (Laplacian of Gaussian)7) fail to detect junctions, while these errors can be compensated by a separate corner detector. In some structure-from-motion vision algorithms, comer detection provides the fundamental data for 3D information reconstruction8-10. These algo- rithms are sometimes used for 3D object tracking’ or for

*School of Electrical and Electronic Engineering, Nanyang Technolo- gical University, Singapore 2263 (email: [email protected]) +Robotics Research Group, Oxford University, 19 Parks Road, Oxford OX1 3PJ, UK Paper received: 19 July 1994; revised paper received: 9 January 1995

real-time tasks, such as robot navigation”, where both accuracy and speed are key requirements.

Consistency: most importantly, if comers are to be used as features upon which subsequent processing is to be based, they must be detected consistently. Accuracy: the corner should be located precisely; errors in location are magnified in structure-from- motion projection into 3D space. Complexity: speed is a prime requirement for real- time jobs such as robot navigation. Not only can reduced algorithmic complexity contribute to faster implementation, but parallel technology can also provide an order of magnitude (or more) speedup.

The first criterion provides a measure of detection, while the second measures localization. These two measurements are often conflicting, as good detection (e.g. by coarse Gaussian smoothing) often leads to poor - localization6. We shall discuss the new comer detector around these two issues, and suggest an approach to combining them, achieving optimal trade-off.

BACKGROUND

Although there has been intense study of edge detection, there are far fewer reports in the literature about comer detection (e.g. Rosenfeld and Kitchen12, Zuniga and Haralick’). NagelI proposed a comer model as a pixel, where (VZ12 is maximum, hence V((VZ12) = 0, or

V(@$+ (@) . = 0. Expanding, using the chain rule,

he finds:

(i gj(gj=o 0262-6856/95/$09.50 0 1995 Elsevier Science B.V. All rights reserved

Image and Vision Computing Volume 13 Number 9 November 1995 695

Real-time corner detection algorithm: H Wang and M Brady

Noticing that the second order derivatives are directly proportional to the principal curvatures, and further assuming & = & = 0, he proposes the corner finder:

1 -g-J-pIc1-=0 -- d2Z a21 61 61 6x 61 61

6Y2 SY K fc2-=o

6Y

when:

61 a21 *z _ 0 62z- 0 6=max, ==max, G- , v-

As Nobell points out, the restrictive assumptions of this corner model are not necessarily met in real, images, though Dreschler and NagelI’ have used it for motion estimation.

Corners are intrinsically second order properties of a surface, but second order derivatives are noise sensitive. Harris4, l6 introduced a corner operator by modifying the Moravec’s interest operator17, using only first order derivatives to approximate the second derivatives.

At each pixel location, a 2 x 2 matrix (called the A matrix) is formed, where A = w*[(VZ) (VZ)T], where w is a Gaussian smoothing mask. A corner is placed where both eigenvalues of A are large, that is, where det (A) - k (trace (A))2 is large. k is given as a constant of 0.04. A close study has been done by Noble, and it was discovered that this algorithm is exactly the C matrix used by Nagel. Steve Smith at Oxford has found that the algorithm suffers from the dislocation at T-junctions.

The Harris corner detector has been used in the 3D vision algorithm Droid18, and produces consistent corner responses, though they are not well localized (shown in the experimental section). In addition, smoothing the products of first order derivatives is computationally intensive. In the parallel implementa- tion of the Harris algorithm, we only achieve 1.1 Hz with 32 transputers (T800).

Medioni and Yasumoto” proposed an algorithm to compute the curvature of a planar curve by fitting a cubic B-spline to edge points. Following edge detection, a parametric B-spline is used to compute the curvature along the edge. Corners are identified with local maxima of curvature. One problem with this approach is that it cannot locate junctions properly. Secondly, an edge segment algorithm is required prior to curve fitting, which imposes additional computational overhead in locating corners.

Our initial implementation of the Medioni-Yasumoto algorithm used Charkravaty’s edge segmentation algorithm” on a parallel compute?‘, and it worked well on closed edge contours or in the middle of edge segments. However, it could not handle corners that occur near the end of an edge segment, as the curve fitting requires a minimum of a few points to compute local curvature. Also, corners do not necessarily imply an edge - points with sharp autocorrelation are robust features in textured regions, but are not situated on extended edges. We prefer to develop a single corner detection function.

Han and Poston” recently proposed a new algorithm with a curve-based approach, in which they adopt the ‘cornemess’ measurement of distance accumulation, which has the advantage of scale invariant. The disad- vantage of a curve-based approach is that the algorithm is dependent on a pre-process stage of curve or edge extraction, which often produces fragmented curve segments. Research has been undertaken to extend this work into three-dimensional curves27. Other nonlinear approaches, such as morphological operators and the early jump-out method, can be found elsewhere24, 25.

Other approaches are reported in the literature, such as using a ‘comemess’ measure of a quadratic poly- nomial by’ Kitchen and Rosenfeld12, or computationally optimizing the similarity convolution, by Cooper et a1.25. Noble14 recently investigated edge and comer finding using morphological operations, which have the significant advantage of being both non-linear and idempotent.

In this paper, a comer detection algorithm based on a measure of total image surface curvature is presented. An advantage of the algorithm is that it utilizes linear interpolation to compute second directional derivatives, and achieves accurate corner localization. Noise is reduced by local non-maximum suppression and false comer response suppression.

MEASUREMENT OF SURFACE CURVATURE

Figure la shows a grey-level comer. The aim of corner detection is to define a function whose local response attains a maximum, so that the comer can be identified. Figure lb shows the response of a corner detection function. In this section, we show that image surface curvature provides a good measure of ‘comerness’, and that it can be utilized for such a purpose.

Let the image by Z( x, y), and let n = h VI be the edge normal. We denote by t the unit tangent vector perpendicular to n (t is also sometimes referred to as the ‘edge tangential’).

The Laplacian is the sum of the second differentials of Z(x, y) in orthogonal directions:

v2z = z,, + zyy (1)

Differentiating Z(x, y) in the direction of n and t using the chain rule, we have2?

a21 1 - - (Zx2ZXX + 2zxzyzxy + z_z,,>

6n2 - IVZ12

a21 1 _- $$ - ,vz,2 (Z,“Zx* - 2ZXZ,Z,, + Z,2Z,,)

Adding equations (2) and (3), thus:

2 2

2 + &f = z,, + zyy = v2z (40

which states the rotational invariance of the Laplacian.

696 Image and Vision Computing Volume 13 Number 9 November 1995

Real-time corner detection algorithm: H Wang and M Brady

Figure 1 Grey-level corner (a) and corner response (b) (4

The total curvature of the image surface is defined to be:

K = Ic” + 7ct

= ((1 + Z,2)Z,, + (1 + Z,‘)ZXX - 2ZXZ,Z,,)/g3

where g” = 1 + Z,” + Zj. It follows from equations (5) and (2) that:

K = -j ( $v*z- JVZ12 2 > (6)

When IVZl* > 1, g M IVZl*. Substituting equation (4) into equation (6), we find that:

(7)

Equation (7) means that the total curvature K is proportional to the second derivative along the edge tangential t, and is inversely proportional to the edge strength. This relationship has been derived previously27, *’ in studies of edge operators. Kitchen and Rosenfeldi2 have also derived this property, but they approximate comemess by $ in equation (3), which is equivalent to the Dreschler-Nagel algorithm29.

Figure 2 shows the total curvature K, which provides an intuitive corner response. Note that errors are introduced along the edge due to the image quantiza- tion and noise. This suggests that using K alone is inappropriate. This problem could easily lead to a smoothing operation such as Gaussian convolution to reduce the effect of noise and quantization. In the next section, we shall show that smoothing by Gaussian convolution causes linear signal displacement along edge normal.

THE ALGORITHM

The linear displacement of a corner resulting from Gaussian convolution degrades the localization of corners. Moreover, noise is present in the analogue- to-digital conversion, and together with the problem of quantization, will affect measurements of the corner response. In particular, the deviation in the edge normal n may result in a large signal fluctuation

(W

over non-corner segments along the edge. Hence, additional constraints are required. We now present a new corner detection algorithm with these constraints in mind.

Non-maximum suppression

Firstly, equation (7) assumes that IVZ[* >> 0. This means we are only interested in comers that lie near pixels where the image gradient magnitude is large. This constraint can reduce the effect of false corner marking, as well as having the advantage of requiring less computation.

False corner response suppression

Let F denote the image grey-level image Z after Gaussian convolution. Suppose that the square of the total curvature K in equation (7) exceeds a certain threshold S, that is:

Figure 2 Surface curvature of Figure la - errors caused by image quantization and noise are evident

Image and Vision Computing Volume 13 Number 9 November 1995 697

Real-time corner detection algorithm: H Wang and M Brady

62F 2

K2.= At.5 > s t-1 IVFI

Multiplying both sides by (lVF1)2, we find:

hence:

-SIVF12 > 0 (8)

inequality Figure 2 shows exactly the first term of

the above equation - the squared second order tangen- tial derivative. It is evident that corners as well as edges can respond in the discrete case (we call it a false corner

response). The second term of equation (8) is the edge strength that responds well at the edge pixels (including corners). The difference of these two terms cancels the false corner response, and leaves a clean corner response! This explains how false corner response suppression works. Hence, the aim of corner detection is reduced to look for the maxima wherever the above inequality is found. Our corner operator can be defined as:

[I’= (!$)2_SIVF12=maximum

d2F o (9) a=

I bF12 > T,, I? > T2

where S is a constant measure of image surface curvature varying with different differentiation masks, F is the intensity image after Gaussian smoothing, and T, and T2 are user-defined thresholds.

Linear interpolation for intermediate pixel addressing

The second order differentiation mask used to implement equation (9) is (-2, -1, 6, -1, -2). Bear in mind that this mask is directional, with a direction perpendicular to the edge normal. Let B, b, C, m, A4

denote pixel positions in the mask from left to right. Due to image quantization, B, b, m, M can be obtained by direct pixel addressing only if n and t align with the X- and y-axes (in most cases, this is not possible). Our algcrithm uses linear interpolation based on Newton’s Divided Difference method, as depicted in Figure 3.

Marr and Hildreth7 assumed linear variation when the LOG crosses zero, that is, the intensity variation near and parallel to the edge contour is locally linear. However, linearity does not hold at corners. As we showed above, the total curvature of the image surface is large, and has a local maximum (or minimum). The directional operator of Canny in some sense also suffers a substantial displacement, ,although he proved that his operator is optimal in the case of one-dimensional processing. Ponce and Brady3’ proved that in the case of a step edge formed by two slant lines, the displacement is proportional to r~ and to the difference of gradients of adjacent lines. This result shows that in the case where two lines have the same gradient, the displacement is zero. Again, their analysis is essentially one-dimensional, that is they did not analyse the displacement of a corner in 2D.

Corner displacement of the second normal derivative

In Figure 3, pixel F,,, is interpolated from Fp and Fq;

pixel FM is interpolated from Fi and Fj , hence:

F,,, = Fp + k(F, - Fp)

Figure 4 shows a right angle grey-level corner formed by the product of two step functions, Z(x, y) = u(x) . u(y). Let G denote the Gaussian function, and e - (x2 + y2)/2a2, Q(x) denote the error function:

@(x) = JT, f(t) dt = J+ e-’ dt

Hence, the step function convolved by a Gaussian function can be represented by:

FM = (2Fi - 4) + 2k(I;; - Fi) F=G*Z

Figure 3 Linear interpolation for intermediate pixel addressing

where k = tan8 = 1%1/1$$/ when tan-‘( l/2) < 8 < a/4.

Pixels Fb and FS are calculated in the direction opposite to t. When 0 < 8 c tan-‘( l/2), FM becomes:

FM = Fk + 2k(Fi - Fk)

LOCALIZATION MEASUREMENT

We prove that in 2D, a grey-level corner undergoes a linear displacement proportional to the standard deviation of the Gaussian convolution.

698 Image and Vision Computing Volume 13 Number 9 November 1995

Intensity

Figure 4 Grey level corner modelled by 2D step edge

More precisely:

u(fl - x)u(t2 - ~~)e-@f+‘:)/~’ dtl dt2

1 x J e-+2oz&, J

Y =---

2rca2 _-M e-</2”2dt2

-m (10)

= & @W KY)

Since ~1 and ~2 are orthogonal, the x and y components Adapting the above analysis to the tangential derivative, in F(x, y) are separated, and F(x, y) is twice differenti- we obtain the second directional derivative when able. We obtain: x=y=A:

1 Fx=-e

27102 -x2PJ2 @(y)

Fxx = - x - e-212~2 Qi (y) 27crT4

FxY = & e-(X2+Y2)/2a2

Analogous expressions can be given for Fy and Fyy. An edge is located when the second order directional derivative crosses zero6. In other words, we seek places where $$ = 0. Substituting F,. Fy, Fxy, F,, and Fyy into equation (2) we observe:

(11)

This condition holds when (T # 0. This shows that zero crossing does not occur at (0,O). Taking into account the symmetry of F(x, )y about a line y = x, and that the edge direction n points forwards on the same line, we assume the displacement of zero crossing occurs at x =y = i for some value 1. Substituting ,X into equation (2) we find:

Real-time corner detection algorithm: H Wang and M Brady

and 2 will be zero when:

-2/2c7= dt + e-k/2a2 = 0 (13)

We call this the edge displacement equation.

Assume that A = ka, where k is some constant, we can substitute ka into the corner displacement equation. Rearranging:

&@k - (1 - k2)e-‘/’ = 0 (14)

from which k has a unique solution k x 0.5, so that i = fika x 0.7070.

To illustrate this, Figure 5 shows a synthetic image and the displacement of the corner zero crossing of $$ with 0 equal to 1 and 4. In each case, the displacement follows the linearity precisely. Figure 6 shows a pixel map of the bottom left section of Figure 5 when r~ = 4.

Corner displacement of the second tangential derivative

(15) dt + e-12/2~2

In this case, we seek the local minimum of e rather than a /@F ’ zero crossing. Figure 7 plots the function 3 (ignoring the

scaling factor 1/2no*). It can be seen that the position of the minimum with respect to 1 increases as Q increases.

Let Fj, denote 3 3, and expand F:, using a Taylor series up to second order. We find:

F:, = - 1 3

------+-t 2&a3 4&a5

A2 + 0(13)

The minimum of g occurs when F;‘, = 0. From equation (16) we have:

J=Aa (17)

We draw three conclusions from these analyses:

1. The corner displacement is linear in a; increasing a causes further displacement of the corner.

2. In the extreme case where a approaches zero, the displacement would be zero. Corners coincide with

Image and Vision Computing Volume 13 Number 9 November 1995 699

Real-time corner detection algorithm: H Wang and M Brady

oooooooo*oooooooooo

oooooooo*oooooooooo

oooooooo*oooooooooo

oooooooo*oooooooooo

ooooooooo*ooooooooo

oooooooooo*oooooooo

ooooooooooo********

0000000000000000000

0000000000000000000

Figure 6 Pixel map of bottom left of Figure 5 (right)

0

.0.25

-0.5

.0.75

-1

.1.25

-1.5

-1.15

Figure 7 Function 9 plots, ‘s’ represent u. The local minimum displaces proportional to the r~

the minimum of the second tangential derivative with no Gaussian smoothing.

3. When 0 approaches zero, the zero crossing of the second normal derivative approaches the corner ad coincides with the minimum of the second tangential derivative.

The conclusion is that smoothing with a convolution such as a Gaussian cases a displacement that is a linear function of the standard deviation. However, when Q is sufficiently small (e.g. OS), the displacement is less than half a pixel and can be ignored.

It is possible to show that the above analysis of corner displacement can be extended to arbitrary angles by adding an additional parameter 8 for the angle subtended:

1(x, v) = u(x)u(v) u((tan 0)x - v)

Then the analysis can be generalized. .In fact, Deriche and Giraudon3’ have just done so in their recent study. They claimed that the dislocation behaviour of the second normal derivatives including angles of 7c/4 and

7r/8 are also proportional to the standard deviation. Our analysis conforms with their results. More interest- ingly, it is found that the dislocation is inversely proportional to the angle of the step corner, i.e. sharper corners result in larger displacement.

However, their analysis applies only to the dislocation of the second order normal derivatives. In this section, we show that the’dislocation in the tangential derivatives are also proportional to the smoothing factor. We refer readers with further interest to Berghohn32 and Berzins33.

Our algorithm does not implement smoothing. However, if the scene is very noisy, some smoothing is expected. The corner detection algorithm described above can be extended using multiple scales and achieve zero displacement by taking advantage of the property of linear displacement. For example, having computed corners at two scales of 0 = 2.0 and 1.0, the position of a corner can be projected back to where it would be for u = 0, hence achieving zero displacement and reducing noise effect.

EXPERIMENTS

Figure 8 shows an image of a cup and its corner map. Each square pattern on the cup takes 3 x 4 pixels. Corners are detected in almost all the expected places with one exception at the second row, fourth column, that the lower right corner was excluded by the non- maximum suppression operation which operated with a 5 x 5 window. On the third row, third column, the lower right corner was detected one pixel offset. This is due to the specularity of the surface.

Figure 9 shows a synthetic image and its detected corners. On the left of the image a series of T junctions are formed from a grey-level ladder. All the junctions are detected and located correctly. On the right-hand side of the image, a few T and Y junctions are formed with different orientation, and the detection shows positive results. It should be point out, however, that the algorithm failed at the obtuse corner of the triangle, and it also spotted spurious corners when the angles are very sharp; This is because the first order derivatives operators perform badly on this synthetic image. A bit of blurring (e.g. Gaussian) would help. However, this new corner detector does not rely on smoothing.

In comparison, we show the results of the Harris corner detector in Figure 10. It can be seen on the synthetic image that the Harris algorithm has failed to

Figure 8 Image of a cup and its corner map. The image is 128 x 128 with 8 bits

700 Image and Vision Computing Volume 13 Number 9 November 1995

Real-time corner detection algorithm: H Wang and M Brady

Figure 9 Left: 128 x 128; right: c the new algorithm

Synthetic ima se, :orners retrie ved by

Figure 10 Corners Harris algorithm

detected by the

spot two junctions on the grey-level ladder, and the detected locations are shifted as the grey-level changes (moving away gradually from its true position). This dislocation is evident in other parts of the image. This is caused by the smoothing of its first order derivatives. On the cup image many corners are missing and their locations are displaced.

In the implementation, A Sobel mask is used for computing the first difference VZ, and S is typically in the range 0.0 N 0.5 (default 0.1). Empirical values for T,

and T2 are 500 and 2000. The local maximum of I is taken from a m x m window, where m is typically from 3-7, depending on the application. In the examples shown in Figures 8, m equals 5.

Figure 22 shows corners detected in the left and the right image with subpixel accuracy. Subpixel accuracy can be implemented by fitting a parabola to the edge strength(VF] using three points. In Figure Z3a, functionf is interpolated from adjacent values of 104. Figure 13b is obtained by projecting IVJl on n to the x-axis. Assuming the maximum off occurs at (x0, ye), fitting a parabola :f(x) = ax2 + bx + c onf( - l),f(O) andf(l), we get:

f(1) -“Q-l) xo = 2f(O) -f(- 1) -S(l)

dy

Figure 14 shows corners matched from the stereo pair, and Figure I5 shows a reconstructed surface in 3D using the disparity information. Matching is conducted using normalized correlation with a grey level template*l and the epipolar constraint.

On a Spare-2, the new corner detector computes the corners for a 256 x 256 images in 0.3 N 0.5 s. On a 300 MIPS Transputer machine, it performs at 14 Hz in real- time14, where the Gaussian is computed on a Datacube image processor with a modest data precision of 8 bits. Figure II shows a stereo image pair of the robot vehicle laboratory at Oxford taken from calibrated cameras mounted on a robot vehicle.

DISCUSSION

The corner operator is based on the cornerness measurement of total curvature (second order tangen- tial derivative). Traditionally, directional derivatives are obtained from linear combinations of first and second derivatives with respect to the x and y components. These methods are both computationally expensive and inaccurate. The proposed linear interpolation scheme has solved this problem, and improved the accuracy of corner localisation. Another major advantage of this operator is its simplicity - a parallel implementation of

Image and Vision Computing Volume 13 Number 9 November 1995 701

Real-time corner detection algorithm: H Wang and M Brady

Figure 11 Stereo image of a lab scene

Figure 12 Corners detected from the stereo pair ,--____ ---___

I

n , f(l)

f(O) (Pi ’ x +-I+ x I, I +I

fC-1)

: I --_____---_-m-J

Figure 13 Subpixel accuracy implementation. The local maximum algorithm picks up the maximumf(0) which is larger than its adjacent neighbourf(- 1) and f( 1) in the direction of II. The parabola fitting is a first order approximation to the local ‘true’ maximum atf(xo)

Figure 15 Reconstructed surface in 3D using the disparity informa- tion

Figure 14 Matched corners from the stereo pair. Disparity is displayed by vectors

this algorithm delivers a performance of 14 frames/s. We also reported a scheme for achieving subpixel accuracy and zero localization displacement using multiple Gaussian smoothing and linear prediction.

In this new algorithm, the determination of S, Ti and

T2 are empirical, depending on the context of the image. For example, if the images are relatively ‘clean’ (with a high quality camera), S can be set small and the non- maximum suppression constrain can be relaxed. In our real-time implementation, one field is taken from a camera with interlaced frames. At certain orientations, aliasing has very strong effect, hence the parameter S was set relatively high. Extension to this algorithm could be finding an adaptive approach to adjust these parameters.

ACKNOWLEDGEMENTS

This work was supported by the ESPRIT project VOILA. The authors would like to thank colleagues in the Oxford Robotics Group for valuable discus- sion.

702 Image and Vision Computing Volume 13 Number 9 November 1995

Real-time corner detection algorithm: H Wang and M Brady

REFERENCES 18

1

2

3

4

5

9

10

11

12

13

14

15

16

17

Rohr, K ‘Modelling and identification of characteristic intensity variations’, Image & Vision Cornput., Vol 10 No 2 (March 1992) pp 66-76 Rangarrajan, K, Shah, M and Brackle, D V ‘Optimal comer detector’, Comput. Vision, Graph. Image Process., Vol48 (1989) Singh, A and Shneier, M ‘Grey level comer detection: a generalization and a robust real time implementation’, Comput. Vision, Graph. Image Process., Vol51 (November 1990) Harris, C and Stephens, M ‘A combined comer and edge detector’, Proc. 4th Alvey Vision Conf, Manchester, UK (September 1988) Zuniga, 0 A and Haralick, R M ‘Corner detection using the facet model’, Proc. IEEE Conf Comput. Vision & Part. Recogn., Washington, DC (June 1983) pp 30-37 Canny, J ‘A computational.approach to edge detection’, IEEE Tra& PAMZ, VoI8 No 6 (1986) pp 679-698- Hildreth. E C and Marr. D C ‘Theorv of edge detection’. Proc. Roy. Sod. Lond. B (1980) pp 187-217 d - Blissett, R J ‘Retriving 3D information from video for robot control and surveillance’, Electr. & Comm. Eng. J. (August 1990) pp 155-163 Lee, S and Kay, Y ‘A Kalman filter approach for accurate 3-D motion estimation from a sequence of stereo images’, Comput. Vision, Graph. Image Process.: Image Understanding, Vo154 No 2 (September 1991) pp 244-258 Burger, W and Bhanu, B ‘Estimating 3-D egomotion from perspective image sequences’, IEEE Trans. PAW, Vol 12 No 11 (November 1990) pp 104&1058

Brady, J M and Wang, H ‘Vision for mobile robots’, Phil. Trans. R. Sot. Lond. B, Vol337 (September, 1992) pp 341-350 Kitchen, L and Rosenfeld, A ‘Gray-level comer detection’, Putt. Recogn. Z&t., Vol 1 (1982) pp 95-102 Nagel, H H ‘On the estimation of optical flow: Relations between different approaches ad some new results’, Artif. Zntell., Vol 33 (1987) pp 299-324 Nobel, J A Description of image surface, PhD thesis, Department of Engineering Science, Oxford University (1989) Dreschler, L and Nagel, H ‘Volumetric mode1 and 3-D trajectory of a moving car derived from monocular TV-frame sequence of a street scene’, Comput. Vision, Graph. Image Process., Vol20 No 3 (November 1982) Harris, C ‘Determination of ego-motion from matched points’, Proc. 3rd Alvey Vision Conf, Cambridge, UK (September 1987) pp 189-192 Moravec, H P ‘Towards automatic visual obstacle avoidance’, Proc. Znt. Joint Conf Artif: Intell., Cambridge, MA (August 1977) p 584

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

Harris, C G and Pike, J M ‘3D positional integration from image sequences’ Proc. 3rd Alvey Vision Conf, Cambridge, UK (September 1987) Medioni, G and Yasumoto, Y ‘Corner detection and curve representation using cubic b-spines’, Proc. IEEE Int. Conf. on Robotics and Automation, San Francisco, CA (April 1986) pp 764769 Charkravarty, I ‘A single-pass, chain generating algorithm for region boundaries’, Comput. Graph. Image Process, Vol 15 (1981) 182-193 Wang, H, Brady, J M and Page, 1 ‘A fast algorithm for computing optic flow and its implementation on a transputer array’, Proc. British Machine Vision Conf, Oxford, UK (September 1990) pp 175-180 Han, J H and Poston, T ‘Distance accumulation and planar curvature’, Proc. IEEE ICCV, Berlin, Germany (1993) 487- 491 Li, S Z, Wang, H, Ang, T H and Bey, K M ‘Detection of comers on 3d space curves’, Proc. SPIE Symposium: Intelligent Robots and Computer Vision, Boston, MA (1994) Nobel, J A ‘Finding comers’ Image & Vision Comput., Vol 6 (May 1988) pp 121-128 Cooper, J, Venkatesh, S and Kitchen, L Early jump-out corner detectors, Technical Report 90/14, University of Western Aus- tralia (1990) Lipschutz, M M Dtfferentiul Geometry McGraw-Hill, New York (1969) Clark, J J ‘Authenticating edges produced by zero-crossing algorithms’, IEEE Trans. PAMZ, Vol 11 No I (January 1989) pp 43-57 Torre, V and Poggio, T A ‘On edge detection’ IEEE Trans. PAMI, Vol8 No 2 (March 1986) pp 147-163 Nagel, H H ‘Constraints for the estimation of displacement vector fields from image sequences’ Proc. Znt. Joint Con/: on AZ, Germany (August 1983) pp 945-951 Ponce, J and Brady, M ‘Toward a surface prima1 sketch in T Kanade (ed), Three Dimensional Machine Vision, Kluwer, Dordrecht (1987) Deriche, R and Giraudon, G ‘A computational approach for comer and vertex detection’, Znt. J. Comput. Vision, Vol 10 No 2 (1993) pp 101-124 Bergholm, F ‘Edge focusing’ IEEE Trans. PAMI, Vol 9 No 6 (1987) pp.72674i - Berzins, V ‘Accuracy of laplacian edge detectors’, Comput. Vision, Graph. Image Process., Vol 27 No 2 (1987) pp 726- 741

.-

Wang, H, Bowman, C, Brady, J M and Harris, C ‘A parallel implementation of a structure-from-motion algorithm’, Proc. ECCVPZ, Genova, Italy (May 1992)

Image and Vision Computing Volume 13 Number 9 November 1995 703