ñòð. 6 |

neighbours. A significant challenge is to smooth out areas without blurring details of

interest. Edge enhancement filters attempt to magnify the differences in brightness

11

between neighbouring pixels. Figure 4 shows two examples of spatial filters. In Figure 4

a), the filter defined will have the highlighted pixelâ€™s luminance set to an average of its

luminance and all of its neighbouring pixels. When applied to all pixels in an image, the

process smoothes out edges contained within it. Using the same concept, Figure 4 b)

implements an edge enhancement filter. It does so by setting the luminance of each pixel

to five times its own luminance minus the luminance of its vertical and horizontal

neighbours.

Figure 4: Image Filtering Models

1/9 1/9 1/9 -1

1/9 1/9 1/9 -1 5 -1

1/9 1/9 1/9 -1

a) Edge smoothing, square shaped window b) Edge enhancement, star shaped window

So far, the image manipulation methods presented have been done in the spatial domain.

Some image processing and analysis however is simpler to do in an alternate system. By

transforming an image into a different space, different attributes come to the forefront. In

this section, examples of such methods are discussed. The Fourier and wavelet based

12

transforms move an image into a space that is appropriate for spatial frequency

operations. The Hough transform moves images into a representation that exposes

straight lines. Each will now be discussed in turn.

The Fourier transform provides a useful representation for isolating the various levels of

spatial frequencies in an image. Fourier's theorem states that any image can be expressed

by the summation of sine and cosine waves. The waves are of varying frequencies and

amplitudes. Figure 5 derived from Russ's work (1992) shows how a simple step function

in one dimension is approximated by adding sincoid waves together. The low frequency

waves provide the 'body' of the function while the higher frequency terms add detail to

the approximation. This is consistent with our discussion about spatial frequency in the

previous section. The transform moves the image from a domain where the image is

expressed as brightness as a function of spatial placement to where it is a set of

amplitudes corresponding to frequencies of sincoid and cosine waves. The Fourier

transform is stated in Figure 6 below.

13

Figure 5 â€“ from Image Processing Handbook (J. Ross)

Individual Frequency Terms Summed Terms and Step Function

14

Figure 6: The 2-D Fourier Transform

âˆ’ j 2 Ï€ (Ï‰ x +Î½ y )

âˆ«âˆ«

F ( Ï‰ ,Î½ ) = l

f (x, y) dxdy

j= âˆ’1

where

âˆ’ j 2 Ï€ (Ï‰ x +Î½ y )

= cos( 2 Ï€Ï‰ x ) âˆ’ j sin( 2 Ï€Ï‰ x )

l

and

+ cos( 2 Ï€Î½ y ) âˆ’ j sin( 2 Ï€Î½ y )

There is a unique one to one relationship between f(w,v) and f(x,y). There is an inverse

function that transforms an image from the frequency domain into the spatial domain.

This function is listed in Figure 7. Transforming between the spatial and frequency

domain (and vice-versa) is done without any loss of image data.

Figure 7: The 2-D Inverse Fourier Transform

j 2Ï€ (Ï‰x +Î½y )

f ( x, y ) = âˆ« âˆ« F (Ï‰ ,Î½ ) l dÏ‰dÎ½

In practice, this equation is not used because of the discrete nature of digital images and

the fact that spatial frequencies do not get higher than the Nyquist frequency. The

discrete version of the transform is mentioned in Figure 8.

15

Figure 8: Discrete Version of the Fourier Transform

n âˆ’1 n âˆ’1

1

âˆ‘âˆ‘

âˆ’ j 2 Ï€ ( kh + li ) / n

ñòð. 6 |