2. Encoding the Iris
After iris is detected, the algorithms are used to encode the iris data. This process extracts features from the normalized iris images and encodes it to generate iris Codes.
Methods available in literature for feature extraction and code generation are:
· Wavelet encoding
Wavelet filters are applied to the 2D iris region to decompose the data in the Iris region into components that appear at different resolutions. The output is then encoded in order to provide a compact and discriminating representation of the iris pattern.
· Gabor Filters(used by Daugman)
Gabor filters are able to provide optimum conjoint representation of a signal in space and spatial frequency. Decomposition of a signal is accomplished using a quadrature pair of Gabor filters.
Daugman demodulates the output of the Gabor filters in order to compress the data. This is done by quantizing the phase information into four levels (represented by two bits), for each possible quadrant in the complex plane to obtain a compact 256-byte template, which allows for efficient storage and comparison of irises.
· Log-Gabor Filters
· Log-Gabor Filters
To negate the disadvantage of Gabor filters of having DC component, Gabor filter which is Gaussian on a log scale, known as Log-Gabor filter, is proposed.
· Zero Crossings of 1D wavelet(proposed by Boles and Boashash)
It makes use of 1D wavelets for encoding iris pattern data; the reason is that zero-crossings correspond to significant features with the iris region. The mother wavelet is defined as the second derivative of a smoothing function. The zero crossings of dyadic scales of these filters are then used to encode features.
· Haar Wavelet(used by Lim et al)
It also uses wavelet transform to extract features from the iris region; Haar wavelet being the mother wavelet.
· Laplacian of Guassian Filters (proposed by Wildes et al)
It decomposes the iris region by application of Laplacian of Gaussian filters to the iris region image. The filtered image is represented as a Laplacian pyramid (with four different resolution levels) which is able to compress the data, to generate compact iris template.
3. Image Verification from the database
After feature extraction, next step is to compare the code of the input Iris image with the code in the database.
Following are the algorithms used for this purpose.
· Hamming distance
The Hamming distance gives a measure of how many bits are the same between two bit patterns. Using the Hamming distance of two bit patterns, a decision can be made as to whether the two patterns were generated from different irises or from the same one.
The Hamming distance is the matching metric employed by Daugman.
· Weighted Euclidean Distance
The weighting Euclidean distance gives a measure of how similar a collection of values are between two templates. This metric is employed by Zhu et al.
· Normalized Correlation
It makes use of normalized correlation between the acquired and database representation for goodness of match. This metric is employed by Wildes et al.
Iris pattern is considered as the most accurate and stable biometric modality, however, iris recognition system meets faces challenge in anti-counterfeit iris as color contact lens become popular recently. Attackers wearing contact lens with artificial textures printed onto them may try to spoof the system. Other spoofing mechanisms include ‘eye movie’, iris pattern printed on plastic/rubber eye, etc.
Various researchers have worked in this direction to make iris recognition system more robust to anti-spoofing. Daugman proposed to detect printed iris pattern using spurious energy in 2D Fourier spectra. Lee et al. suggested detecting fake iris based on Purkinje image. He et al. used four features (image mean, variance, contrast and angular second moment) to detect fake iris. Detecting iris edge sharpness, classify based on iris texture, etc. are few other methods
Filed Under: Articles