a no reference image quality metric for blur and ringing

8
A No Reference Image Quality Metric for Blur and Ringing Effect based on a Neural Weighting Scheme Aladine Chetouani a , Azeddine Beghdadi b a ISIR, UPMC – Paris VI b L2TI, Institut Galilée, Université Paris XIII ABSTRACT No Reference Image Quality Metrics proposed in the literature are generally developed for specific degradations, limiting thus their application. To overcome this limitation, we propose in this study a NR-IQM for ringing and blur distortions based on a neural weighting scheme. For a given image, we first estimate the level of blur and ringing degradations contained in an image using an Artificial Neural Networks (ANN) model. Then, the final index quality is given by combining blur and ringing measures by using the estimated weights through the learning process. The obtained results are promising. Keywords: Image Quality, Artificial Neural Networks, Subjective Scores, Artifacts 1. INTRODUCTION Nowadays, Image Quality evaluation plays an important role. Indeed, images undergo generally different processes (acquisition, compression, transmission and so on), which can strongly affect their perceptual quality. The visual impact of the various distortions differs and depends on many characteristics of the image. The more common distortions are blocking effect, ringing artfifact and blur. Blocking effect appears as artificial boundaries in the image and is due to the fact the block are processed independently. Another annoying artifact is the ringing effect or Gibbs phenomena, which affects the sharpness around the edges. This degradation appears as oscillations and is caused by the quantification process in compression such as in JPEG2000 compression method. Blur is also an annoying distortion, due to defocus, motion or low pass filtering which affect high frequencies (edges and details) in the image. To estimate the visual impact of these distortions, different measures have been proposed in the literature and classified into three categories. Full reference image quality measures, such SSIM [1], need both the original image and its degraded version. Reduced reference measures are based on some features extracted from the degraded image and its original version [2]. Finally, No Reference Image Quality Measures, (NR-IQM) such as the method proposed in [3], aim at estimating the image quality level without referring to the original image. They are generally degradation-oriented and consequently use some a priori information on the degradation. In this work, we focus on NR-IQMs for blur and ringing artifacts estimation. We first compute a weight for each considered distortion (blur or ringing) and then use it to estimate the quality of a given image. The degradation modeling is here realized using an Artificial Neural Networks (ANN). The proposed method permits to estimate the quality of both distortions through one metric. Furthermore we can better estimate the quality of JPEG2000 compressed images. Indeed, NR-IQMs for this kind of images generally consider the ringing effect as the unique and dominant degradation. Nevertheless, blur degradation appears also in this kind of images and could be the most prominent distortion. The paper is organized as follows: Section 2 describes the proposed approach. In section 3, we discuss the experimental results. A conclusion of the work is then given is section 4.

Upload: others

Post on 11-May-2022

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A No Reference Image Quality Metric for Blur and Ringing

A No Reference Image Quality Metric for Blur and Ringing Effect based on a Neural Weighting Scheme

Aladine Chetouania, Azeddine Beghdadib

aISIR, UPMC – Paris VI bL2TI, Institut Galilée, Université Paris XIII

ABSTRACT

No Reference Image Quality Metrics proposed in the literature are generally developed for specific degradations, limiting thus their application. To overcome this limitation, we propose in this study a NR-IQM for ringing and blur distortions based on a neural weighting scheme. For a given image, we first estimate the level of blur and ringing degradations contained in an image using an Artificial Neural Networks (ANN) model. Then, the final index quality is given by combining blur and ringing measures by using the estimated weights through the learning process. The obtained results are promising.

Keywords: Image Quality, Artificial Neural Networks, Subjective Scores, Artifacts

1. INTRODUCTION

Nowadays, Image Quality evaluation plays an important role. Indeed, images undergo generally different processes (acquisition, compression, transmission and so on), which can strongly affect their perceptual quality. The visual impact of the various distortions differs and depends on many characteristics of the image. The more common distortions are blocking effect, ringing artfifact and blur. Blocking effect appears as artificial boundaries in the image and is due to the fact the block are processed independently. Another annoying artifact is the ringing effect or Gibbs phenomena, which affects the sharpness around the edges. This degradation appears as oscillations and is caused by the quantification process in compression such as in JPEG2000 compression method. Blur is also an annoying distortion, due to defocus, motion or low pass filtering which affect high frequencies (edges and details) in the image.

To estimate the visual impact of these distortions, different measures have been proposed in the literature and classified into three categories. Full reference image quality measures, such SSIM [1], need both the original image and its degraded version. Reduced reference measures are based on some features extracted from the degraded image and its original version [2]. Finally, No Reference Image Quality Measures, (NR-IQM) such as the method proposed in [3], aim at estimating the image quality level without referring to the original image. They are generally degradation-oriented and consequently use some a priori information on the degradation.

In this work, we focus on NR-IQMs for blur and ringing artifacts estimation. We first compute a weight for each considered distortion (blur or ringing) and then use it to estimate the quality of a given image. The degradation modeling is here realized using an Artificial Neural Networks (ANN). The proposed method permits to estimate the quality of both distortions through one metric. Furthermore we can better estimate the quality of JPEG2000 compressed images. Indeed, NR-IQMs for this kind of images generally consider the ringing effect as the unique and dominant degradation. Nevertheless, blur degradation appears also in this kind of images and could be the most prominent distortion.

The paper is organized as follows: Section 2 describes the proposed approach. In section 3, we discuss the experimental results. A conclusion of the work is then given is section 4.

Page 2: A No Reference Image Quality Metric for Blur and Ringing

2. THE PROPOSED METHOD

Generally, NR-IQMs proposed in the literature consider only one distortion. This explicit assumption limits highly the useful of this kind of metrics. In this study, we propose to overcome this limitation by first quantifying the degradation type contained in a given image. Then, a neural scheme is used to estimate the distortion weights. The obtained weights are used to compute the final image quality index. The flowchart of the proposed approach is shown in Fig. 1.

Figure 1. Flowchart of the proposed method.

This work has been motivated by the fact that in JPEG2000 compressed images, ringing and blur artifacts appear at certain bit rate as can be seen in Fig. 2. To better evaluate the quality of this type of images, we propose to estimate the quantity (weights) of blur and ringing contained in a given image and use its as weights in the computation of the global image quality metric.

a) b)

Figure 2. Illustration of a) ringing and b) blur artifacts.

To better see the impact of the considered artefacts, we show also the 1-D profile of a original signal and its degraded version (see Figure 3).

Degraded image

Weight Distortion Estimation

ANN

Image Quality Index

Features Extraction

NR-IQMs

Page 3: A No Reference Image Quality Metric for Blur and Ringing

Figure 3. Illustration of ringing and blur artifacts for a step signal.

2.1 Image Database

Figure 4. Some original images of the LIVE Image Database.

In this experiments, LIVE image database [4] is used. This database is composed by 5 types of degradations (noise, blur, blocking, ringing and fast fading). The Mean Opinion Scores (MOS) are provided for each degraded image. Here, we restrict the database to blur and ringing distortions, which results in a set of 227 and 174 images with ringing and blur artifacts, respectively. This database is used in the learning and test steps using the cross-validation principle (60% for the learning and 40% during the test).

Page 4: A No Reference Image Quality Metric for Blur and Ringing

2.2 Weight estimation

This step is the most decisive of the whole process. To quantify the blur and the ringing degradations contained in a given image, different features are extracted and used as inputs to an Artificial Neural Networks (ANN) model. The features, the classifier used and the index quality computation process, are described in the following. 2.2.1 Considered Features

Some descriptors are here used to identify the type of the degradation contained in a given image (i.e. Ringing or Blur distortions). The best result has been obtained with 6 NR-IQMs with 1 ringing measure and 5 blur metrics. The ringing measure used is based on wavelet transform and Natural Scene Statistics [5]. The image quality index is derived from statistical models of the coefficients distribution at different decomposition levels. Five Blur measures are considered in this study.

• The first one is based on wavelet domain [6]. An edge map is first derived from the high frequencies coefficients at each decomposition level. The blur measure is then obtained by analyzing the type of the edge contained in the image through some rules.

• The second NR blur metric used here is based on the idea expressed in [7]. It is argued that the visual impact of a blur depends on the original sharpness of the image. In others words, blurring a contrasted image would have more visual impact than when applied to a blur image. The blur level on the observed image is computed by estimating the impact of adding blur to this image. In other words, the original blur image is considered as a reference and the more blurred image as the degraded version. Then a Full Reference Metric is used to estimate the amount of blur.

• Based on the above metric, the third measure is computed in the frequency domain [8]. The radial spectrums of

the degraded image and its blurred version are first computed. The quality metric is then obtained by analyzing the spectrum differences.

• The fourth method is Marziliano’s method which is based on edge point analysis [9]. Once the edge points are

detected using canny’s detector, the index quality is obtain by estimating the width of each edge point.

• The last one is based on some subjective tests where the Just Noticeable Blur (JNB) is measured for different contrast levels [10]. A blur model is then derived from these tests and used for computing the blurriness of a given image.

2.2.2 Artificial Neural Networks Model

Once the descriptors and classes sets defined, the modeling step can be done. Note that this ANN model is used here to estimate weight of each considered degradation. A Mutli Layer Perceptron ANN model is used here (see Fig.5).

Page 5: A No Reference Image Quality Metric for Blur and Ringing

Figure 5. The Artificial Neural Networks used in the proposed method.

The selected descriptors are used as inputs of this ANN. The outputs correspond to the weight values for blur and ringing degradations. The ANN characteristics are summarized in Table 1.

Table 1. MLP-ANN characteristics.

Inputs 6 (i.e. 6 NR-IQMs, the index values are scaled to the range [-1, +1]).

Hidden layer 1 (the number of neurons is equal to 9) Outputs 2 (Considered distortions: Blur and Ringing) Learning method Back propagation Activation function Sigmoid

2.2.3 Quality Index Computation

After computing the weight of each considered degradation, the image quality index is given by the following expression:

!

Index = wb . BlurMetric + wr . RingingMetric (1)

where wb and wr are the estimated weights of blur and ringing distortions, respectively. The blur metric used here is based on some subjective tests [11] and the ringing effect measure is that proposed in [5] and described above.

3. EXPERIMENTAL RESULTS To test the efficiency of the proposed method, the LIVE database is used during the test step. The proposed method is evaluated in terms of correlation with subjective judgments. We first show the weights obtained for different images (see Fig. 6) and the corresponding images (see Fig. 7).

Layer c+1

Layer c

Layer c-1

1cnj! +

Inputs

NR-IQMs

W E I G H T S

Index

quality

Page 6: A No Reference Image Quality Metric for Blur and Ringing

Figure 6. The estimated weights for blur (blue bar) and ringing (red bar) artifacts.

We can see that the obtained weights are well representative of the distortion contained in the image. Indeed, for example for the image 3, in Fig. 4 (corresponding image: Fig. 5.c), the obtained weights are equal to 0.84 for blur and 0.16 for ringing. In other words, this means that blur is more dominant in this image than ringing.

a) b) c)

d) e)

Figure 7. a-e) Degraded images corresponding to the images 1-5) of the figure 5.

The Pearson and Spearman correlation coefficients obtained using the proposed method for both blur and ringing distortions are presented in Table 2. We can see that the correlation is close to the subjective scores for both distortions. Thus, the proposed metric can be well used to estimate the quality of images with blur and ringing artifacts, which are encountered in the case of JPEG2000 compressed images.

Page 7: A No Reference Image Quality Metric for Blur and Ringing

Table 2. Pearson and Spearman Correlation Coefficients.

Degradation type Pearson correlations Ringing 0.90 Blur 0.91 Degradation type Spearman correlations

Ringing 0.91 Blur 0.93

Figure 8 illustrates the overall image quality scheme. For a given degraded image, the weights of each considered degradation are estimated through an ANN scheme. These weights are then used to combine blur and ringing metrics.

Figure 8. The global NR-IQM for JPEG2000 compressed images.

4. CONCLUSION In this study, a new no reference image quality measure based on a neural network approach for estimating the visual impact of blur and ringing artifacts is proposed. It is shown that by combining some distortion-oriented measures based on the observed image features, an efficient global image quality index dedicated to blur distortion and ringing effect could be derived . Through this study, it is also demonstrated that the use of neural network offers a flexible and efficient solution for estimating the visual impacts of different distortions. The obtained results show the efficiency of the proposed method. As future works, other degradations such as blocking and noise will be considered. Another issue, to be explored in the future, is the use of more elaborated neural network and especially general regression neural network.

REFERENCES

[1] Wang, Z., Simoncelli, E.P., and Bovik, A.C., “Multi-scale structural similarity for image quality assessment,” IEEE Asilomar Conference on Signals, Systems and Computers, Vol. 2, 1398-1402 (2003).

[2] Wang, Z., and Simoncelli, E.P., “Reduced-reference image quality assessment using a wavelet-domain natural image statistic model,” Human Vision and Electronic Imaging X, Proc. SPIE, Vol. 5666, 149-159 (2005).

[3] Chetouani, A., Mostafaoui G., and Beghdadi, A., “A New Free Reference Image Quality Index Based on Perceptual Blur Estimation,” IEEE Pacific Rim Conference on Multimedia, 1185-1196 (2009).

Degraded Image

!No Reference

Image Quality Metrics

(Features)

Degradation Type

Estimation

(ANN) RINGING METRIC

BLUR METRIC

Image Quality Metric

Page 8: A No Reference Image Quality Metric for Blur and Ringing

[4] Sheikh, H.R., Wang, Z., Cormack, L., and Bovik, A.C., LIVE Image Quality Assessment Database. http://live.ece.utexas.eduesearch/quality

[5] Sheikh, H.R., Bovik, A. C., and Cormack, L. K., “No-Reference Quality Assessment Using Natural Scene Statistics: JPEG2000,” IEEE Transactions on Image Processing, Vol. 14, No. 12, 1918-1927 (2005).

[6] Tong, H., Li, M., Zhang, H., and Zhang, C., “Blur detection for digital images using wavelet transform,” IEEE International Conference on Multimedia and Expo, Vol. 1, 17-20 (2004).

[7] Crête, F., “Estimer, mesurer et corriger les artefacts de compression pour la télévision,” Thesis report, Université Joseph Fourier (2007).

[8] Chetouani, A., Beghdadi, A. and Deriche, M. “A new free reference image quality index for blur estimation in the frequency domain,” IEEE International Symposium on Signal Processing and Information Technology, 155-159 (2009).

[9] Marziliano, P., Dufaux, F., Winkler S., and Ebrahimi, T., “A no-reference perceptual blur metric,” IEEE International Conference Image Processing, Vol. 3, 57-60 (2002).

[10] Ferzli, R., and Karam, J. L., “A No-Reference Objective Image Sharpness Metric Based on the Notion of Just Noticeable Blur,” IEEE Transactions on Image Processing, Vol. 18, no 4, 717-728 (2009).

[11] Narvekar, N.D., and Karam, L. J., “A No-Reference Image Blur Metric Based on the Cumulative Probability of Blur Detection (CPBD)”, IEEE Transactions on Image Processing, Vol. 20, 2678-2683 (2011).