|
Technical Program
Paper Detail
Paper: | WA-L2.3 |
Session: | Lossless Image Coding |
Time: | Wednesday, October 11, 10:20 - 10:40 |
Presentation: |
Lecture
|
Title: |
ERROR ENTROPY AND MEAN SQUARE ERROR MINIMIZATION FOR LOSSLESS IMAGE COMPRESSION |
Authors: |
Peter William; University of Nebraska-Lincoln | | | | Michael Hoffman; University of Nebraska-Lincoln | | |
Abstract: |
In this paper, the Minimum Error Entropy (MEE) criterion is considered as an alternative to the Mean Square Error (MSE) criterion in obtaining predictor coefficients. Estimation of the error entropy is done using Renyi’s formula. The PDF of the error between image pixels and the predicted values is estimated using the Parzen windowing with a Gaussian kernel. The performance of the error entropy minimization and the mean square error minimization is compared using the first order Shannon’s entropy of the residual error. Comparison between MEE and MSE is extended to the issue of treating the image as a number of independent blocks, where each block uses its optimized predictor. The behavior of MEE is similar to MSE with a small improvement when using the maximum allowable window size. |
|