Data Denoising In Analog And Digital Domains
thesisposted on 22.05.2021, 16:34 authored by Amroabadi S. Hashemi
In this thesis, we develop various methods for the purpose of data denoising. We propose a method for Mean Square Error (MSE) estimation in Soft Thresholding. The MSE estimator is based on Minimum Noiseless Data Length (MNDL). Our simulation results show that this MSE estimate is a valuable comparison measure for different soft thresholding methods. Two denoising methods are proposed for analog domain: Mean Square Error EstiMation (MSEEM) which minimizes the worst case MSE estimate, and Noise Invalidation Denoising (NIDe) method which is based on the newly prosposed idea of noise signature. While MSEEM shown to be the optimum denoising method for non-sparse signals, NIDe approach outperforms the other well known denoising methods in presence of colored noise. In digital domain we address two interesting problems: 1) simultaneous denoising and quantization method, 2) denoising a digital signal in digital domain. For problem one, we propose a new method that generalizes the idea of dead zone estimation to a multi-level noise removal. An example of this method is shown for hyperspectral image denoising and compression. A digital domain denoising approach pioneers in answering the second problem with only one prior knowledge on the desired signal, that it is digital. The method provides the optimum reconstruction levels in the MSE sense. One of the critical steps of denoising process is the noise variance estimation. As a part of this thesis, we propose a novel noise variance estimation method for BayesShrink that outperforms conventional MAD-based noise variance estimation. Although BayesShrink is one of the most efficient denoising methods, no analytical analysis is available for it. Here, we study Bayes estimators for General Gaussian Distribued (GGD) data and provide the theoretical justification for BayesShrink. This study enables us to generalize the BayesShrink threshold to Generalized BayesShrink which outperforms the BayesShrink itself.