Contents |
|
| Foreword | vii |
| Introduction | ix |
| Outline | xi |
| Notation and conventions | xiii |
I | Theoretical background and basic methods | 1 |
1 | Local approximations, the moving least-squares method, and the local polynomial approximation (LPA) | 3 |
1.1 | Analysis, reconstruction and approximation in Hilbert spaces | 3 |
1.1.1 | Projection onto an orthonormal basis | 4 |
1.1.2 | Non-orthogonal case: frames | 4 |
1.2 | The space L^2 with a windowing measure | 6 |
1.2.1 | Definition of the space | 6 |
1.2.2 | Best approximation in L^2(R^d,mu) | 7 |
1.3 | Pointwise evaluation of the approximation and its kernel representation. | 8 |
1.4 | Moving least-squares (MLS) method | 10 |
1.4.1 | Translation-invariance | 10 |
1.4.2 | Moving least-squares denoising | 11 |
1.5 | Local polynomial approximation (LPA) | 12 |
1.5.1 | Characterization of an LPA | 12 |
1.5.2 | Function and derivative estimation kernels | 12 |
1.5.3 | Vanishing moments | 14 |
1.5.4 | Zero-order LPA | 14 |
1.6 | Finite case andmatrix notation | 14 |
1.6.1 | Best approximation (weighted least-squares solution) | 15 |
1.6.2 | Pointwise evalution of the best approximation and corresponding kernel | 15 |
1.6.3 | Vector form for LPA function and derivative estimation kernels | 16 |
1.7 | Some examples of LPA kernels | 16 |
2 | Adaptive nonparametric estimation | 19 |
2.1 | Parametric vs. nonparametric estimation | 19 |
2.2 | Scale | 20 |
2.2.1 | LPA kernels as smoothers | 21 |
2.3 | Accuracy analysis of the LPA kernels | 21 |
2.3.1 | Bias and variance | 22 |
2.3.2 | Asymptotic error analysis | 23 |
2.3.3 | Ideal scale | 24 |
2.4 | Intersection of Confidence Intervals (ICI) rule | 25 |
2.4.1 | The idea | 25 |
2.4.2 | ICI adaptive-scale selection rule | 26 |
2.4.3 | ICI algorithmpseudo-code | 27 |
2.4.4 | Choice of Gamma | 28 |
2.4.5 | Examples with symmetric windows | 29 |
3 | Directional LPA | 33 |
3.1 | Motivation | 33 |
3.2 | Directional LPA: a general definition | 34 |
3.3 | Discrete directional-LPA kernel construction | 35 |
3.4 | Peculiarities of the directional-LPA kernel design | 35 |
3.5 | Some examples of directional-LPA kernels | 36 |
4 | Anisotropic LPA-ICI | 39 |
4.1 | Motivation and idea | 39 |
4.1.1 | Estimates with support optimization | 40 |
4.1.2 | Estimates with kernel-scale optimization | 42 |
4.2 | Anisotropic estimator based on directional adaptive-scale | 43 |
4.2.1 | Anisotropic LPA-ICI estimator | 43 |
4.2.2 | Adaptive anisotropic kernel and adaptive anisotropic neighborhood | 44 |
4.2.3 | Anisotropic estimation: formal modelling | 45 |
4.3 | Anisotropic LPA-ICI pseudo-code | 49 |
4.4 | Illustrations | 51 |
4.5 | Fusing: why sigma^-2? | 51 |
4.5.1 | Fusing unbiased estimates | 54 |
4.5.2 | Uniform kernels for a uniform anisotropic kernel | 55 |
4.6 | Ideal scale h and the use of ICI for fused estimates | 56 |
4.6.1 | Practical impact of the Gamma parameter | 56 |
4.6.2 | MSE of the anisotropic fused estimate | 57 |
4.6.3 | A simplified analysis | 58 |
4.6.4 | Speculations and results on the value of Gamma | 59 |
4.7 | Uniform fusing for overlapping discrete kernels | 60 |
4.8 | Variance of the fused estimate | 62 |
4.8.1 | Non-overlapping kernels | 63 |
4.8.2 | Origin-overlapping kernels | 63 |
4.8.3 | Uniform fusing | 64 |
4.9 | Robust ICI for anisotropic estimation | 64 |
4.9.1 | WOS filters | 65 |
4.9.2 | Anisotropic LPA-ICI WOS filters | 66 |
4.9.3 | Binary WOS as thresholding of a linear filter | 67 |
4.10 | Algorithm complexity and implementation issues | 68 |
4.10.1 | Complexity | 68 |
4.10.2 | Implementation aspects | 68 |
5 | LPA-ICI for signal-dependant or space-variant noise | 71 |
5.1 | Signal-dependant noise | 71 |
5.1.1 | Poisson observations | 72 |
5.2 | Space-variant noise | 73 |
5.2.1 | Variance for heteroskedastic observations | 74 |
5.2.2 | Variance's asymptotics | 74 |
5.2.3 | Confidence intervals for non-Gaussian distributed estimates | 74 |
5.2.4 | Conclusions | 75 |
6 | Recursive anisotropic LPA-ICI | 77 |
6.1 | An iterative system | 77 |
6.2 | Estimation neighborhoods' enlargement | 78 |
6.3 | Variance of l-th iteration's directional estimates | 79 |
7 | Directional derivative estimation and anisotropic gradient | 81 |
7.1 | Derivative estimation | 81 |
7.1.1 | Derivative estimation LPA kernels | 82 |
7.2 | Anisotropic gradient estimation | 86 |
7.2.1 | Motivation | 88 |
7.2.2 | An illustrative example in the continuous domain | 89 |
7.2.3 | The same example in the discrete domain | 91 |
7.2.4 | Continuous domain anisotropic gradient | 94 |
7.2.5 | Discrete domain anisotropic gradient | 97 |
7.2.6 | More examples | 99 |
II | Algorithms, applications and further examples | 107 |
8 | Denoising | 109 |
8.1 | Additive white Gaussian noise | 109 |
8.2 | Recursive LPA-ICI implementation | 109 |
8.2.1 | Simulations | 112 |
8.3 | Signal-dependant noise | 113 |
8.3.1 | Recursive variance update | 114 |
8.3.2 | Poisson denoising experiments | 116 |
8.3.3 | Simulation results | 118 |
8.3.4 | Other types of noise | 120 |
9 | Deconvolution | 125 |
9.1 | Additive white Gaussian noise | 125 |
9.1.1 | Introduction | 125 |
9.1.2 | Adaptive RI-RWI deblurring algorithm | 126 |
9.1.3 | Derivative estimation and edge detection from noisy blurred observations | 128 |
9.2 | Poisson deconvolution | 128 |
9.2.1 | Introduction | 129 |
9.2.2 | Poissonian RI-RWI algorithm. | 130 |
9.2.3 | Linear inverse with directional adaptive LPA-ICI filtering | 130 |
9.2.4 | Poissonian RI inverse | 131 |
9.2.5 | Poissonian RWI inverse | 132 |
9.2.6 | Comments | 132 |
9.2.7 | Numerical experiments | 133 |
9.3 | Inverse halftoning | 134 |
9.3.1 | Halftoning and inverse halftoning | 136 |
9.3.2 | Error diffusion | 137 |
9.3.3 | Linear model of error diffusion | 138 |
9.3.4 | Anisotropic LPA-ICI inverse-halftoning | 140 |
9.3.5 | Simulation results | 143 |
10 | Other applications | 147 |
10.1 | Video denoising | 147 |
10.1.1 | Introduction | 148 |
10.1.2 | Coordinate system | 148 |
10.1.3 | Video-denoising simulation | 148 |
10.2 | Shading from depth map: Z-buffer shading | 150 |
11 | Hybrid methods: LPA-ICI SA-DCT | 155 |
| Bibliography | 157 |