Office of Academic Resources
Chulalongkorn University
Chulalongkorn University

Home / Help

Authorร{147} Ruanaidh, Joseph J. K. author
TitleNumerical Bayesian Methods Applied to Signal Processing [electronic resource] / by Joseph J. K. ร{147} Ruanaidh, William J. Fitzgerald
ImprintNew York, NY : Springer New York : Imprint: Springer, 1996
Connect tohttp://dx.doi.org/10.1007/978-1-4612-0717-7
Descript XIV, 244 p. online resource

SUMMARY

This book is concerned with the processing of signals that have been samยญ pled and digitized. The fundamental theory behind Digital Signal Processยญ ing has been in existence for decades and has extensive applications to the fields of speech and data communications, biomedical engineering, acousยญ tics, sonar, radar, seismology, oil exploration, instrumentation and audio signal processing to name but a few [87]. The term "Digital Signal Processing", in its broadest sense, could apply to any operation carried out on a finite set of measurements for whatever purpose. A book on signal processing would usually contain detailed deยญ scriptions of the standard mathematical machinery often used to describe signals. It would also motivate an approach to real world problems based on concepts and results developed in linear systems theory, that make use of some rather interesting properties of the time and frequency domain representations of signals. While this book assumes some familiarity with traditional methods the emphasis is altogether quite different. The aim is to describe general methods for carrying out optimal signal processing


CONTENT

1 Introduction -- 2 Probabilistic Inference in Signal Processing -- 2.1 Introduction -- 2.2 The likelihood function -- 2.3 Bayesian data analysis -- 2.4 Prior probabilities -- 2.5 The removal of nuisance parameters -- 2.6 Model selection using Bayesian evidence -- 2.7 The general linear model -- 2.8 Interpretations of the general linear model -- 2.9 Example of marginalization -- 2.10 Example of model selection -- 2.11 Concluding remarks -- 3 Numerical Bayesian Inference -- 3.1 The normal approximation -- 3.2 Optimization -- 3.3 Integration -- 3.4 Numerical quadrature -- 3.5 Asymptotic approximations -- 3.6 The Monte Carlo method -- 3.7 The generation of random variates -- 3.8 Evidence using importance sampling -- 3.9 Marginal densities -- 3.10 Opportunities for variance reduction -- 3.11 Summary -- 4 Markov Chain Monte Carlo Methods -- 4.1 Introduction -- 4.2 Background on Markov chains -- 4.3 The canonical distribution -- 4.4 The Gibbs sampler -- 4.5 The Metropolis-Hastings algorithm -- 4.6 Dynamical sampling methods -- 4.7 Implementation of simulated annealing -- 4.8 Other issues -- 4.9 Free energy estimation -- 4.10 Summary -- 5 Retrospective Changepoint Detection -- 5.1 Introduction -- 5.2 The simple Bayesian step detector -- 5.3 The detection of changepoints using the general linear model -- 5.4 Recursive Bayesian estimation -- 5.5 Detection of multiple changepoints -- 5.6 Implementation details -- 5.7 Multiple changepoint results -- 5.8 Concluding Remarks -- 6 Restoration of Missing Samples in Digital Audio Signals -- 6.1 Introduction -- 6.2 Model formulation -- 6.3 The EM algorithm -- 6.4 Gibbs sampling -- 6.5 Implementation issues -- 6.6 Relationship between the three restoration methods -- 6.7 Simulations -- 6.8 Discussion -- 6.9 Concluding remarks -- 7 Integration in Bayesian Data Analysis -- 7.1 Polynomial data -- 7.2 Decay problem -- 7.3 General model selection -- 7.4 Summary -- 8 Conclusion -- 8.1 A review of the work -- 8.2 Further work -- A The General Linear Model -- A.1 Integrating out model amplitudes -- A.1.1 Least squares -- A.1.2 Orthogonalization -- A.2 Integrating out the standard deviation -- A.3 Marginal density for a linear coefficient -- A.4 Marginal density for standard deviation -- A.5 Conditional density for a linear coefficient -- A.6 Conditional density for standard deviation -- B Sampling from a Multivariate Gaussian Density -- C Hybrid Monte Carlo Derivations -- C.1 Full Gaussian likelihood -- C.2 Student-t distribution -- C.3 Remark -- D EM Algorithm Derivations -- D.l Expectation -- D.2 Maximization -- E Issues in Sampling Based Approaches to Integration -- E.1 Marginalizing using the conditional density -- E.2 Approximating the conditional density -- E.3 Gibbs sampling from the joint density -- E.4 Reverse importance sampling -- F Detailed Balance -- F.1 Detailed balance in the Gibbs sampler -- F.2 Detailed balance in the Metropolis Hastings algorithm. -- F.3 Detailed balance in the Hybrid Monte Carlo algorithm. -- F.4 Remarks -- References


Computer science Computers Statistics Computer Science Theory of Computation Statistics general



Location



Office of Academic Resources, Chulalongkorn University, Phayathai Rd. Pathumwan Bangkok 10330 Thailand

Contact Us

Tel. 0-2218-2929,
0-2218-2927 (Library Service)
0-2218-2903 (Administrative Division)
Fax. 0-2215-3617, 0-2218-2907

Social Network

  line

facebook   instragram