Although numerical approximation and statistical inference are traditionally covered as entirely separate subjects, they are intimately connected through the common purpose of making estimations with partial information. This book explores these connections from a game and decision theoretic perspective, showing how they constitute a pathway to developing simple and general methods for solving fundamental problems in both areas. It illustrates these interplays by addressing problems related to numerical homogenization, operator adapted wavelets, fast solvers, and Gaussian processes. This perspective reveals much of their essential anatomy and greatly facilitates advances in these areas, thereby appearing to establish a general principle for guiding the process of scientific discovery. This book is designed for graduate students, researchers, and engineers in mathematics, applied mathematics, and computer science, and particularly researchers interested in drawing on and developing this interface between approximation, inference, and learning.
This monograph demonstrates a new approach to the classical mode decomposition problem through nonlinear regression models, which achieve near-machine precision in the recovery of the modes. The presentation includes a review of generalized additive models, additive kernels/Gaussian processes, generalized Tikhonov regularization, empirical mode decomposition, and Synchrosqueezing, which are all related to and generalizable under the proposed framework. Although kernel methods have strong theoretical foundations, they require the prior selection of a good kernel. While the usual approach to this kernel selection problem is hyperparameter tuning, the objective of this monograph is to present an alternative (programming) approach to the kernel selection problem while using mode decomposition as a prototypical pattern recognition problem. In this approach, kernels are programmed for the task at hand through the programming of interpretable regression networks in the context of additive Gaussian processes. It is suitable for engineers, computer scientists, mathematicians, and students in these fields working on kernel methods, pattern recognition, and mode decomposition problems.
This monograph demonstrates a new approach to the classical mode decomposition problem through nonlinear regression models, which achieve near-machine precision in the recovery of the modes. The presentation includes a review of generalized additive models, additive kernels/Gaussian processes, generalized Tikhonov regularization, empirical mode decomposition, and Synchrosqueezing, which are all related to and generalizable under the proposed framework. Although kernel methods have strong theoretical foundations, they require the prior selection of a good kernel. While the usual approach to this kernel selection problem is hyperparameter tuning, the objective of this monograph is to present an alternative (programming) approach to the kernel selection problem while using mode decomposition as a prototypical pattern recognition problem. In this approach, kernels are programmed for the task at hand through the programming of interpretable regression networks in the context of additive Gaussian processes. It is suitable for engineers, computer scientists, mathematicians, and students in these fields working on kernel methods, pattern recognition, and mode decomposition problems.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.