This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of their use in practice. This also includes a treatment of the Berry–Esseen error estimate in the central limit theorem. The authors assume mathematical maturity at a graduate level; otherwise the book is suitable for students with varying levels of background in analysis and measure theory. For the reader who needs refreshers, theorems from analysis and measure theory used in the main text are provided in comprehensive appendices, along with their proofs, for ease of reference. Rabi Bhattacharya is Professor of Mathematics at the University of Arizona. Edward Waymire is Professor of Mathematics at Oregon State University. Both authors have co-authored numerous books, including a series of four upcoming graduate textbooks in stochastic processes with applications.
This book develops systematically and rigorously, yet in an expository and lively manner, the evolution of general random processes and their large time properties such as transience, recurrence, and convergence to steady states. The emphasis is on the most important classes of these processes from the viewpoint of theory as well as applications, namely, Markov processes. The book features very broad coverage of the most applicable aspects of stochastic processes, including sufficient material for self-contained courses on random walks in one and multiple dimensions; Markov chains in discrete and continuous times, including birth-death processes; Brownian motion and diffusions; stochastic optimization; and stochastic differential equations. This book is for graduate students in mathematics, statistics, science and engineering, and it may also be used as a reference by professionals in diverse fields whose work involves the application of probability.
This book develops systematically and rigorously, yet in an expository and lively manner, the evolution of general random processes and their large time properties such as transience, recurrence, and convergence to steady states. The emphasis is on the most important classes of these processes from the viewpoint of theory as well as applications, namely, Markov processes. The book features very broad coverage of the most applicable aspects of stochastic processes, including sufficient material for self-contained courses on random walks in one and multiple dimensions; Markov chains in discrete and continuous times, including birth-death processes; Brownian motion and diffusions; stochastic optimization; and stochastic differential equations. This book is for graduate students in mathematics, statistics, science and engineering, and it may also be used as a reference by professionals in diverse fields whose work involves the application of probability.
This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of their use in practice. This also includes a treatment of the Berry–Esseen error estimate in the central limit theorem. The authors assume mathematical maturity at a graduate level; otherwise the book is suitable for students with varying levels of background in analysis and measure theory. For the reader who needs refreshers, theorems from analysis and measure theory used in the main text are provided in comprehensive appendices, along with their proofs, for ease of reference. Rabi Bhattacharya is Professor of Mathematics at the University of Arizona. Edward Waymire is Professor of Mathematics at Oregon State University. Both authors have co-authored numerous books, including a series of four upcoming graduate textbooks in stochastic processes with applications.
This textbook explores two distinct stochastic processes that evolve at random: weakly stationary processes and discrete parameter Markov processes. Building from simple examples, the authors focus on developing context and intuition before formalizing the theory of each topic. This inviting approach illuminates the key ideas and computations in the proofs, forming an ideal basis for further study. After recapping the essentials from Fourier analysis, the book begins with an introduction to the spectral representation of a stationary process. Topics in ergodic theory follow, including Birkhoff’s Ergodic Theorem and an introduction to dynamical systems. From here, the Markov property is assumed and the theory of discrete parameter Markov processes is explored on a general state space. Chapters cover a variety of topics, including birth–death chains, hitting probabilities and absorption, the representation of Markov processes as iterates of random maps, and large deviation theory for Markov processes. A chapter on geometric rates of convergence to equilibrium includes a splitting condition that captures the recurrence structure of certain iterated maps in a novel way. A selection of special topics concludes the book, including applications of large deviation theory, the FKG inequalities, coupling methods, and the Kalman filter. Featuring many short chapters and a modular design, this textbook offers an in-depth study of stationary and discrete-time Markov processes. Students and instructors alike will appreciate the accessible, example-driven approach and engaging exercises throughout. A single, graduate-level course in probability is assumed.
This graduate text presents the elegant and profound theory of continuous parameter Markov processes and many of its applications. The authors focus on developing context and intuition before formalizing the theory of each topic, illustrated with examples. After a review of some background material, the reader is introduced to semigroup theory, including the Hille–Yosida Theorem, used to construct continuous parameter Markov processes. Illustrated with examples, it is a cornerstone of Feller’s seminal theory of the most general one-dimensional diffusions studied in a later chapter. This is followed by two chapters with probabilistic constructions of jump Markov processes, and processes with independent increments, or Lévy processes. The greater part of the book is devoted to Itô’s fascinating theory of stochastic differential equations, and to the study of asymptotic properties of diffusions in all dimensions, such as explosion, transience, recurrence, existence of steady states, and the speed of convergence to equilibrium. A broadly applicable functional central limit theorem for ergodic Markov processes is presented with important examples. Intimate connections between diffusions and linear second order elliptic and parabolic partial differential equations are laid out in two chapters, and are used for computational purposes. Among Special Topics chapters, two study anomalous diffusions: one on skew Brownian motion, and the other on an intriguing multi-phase homogenization of solute transport in porous media.
This textbook offers an approachable introduction to stochastic processes that explores the four pillars of random walk, branching processes, Brownian motion, and martingales. Building from simple examples, the authors focus on developing context and intuition before formalizing the theory of each topic. This inviting approach illuminates the key ideas and computations in the proofs, forming an ideal basis for further study. Consisting of many short chapters, the book begins with a comprehensive account of the simple random walk in one dimension. From here, different paths may be chosen according to interest. Themes span Poisson processes, branching processes, the Kolmogorov–Chentsov theorem, martingales, renewal theory, and Brownian motion. Special topics follow, showcasing a selection of important contemporary applications, including mathematical finance, optimal stopping, ruin theory, branching random walk, and equations of fluids. Engaging exercises accompany the theory throughout. Random Walk, Brownian Motion, and Martingales is an ideal introduction to the rigorous study of stochastic processes. Students and instructors alike will appreciate the accessible, example-driven approach. A single, graduate-level course in probability is assumed.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.