After a brief historical perspective, A First Course on Parametric Inference, discusses the basic concept of sufficient statistic and the classical approach based on minimum variance unbiased estimator. There is a separate chapter on simultaneous estimation of several parameters. Large sample theory of estimation, based on consistent asymptotically normal estimators obtained by method of moments, percentile and the method of maximum likelihood is also introduced. The tests of hypotheses for finite samples with classical Neyman-Pearson theory is developed pointing out its connection with Bayesian approach. The hypotheses testing and confidence interval techniques are developed leading to likelihood ratio tests, score tests and tests based on maximum likelihood estimators."--BOOK JACKET.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.