Differential Item Functioning, Second Edition is a revision of the 1983 title Test Item Bias. In the past 23 years, differential item performance has assumed a level of attention unimagined in the early 1980s. Then, only a few tests and assessment programs attended to "item bias," while doing so is now a mandatory step in any responsible assessment program. Also, technical advances, such as the widespread use of item response theory, have pushed the field of differential performance to levels of technical sophistication far beyond what was practiced years ago. This new edition presents an up-to-date description of DIF; describes varying procedures for addressing DIF in practical testing contexts; presents useful examples and studies of DIF that readers may employ as a guide in their own DIF work; and briefly describes relevant features of major statistical packages that can be employed in DIF analysis (e.g., SPSS, SAS, M+, Minitab, and Systat). This text is ideal for the measurement professional or advanced student who deals with educational or psychological assessment. Readers need only have a preliminary background in tests and measurement, including some beginning statistics and elementary algebra, in order to find this volume useful.
Quantitative thinking is our inclination to view natural and everyday phenomena through a lens of measurable events, with forecasts, odds, predictions, and likelihood playing a dominant part. The Error of Truth recounts the astonishing and unexpected tale of how quantitative thinking came to be, and its rise to primacy in the nineteenth and early twentieth centuries. Additionally, it considers how seeing the world through a quantitative lens has shaped our perception of the world we live in, and explores the lives of the individuals behind its early establishment. This worldview was unlike anything humankind had before, and it came about because of a momentous human achievement: we had learned how to measure uncertainty. Probability as a science was conceptualised. As a result of probability theory, we now had correlations, reliable predictions, regressions, the bellshaped curve for studying social phenomena, and the psychometrics of educational testing. Significantly, these developments happened during a relatively short period in world history— roughly, the 130-year period from 1790 to 1920, from about the close of the Napoleonic era, through the Enlightenment and the Industrial Revolutions, to the end of World War I. At which time, transportation had advanced rapidly, due to the invention of the steam engine, and literacy rates had increased exponentially. This brief period in time was ready for fresh intellectual activity, and it gave a kind of impetus for the probability inventions. Quantification is now everywhere in our daily lives, such as in the ubiquitous microchip in smartphones, cars, and appliances; in the Bayesian logic of artificial intelligence, as well as applications in business, engineering, medicine, economics, and elsewhere. Probability is the foundation of quantitative thinking. The Error of Truth tells its story— when, why, and how it happened.
A unique, practical manual for identifying and analyzing item bias in standardized tests. Osterlind discusses five strategies for detecting bias: analysis of variance, transformed item difficulties, chi square, item characteristic curve, and distractor response. He covers specific hypotheses under test for each technique, as well as the capabilities and limitations of each strategy.
This new edition presents an up-to-date description of differential item functioning. It describes varying procedures for addressing DIF in practical testing contexts. The authors present useful examples and studies of DIF that readers may employ as a guide in their own work. They also cover major statistical packages that can be employed in DIF analysis (e.g., SPSS, SAS, M+, Minitab, and Systat). This text is ideal for the measurement professional or advanced student who deals with educational or psychological assessment.
A unique, practical manual for identifying and analyzing item bias in standardized tests. Osterlind discusses five strategies for detecting bias: analysis of variance, transformed item difficulties, chi square, item characteristic curve, and distractor response. He covers specific hypotheses under test for each technique, as well as the capabilities and limitations of each strategy.
Quantitative thinking is our inclination to view natural and everyday phenomena through a lens of measurable events, with forecasts, odds, predictions, and likelihood playing a dominant part. The Error of Truth recounts the astonishing and unexpected tale of how quantitative thinking came to be, and its rise to primacy in the nineteenth and early twentieth centuries. Additionally, it considers how seeing the world through a quantitative lens has shaped our perception of the world we live in, and explores the lives of the individuals behind its early establishment. This worldview was unlike anything humankind had before, and it came about because of a momentous human achievement: we had learned how to measure uncertainty. Probability as a science was conceptualised. As a result of probability theory, we now had correlations, reliable predictions, regressions, the bellshaped curve for studying social phenomena, and the psychometrics of educational testing. Significantly, these developments happened during a relatively short period in world history— roughly, the 130-year period from 1790 to 1920, from about the close of the Napoleonic era, through the Enlightenment and the Industrial Revolutions, to the end of World War I. At which time, transportation had advanced rapidly, due to the invention of the steam engine, and literacy rates had increased exponentially. This brief period in time was ready for fresh intellectual activity, and it gave a kind of impetus for the probability inventions. Quantification is now everywhere in our daily lives, such as in the ubiquitous microchip in smartphones, cars, and appliances; in the Bayesian logic of artificial intelligence, as well as applications in business, engineering, medicine, economics, and elsewhere. Probability is the foundation of quantitative thinking. The Error of Truth tells its story— when, why, and how it happened.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.