Publicly available statistics from government agencies that are credible, relevant, accurate, and timely are essential for policy makers, individuals, households, businesses, academic institutions, and other organizations to make informed decisions. Even more, the effective operation of a democratic system of government depends on the unhindered flow of statistical information to its citizens. In the United States, federal statistical agencies in cabinet departments and independent agencies are the governmental units whose principal function is to compile, analyze, and disseminate information for such statistical purposes as describing population characteristics and trends, planning and monitoring programs, and conducting research and evaluation. The work of these agencies is coordinated by the U.S. Office of Management and Budget. Statistical agencies may acquire information not only from surveys or censuses of people and organizations, but also from such sources as government administrative records, private-sector datasets, and Internet sources that are judged of suitable quality and relevance for statistical use. They may conduct analyses, but they do not advocate policies or take partisan positions. Statistical purposes for which they provide information relate to descriptions of groups and exclude any interest in or identification of an individual person, institution, or economic unit. Four principles are fundamental for a federal statistical agency: relevance to policy issues, credibility among data users, trust among data providers, and independence from political and other undue external influence. Principles and Practices for a Federal Statistical Agency: Sixth Edition presents and comments on these principles as they've been impacted by changes in laws, regulations, and other aspects of the environment of federal statistical agencies over the past 4 years.
The Bureau of Justice Statistics (BJS) of the U.S. Department of Justice is one of the smallest of the U.S. principal statistical agencies but shoulders one of the most expansive and detailed legal mandates among those agencies. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics examines the full range of BJS programs and suggests priorities for data collection. BJS's data collection portfolio is a solid body of work, well justified by public information needs or legal requirements and a commendable effort to meet its broad mandate given less-than-commensurate fiscal resources. The book identifies some major gaps in the substantive coverage of BJS data, but notes that filling those gaps would require increased and sustained support in terms of staff and fiscal resources. In suggesting strategic goals for BJS, the book argues that the bureau's foremost goal should be to establish and maintain a strong position of independence. To avoid structural or political interference in BJS work, the report suggests changing the administrative placement of BJS within the Justice Department and making the BJS directorship a fixed-term appointment. In its thirtieth year, BJS can look back on a solid body of accomplishment; this book suggests further directions for improvement to give the nation the justice statistics-and the BJS-that it deserves.
The environment for obtaining information and providing statistical data for policy makers and the public has changed significantly in the past decade, raising questions about the fundamental survey paradigm that underlies federal statistics. New data sources provide opportunities to develop a new paradigm that can improve timeliness, geographic or subpopulation detail, and statistical efficiency. It also has the potential to reduce the costs of producing federal statistics. The panel's first report described federal statistical agencies' current paradigm, which relies heavily on sample surveys for producing national statistics, and challenges agencies are facing; the legal frameworks and mechanisms for protecting the privacy and confidentiality of statistical data and for providing researchers access to data, and challenges to those frameworks and mechanisms; and statistical agencies access to alternative sources of data. The panel recommended a new approach for federal statistical programs that would combine diverse data sources from government and private sector sources and the creation of a new entity that would provide the foundational elements needed for this new approach, including legal authority to access data and protect privacy. This second of the panel's two reports builds on the analysis, conclusions, and recommendations in the first one. This report assesses alternative methods for implementing a new approach that would combine diverse data sources from government and private sector sources, including describing statistical models for combining data from multiple sources; examining statistical and computer science approaches that foster privacy protections; evaluating frameworks for assessing the quality and utility of alternative data sources; and various models for implementing the recommended new entity. Together, the two reports offer ideas and recommendations to help federal statistical agencies examine and evaluate data from alternative sources and then combine them as appropriate to provide the country with more timely, actionable, and useful information for policy makers, businesses, and individuals.
Policy makers need information about the nationâ€"ranging from trends in the overall economy down to the use by individuals of Medicareâ€"in order to evaluate existing programs and to develop new ones. This information often comes from research based on data about individual people, households, and businesses and other organizations, collected by statistical agencies. The benefit of increasing data accessibility to researchers and analysts is better informed public policy. To realize this benefit, a variety of modes for data accessâ€" including restricted access to confidential data and unrestricted access to appropriately altered public-use dataâ€"must be used. The risk of expanded access to potentially sensitive data is the increased probability of breaching the confidentiality of the data and, in turn, eroding public confidence in the data collection enterprise. Indeed, the statistical system of the United States ultimately depends on the willingness of the public to provide the information on which research data are based. Expanding Access to Research Data issues guidance on how to more fully exploit these tradeoffs. The panel's recommendations focus on needs highlighted by legal, social, and technological changes that have occurred during the last decade.
Since 1992, the Committee on National Statistics (CNSTAT) has produced a book on principles and practices for a federal statistical agency, updating the document every 4 years to provide a current edition to newly appointed cabinet secretaries at the beginning of each presidential administration. This third edition presents and comments on three basic principles that statistical agencies must embody in order to carry out their mission fully: (1) They must produce objective data that are relevant to policy issues, (2) they must achieve and maintain credibility among data users, and (3) they must achieve and maintain trust among data providers. The book also discusses 11 important practices that are means for statistical agencies to live up to the four principles. These practices include a commitment to quality and professional practice and an active program of methodological and substantive research.
The decennial census was the federal government's largest and most complex peacetime operation. This report of a panel of the National Research Council's Committee on National Statistics comprehensively reviews the conduct of the 2000 census and the quality of the resulting data. The panel's findings cover the planning process for 2000, which was marked by an atmosphere of intense controversy about the proposed role of statistical techniques in the census enumeration and possible adjustment for errors in counting the population. The report addresses the success and problems of major innovations in census operations, the completeness of population coverage in 2000, and the quality of both the basic demographic data collected from all census respondents and the detailed socioeconomic data collected from the census long-form sample (about one-sixth of the population). The panel draws comparisons with the 1990 experience and recommends improvements in the planning process and design for 2010. The 2000 Census: Counting Under Adversity will be an invaluable resource for users of the 2000 data and for policymakers and census planners. It provides a trove of information about the issues that have fueled debate about the census process and about the operations and quality of the nation's twenty-second decennial enumeration.
Planning for the 2020 census is already beginning. This book from the National Research Council examines several aspects of census planning, including questionnaire design, address updating, non-response follow-up, coverage follow-up, de-duplication of housing units and residents, editing and imputation procedures, and several other census operations. This book recommends that the Census Bureau overhaul its approach to research and development. The report urges the Bureau to set cost and quality goals for the 2020 and future censuses, improving efficiency by taking advantage of new technologies.
To derive statistics about crime â€" to estimate its levels and trends, assess its costs to and impacts on society, and inform law enforcement approaches to prevent it â€" a conceptual framework for defining and thinking about crime is virtually a prerequisite. Developing and maintaining such a framework is no easy task, because the mechanics of crime are ever evolving and shifting: tied to shifts and development in technology, society, and legislation. Interest in understanding crime surged in the 1920s, which proved to be a pivotal decade for the collection of nationwide crime statistics. Now established as a permanent agency, the Census Bureau commissioned the drafting of a manual for preparing crime statisticsâ€"intended for use by the police, corrections departments, and courts alike. The new manual sought to solve a perennial problem by suggesting a standard taxonomy of crime. Shortly after the Census Bureau issued its manual, the International Association of Chiefs of Police in convention adopted a resolution to create a Committee on Uniform Crime Records â€"to begin the process of describing what a national system of data on crimes known to the police might look like. The key distinction between the rigorous classification proposed in this report and the "classifications" that have come before in U.S. crime statistics is that it is intended to partition the entirety of behaviors that could be considered criminal offenses into mutually exclusive categories. Modernizing Crime Statistics: Report 1: Defining and Classifying Crime assesses and makes recommendations for the development of a modern set of crime measures in the United States and the best means for obtaining them. This first report develops a new classification of crime by weighing various perspectives on how crime should be defined and organized with the needs and demands of the full array of crime data users and stakeholders.
U.S. business data are used broadly, providing the building blocks for key national-as well as regional and local-statistics measuring aggregate income and output, employment, investment, prices, and productivity. Beyond aggregate statistics, individual- and firm-level data are used for a wide range of microanalyses by academic researchers and by policy makers. In the United States, data collection and production efforts are conducted by a decentralized system of statistical agencies. This apparatus yields an extensive array of data that, particularly when made available in the form of microdata, provides an unparalleled resource for policy analysis and research on social issues and for the production of economic statistics. However, the decentralized nature of the statistical system also creates challenges to efficient data collection, to containment of respondent burden, and to maintaining consistency of terms and units of measurement. It is these challenges that raise to paramount importance the practice of effective data sharing among the statistical agencies. With this as the backdrop, the Bureau of Economic Analysis (BEA) asked the Committee on National Statistics of the National Academies to convene a workshop to discuss interagency business data sharing. The workshop was held October 21, 2005. This report is a summary of the discussions of that workshop. The workshop focused on the benefits of data sharing to two groups of stakeholders: the statistical agencies themselves and downstream data users. Presenters were asked to highlight untapped opportunities for productive data sharing that cannot yet be exploited because of regulatory or legislative constraints. The most prominently discussed example was that of tax data needed to reconcile the two primary business lists use by the statistical agencies.
On May 8, 2009, the symposium, The Federal Statistical System: Recognizing Its Contributions, Moving It Forward was held in Washington, DC. One of the topics considered at that symposium was the health of innovation in the federal statistical system. A consequence of the symposium was an agreement by the Committee on National Statistics to hold a workshop on the future of innovation in the federal statistical system. This workshop was held on June 29, 2010. The original statement of task for the workshop focused on three challenges to the statistical system: (1) the obstacles to innovative, focused research and development initiatives that could make statistical programs more cost effective; (2) a gap between emerging data visualization and communications technologies and the ability of statistical agencies to understand and capitalize on these developments for their data dissemination programs; and (3) the maturation of the information technology (IT) discipline and the difficulties confronting individual agencies in keeping current with best practice in IT regarding data confidentiality. This report, Facilitating Innovation in the Federal Statistical System, is a descriptive summary of what transpired at the workshop. It is therefore limited to the views and opinions of the workshop participants. However, it does not strictly follow the agenda of the workshop, which had four sessions. Instead, it is organized around the themes of the discussions, which migrated across the four sessions.
The Committee on National Statistics and the Committee on Population, at the request of the NIA, convened a workshop in March 1996 to discuss data on the aging population that address the emerging and important social, economic, and health conditions of the older population. The purposes of the workshop were to identify how the population at older ages in the next few decades will differ from the older population today, to understand the underlying causes of those changes, to anticipate future problems and policy issues, and to suggest future needs for data for research in these areas. The scope of the workshop was broader than that of the 1988 CNSTAT report, including not only data on health and long-term care, but also actuarial, economic, demographic, housing, and epidemiological data needs for informing public policy.
The Committee on National Statistics (CNSTAT) convened a workshop on November 4-5, 1999, to identify new directions for health statistics and the implications for health data of changes in the health arena faced by DHHS; state and local health departments; the consumers, developers, and providers of health care products and services; and other health policy makers. Changes in our understanding of health, in health care (managed care, Medicaid, Medicare), in welfare reform, in federal-state relations, in the availability of administrative data, in advanced genetic data, in information technology, in confidentiality issues, and in data integration are examples of recent developments that may play a significant role for DHHS in making future policy decisions.
Federal government statistics provide critical information to the country and serve a key role in a democracy. For decades, sample surveys with instruments carefully designed for particular data needs have been one of the primary methods for collecting data for federal statistics. However, the costs of conducting such surveys have been increasing while response rates have been declining, and many surveys are not able to fulfill growing demands for more timely information and for more detailed information at state and local levels. Innovations in Federal Statistics examines the opportunities and risks of using government administrative and private sector data sources to foster a paradigm shift in federal statistical programs that would combine diverse data sources in a secure manner to enhance federal statistics. This first publication of a two-part series discusses the challenges faced by the federal statistical system and the foundational elements needed for a new paradigm.
In 2014 the National Science Foundation (NSF) provided support to the National Academies of Sciences, Engineering, and Medicine for a series of Forums on Open Science in response to a government-wide directive to support increased public access to the results of research funded by the federal government. However, the breadth of the work resulting from the series precluded a focus on any specific topic or discussion about how to improve public access. Thus, the main goal of the Workshop on Transparency and Reproducibility in Federal Statistics was to develop some understanding of what principles and practices are, or would be, supportive of making federal statistics more understandable and reviewable, both by agency staff and the public. This publication summarizes the presentations and discussions from the workshop.
The usefulness of the U.S. decennial census depends critically on the accuracy with which individual people are counted in specific housing units, at precise geographic locations. The 2000 and other recent censuses have relied on a set of residence rules to craft instructions on the census questionnaire in order to guide respondents to identify their correct "usual residence." Determining the proper place to count such groups as college students, prisoners, and military personnel has always been complicated and controversial; major societal trends such as placement of children in shared custody arrangements and the prevalence of "snowbird" and "sunbird" populations who regularly move to favorable climates further make it difficult to specify ties to one household and one place. Once, Only Once, and in the Right Place reviews the evolution of current residence rules and the way residence concepts are presented to respondents. It proposes major changes to the basic approach of collecting residence information and suggests a program of research to improve the 2010 and future censuses.
Reform of welfare is one of the nation's most contentious issues, with debate often driven more by politics than by facts and careful analysis. Evaluating Welfare Reform in an Era of Transition identifies the key policy questions for measuring whether our changing social welfare programs are working, reviews the available studies and research, and recommends the most effective ways to answer those questions. This book discusses the development of welfare policy, including the landmark 1996 federal law that devolved most of the responsibility for welfare policies and their implementation to the states. A thorough analysis of the available research leads to the identification of gaps in what is currently known about the effects of welfare reform. Evaluating Welfare Reform in an Era of Transition specifies what-and why-we need to know about the response of individual states to the federal overhaul of welfare and the effects of the many changes in the nation's welfare laws, policies, and practices. With a clear approach to a variety of issues, Evaluating Welfare Reform in an Era of Transition will be important to policy makers, welfare administrators, researchers, journalists, and advocates on all sides of the issue.
Policy makers need information about the nation—ranging from trends in the overall economy down to the use by individuals of Medicare—in order to evaluate existing programs and to develop new ones. This information often comes from research based on data about individual people, households, and businesses and other organizations, collected by statistical agencies. The benefit of increasing data accessibility to researchers and analysts is better informed public policy. To realize this benefit, a variety of modes for data access— including restricted access to confidential data and unrestricted access to appropriately altered public-use data—must be used. The risk of expanded access to potentially sensitive data is the increased probability of breaching the confidentiality of the data and, in turn, eroding public confidence in the data collection enterprise. Indeed, the statistical system of the United States ultimately depends on the willingness of the public to provide the information on which research data are based. Expanding Access to Research Data issues guidance on how to more fully exploit these tradeoffs. The panel’s recommendations focus on needs highlighted by legal, social, and technological changes that have occurred during the last decade.
To derive statistics about crime â€" to estimate its levels and trends, assess its costs to and impacts on society, and inform law enforcement approaches to prevent it - a conceptual framework for defining and thinking about crime is virtually a prerequisite. Developing and maintaining such a framework is no easy task, because the mechanics of crime are ever evolving and shifting: tied to shifts and development in technology, society, and legislation. Interest in understanding crime surged in the 1920s, which proved to be a pivotal decade for the collection of nationwide crime statistics. Now established as a permanent agency, the Census Bureau commissioned the drafting of a manual for preparing crime statisticsâ€"intended for use by the police, corrections departments, and courts alike. The new manual sought to solve a perennial problem by suggesting a standard taxonomy of crime. Shortly after the Census Bureau issued its manual, the International Association of Chiefs of Police in convention adopted a resolution to create a Committee on Uniform Crime Records â€"to begin the process of describing what a national system of data on crimes known to the police might look like. Report 1 performed a comprehensive reassessment of what is meant by crime in U.S. crime statistics and recommends a new classification of crime to organize measurement efforts. This second report examines methodological and implementation issues and presents a conceptual blueprint for modernizing crime statistics.
Vital statistics, the records of birth and death, are a critical national information resource for understanding public health. Over the past few decades, the specific program that gathers the data has evolved into a complex cooperative program between the federal and state governments for social measurement. The Vital Statistics Cooperative Program (VSCP) is currently maintained by the National Center for Health Statistics (NCHS). The U.S. vital statistics system relies on the original information reported by myriad individuals, channeled through varying state and local information systems, and coordinated and processed by a federal statistical agency that has experienced relatively flat funding for many years. The challenges facing the vital statistics system and the continuing importance of the resulting data make it an important topic for examination. A workshop, held by the National Academies and summarized in this volume, considered the importance of adequate vital statistics. In particular, the workshop assessed both current and emerging uses of the data, considered the methodological and organizational features of compiling vital data, and identified possible visions for the vital statistics program.
This final report of the Panel on Alternative Census Methodologies provides an assessment of the Census Bureau's plans for the 2000 census as of the time of the 1998 census dress rehearsal. It examines changes in census plans following, and to a modest extent in reaction to, the panel's second interim report, regarding the use of sampling for nonresponse follow-up, construction of the master address file, use of multiple response modes and respondent-friendly questionnaires, and the use of administrative records. It also describes evaluation plans for the census dress rehearsal and plans for data collection and experimentation during the 2000 census. Most of the results from the dress rehearsal were not yet available to the panel, so this report does not offer any suggested changes to 2000 census plans in response to the dress rehearsal.
This final report of the Panel on Alternative Census Methodologies provides an assessment of the Census Bureau's plans for the 2000 census as of the time of the 1998 census dress rehearsal. It examines changes in census plans following, and to a modest extent in reaction to, the panel's second interim report, regarding the use of sampling for nonresponse follow-up, construction of the master address file, use of multiple response modes and respondent-friendly questionnaires, and the use of administrative records. It also describes evaluation plans for the census dress rehearsal and plans for data collection and experimentation during the 2000 census. Most of the results from the dress rehearsal were not yet available to the panel, so this report does not offer any suggested changes to 2000 census plans in response to the dress rehearsal.
In the early 1990s, the Census Bureau proposed a program of continuous measurement as a possible alternative to the gathering of detailed social, economic, and housing data from a sample of the U.S. population as part of the decennial census. The American Community Survey (ACS) became a reality in 2005, and has included group quarters (GQ)-such places as correctional facilities for adults, student housing, nursing facilities, inpatient hospice facilities, and military barracks-since 2006, primarily to more closely replicate the design and data products of the census long-form sample. The decision to include group quarters in the ACS enables the Census Bureau to provide a comprehensive benchmark of the total U.S. population (not just those living in households). However, the fact that the ACS must rely on a sample of what is a small and very diverse population, combined with limited funding available for survey operations, makes the ACS GQ sampling, data collection, weighting, and estimation procedures more complex and the estimates more susceptible to problems stemming from these limitations. The concerns are magnified in small areas, particularly in terms of detrimental effects on the total population estimates produced for small areas. Small Populations, Large Effects provides an in-depth review of the statistical methodology for measuring the GQ population in the ACS. This report addresses difficulties associated with measuring the GQ population and the rationale for including GQs in the ACS. Considering user needs for ACS data and of operational feasibility and compatibility with the treatment of the household population in the ACS, the report recommends alternatives to the survey design and other methodological features that can make the ACS more useful for users of small-area data.
America's farms and farmers are integral to the U.S. economy and, more broadly, to the nation's social and cultural fabric. A healthy agricultural sector helps ensure a safe and reliable food supply, improves energy security, and contributes to employment and economic development, traditionally in small towns and rural areas where farming serves as a nexus for related sectors from farm machinery manufacturing to food processing. The agricultural sector also plays a role in the nation's overall economic growth by providing crucial raw inputs for the production of a wide range of goods and services, including many that generate substantial export value. If the agricultural sector is to be accurately understood and the policies that affect its functioning are to remain well informed, the statistical system's data collection programs must be periodically revisited to ensure they are keeping up with current realities. This report reviews current information and makes recommendations to the U.S. Department of Agriculture's (USDA's) National Agricultural Statistics Service (NASS) and Economic Research Service (ERS) to help identify effective methods for collecting data and reporting information about American agriculture, given increased complexity and other changes in farm business structure in recent decades.
The National Agricultural Statistics Service (NASS) is the primary statistical data collection agency within the U.S. Department of Agriculture (USDA). NASS conducts hundreds of surveys each year and prepares reports covering virtually every aspect of U.S. agriculture. Among the small-area estimates produced by NASS are county-level estimates for crops (planted acres, harvested acres, production, and yield by commodity) and for cash rental rates for irrigated cropland, nonirrigated cropland, and permanent pastureland. Key users of these county-level estimates include USDA's Farm Services Agency (FSA) and Risk Management Agency (RMA), which use the estimates as part of their processes for distributing farm subsidies and providing farm insurance, respectively. Improving Crop Estimates by Integrating Multiple Data Sources assesses county-level crop and cash rents estimates, and offers recommendations on methods for integrating data sources to provide more precise county-level estimates of acreage and yield for major crops and of cash rents by land use. This report considers technical issues involved in using the available data sources, such as methods for integrating the data, the assumptions underpinning the use of each source, the robustness of the resulting estimates, and the properties of desirable estimates of uncertainty.
Patterns of food consumption and nutritional intake strongly affect the population's health and well-being. The Food Economics Division of USDA's Economic Research Service (ERS) engages in research and data collection to inform policy making related to the leading federal nutrition assistance programs managed by USDA's Food and Nutrition Service. The ERS uses the Consumer Food Data System to understand why people choose foods, how food assistance programs affect these choices, and the health impacts of those choices. At the request of ERS, A Consumer Food Data System for 2030 and Beyond provides a blueprint for ERS's Food Economics Division for its data strategy over the next decade. This report explores the quality of data collected, the data collection process, and the kinds of data that may be most valuable to researchers, policy makers, and program administrators going forward. The recommendations of A Consumer Food Data System for 2030 and Beyond will guide ERS to provide and sustain a multisource, interconnected, reliable data system.
The National Center for Science and Engineering Statistics (NCSES) of the National Science Foundation (NSF) communicates its science and engineering (S&E) information to data users in a very fluid environment that is undergoing modernization at a pace at which data producer dissemination practices, protocols, and technologies, on one hand, and user demands and capabilities, on the other, are changing faster than the agency has been able to accommodate. NCSES asked the Committee on National Statistics and the Computer Science and Telecommunications Board of the National Research Council to form a panel to review the NCSES communication and dissemination program that is concerned with the collection and distribution of information on science and engineering and to recommend future directions for the program. Communicating Science and Engineering Data in the Information Age includes recommendations to improve NCSES's dissemination program and improve data user engagement. This report includes recommendations such as NCSES's transition to a dissemination framework that emphasizes database management rather than data presentation, and that NCSES analyze the results of its initial online consumer survey and refine it over time. The implementation of the report's recommendations should be undertaken within an overall framework that accords priority to the basic quality of the data and the fundamentals of dissemination, then to significant enhancements that are achievable in the short term, while laying the groundwork for other long-term improvements.
The Panel on Estimates of Poverty for Small Geographic Areas was established by the Committee on National Statistics at the National Research Council in response to the Improving America's Schools Act of 1994. That act charged the U.S. Census Bureau to produce updated estimates of poor school-age children every two years for the nation's more than 3,000 counties and 14,000 school districts. The act also charged the panel with determining the appropriateness and reliability of the Bureau's estimates for use in the allocation of more than $7 billion of Title I funds each year for educationally disadvantaged children. The panel's charge was both a major one and one with immovable deadlines. The panel had to evaluate the Census Bureau's work on a very tight schedule in order to meet legal requirements for allocation of Title I funds. As it turned out, the panel produced three interim reports: the first one evaluated county-level estimates of poor school-age children in 1993, the second one assessed a revised set of 1993 county estimates; and the third one covered both county- and school district-level estimates of poor school-age children in 1995. This volume combines and updates these three reports into a single reference volume.
The National Science Foundation (NSF) has long collected information on the number and characteristics of individuals with education or employment in science and engineering and related fields in the United States. An important motivation for this effort is to fulfill a congressional mandate to monitor the status of women and minorities in the science and engineering workforce. Consequently, many statistics are calculated by race or ethnicity, gender, and disability status. For more than 25 years, NSF obtained a sample frame for identifying the target population for information it gathered from the list of respondents to the decennial census long-form who indicated that they had earned a bachelors or higher degree. The probability that an individual was sampled from this list was dependent on both demographic and employment characteristics. But, the source for the sample frame will no longer be available because the census long-form is being replaced as of the 2010 census with the continuous collection of detailed demographic and other information in the new American Community Survey (ACS). At the request of NSF's Science Resources Statistics Division, the Committee on National Statistics of the National Research Council formed a panel to conduct a workshop and study the issues involved in replacing the decennial census long-form sample with a sample from the ACS to serve as the frame for the information the NSF gathers. The workshop had the specific objective of identifying issues for the collection of field of degree information on the ACS with regard to goals, content, statistical methodology, data quality, and data products.
The Committee on National Statistics of the National Academies of Sciences, Engineering, and Medicine convened a 2-day public workshop from December 11-12, 2019, to discuss the suite of data products the Census Bureau will generate from the 2020 Census. The workshop featured presentations by users of decennial census data products to help the Census Bureau better understand the uses of the data products and the importance of these uses and help inform the Census Bureau's decisions on the final specification of 2020 data products. This publication summarizes the presentation and discussion of the workshop.
Recent trends in federal policies for social and economic programs have increased the demand for timely, accurate estimates of income and poverty for states, counties, and even smaller areas. Every year more than $130 billion in federal funds is allocated to states and localities through formulas that use such estimates. These funds support a wide range of programs that include child care, community development, education, job training, nutrition, and public health. A new program of the U.S. Census Bureau is now providing more timely estimates for these programs than those from the decennial census, which have been used for many years. These new estimates are being used to allocate more than $7 billion annually to school districts, through the Title I program that supports educationally disadvantaged children. But are these estimates as accurate as possible given the available data? How can the statistical models and data that are used to develop the estimates be improved? What should policy makers consider in selecting particular estimates? This new book from the National Research Council provides guidance for improving the Census Bureau's program and for policy makers who use such estimates for allocating funds.
Disparities in health and health care across racial, ethnic, and socioeconomic backgrounds in the United States are well documented. The reasons for these disparities are, however, not well understood. Current data available on race, ethnicity, SEP, and accumulation and language use are severely limited. The report examines data collection and reporting systems relating to the collection of data on race, ethnicity, and socioeconomic position and offers recommendations.
Recent years have seen a growing tendency for social scientists to collect biological specimens such as blood, urine, and saliva as part of large-scale household surveys. By combining biological and social data, scientists are opening up new fields of inquiry and are able for the first time to address many new questions and connections. But including biospecimens in social surveys also adds a great deal of complexity and cost to the investigator's task. Along with the usual concerns about informed consent, privacy issues, and the best ways to collect, store, and share data, researchers now face a variety of issues that are much less familiar or that appear in a new light. In particular, collecting and storing human biological materials for use in social science research raises additional legal, ethical, and social issues, as well as practical issues related to the storage, retrieval, and sharing of data. For example, acquiring biological data and linking them to social science databases requires a more complex informed consent process, the development of a biorepository, the establishment of data sharing policies, and the creation of a process for deciding how the data are going to be shared and used for secondary analysis-all of which add cost to a survey and require additional time and attention from the investigators. These issues also are likely to be unfamiliar to social scientists who have not worked with biological specimens in the past. Adding to the attraction of collecting biospecimens but also to the complexity of sharing and protecting the data is the fact that this is an era of incredibly rapid gains in our understanding of complex biological and physiological phenomena. Thus the tradeoffs between the risks and opportunities of expanding access to research data are constantly changing. Conducting Biosocial Surveys offers findings and recommendations concerning the best approaches to the collection, storage, use, and sharing of biospecimens gathered in social science surveys and the digital representations of biological data derived therefrom. It is aimed at researchers interested in carrying out such surveys, their institutions, and their funding agencies.
Since 1992, the Committee on National Statistics (CNSTAT) has produced a book on principles and practices for a federal statistical agency, updating the document every 4 years to provide a current edition to newly appointed cabinet secretaries at the beginning of each presidential administration. This fourth edition presents and comments on four basic principles that statistical agencies must embody in order to carry out their mission fully: (1) They must produce objective data that are relevant to policy issues, (2) they must achieve and maintain credibility among data users, (3) they must achieve and maintain trust among data providers, and (4) they must achieve and maintain a strong position of independence from the appearance and reality of political control. The book also discusses 11 important practices that are means for statistical agencies to live up to the four principles. These practices include a commitment to quality and professional practice and an active program of methodological and substantive research. This fourth edition adds the principle that statistical agencies must operate from a strong position of independence and the practice that agencies must have ongoing internal and external evaluations of their programs.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.