Planning for the 2020 census is already beginning. This book from the National Research Council examines several aspects of census planning, including questionnaire design, address updating, non-response follow-up, coverage follow-up, de-duplication of housing units and residents, editing and imputation procedures, and several other census operations. This book recommends that the Census Bureau overhaul its approach to research and development. The report urges the Bureau to set cost and quality goals for the 2020 and future censuses, improving efficiency by taking advantage of new technologies.
For the past 50 years, the Census Bureau has conducted experiments and evaluations with every decennial census involving field data collection during which alternatives to current census processes are assessed for a subset of the population. An "evaluation" is usually a post hoc analysis of data collected as part of the decennial census processing to determine whether individual steps in the census operated as expected. The 2010 Program for Evaluations and Experiments, known as CPEX, has enormous potential to reduce costs and increase effectiveness of the 2020 census by reducing the initial list of potential research topics from 52 to 6. The panel identified three priority experiments for inclusion in the 2010 census to assist 2020 census planning: (1) an experiment on the use of the Internet for data collection; (2) an experiment on the use of administrative records for various census purposes; and (3) an experiment (or set of experiments) on features of the census questionnaire. They also came up with 11 recommendations to improve efficiency and quality of data collection including allowing use of the Internet for data submission and including one or more alternate questionnaire experiments to examine things such as the representation of race and ethnicity.
The Panel on Research on Future Census Methods has a broad charge to review the early planning process for the 2010 census. Its work includes observing the operation of the 2000 census, deriving lessons for 2010, and advising on effective evaluations and tests. This is the panel's third report; they have previously issued an interim report offering suggestions on the Census Bureau's evaluation plan for 2000 and a letter report commenting on the bureau's proposed general structure for the 2010 census.
At the request of the U.S. Census Bureau, the National Research Council's Committee on National Statistics established the Panel on Research on Future Census Methods to review the early planning process for the 2010 census. This new report documents the panel's strong support for the major aims of the Census Bureau's emerging plan for 2010. At the same time, it notes the considerable challenges that must be overcome if the bureau's innovations are to be successful. The panel agrees with the Census Bureau that implementation of the American Community Survey and, with it, the separation of the long form from the census process are excellent concepts. Moreover, it concurs that the critically important Master Address File and TIGER geographic systems are in dire need of comprehensive updating and that new technologies have the potential to improve the accuracy of the count. The report identifies the risks and rewards of these and other components of the Census Bureau's plan. The report emphasizes the need for the bureau to link its research and evaluation efforts much more closely to operational planning and the importance of funding for a comprehensive and rigorous testing program before 2010.
The decennial census was the federal government's largest and most complex peacetime operation. This report of a panel of the National Research Council's Committee on National Statistics comprehensively reviews the conduct of the 2000 census and the quality of the resulting data. The panel's findings cover the planning process for 2000, which was marked by an atmosphere of intense controversy about the proposed role of statistical techniques in the census enumeration and possible adjustment for errors in counting the population. The report addresses the success and problems of major innovations in census operations, the completeness of population coverage in 2000, and the quality of both the basic demographic data collected from all census respondents and the detailed socioeconomic data collected from the census long-form sample (about one-sixth of the population). The panel draws comparisons with the 1990 experience and recommends improvements in the planning process and design for 2010. The 2000 Census: Counting Under Adversity will be an invaluable resource for users of the 2000 data and for policymakers and census planners. It provides a trove of information about the issues that have fueled debate about the census process and about the operations and quality of the nation's twenty-second decennial enumeration.
The census coverage measurement programs have historically addressed three primary objectives: (1) to inform users about the quality of the census counts; (2) to help identify sources of error to improve census taking, and (3) to provide alternative counts based on information from the coverage measurement program. In planning the 1990 and 2000 censuses, the main objective was to produce alternative counts based on the measurement of net coverage error. For the 2010 census coverage measurement program, the Census Bureau will deemphasize that goal, and is instead planning to focus on the second goal of improving census processes. This book, which details the findings of the National Research Council's Panel on Coverage Evaluation and Correlation Bias, strongly supports the Census Bureau's change in goal. However, the panel finds that the current plans for data collection, data analysis, and data products are still too oriented towards measurement of net coverage error to fully exploit this new focus. Although the Census Bureau has taken several important steps to revise data collection and analysis procedures and data products, this book recommends further steps to enhance the value of coverage measurement for the improvement of future census processes.
To derive statistics about crime â€" to estimate its levels and trends, assess its costs to and impacts on society, and inform law enforcement approaches to prevent it - a conceptual framework for defining and thinking about crime is virtually a prerequisite. Developing and maintaining such a framework is no easy task, because the mechanics of crime are ever evolving and shifting: tied to shifts and development in technology, society, and legislation. Interest in understanding crime surged in the 1920s, which proved to be a pivotal decade for the collection of nationwide crime statistics. Now established as a permanent agency, the Census Bureau commissioned the drafting of a manual for preparing crime statisticsâ€"intended for use by the police, corrections departments, and courts alike. The new manual sought to solve a perennial problem by suggesting a standard taxonomy of crime. Shortly after the Census Bureau issued its manual, the International Association of Chiefs of Police in convention adopted a resolution to create a Committee on Uniform Crime Records â€"to begin the process of describing what a national system of data on crimes known to the police might look like. Report 1 performed a comprehensive reassessment of what is meant by crime in U.S. crime statistics and recommends a new classification of crime to organize measurement efforts. This second report examines methodological and implementation issues and presents a conceptual blueprint for modernizing crime statistics.
To derive statistics about crime â€" to estimate its levels and trends, assess its costs to and impacts on society, and inform law enforcement approaches to prevent it â€" a conceptual framework for defining and thinking about crime is virtually a prerequisite. Developing and maintaining such a framework is no easy task, because the mechanics of crime are ever evolving and shifting: tied to shifts and development in technology, society, and legislation. Interest in understanding crime surged in the 1920s, which proved to be a pivotal decade for the collection of nationwide crime statistics. Now established as a permanent agency, the Census Bureau commissioned the drafting of a manual for preparing crime statisticsâ€"intended for use by the police, corrections departments, and courts alike. The new manual sought to solve a perennial problem by suggesting a standard taxonomy of crime. Shortly after the Census Bureau issued its manual, the International Association of Chiefs of Police in convention adopted a resolution to create a Committee on Uniform Crime Records â€"to begin the process of describing what a national system of data on crimes known to the police might look like. The key distinction between the rigorous classification proposed in this report and the "classifications" that have come before in U.S. crime statistics is that it is intended to partition the entirety of behaviors that could be considered criminal offenses into mutually exclusive categories. Modernizing Crime Statistics: Report 1: Defining and Classifying Crime assesses and makes recommendations for the development of a modern set of crime measures in the United States and the best means for obtaining them. This first report develops a new classification of crime by weighing various perspectives on how crime should be defined and organized with the needs and demands of the full array of crime data users and stakeholders.
The United States prides itself on being a nation of immigrants, and the country has a long history of successfully absorbing people from across the globe. The integration of immigrants and their children contributes to our economic vitality and our vibrant and ever changing culture. We have offered opportunities to immigrants and their children to better themselves and to be fully incorporated into our society and in exchange immigrants have become Americans - embracing an American identity and citizenship, protecting our country through service in our military, fostering technological innovation, harvesting its crops, and enriching everything from the nation's cuisine to its universities, music, and art. Today, the 41 million immigrants in the United States represent 13.1 percent of the U.S. population. The U.S.-born children of immigrants, the second generation, represent another 37.1 million people, or 12 percent of the population. Thus, together the first and second generations account for one out of four members of the U.S. population. Whether they are successfully integrating is therefore a pressing and important question. Are new immigrants and their children being well integrated into American society, within and across generations? Do current policies and practices facilitate their integration? How is American society being transformed by the millions of immigrants who have arrived in recent decades? To answer these questions, this new report from the National Academies of Sciences, Engineering, and Medicine summarizes what we know about how immigrants and their descendants are integrating into American society in a range of areas such as education, occupations, health, and language.
The Panel on Research on Future Census Methods has a broad charge to review the early planning process for the 2010 census. Its work includes observing the operation of the 2000 census, deriving lessons for 2010, and advising on effective evaluations and tests. This is the panel's third report; they have previously issued an interim report offering suggestions on the Census Bureau's evaluation plan for 2000 and a letter report commenting on the bureau's proposed general structure for the 2010 census.
At the request of the U.S. Census Bureau, the National Research Council's Committee on National Statistics established the Panel on Research on Future Census Methods to review the early planning process for the 2010 census. This new report documents the panel's strong support for the major aims of the Census Bureau's emerging plan for 2010. At the same time, it notes the considerable challenges that must be overcome if the bureau's innovations are to be successful. The panel agrees with the Census Bureau that implementation of the American Community Survey and, with it, the separation of the long form from the census process are excellent concepts. Moreover, it concurs that the critically important Master Address File and TIGER geographic systems are in dire need of comprehensive updating and that new technologies have the potential to improve the accuracy of the count. The report identifies the risks and rewards of these and other components of the Census Bureau's plan. The report emphasizes the need for the bureau to link its research and evaluation efforts much more closely to operational planning and the importance of funding for a comprehensive and rigorous testing program before 2010.
The Bureau of Justice Statistics (BJS) of the U.S. Department of Justice is one of the smallest of the U.S. principal statistical agencies but shoulders one of the most expansive and detailed legal mandates among those agencies. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics examines the full range of BJS programs and suggests priorities for data collection. BJS's data collection portfolio is a solid body of work, well justified by public information needs or legal requirements and a commendable effort to meet its broad mandate given less-than-commensurate fiscal resources. The book identifies some major gaps in the substantive coverage of BJS data, but notes that filling those gaps would require increased and sustained support in terms of staff and fiscal resources. In suggesting strategic goals for BJS, the book argues that the bureau's foremost goal should be to establish and maintain a strong position of independence. To avoid structural or political interference in BJS work, the report suggests changing the administrative placement of BJS within the Justice Department and making the BJS directorship a fixed-term appointment. In its thirtieth year, BJS can look back on a solid body of accomplishment; this book suggests further directions for improvement to give the nation the justice statistics-and the BJS-that it deserves.
The usefulness of the U.S. decennial census depends critically on the accuracy with which individual people are counted in specific housing units, at precise geographic locations. The 2000 and other recent censuses have relied on a set of residence rules to craft instructions on the census questionnaire in order to guide respondents to identify their correct "usual residence." Determining the proper place to count such groups as college students, prisoners, and military personnel has always been complicated and controversial; major societal trends such as placement of children in shared custody arrangements and the prevalence of "snowbird" and "sunbird" populations who regularly move to favorable climates further make it difficult to specify ties to one household and one place. Once, Only Once, and in the Right Place reviews the evolution of current residence rules and the way residence concepts are presented to respondents. It proposes major changes to the basic approach of collecting residence information and suggests a program of research to improve the 2010 and future censuses.
International trade plays a substantial role in the economy of the United States. More than 1.6 billion tons of international merchandise was conveyed using the U.S. transportation system in 2001. The need to transport this merchandise raises concerns about the quality of the transportation system and its ability to support this component of freight movement. Measuring International Trade on U.S. Highways evaluates the accuracy and reliability of measuring the ton-miles and value-miles of international trade traffic carried by highway for each state. This report also assesses the accuracy and reliability of the use of diesel fuel data as a measure of international trade traffic by state and identifies needed improvements in long-term data collection programs.
It is easy to underestimate how little was known about crimes and victims before the findings of the National Crime Victimization Survey (NCVS) became common wisdom. In the late 1960s, knowledge of crimes and their victims came largely from reports filed by local police agencies as part of the Federal Bureau of Investigation's (FBI) Uniform Crime Reporting (UCR) system, as well as from studies of the files held by individual police departments. Criminologists understood that there existed a "dark figure" of crime consisting of events not reported to the police. However, over the course of the last decade, the effectiveness of the NCVS has been undermined by the demands of conducting an increasingly expensive survey in an effectively flat-line budgetary environment. Surveying Victims: Options for Conducting the National Crime Victimization Survey, reviews the programs of the Bureau of Justice Statistics (BJS.) Specifically, it explores alternative options for conducting the NCVS, which is the largest BJS program. This book describes various design possibilities and their implications relative to three basic goals; flexibility, in terms of both content and analysis; utility for gathering information on crimes that are not well reported to police; and small-domain estimation, including providing information on states or localities. This book finds that, as currently configured and funded, the NCVS is not achieving and cannot achieve BJS's mandated goal to "collect and analyze data that will serve as a continuous indication of the incidence and attributes of crime." Accordingly, Surveying Victims recommends that BJS be afforded the budgetary resources necessary to generate accurate measure of victimization.
In the early 1990s, the Census Bureau proposed a program of continuous measurement as a possible alternative to the gathering of detailed social, economic, and housing data from a sample of the U.S. population as part of the decennial census. The American Community Survey (ACS) became a reality in 2005, and has included group quarters (GQ)-such places as correctional facilities for adults, student housing, nursing facilities, inpatient hospice facilities, and military barracks-since 2006, primarily to more closely replicate the design and data products of the census long-form sample. The decision to include group quarters in the ACS enables the Census Bureau to provide a comprehensive benchmark of the total U.S. population (not just those living in households). However, the fact that the ACS must rely on a sample of what is a small and very diverse population, combined with limited funding available for survey operations, makes the ACS GQ sampling, data collection, weighting, and estimation procedures more complex and the estimates more susceptible to problems stemming from these limitations. The concerns are magnified in small areas, particularly in terms of detrimental effects on the total population estimates produced for small areas. Small Populations, Large Effects provides an in-depth review of the statistical methodology for measuring the GQ population in the ACS. This report addresses difficulties associated with measuring the GQ population and the rationale for including GQs in the ACS. Considering user needs for ACS data and of operational feasibility and compatibility with the treatment of the household population in the ACS, the report recommends alternatives to the survey design and other methodological features that can make the ACS more useful for users of small-area data.
For the past 50 years, the Census Bureau has conducted experiments and evaluations with every decennial census involving field data collection during which alternatives to current census processes are assessed for a subset of the population. An "evaluation" is usually a post hoc analysis of data collected as part of the decennial census processing to determine whether individual steps in the census operated as expected. The 2010 Program for Evaluations and Experiments, known as CPEX, has enormous potential to reduce costs and increase effectiveness of the 2020 census by reducing the initial list of potential research topics from 52 to 6. The panel identified three priority experiments for inclusion in the 2010 census to assist 2020 census planning: (1) an experiment on the use of the Internet for data collection; (2) an experiment on the use of administrative records for various census purposes; and (3) an experiment (or set of experiments) on features of the census questionnaire. They also came up with 11 recommendations to improve efficiency and quality of data collection including allowing use of the Internet for data submission and including one or more alternate questionnaire experiments to examine things such as the representation of race and ethnicity.
Sponsored by the Census Bureau and charged to evaluate the 2010 U.S. census with an eye toward suggesting research and development for the 2020 census, the Panel to Review the 2010 Census uses this first interim report to suggest general priorities for 2020 research. Although the Census Bureau has taken some useful organizational and administrative steps to prepare for 2020, the panel offers three core recommendations, and suggests the Census Bureau take and assertive, aggressive approach to 2020 planning rather than casting possibilities purely as hypothetical. The first recommendation on research and development suggests four broad topic areas for research early in the decade. Second, the report suggest that the Bureau take an aggressive, assertive posture toward research in these priority areas. Third, it identifies the setting of bold goals as essential to underscoring the need for serious reengineering and building commitment to change.
Sponsored by the Census Bureau and charged to evaluate the 2010 U.S. census with an eye toward suggesting research and development for the 2020 census, the Panel to Review the 2010 Census uses this first interim report to suggest general priorities for 2020 research. Although the Census Bureau has taken some useful organizational and administrative steps to prepare for 2020, the panel offers three core recommendations, and suggests the Census Bureau take and assertive, aggressive approach to 2020 planning rather than casting possibilities purely as hypothetical. The first recommendation on research and development suggests four broad topic areas for research early in the decade. Second, the report suggest that the Bureau take an aggressive, assertive posture toward research in these priority areas. Third, it identifies the setting of bold goals as essential to underscoring the need for serious reengineering and building commitment to change.
The Panel on Research on Future Census Methods was formed to examine alternative designs for the 2010 census and to assist the Census Bureau in planning tests and analyses to help assess and compare the advantages and disadvantages of them. Designing the 2010 Census: First Interim Report, examines whether the auxiliary information that is planned to be collected (and retained) during the 2000 census could be augmented to help guide the Census Bureau in its assessment of alternative designs for the 2010 census.
The census coverage measurement programs have historically addressed three primary objectives: (1) to inform users about the quality of the census counts; (2) to help identify sources of error to improve census taking, and (3) to provide alternative counts based on information from the coverage measurement program. In planning the 1990 and 2000 censuses, the main objective was to produce alternative counts based on the measurement of net coverage error. For the 2010 census coverage measurement program, the Census Bureau will deemphasize that goal, and is instead planning to focus on the second goal of improving census processes. This book, which details the findings of the National Research Council's Panel on Coverage Evaluation and Correlation Bias, strongly supports the Census Bureau's change in goal. However, the panel finds that the current plans for data collection, data analysis, and data products are still too oriented towards measurement of net coverage error to fully exploit this new focus. Although the Census Bureau has taken several important steps to revise data collection and analysis procedures and data products, this book recommends further steps to enhance the value of coverage measurement for the improvement of future census processes.
Planning for the 2020 census is already beginning. This book from the National Research Council examines several aspects of census planning, including questionnaire design, address updating, non-response follow-up, coverage follow-up, de-duplication of housing units and residents, editing and imputation procedures, and several other census operations. This book recommends that the Census Bureau overhaul its approach to research and development. The report urges the Bureau to set cost and quality goals for the 2020 and future censuses, improving efficiency by taking advantage of new technologies.
The decennial census was the federal government's largest and most complex peacetime operation. This report of a panel of the National Research Council's Committee on National Statistics comprehensively reviews the conduct of the 2000 census and the quality of the resulting data. The panel's findings cover the planning process for 2000, which was marked by an atmosphere of intense controversy about the proposed role of statistical techniques in the census enumeration and possible adjustment for errors in counting the population. The report addresses the success and problems of major innovations in census operations, the completeness of population coverage in 2000, and the quality of both the basic demographic data collected from all census respondents and the detailed socioeconomic data collected from the census long-form sample (about one-sixth of the population). The panel draws comparisons with the 1990 experience and recommends improvements in the planning process and design for 2010. The 2000 Census: Counting Under Adversity will be an invaluable resource for users of the 2000 data and for policymakers and census planners. It provides a trove of information about the issues that have fueled debate about the census process and about the operations and quality of the nation's twenty-second decennial enumeration.
The usefulness of the U.S. decennial census depends critically on the accuracy with which individual people are counted in specific housing units, at precise geographic locations. The 2000 and other recent censuses have relied on a set of residence rules to craft instructions on the census questionnaire in order to guide respondents to identify their correct "usual residence." Determining the proper place to count such groups as college students, prisoners, and military personnel has always been complicated and controversial; major societal trends such as placement of children in shared custody arrangements and the prevalence of "snowbird" and "sunbird" populations who regularly move to favorable climates further make it difficult to specify ties to one household and one place. Once, Only Once, and in the Right Place reviews the evolution of current residence rules and the way residence concepts are presented to respondents. It proposes major changes to the basic approach of collecting residence information and suggests a program of research to improve the 2010 and future censuses.
The U.S. census, conducted every 10 years since 1790, faces dramatic new challenges as the country begins its third century. Critics of the 1990 census cited problems of increasingly high costs, continued racial differences in counting the population, and declining public confidence. This volume provides a major review of the traditional U.S. census. Starting from the most basic questions of how data are used and whether they are needed, the volume examines the data that future censuses should provide. It evaluates several radical proposals that have been made for changing the census, as well as other proposals for redesigning the year 2000 census. The book also considers in detail the much-criticized long form, the role of race and ethnic data, and the need for and ways to obtain small-area data between censuses.
This volume contains the full text of two reports: one is an interim review of major census operations, which also assesses the U.S. Census bureau's recommendation in March 2001 regarding statistical adjustment of census data for redistricting. It does not address the decision on adjustment for non-redistricting purposes. The second report consists of a letter sent to William Barron, acting director of the Census Bureau. It reviews the new set of evaluations prepared by the Census Bureau in support of its October decision. The two reports are packaged together to provide a unified discussion of statistical adjustment and other aspects of the 2000 census that the authoring panel has considered to date.
Following several years of testing and evaluation, the American Community Survey (ACS) was launched in 2005 as a replacement for the census "long form," used to collect detailed social, economic, and housing data from a sample of the U.S. population as part of the decennial census. During the first year of the ACS implementation, the Census Bureau collected data only from households. In 2006 a sample of group quarters (GQs)-such as correctional facilities, nursing homes, and college dorms-was added to more closely mirror the design of the census long-form sample. The design of the ACS relies on monthly samples that are cumulated to produce multiyear estimates based on 1, 3, and 5 years of data. The data published by the Census Bureau for a geographic area depend on the area's size. The multiyear averaging approach enables the Census Bureau to produce estimates that are intended to be robust enough to release for small areas, such as the smallest governmental units and census block groups. However, the sparseness of the GQ representation in the monthly samples affects the quality of the estimates in many small areas that have large GQ populations relative to the total population. The Census Bureau asked the National Research Council to review and evaluate the statistical methods used for measuring the GQ population. This book presents recommendations addressing improvements in the sample design, sample allocation, weighting, and estimation procedures to assist the Census Bureau's work in the very near term, while further research is conducted to address the underlying question of the relative importance and costs of the GQ data collection in the context of the overall ACS design.
In the early 1990s, the Census Bureau proposed a program of continuous measurement as a possible alternative to the gathering of detailed social, economic, and housing data from a sample of the U.S. population as part of the decennial census. The American Community Survey (ACS) became a reality in 2005, and has included group quarters (GQ)-such places as correctional facilities for adults, student housing, nursing facilities, inpatient hospice facilities, and military barracks-since 2006, primarily to more closely replicate the design and data products of the census long-form sample. The decision to include group quarters in the ACS enables the Census Bureau to provide a comprehensive benchmark of the total U.S. population (not just those living in households). However, the fact that the ACS must rely on a sample of what is a small and very diverse population, combined with limited funding available for survey operations, makes the ACS GQ sampling, data collection, weighting, and estimation procedures more complex and the estimates more susceptible to problems stemming from these limitations. The concerns are magnified in small areas, particularly in terms of detrimental effects on the total population estimates produced for small areas. Small Populations, Large Effects provides an in-depth review of the statistical methodology for measuring the GQ population in the ACS. This report addresses difficulties associated with measuring the GQ population and the rationale for including GQs in the ACS. Considering user needs for ACS data and of operational feasibility and compatibility with the treatment of the household population in the ACS, the report recommends alternatives to the survey design and other methodological features that can make the ACS more useful for users of small-area data.
The decennial census was the federal government's largest and most complex peacetime operation. This report of a panel of the National Research Council's Committee on National Statistics comprehensively reviews the conduct of the 2000 census and the quality of the resulting data. The panel's findings cover the planning process for 2000, which was marked by an atmosphere of intense controversy about the proposed role of statistical techniques in the census enumeration and possible adjustment for errors in counting the population. The report addresses the success and problems of major innovations in census operations, the completeness of population coverage in 2000, and the quality of both the basic demographic data collected from all census respondents and the detailed socioeconomic data collected from the census long-form sample (about one-sixth of the population). The panel draws comparisons with the 1990 experience and recommends improvements in the planning process and design for 2010. The 2000 Census: Counting Under Adversity will be an invaluable resource for users of the 2000 data and for policymakers and census planners. It provides a trove of information about the issues that have fueled debate about the census process and about the operations and quality of the nation's twenty-second decennial enumeration.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.