Who Goes There?: Authentication Through the Lens of Privacy explores authentication technologies (passwords, PKI, biometrics, etc.) and their implications for the privacy of the individuals being authenticated. As authentication becomes ever more ubiquitous, understanding its interplay with privacy is vital. The report examines numerous concepts, including authentication, authorization, identification, privacy, and security. It provides a framework to guide thinking about these issues when deciding whether and how to use authentication in a particular context. The book explains how privacy is affected by system design decisions. It also describes government's unique role in authentication and what this means for how government can use authentication with minimal invasions of privacy. In addition, Who Goes There? outlines usability and security considerations and provides a primer on privacy law and policy.
IDsâ€"Not That Easy highlights some of the challenging policy, procedural, and technological issues presented by nationwide identity systems. In the wake of the events of September 11, 2001, nationwide identity systems have been proposed to better track the movement of suspected terrorists. However, questions arise as to who would use the system and how, if participation would be mandatory, the type of data that would be collected, and the legal structures needed to protect privacy. The committee's goal is to foster a broad and deliberate discussion among policy-makers and the public about the form of nationwide identity system that might be created, and whether such a system is desirable or feasible.
Privacy is a growing concern in the United States and around the world. The spread of the Internet and the seemingly boundaryless options for collecting, saving, sharing, and comparing information trigger consumer worries. Online practices of business and government agencies may present new ways to compromise privacy, and e-commerce and technologies that make a wide range of personal information available to anyone with a Web browser only begin to hint at the possibilities for inappropriate or unwarranted intrusion into our personal lives. Engaging Privacy and Information Technology in a Digital Age presents a comprehensive and multidisciplinary examination of privacy in the information age. It explores such important concepts as how the threats to privacy evolving, how can privacy be protected and how society can balance the interests of individuals, businesses and government in ways that promote privacy reasonably and effectively? This book seeks to raise awareness of the web of connectedness among the actions one takes and the privacy policies that are enacted, and provides a variety of tools and concepts with which debates over privacy can be more fruitfully engaged. Engaging Privacy and Information Technology in a Digital Age focuses on three major components affecting notions, perceptions, and expectations of privacy: technological change, societal shifts, and circumstantial discontinuities. This book will be of special interest to anyone interested in understanding why privacy issues are often so intractable.
When you visit the doctor, information about you may be recorded in an office computer. Your tests may be sent to a laboratory or consulting physician. Relevant information may be transmitted to your health insurer or pharmacy. Your data may be collected by the state government or by an organization that accredits health care or studies medical costs. By making information more readily available to those who need it, greater use of computerized health information can help improve the quality of health care and reduce its costs. Yet health care organizations must find ways to ensure that electronic health information is not improperly divulged. Patient privacy has been an issue since the oath of Hippocrates first called on physicians to "keep silence" on patient matters, and with highly sensitive dataâ€"genetic information, HIV test results, psychiatric recordsâ€"entering patient records, concerns over privacy and security are growing. For the Record responds to the health care industry's need for greater guidance in protecting health information that increasingly flows through the national information infrastructureâ€"from patient to provider, payer, analyst, employer, government agency, medical product manufacturer, and beyond. This book makes practical detailed recommendations for technical and organizational solutions and national-level initiatives. For the Record describes two major types of privacy and security concerns that stem from the availability of health information in electronic form: the increased potential for inappropriate release of information held by individual organizations (whether by those with access to computerized records or those who break into them) and systemic concerns derived from open and widespread sharing of data among various parties. The committee reports on the technological and organizational aspects of security management, including basic principles of security; the effectiveness of technologies for user authentication, access control, and encryption; obstacles and incentives in the adoption of new technologies; and mechanisms for training, monitoring, and enforcement. For the Record reviews the growing interest in electronic medical records; the increasing value of health information to providers, payers, researchers, and administrators; and the current legal and regulatory environment for protecting health data. This information is of immediate interest to policymakers, health policy researchers, patient advocates, professionals in health data management, and other stakeholders.
Recent disclosures about the bulk collection of domestic phone call records and other signals intelligence programs have stimulated widespread debate about the implications of such practices for the civil liberties and privacy of Americans. In the wake of these disclosures, many have identified a need for the intelligence community to engage more deeply with outside privacy experts and stakeholders. At the request of the Office of the Director of National Intelligence, the National Academies of Sciences, Engineering, and Medicine convened a workshop to address the privacy implications of emerging technologies, public and individual preferences and attitudes toward privacy, and ethical approaches to data collection and use. This report summarizes discussions between experts from academia and the private sector and from the intelligence community on private sector best practices and privacy research results.
Despite many advances, security and privacy often remain too complex for individuals or enterprises to manage effectively or to use conveniently. Security is hard for users, administrators, and developers to understand, making it all too easy to use, configure, or operate systems in ways that are inadvertently insecure. Moreover, security and privacy technologies originally were developed in a context in which system administrators had primary responsibility for security and privacy protections and in which the users tended to be sophisticated. Today, the user base is much wider-including the vast majority of employees in many organizations and a large fraction of households-but the basic models for security and privacy are essentially unchanged. Security features can be clumsy and awkward to use and can present significant obstacles to getting work done. As a result, cybersecurity measures are all too often disabled or bypassed by the users they are intended to protect. Similarly, when security gets in the way of functionality, designers and administrators deemphasize it. The result is that end users often engage in actions, knowingly or unknowingly, that compromise the security of computer systems or contribute to the unwanted release of personal or other confidential information. Toward Better Usability, Security, and Privacy of Information Technology discusses computer system security and privacy, their relationship to usability, and research at their intersection.
Who Goes There?: Authentication Through the Lens of Privacy explores authentication technologies (passwords, PKI, biometrics, etc.) and their implications for the privacy of the individuals being authenticated. As authentication becomes ever more ubiquitous, understanding its interplay with privacy is vital. The report examines numerous concepts, including authentication, authorization, identification, privacy, and security. It provides a framework to guide thinking about these issues when deciding whether and how to use authentication in a particular context. The book explains how privacy is affected by system design decisions. It also describes government's unique role in authentication and what this means for how government can use authentication with minimal invasions of privacy. In addition, Who Goes There? outlines usability and security considerations and provides a primer on privacy law and policy.
Increasingly, the core mission of the Centers for Medicare and Medicaid Services (CMS), an agency of the Department of Health and Human Services, is expanding from one of focusing on prompt claims payment to one of becoming more broadly involved in improving health care quality and efficiency. The requirements for the information technology (IT) systems of CMS are changing as its mission changes, and the efforts to evolve its systems from those designed to support the agency's historical mission come in the midst of a push to modernize the nation's health care IT more broadly. These new challenges arise even as CMS must meet challenging day-to-day operational requirements and make frequent adjustments to its business processes, code, databases, and systems in response to changing statutory, regulatory, and policy requirements. In light of these and other emerging challenges, CMS asked the National Research Council to conduct a study that would lay out a forward-looking vision for the Centers for Medicare and Medicaid Services, taking account of CMS's mission, business processes, and information technology requirements. The study is being conducted in two phases. The first, resulting in the present volume, draws on a series of teleconferences, briefings, and an information-gathering workshop held in Washington, D.C., on September 27-28, 2010. The second phase, drawing on that workshop and on additional briefings, site visits, and committee deliberations, will result in a final report with recommendations, to be issued at the end of the project in 2011.
Federal laws, regulations, and executive orders have imposed requirements for federal agencies to move toward the sustainable acquisition of goods and services, including the incorporation of sustainable purchasing into federal agency decision making. Since the federal government is such a significant player in the market, its move to incorporate sustainable procurement practices could have a profound impact on the types of products being developed for the market as a whole. The General Services Administration (GSA) has played a key role in furthering sustainable procurement practices throughout the federal government. GSA is responsible for formulating and maintaining government-wide policies covering a variety of administrative actions, including those related to procurement and management. GSA has several ongoing activities related to sustainable procurement to assess the feasibility of working with the federal supplier community - vendors and contractors that serve federal agencies to measure and reduce greenhouse gas emissions in the supply chain while encouraging sustainable operations among suppliers. GSA has also been actively developing programs to assist federal agencies in making sustainable procurement decisions. As federal agencies cannot directly fund the development of sustainable procurement tools, they are particularly interested in understanding how to foster innovation and provide incentives for collaboration between developers and users of tools for sustainable purchasing throughout the supply chain. The training of procurement professionals is also a priority for these agencies. To assist efforts to build sustainability considerations into the procurement process, the National Research Council appointed a committee to organize a two-day workshop that explored ways to better incorporate sustainability considerations into procurement tools and capabilities across the public and private sectors. The workshop was designed to help participants assess the current landscape of green purchasing tools, identify emerging needs for enhanced or new tools and opportunities to develop them, identify potential barriers to progress, and explore potential solutions. The workshop provided an opportunity for participants to discuss challenges related to sustainable purchasing and to developing new procurement tools. Sustainability Considerations for Procurement Tools and Capabilities reviews the presenters' recommendations and tools currently used in sustainable procurement, such as databases for ecolabels and standards, codes, or regulations and other nontechnological tools such as policies, frameworks, rating systems, and product indexes.
Recent disclosures about the bulk collection of domestic phone call records and other signals intelligence programs have stimulated widespread debate about the implications of such practices for the civil liberties and privacy of Americans. In the wake of these disclosures, many have identified a need for the intelligence community to engage more deeply with outside privacy experts and stakeholders. At the request of the Office of the Director of National Intelligence, the National Academies of Sciences, Engineering, and Medicine convened a workshop to address the privacy implications of emerging technologies, public and individual preferences and attitudes toward privacy, and ethical approaches to data collection and use. This report summarizes discussions between experts from academia and the private sector and from the intelligence community on private sector best practices and privacy research results.
The Centers for Medicare and Medicaid Services (CMS) is the agency in the Department of Health and Human Services responsible for providing health coverage for seniors and people with disabilities, for limited-income individuals and families, and for children-totaling almost 100 million beneficiaries. The agency's core mission was established more than four decades ago with a mandate to focus on the prompt payment of claims, which now total more than 1.2 billion annually. With CMS's mission expanding from its original focus on prompt claims payment come new requirements for the agency's information technology (IT) systems. Strategies and Priorities for Information Technology at the Centers for Medicare and Medicaid Services reviews CMS plans for its IT capabilities in light of these challenges and to make recommendations to CMS on how its business processes, practices, and information systems can best be developed to meet today's and tomorrow's demands. The report's recommendations and conclusions offered cluster around the following themes: (1) the need for a comprehensive strategic technology plan; (2) the application of an appropriate metamethodology to guide an iterative, incremental, and phased transition of business and information systems; (3) the criticality of IT to high-level strategic planning and its implications for CMS's internal organization and culture; and (4) the increasing importance of data and analytical efforts to stakeholders inside and outside CMS. Given the complexity of CMS's IT systems, there will be no simple solution. Although external contractors and advisory organizations will play important roles, CMS needs to assert well-informed technical and strategic leadership. The report argues that the only way for CMS to succeed in these efforts is for the agency, with its stakeholders and Congress, to recognize resolutely that action must be taken, to begin the needed cultural and organizational transformations, and to develop the appropriate internal expertise to lead the initiative with a comprehensive, incremental, iterative, and integrated approach that effectively and strategically integrates business requirements and IT capabilities.
Information technology (IT) is essential to virtually all of the nation's critical infrastructures making them vulnerable by a terrorist attack on their IT system. An attack could be on the system itself or use the IT system to launch or exacerbate another type of attack. IT can also be used as a counterterrorism tool. The report concludes that the most devastating consequences of a terrorist attack would occur if it were on or used IT as part of a broader attack. The report presents two recommendations on what can be done in the short term to protect the nation's communications and information systems and several recommendations about what can be done over the longer term. The report also notes the importance of considering how an IT system will be deployed to maximize protection against and usefulness in responding to attacks.
Biometric recognition-the automated recognition of individuals based on their behavioral and biological characteristic-is promoted as a way to help identify terrorists, provide better control of access to physical facilities and financial accounts, and increase the efficiency of access to services and their utilization. Biometric recognition has been applied to identification of criminals, patient tracking in medical informatics, and the personalization of social services, among other things. In spite of substantial effort, however, there remain unresolved questions about the effectiveness and management of systems for biometric recognition, as well as the appropriateness and societal impact of their use. Moreover, the general public has been exposed to biometrics largely as high-technology gadgets in spy thrillers or as fear-instilling instruments of state or corporate surveillance in speculative fiction. Now, as biometric technologies appear poised for broader use, increased concerns about national security and the tracking of individuals as they cross borders have caused passports, visas, and border-crossing records to be linked to biometric data. A focus on fighting insurgencies and terrorism has led to the military deployment of biometric tools to enable recognition of individuals as friend or foe. Commercially, finger-imaging sensors, whose cost and physical size have been reduced, now appear on many laptop personal computers, handheld devices, mobile phones, and other consumer devices. Biometric Recognition: Challenges and Opportunities addresses the issues surrounding broader implementation of this technology, making two main points: first, biometric recognition systems are incredibly complex, and need to be addressed as such. Second, biometric recognition is an inherently probabilistic endeavor. Consequently, even when the technology and the system in which it is embedded are behaving as designed, there is inevitable uncertainty and risk of error. This book elaborates on these themes in detail to provide policy makers, developers, and researchers a comprehensive assessment of biometric recognition that examines current capabilities, future possibilities, and the role of government in technology and system development.
IDsâ€"Not That Easy highlights some of the challenging policy, procedural, and technological issues presented by nationwide identity systems. In the wake of the events of September 11, 2001, nationwide identity systems have been proposed to better track the movement of suspected terrorists. However, questions arise as to who would use the system and how, if participation would be mandatory, the type of data that would be collected, and the legal structures needed to protect privacy. The committee's goal is to foster a broad and deliberate discussion among policy-makers and the public about the form of nationwide identity system that might be created, and whether such a system is desirable or feasible.
A broad and growing literature describes the deep and multidisciplinary nature of the sustainability challenges faced by the United States and the world. Despite the profound technical challenges involved, sustainability is not, at its root, a technical problem, nor will merely technical solutions be sufficient. Instead, deep economic, political, and cultural adjustments will ultimately be required, along with a major, long-term commitment in each sphere to deploy the requisite technical solutions at scale. Nevertheless, technological advances and enablers have a clear role in supporting such change, and information technology (IT) is a natural bridge between technical and social solutions because it can offer improved communication and transparency for fostering the necessary economic, political, and cultural adjustments. Moreover, IT is at the heart of nearly every large-scale socioeconomic system-including systems for finance, manufacturing, and the generation and distribution of energy-and so sustainability-focused changes in those systems are inextricably linked with advances in IT. The focus of Computing Research for Sustainability is "greening through IT," the application of computing to promote sustainability broadly. The aim of this report is twofold: to shine a spotlight on areas where IT innovation and computer science (CS) research can help, and to urge the computing research community to bring its approaches and methodologies to bear on these pressing global challenges. Computing Research for Sustainability focuses on addressing medium- and long-term challenges in a way that would have significant, measurable impact. The findings and recommended principles of the Committee on Computing Research for Environmental and Societal Sustainability concern four areas: (1) the relevance of IT and CS to sustainability; (2) the value of the CS approach to problem solving, particularly as it pertains to sustainability challenges; (3) key CS research areas; and (4) strategy and pragmatic approaches for CS research on sustainability.
Advances in the miniaturization and networking of microprocessors promise a day when networked computers are embedded throughout the everyday world. However, our current understanding of what such systems would be like is insufficient to bring the promise to reality. Embedded, Everywhere explores the potential of networked systems of embedded computers and the research challenges arising from embedding computation and communications technology into a wide variety of applicationsâ€"from precision agriculture to automotive telematics to defense systems. It describes how these emerging networks operate under unique constraints not present in more traditional distributed systems, such as the Internet. It articulates how these networks will have to be dynamically adaptive and self-configuring, and how new models for approaching programming and computation are necessary. Issues relating to trustworthiness, security, safety, reliability, usability, and privacy are examined in light of the ubiquitous nature of these systems. A comprehensive, systems-oriented research agenda is presented, along with recommendations to major federal funding agencies.
The end of dramatic exponential growth in single-processor performance marks the end of the dominance of the single microprocessor in computing. The era of sequential computing must give way to a new era in which parallelism is at the forefront. Although important scientific and engineering challenges lie ahead, this is an opportune time for innovation in programming systems and computing architectures. We have already begun to see diversity in computer designs to optimize for such considerations as power and throughput. The next generation of discoveries is likely to require advances at both the hardware and software levels of computing systems. There is no guarantee that we can make parallel computing as common and easy to use as yesterday's sequential single-processor computer systems, but unless we aggressively pursue efforts suggested by the recommendations in this book, it will be "game over" for growth in computing performance. If parallel programming and related software efforts fail to become widespread, the development of exciting new applications that drive the computer industry will stall; if such innovation stalls, many other parts of the economy will follow suit. The Future of Computing Performance describes the factors that have led to the future limitations on growth for single processors that are based on complementary metal oxide semiconductor (CMOS) technology. It explores challenges inherent in parallel computing and architecture, including ever-increasing power consumption and the escalated requirements for heat dissipation. The book delineates a research, practice, and education agenda to help overcome these challenges. The Future of Computing Performance will guide researchers, manufacturers, and information technology professionals in the right direction for sustainable growth in computer performance, so that we may all enjoy the next level of benefits to society.
Quantum mechanics, the subfield of physics that describes the behavior of very small (quantum) particles, provides the basis for a new paradigm of computing. First proposed in the 1980s as a way to improve computational modeling of quantum systems, the field of quantum computing has recently garnered significant attention due to progress in building small-scale devices. However, significant technical advances will be required before a large-scale, practical quantum computer can be achieved. Quantum Computing: Progress and Prospects provides an introduction to the field, including the unique characteristics and constraints of the technology, and assesses the feasibility and implications of creating a functional quantum computer capable of addressing real-world problems. This report considers hardware and software requirements, quantum algorithms, drivers of advances in quantum computing and quantum devices, benchmarks associated with relevant use cases, the time and resources required, and how to assess the probability of success.
A grand challenge for science is to understand the human implications of global environmental change and to help society cope with those changes. Virtually all the scientific questions associated with this challenge depend on geospatial information (geoinformation) and on the ability of scientists, working individually and in groups, to interact with that information in flexible and increasingly complex ways. Another grand challenge is how to respond to calamities-terrorist activities, other human-induced crises, and natural disasters. Much of the information that underpins emergency preparedness, response, recovery, and mitigation is geospatial in nature. In terrorist situations, for example, origins and destinations of phone calls and e-mail messages, travel patterns of individuals, dispersal patterns of airborne chemicals, assessment of places at risk, and the allocation of resources all involve geospatial information. Much of the work addressing environment- and emergency-related concerns will depend on how productively humans are able to integrate, distill, and correlate a wide range of seemingly unrelated information. In addition to critical advances in location-aware computing, databases, and data mining methods, advances in the human-computer interface will couple new computational capabilities with human cognitive capabilities. This report outlines an interdisciplinary research roadmap at the intersection of computer science and geospatial information science. The report was developed by a committee convened by the Computer Science and Telecommunications Board of the National Research Council.
Certification of critical software systems (e.g., for safety and security) is important to help ensure their dependability. Today, certification relies as much on evaluation of the software development process as it does on the system's properties. While the latter are preferable, the complexity of these systems usually makes them extremely difficult to evaluate. To explore these and related issues, the National Coordination Office for Information technology Research and Development asked the NRC to undertake a study to assess the current state of certification in dependable systems. The study is in two phases: the first to frame the problem and the second to assess it. This report presents a summary of a workshop held as part of the first phase. The report presents a summary of workshop participants' presentations and subsequent discussion. It covers, among other things, the strengths and limitations of process; new challenges and opportunities; experience to date; organization context; and cost-effectiveness of software engineering techniques. A consensus report will be issued upon completion of the second phase.
The focus of Software for Dependable Systems is a set of fundamental principles that underlie software system dependability and that suggest a different approach to the development and assessment of dependable software. Unfortunately, it is difficult to assess the dependability of software. The field of software engineering suffers from a pervasive lack of evidence about the incidence and severity of software failures; about the dependability of existing software systems; about the efficacy of existing and proposed development methods; about the benefits of certification schemes; and so on. There are many anecdotal reports, which-although often useful for indicating areas of concern or highlighting promising avenues of research-do little to establish a sound and complete basis for making policy decisions regarding dependability. The committee regards claims of extraordinary dependability that are sometimes made on this basis for the most critical of systems as unsubstantiated, and perhaps irresponsible. This difficulty regarding the lack of evidence for system dependability leads to two conclusions: (1) that better evidence is needed, so that approaches aimed at improving the dependability of software can be objectively assessed, and (2) that, for now, the pursuit of dependability in software systems should focus on the construction and evaluation of evidence. The committee also recognized the importance of adopting the practices that are already known and used by the best developers; this report gives a sample of such practices. Some of these (such as systematic configuration management and automated regression testing) are relatively easy to adopt; others (such as constructing hazard analyses and threat models, exploiting formal notations when appropriate, and applying static analysis to code) will require new training for many developers. However valuable, though, these practices are in themselves no silver bullet, and new techniques and methods will be required in order to build future software systems to the level of dependability that will be required.
In a world of increasing dependence on information technology, the prevention of cyberattacks on a nation's important computer and communications systems and networks is a problem that looms large. Given the demonstrated limitations of passive cybersecurity defense measures, it is natural to consider the possibility that deterrence might play a useful role in preventing cyberattacks against the United States and its vital interests. At the request of the Office of the Director of National Intelligence, the National Research Council undertook a two-phase project aimed to foster a broad, multidisciplinary examination of strategies for deterring cyberattacks on the United States and of the possible utility of these strategies for the U.S. government. The first phase produced a letter report providing basic information needed to understand the nature of the problem and to articulate important questions that can drive research regarding ways of more effectively preventing, discouraging, and inhibiting hostile activity against important U.S. information systems and networks. The second phase of the project entailed selecting appropriate experts to write papers on questions raised in the letter report. A number of experts, identified by the committee, were commissioned to write these papers under contract with the National Academy of Sciences. Commissioned papers were discussed at a public workshop held June 10-11, 2010, in Washington, D.C., and authors revised their papers after the workshop. Although the authors were selected and the papers reviewed and discussed by the committee, the individually authored papers do not reflect consensus views of the committee, and the reader should view these papers as offering points of departure that can stimulate further work on the topics discussed. The papers presented in this volume are published essentially as received from the authors, with some proofreading corrections made as limited time allowed.
Vulnerabilities abound in U.S. society. The openness and efficiency of our key infrastructures â€" transportation, information and telecommunications systems, health systems, the electric power grid, emergency response units, food and water supplies, and others â€" make them susceptible to terrorist attacks. Making the Nation Safer discusses technical approaches to mitigating these vulnerabilities. A broad range of topics are covered in this book, including: Nuclear and radiological threats, such as improvised nuclear devices and "dirty bombs;" Bioterrorism, medical research, agricultural systems and public health; Toxic chemicals and explosive materials; Information technology, such as communications systems, data management, cyber attacks, and identification and authentication systems; Energy systems, such as the electrical power grid and oil and natural gas systems; Transportation systems; Cities and fixed infrastructures, such as buildings, emergency operations centers, and tunnels; The response of people to terrorism, such as how quality of life and morale of the population can be a target of terrorists and how people respond to terrorist attacks; and Linked infrastructures, i.e. the vulnerabilities that result from the interdependencies of key systems. In each of these areas, there are recommendations on how to immediately apply existing knowledge and technology to make the nation safer and on starting research and development programs that could produce innovations that will strengthen key systems and protect us against future threats. The book also discusses issues affecting the government's ability to carry out the necessary science and engineering programs and the important role of industry, universities, and states, counties, and cities in homeland security efforts. A long term commitment to homeland security is necessary to make the nation safer, and this book lays out a roadmap of how science and engineering can assist in countering terrorism.
In May 2016, the National Academies of Sciences, Engineering, and Medicine hosted a workshop on Cryptographic Agility and Interoperability. Speakers at the workshop discussed the history and practice of cryptography, its current challenges, and its future possibilities. This publication summarizes the presentations and discussions from the workshop.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.