The end of dramatic exponential growth in single-processor performance marks the end of the dominance of the single microprocessor in computing. The era of sequential computing must give way to a new era in which parallelism is at the forefront. Although important scientific and engineering challenges lie ahead, this is an opportune time for innovation in programming systems and computing architectures. We have already begun to see diversity in computer designs to optimize for such considerations as power and throughput. The next generation of discoveries is likely to require advances at both the hardware and software levels of computing systems. There is no guarantee that we can make parallel computing as common and easy to use as yesterday's sequential single-processor computer systems, but unless we aggressively pursue efforts suggested by the recommendations in this book, it will be "game over" for growth in computing performance. If parallel programming and related software efforts fail to become widespread, the development of exciting new applications that drive the computer industry will stall; if such innovation stalls, many other parts of the economy will follow suit. The Future of Computing Performance describes the factors that have led to the future limitations on growth for single processors that are based on complementary metal oxide semiconductor (CMOS) technology. It explores challenges inherent in parallel computing and architecture, including ever-increasing power consumption and the escalated requirements for heat dissipation. The book delineates a research, practice, and education agenda to help overcome these challenges. The Future of Computing Performance will guide researchers, manufacturers, and information technology professionals in the right direction for sustainable growth in computer performance, so that we may all enjoy the next level of benefits to society.
Starting in the mid 1990s, the United States economy experienced an unprecedented upsurge in economic productivity. Rapid technological change in communications, computing, and information management continue to promise further gains in productivity, a phenomenon often referred to as the New Economy. To better understand this phenomenon, the National Academies Board on Science, Technology, and Economic Policy (STEP) has convened a series of workshops and commissioned papers on Measuring and Sustaining the New Economy. This major workshop, entitled Deconstructing the Computer, brought together leading industrialists and academic researchers to explore the contribution of the different components of computers to improved price-performance and quality of information systems. The objective was to help understand the sources of the remarkable growth of American productivity in the 1990s, the relative contributions of computers and their underlying components, and the evolution and future contributions of the technologies supporting this positive economic performance.
Advanced computing capabilities are used to tackle a rapidly growing range of challenging science and engineering problems, many of which are compute- and data-intensive as well. Demand for advanced computing has been growing for all types and capabilities of systems, from large numbers of single commodity nodes to jobs requiring thousands of cores; for systems with fast interconnects; for systems with excellent data handling and management; and for an increasingly diverse set of applications that includes data analytics as well as modeling and simulation. Since the advent of its supercomputing centers, the National Science Foundation (NSF) has provided its researchers with state-of-the-art computing systems. The growth of new models of computing, including cloud computing and publically available by privately held data repositories, opens up new possibilities for NSF. In order to better understand the expanding and diverse requirements of the science and engineering community and the importance of a new broader range of advanced computing infrastructure, the NSF requested that the National Research Council carry out a study examining anticipated priorities and associated tradeoffs for advanced computing. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020 provides a framework for future decision-making about NSF's advanced computing strategy and programs. It offers recommendations aimed at achieving four broad goals: (1) position the U.S. for continued leadership in science and engineering, (2) ensure that resources meet community needs, (3) aid the scientific community in keeping up with the revolution in computing, and (4) sustain the infrastructure for advanced computing.
Maintaining the United States' strong lead in information technology will require continued federal support of research in this area, most of which is currently funded under the High Performance Computing and Communications Initiative (HPCCI). The Initiative has already accomplished a great deal and should be continued. This book provides 13 major recommendations for refining both HPCCI and support of information technology research in general. It also provides a good overview of the development of HPCC technologies.
The end of dramatic exponential growth in single-processor performance marks the end of the dominance of the single microprocessor in computing. The era of sequential computing must give way to a new era in which parallelism is at the forefront. Although important scientific and engineering challenges lie ahead, this is an opportune time for innovation in programming systems and computing architectures. We have already begun to see diversity in computer designs to optimize for such considerations as power and throughput. The next generation of discoveries is likely to require advances at both the hardware and software levels of computing systems. There is no guarantee that we can make parallel computing as common and easy to use as yesterday's sequential single-processor computer systems, but unless we aggressively pursue efforts suggested by the recommendations in this book, it will be "game over" for growth in computing performance. If parallel programming and related software efforts fail to become widespread, the development of exciting new applications that drive the computer industry will stall; if such innovation stalls, many other parts of the economy will follow suit. The Future of Computing Performance describes the factors that have led to the future limitations on growth for single processors that are based on complementary metal oxide semiconductor (CMOS) technology. It explores challenges inherent in parallel computing and architecture, including ever-increasing power consumption and the escalated requirements for heat dissipation. The book delineates a research, practice, and education agenda to help overcome these challenges. The Future of Computing Performance will guide researchers, manufacturers, and information technology professionals in the right direction for sustainable growth in computer performance, so that we may all enjoy the next level of benefits to society.
Computing and information and communications technology (ICT) has dramatically changed how we work and live, has had profound effects on nearly every sector of society, has transformed whole industries, and is a key component of U.S. global leadership. A fundamental driver of advances in computing and ICT has been the fact that the single-processor performance has, until recently, been steadily and dramatically increasing year over years, based on a combination of architectural techniques, semiconductor advances, and software improvements. Users, developers, and innovators were able to depend on those increases, translating that performance into numerous technological innovations and creating successive generations of ever more rich and diverse products, software services, and applications that had profound effects across all sectors of society. However, we can no longer depend on those extraordinary advances in single-processor performance continuing. This slowdown in the growth of single-processor computing performance has its roots in fundamental physics and engineering constraints--multiple technological barriers have converged to pose deep research challenges, and the consequences of this shift are deep and profound for computing and for the sectors of the economy that depend on and assume, implicitly or explicitly, ever-increasing performance. From a technology standpoint, these challenges have led to heterogeneous multicore chips and a shift to alternate innovation axes that include, but are not limited to, improving chip performance, mobile devices, and cloud services. As these technical shifts reshape the computing industry, with global consequences, the United States must be prepared to exploit new opportunities and to deal with technical challenges. The New Global Ecosystem in Advanced Computing: Implications for U.S. Competitiveness and National Security outlines the technical challenges, describe the global research landscape, and explore implications for competition and national security.
The Committee on the Future of Supercomputing was tasked to assess prospects for supercomputing technology research and development in support of U.S. needs, to examine key elements of context-the history of supercomputing, the erosion of research investment, the changing nature of problems demanding supercomputing, and the needs of government agencies for supercomputing capabilities-and to assess options for progress. This interim report establishes context-including the history and current state of supercomputing, application requirements, technology evolution, the socioeconomic context-to identify some of the issues that may be explored in more depth in the second phase of the study.
Supercomputers play a significant and growing role in a variety of areas important to the nation. They are used to address challenging science and technology problems. In recent years, however, progress in supercomputing in the United States has slowed. The development of the Earth Simulator supercomputer by Japan that the United States could lose its competitive advantage and, more importantly, the national competence needed to achieve national goals. In the wake of this development, the Department of Energy asked the NRC to assess the state of U.S. supercomputing capabilities and relevant R&D. Subsequently, the Senate directed DOE in S. Rpt. 107-220 to ask the NRC to evaluate the Advanced Simulation and Computing program of the National Nuclear Security Administration at DOE in light of the development of the Earth Simulator. This report provides an assessment of the current status of supercomputing in the United States including a review of current demand and technology, infrastructure and institutions, and international activities. The report also presents a number of recommendations to enable the United States to meet current and future needs for capability supercomputers.
The U.S. information technology (IT) research and development (R&D) ecosystem was the envy of the world in 1995. However, this position of leadership is not a birthright, and it is now under pressure. In recent years, the rapid globalization of markets, labor pools, and capital flows have encouraged many strong national competitors. During the same period, national policies have not sufficiently buttressed the ecosystem, or have generated side effects that have reduced its effectiveness. As a result, the U.S. position in IT leadership today has materially eroded compared with that of prior decades, and the nation risks ceding IT leadership to other nations within a generation. Assessing the Impacts of Changes in the Information Technology R&D Ecosystem calls for a recommitment to providing the resources needed to fuel U.S. IT innovation, to removing important roadblocks that reduce the ecosystem's effectiveness in generating innovation and the fruits of innovation, and to becoming a lead innovator and user of IT. The book examines these issues and makes recommendations to strengthen the U.S. IT R&D ecosystem.
This report summarizes a workshopâ€"Strengthening Science-Based Decision-Making: Implementing the Stockholm Convention on Persistent Organic Pollutants held June 7-10, 2004, in Beijing, China. The presentations and discussions summarized here describe the types of scientific information necessary to make informed decisions to eliminate the production and use of Persistent Organic Pollutants (POPs) banned under the Stockholm Convention, sources of information; scientifically informed strategies for eliminating POPs, elements of good scientific advice, such as transparency, peer review, and disclosure of conflicts of interest; and information dealing with POPs that decision makers need from the scientific community, including next steps to make such science available and ensure its use on a continuing basis.
Computer Science: Reflections on the Field, Reflections from the Field provides a concise characterization of key ideas that lie at the core of computer science (CS) research. The book offers a description of CS research recognizing the richness and diversity of the field. It brings together two dozen essays on diverse aspects of CS research, their motivation and results. By describing in accessible form computer science's intellectual character, and by conveying a sense of its vibrancy through a set of examples, the book aims to prepare readers for what the future might hold and help to inspire CS researchers in its creation.
Information technology (IT) is widely understood to be the enabling technology of the 21st century. IT has transformed, and continues to transform, all aspects of our lives: commerce and finance, education, employment, energy, health care, manufacturing, government, national security, transportation, communications, entertainment, science, and engineering. IT and its impact on the U.S. economy-both directly (the IT sector itself) and indirectly (other sectors that are powered by advances in IT)-continue to grow in size and importance. In 1995, the National Research Council's Computer Science and Telecommunications Board (CSTB) produced the report Evolving the High Performance Computing and Communications Initiative to Support the Nation's Information Infrastructure. A graphic in that report, often called the "tire tracks" diagram because of its appearance, produced an extraordinary response by clearly linking government investments in academic and industry research to the ultimate creation of new information technology industries with more than $1 billion in annual revenue. Used in presentations to Congress and executive branch decision makers and discussed broadly in the research and innovation policy communities, the tire tracks figure dispelled the assumption that the commercially successful IT industry is self-sufficient, underscoring through long incubation periods of years and even decades. The figure was updated in 2002, 2003, and 2009 reports produced by the CSTB. With the support of the National Science Foundation, CSTB updated the tire tracks figure. Continuing Innovation in Information Technology includes the updated figure and a brief text based in large part on prior CSTB reports.
Starting in the mid 1990s, the United States economy experienced an unprecedented upsurge in economic productivity. Rapid technological change in communications, computing, and information management continue to promise further gains in productivity, a phenomenon often referred to as the New Economy. To better understand this phenomenon, the National Academies Board on Science, Technology, and Economic Policy (STEP) has convened a series of workshops and commissioned papers on Measuring and Sustaining the New Economy. This major workshop, entitled Deconstructing the Computer, brought together leading industrialists and academic researchers to explore the contribution of the different components of computers to improved price-performance and quality of information systems. The objective was to help understand the sources of the remarkable growth of American productivity in the 1990s, the relative contributions of computers and their underlying components, and the evolution and future contributions of the technologies supporting this positive economic performance.
Computers are increasingly the enabling devices of the information revolution, and computing is becoming ubiquitous in every corner of society, from manufacturing to telecommunications to pharmaceuticals to entertainment. Even more importantly, the face of computing is changing rapidly, as even traditional rivals such as IBM and Apple Computer begin to cooperate and new modes of computing are developed. Computing the Future presents a timely assessment of academic computer science and engineering (CS&E), examining what should be done to ensure continuing progress in making discoveries that will carry computing into the twenty-first century. Most importantly, it advocates a broader research and educational agenda that builds on the field's impressive accomplishments. The volume outlines a framework of priorities for CS&E, along with detailed recommendations for education, funding, and leadership. A core research agenda is outlined for these areas: processors and multiple-processor systems, data communications and networking, software engineering, information storage and retrieval, reliability, and user interfaces. This highly readable volume examines: Computer science and engineering as a discipline-how computer scientists and engineers are pushing back the frontiers of their field. How CS&E must change to meet the challenges of the future. The influence of strategic investment by federal agencies in CS&E research. Recent structural changes that affect the interaction of academic CS&E and the business environment. Specific examples of interdisciplinary and applications research in four areas: earth sciences and the environment, computational biology, commercial computing, and the long-term goal of a national electronic library. The volume provides a detailed look at undergraduate CS&E education, highlighting the limitations of four-year programs, and discusses the emerging importance of a master's degree in CS&E and the prospects for broadening the scope of the Ph.D. It also includes a brief look at continuing education.
This book synthesizes the findings of three workshops on research issues in high-performance computing and communications (HPCC). It focuses on the role that computing and communications can play in supporting federal, state, and local emergency management officials who deal with natural and man-made hazards (e.g., toxic spills, terrorist bombings). The volume also identifies specific research challenges for HPCC in meeting unmet technology needs in crisis management and other nationally important application areas, such as manufacturing, health care, digital libraries, and electronic commerce and banking.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.