Revised to include current components considered for today's unconventional and multi-fracture grids, Mechanics of Hydraulic Fracturing, Second Edition explains one of the most important features for fracture design — the ability to predict the geometry and characteristics of the hydraulically induced fracture. With two-thirds of the world's oil and natural gas reserves committed to unconventional resources, hydraulic fracturing is the best proven well stimulation method to extract these resources from their more remote and complex reservoirs. However, few hydraulic fracture models can properly simulate more complex fractures. Engineers and well designers must understand the underlying mechanics of how fractures are modeled in order to correctly predict and forecast a more advanced fracture network. Updated to accommodate today's fracturing jobs, Mechanics of Hydraulic Fracturing, Second Edition enables the engineer to: - Understand complex fracture networks to maximize completion strategies - Recognize and compute stress shadow, which can drastically affect fracture network patterns - Optimize completions by properly modeling and more accurately predicting for today's hydraulic fracturing completions - Discusses the underlying mechanics of creating a fracture from the wellbore - Enhanced to include newer modeling components such as stress shadow and interaction of hydraulic fracture with a natural fracture, which aids in more complex fracture networks - Updated experimental studies that apply to today's unconventional fracturing cases
“A fascinating story . . . worth the attention of every student of modern China.” —The Journal of Asian Studies China’s 1911 Revolution was a momentous political transformation. Its leaders, however, were not rebellious troublemakers on the periphery of imperial order. On the contrary, they were a powerful political and economic elite deeply entrenched in local society and well-respected both for their imperially sanctioned cultural credentials and for their mastery of new ideas. The revolution they spearheaded produced a new, democratic political culture that enshrined national sovereignty, constitutionalism, and the rights of the people as indisputable principles. Based upon previously untapped Qing and Republican sources, The Politics of Rights and the 1911 Revolution in China is a nuanced and colorful chronicle of the revolution as it occurred in local and regional areas. Xiaowei Zheng explores the ideas that motivated the revolution, the popularization of those ideas, and their animating impact on the Chinese people at large. The focus of the book is not on the success or failure of the revolution, but rather on the transformative effect that revolution has on people and what they learn from it.
Multiscale modeling is becoming essential for accurate, rapid simulation in science and engineering. This book presents the results of three decades of research on multiscale modeling in process engineering from principles to application, and its generalization for different fields. This book considers the universality of meso-scale phenomena for the first time, and provides insight into the emerging discipline that unifies them, meso-science, as well as new perspectives for virtual process engineering. Multiscale modeling is applied in areas including: multiphase flow and fluid dynamics chemical, biochemical and process engineering mineral processing and metallurgical engineering energy and resources materials science and engineering Jinghai Li is Vice-President of the Chinese Academy of Sciences (CAS), a professor at the Institute of Process Engineering, CAS, and leader of the EMMS (Energy-minimizing multiscale) Group. Wei Ge, Wei Wang, Ning Yang and Junwu Wang are professors at the EMMS Group, part of the Institute of Process Engineering, CAS. Xinhua Liu, Limin Wang, Xianfeng He and Xiaowei Wang are associate professors at the EMMS Group, part of the Institute of Process Engineering, CAS. Mooson Kwauk is an emeritus director of the Institute of Process Engineering, CAS, and is an advisor to the EMMS Group.
This much-needed work on ethnicity in Asia offers a major sociological analysis of Hui Muslims in contemporary China. Using both qualitative and quantitative data derived from fieldwork in Lanzhou between March 2001 and July 2004, it looks at the contrast between the urban life of the Han people, the ethnic majority in the city of Lanzhou, and the Hui people, the largest ethnic minority in the city, and assesses the link between minority ethnicity and traditional behaviour in urban sociology and research on ethnic groups of China. In-depth interviews and survey data provides a fresh perspective to the study of ethnic behaviour in China, and offers a rich account of Hui behaviour in seven aspects of urban life: neighbouring interaction, friendship formation, network behaviour, mate selection methods, spouse choice, marital homogamy, and household structure. Contributing to the global discourse on Islam, religious fundamentalism and modernity, this book will be invaluable to anyone interested in Chinese society, Islam, religion, development, urban studies, anthropology and ethnicity.
Challenges the conventional view that the party-state structure creates a monolithic political elite in PR China, allowing readers to think about Chinese politics in a different perspective using an institutional approach Unlike existing research on Chinese elites this book relies upon advance statistical data Statistics are based on 1588 top Chinese leaders making this book the most extensive and up-to-date biographical data set in elite studies
How the planet's two largest greenhouse gas emitters navigate climate policy. The United States and China together account for a disproportionate 45 percent of global carbon dioxide emissions. In 2014, then-President Obama and Chinese President Xi Jinping announced complementary efforts to limit emissions, paving the way for the Paris Agreement. And yet, with President Trump's planned withdrawal from the Paris accords and Xi's consolidation of power—as well as mutual mistrust fueled by misunderstanding—the climate future is uncertain. In Titans of the Climate, Kelly Sims Gallagher and Xiaowei Xuan examine how the planet's two largest greenhouse gas emitters develop and implement climate policy. Through dispassionate analysis, the authors aim to help readers understand the challenges, constraints, and opportunities in each country. Gallagher—a former U.S. climate policymaker—and Xuan—a member of a Chinese policy think tank—describe the specific drivers—political, economic, and social—of climate policies in both countries and map the differences between policy outcomes. They characterize the U.S. approach as “deliberative incrementalism”; the Chinese, meanwhile, engage in “strategic pragmatism.” Comparing the policy processes of the two countries, Gallagher and Xuan make the case that if each country understands more about the other's goals and constraints, climate policy cooperation is more likely to succeed.
This book provides a structured introduction of the key concepts and techniques that enable in-/near-memory computing. For decades, processing-in-memory or near-memory computing has been attracting growing interest due to its potential to break the memory wall. Near-memory computing moves compute logic near the memory, and thereby reduces data movement. Recent work has also shown that certain memories can morph themselves into compute units by exploiting the physical properties of the memory cells, enabling in-situ computing in the memory array. While in- and near-memory computing can circumvent overheads related to data movement, it comes at the cost of restricted flexibility of data representation and computation, design challenges of compute capable memories, and difficulty in system and software integration. Therefore, wide deployment of in-/near-memory computing cannot be accomplished without techniques that enable efficient mapping of data-intensive applications to such devices, without sacrificing accuracy or increasing hardware costs excessively. This book describes various memory substrates amenable to in- and near-memory computing, architectural approaches for designing efficient and reliable computing devices, and opportunities for in-/near-memory acceleration of different classes of applications.
Written by internationally recognized experts in the field with academic as well as industrial experience, this book concisely yet systematically covers all aspects of the topic. The monograph focuses on the optoelectronic behavior of organic solids and their application in new optoelectronic devices. It covers organic field-effect and organic electroluminescent materials and devices, organic photonics, materials and devices, as well as organic solids in photo absorption and energy conversion. Much emphasis is laid on the preparation of functional materials and the fabrication of devices, from materials synthesis and purification, to physicochemical properties and the basic processes and working principles of the devices. The only book to cover fundamentals, applications, and the latest research results, this is a handy reference for both researchers and those new to the field. From the contents: * Electronic process in organic solids * Organic/polymeric semiconductors for field-effect transistors * Organic/polymeric field-effect transistors * Organic circuits and organic single molecular transistors * Polymer light-emitting Diodes (PLEDs): devices and materials * Organic solids for photonics * Organic photonic devices * Organic solar cells based on small molecules * Polymer solar cells * Dye-sensitized solar cells (DSSCs) * Organic thermoelectric power devices
With the end of Dennard scaling and Moore’s law, IC chips, especially large-scale ones, now face more reliability challenges, and reliability has become one of the mainstay merits of VLSI designs. In this context, this book presents a built-in on-chip fault-tolerant computing paradigm that seeks to combine fault detection, fault diagnosis, and error recovery in large-scale VLSI design in a unified manner so as to minimize resource overhead and performance penalties. Following this computing paradigm, we propose a holistic solution based on three key components: self-test, self-diagnosis and self-repair, or “3S” for short. We then explore the use of 3S for general IC designs, general-purpose processors, network-on-chip (NoC) and deep learning accelerators, and present prototypes to demonstrate how 3S responds to in-field silicon degradation and recovery under various runtime faults caused by aging, process variations, or radical particles. Moreover, we demonstrate that 3S not only offers a powerful backbone for various on-chip fault-tolerant designs and implementations, but also has farther-reaching implications such as maintaining graceful performance degradation, mitigating the impact of verification blind spots, and improving chip yield. This book is the outcome of extensive fault-tolerant computing research pursued at the State Key Lab of Processors, Institute of Computing Technology, Chinese Academy of Sciences over the past decade. The proposed built-in on-chip fault-tolerant computing paradigm has been verified in a broad range of scenarios, from small processors in satellite computers to large processors in HPCs. Hopefully, it will provide an alternative yet effective solution to the growing reliability challenges for large-scale VLSI designs.
This book provides a ‘one-stop source’ for all readers who are interested in a new, empirical approach to machine learning that, unlike traditional methods, successfully addresses the demands of today’s data-driven world. After an introduction to the fundamentals, the book discusses in depth anomaly detection, data partitioning and clustering, as well as classification and predictors. It describes classifiers of zero and first order, and the new, highly efficient and transparent deep rule-based classifiers, particularly highlighting their applications to image processing. Local optimality and stability conditions for the methods presented are formally derived and stated, while the software is also provided as supplemental, open-source material. The book will greatly benefit postgraduate students, researchers and practitioners dealing with advanced data processing, applied mathematicians, software developers of agent-oriented systems, and developers of embedded and real-time systems. It can also be used as a textbook for postgraduate coursework; for this purpose, a standalone set of lecture notes and corresponding lab session notes are available on the same website as the code. Dimitar Filev, Henry Ford Technical Fellow, Ford Motor Company, USA, and Member of the National Academy of Engineering, USA: “The book Empirical Approach to Machine Learning opens new horizons to automated and efficient data processing.” Paul J. Werbos, Inventor of the back-propagation method, USA: “I owe great thanks to Professor Plamen Angelov for making this important material available to the community just as I see great practical needs for it, in the new area of making real sense of high-speed data from the brain.” Chin-Teng Lin, Distinguished Professor at University of Technology Sydney, Australia: “This new book will set up a milestone for the modern intelligent systems.” Edward Tunstel, President of IEEE Systems, Man, Cybernetics Society, USA: “Empirical Approach to Machine Learning provides an insightful and visionary boost of progress in the evolution of computational learning capabilities yielding interpretable and transparent implementations.”
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.