Image Segmentation Summarizes and improves new theory, methods, and applications of current image segmentation approaches, written by leaders in the field The process of image segmentation divides an image into different regions based on the characteristics of pixels, resulting in a simplified image that can be more efficiently analyzed. Image segmentation has wide applications in numerous fields ranging from industry detection and bio-medicine to intelligent transportation and architecture. Image Segmentation: Principles, Techniques, and Applications is an up-to-date collection of recent techniques and methods devoted to the field of computer vision. Covering fundamental concepts, new theories and approaches, and a variety of practical applications including medical imaging, remote sensing, fuzzy clustering, and watershed transform. In-depth chapters present innovative methods developed by the authors—such as convolutional neural networks, graph convolutional networks, deformable convolution, and model compression—to assist graduate students and researchers apply and improve image segmentation in their work. Describes basic principles of image segmentation and related mathematical methods such as clustering, neural networks, and mathematical morphology. Introduces new methods for achieving rapid and accurate image segmentation based on classic image processing and machine learning theory. Presents techniques for improved convolutional neural networks for scene segmentation, object recognition, and change detection, etc. Highlights the effect of image segmentation in various application scenarios such as traffic image analysis, medical image analysis, remote sensing applications, and material analysis, etc. Image Segmentation: Principles, Techniques, and Applications is an essential resource for undergraduate and graduate courses such as image and video processing, computer vision, and digital signal processing, as well as researchers working in computer vision and image analysis looking to improve their techniques and methods.
Neither Donkey Nor Horse "tells the story of how Chinese medicine was transformed from the antithesis of modernity in the early twentieth century into a potent symbol and vehicle for China s struggle with it half a century later. Instead of viewing this transition as derivative of the political history of modern China, Sean Hsiang-lin Lei argues that China s medical history had a life of its own and at times directly influenced the ideological struggle over the meaning of China s modernity and the Chinese state. Far from being a remnant of China s pre-modern past, Chinese medicine in the twentieth century co-evolved with Western medicine and the Nationalist state, undergoing a profound transformationinstitutionally, epistemologically, and materiallythat justifies our recognizing it as modern Chinese medicine. This new medicine was derided as neither donkey nor horse, because it attempted to integrate modern Western medicine into what its opponents considered the pre-modern and un-scientific practices of Chinese medicine. Its historic rise is of crucial importance for the general history of modernity in China, fundamentally challenging the conception of modernity that rejected the possibility of productive crossbreeding between the modern and the traditional. By exploring the co-production of modern Chinese medicine and China s modernity, Lei offers both a political history of medicine and a medical history of the Chinese state. "Neither Donkey Nor Horse "synthesizes into a single historical narrative what was previously separated into three independent histories: the history of Western medicine in China, the history of Chinese medicine, and the political history of the state.
More work is being done in the statistical aspects of medical imaging, and this book fills the gap to provide a unified framework of study by presenting a complete look at medical imaging and statistics - from the statistical aspects of imaging technology to the statistical analysis of images. It provides technicians and students with the statistical principles that underlay medical imaging, as required reference material for researchers involved in the design of new technology. Illustrations are included throughout as are many real examples, and algorithms. The text also includes exercises developed out of the author's many years experience with studying the statistics of medical imaging.
The Ricci flow uses methods from analysis to study the geometry and topology of manifolds. With the third part of their volume on techniques and applications of the theory, the authors give a presentation of Hamilton's Ricci flow for graduate students and mathematicians interested in working in the subject, with an emphasis on the geometric and analytic aspects. The topics include Perelman's entropy functional, point picking methods, aspects of Perelman's theory of $\kappa$-solutions including the $\kappa$-gap theorem, compactness theorem and derivative estimates, Perelman's pseudolocality theorem, and aspects of the heat equation with respect to static and evolving metrics related to Ricci flow. In the appendices, we review metric and Riemannian geometry including the space of points at infinity and Sharafutdinov retraction for complete noncompact manifolds with nonnegative sectional curvature. As in the previous volumes, the authors have endeavored, as much as possible, to make the chapters independent of each other. The book makes advanced material accessible to graduate students and nonexperts. It includes a rigorous introduction to some of Perelman's work and explains some technical aspects of Ricci flow useful for singularity analysis. The authors give the appropriate references so that the reader may further pursue the statements and proofs of the various results.
Identifying the input-output relationship of a system or discovering the evolutionary law of a signal on the basis of observation data, and applying the constructed mathematical model to predicting, controlling or extracting other useful information constitute a problem that has been drawing a lot of attention from engineering and gaining more and more importance in econo metrics, biology, environmental science and other related areas. Over the last 30-odd years, research on this problem has rapidly developed in various areas under different terms, such as time series analysis, signal processing and system identification. Since the randomness almost always exists in real systems and in observation data, and since the random process is sometimes used to model the uncertainty in systems, it is reasonable to consider the object as a stochastic system. In some applications identification can be carried out off line, but in other cases this is impossible, for example, when the structure or the parameter of the system depends on the sample, or when the system is time-varying. In these cases we have to identify the system on line and to adjust the control in accordance with the model which is supposed to be approaching the true system during the process of identification. This is why there has been an increasing interest in identification and adaptive control for stochastic systems from both theorists and practitioners.
Ricci flow is a powerful analytic method for studying the geometry and topology of manifolds. This book is an introduction to Ricci flow for graduate students and mathematicians interested in working in the subject. To this end, the first chapter is a review of the relevant basics of Riemannian geometry. For the benefit of the student, the text includes a number of exercises of varying difficulty. The book also provides brief introductions to some general methods of geometric analysis and other geometric flows. Comparisons are made between the Ricci flow and the linear heat equation, mean curvature flow, and other geometric evolution equations whenever possible. Several topics of Hamilton's program are covered, such as short time existence, Harnack inequalities, Ricci solitons, Perelman's no local collapsing theorem, singularity analysis, and ancient solutions. A major direction in Ricci flow, via Hamilton's and Perelman's works, is the use of Ricci flow as an approach to solving the Poincaré conjecture and Thurston's geometrization conjecture.
This book presents systematic overviews and bright insights into big data-driven intelligent fault diagnosis and prognosis for mechanical systems. The recent research results on deep transfer learning-based fault diagnosis, data-model fusion remaining useful life (RUL) prediction, etc., are focused on in the book. The contents are valuable and interesting to attract academic researchers, practitioners, and students in the field of prognostics and health management (PHM). Essential guidelines are provided for readers to understand, explore, and implement the presented methodologies, which promote further development of PHM in the big data era. Features: Addresses the critical challenges in the field of PHM at present Presents both fundamental and cutting-edge research theories on intelligent fault diagnosis and prognosis Provides abundant experimental validations and engineering cases of the presented methodologies
Using interviews, newspaper articles, online texts, official documents, and national surveys, Lei shows that the development of the public sphere in China has provided an unprecedented forum for citizens to organize, influence the public agenda, and demand accountability from the government.
Neither Donkey nor Horse tells the story of how Chinese medicine was transformed from the antithesis of modernity in the early twentieth century into a potent symbol of and vehicle for China’s exploration of its own modernity half a century later. Instead of viewing this transition as derivative of the political history of modern China, Sean Hsiang-lin Lei argues that China’s medical history had a life of its own, one that at times directly influenced the ideological struggle over the meaning of China’s modernity and the Chinese state. Far from being a remnant of China’s premodern past, Chinese medicine in the twentieth century coevolved with Western medicine and the Nationalist state, undergoing a profound transformation—institutionally, epistemologically, and materially—that resulted in the creation of a modern Chinese medicine. This new medicine was derided as “neither donkey nor horse” because it necessarily betrayed both of the parental traditions and therefore was doomed to fail. Yet this hybrid medicine survived, through self-innovation and negotiation, thus challenging the conception of modernity that rejected the possibility of productive crossbreeding between the modern and the traditional. By exploring the production of modern Chinese medicine and China’s modernity in tandem, Lei offers both a political history of medicine and a medical history of the Chinese state.
Subpixel mapping is a technology that generates a fine resolution land cover map from coarse resolution fractional images by predicting the spatial locations of different land cover classes at the subpixel scale. This book provides readers with a complete overview of subpixel image processing methods, basic principles, and different subpixel mapping techniques based on single or multi-shift remote sensing images. Step-by-step procedures, experimental contents, and result analyses are explained clearly at the end of each chapter. Real-life applications are a great resource for understanding how and where to use subpixel mapping when dealing with different remote sensing imaging data. This book will be of interest to undergraduate and graduate students, majoring in remote sensing, surveying, mapping, and signal and information processing in universities and colleges, and it can also be used by professionals and researchers at different levels in related fields.
This book presents the key technology of electronic noses, and systematically describes how e-noses can be used to automatically analyse odours. Appealing to readers from the fields of artificial intelligence, computer science, electrical engineering, electronics, and instrumentation science, it addresses three main areas: First, readers will learn how to apply machine learning, pattern recognition and signal processing algorithms to real perception tasks. Second, they will be shown how to make their algorithms match their systems once the algorithms don’t work because of the limitation of hardware resources. Third, readers will learn how to make schemes and solutions when the acquired data from their systems is not stable due to the fundamental issues affecting perceptron devices (e.g. sensors). In brief, the book presents and discusses the key technologies and new algorithmic challenges in electronic noses and artificial olfaction. The goal is to promote the industrial application of electronic nose technology in environmental detection, medical diagnosis, food quality control, explosive detection, etc. and to highlight the scientific advances in artificial olfaction and artificial intelligence. The book offers a good reference guide for newcomers to the topic of electronic noses, because it refers to the basic principles and algorithms. At the same time, it clearly presents the key challenges – such as long-term drift, signal uniqueness, and disturbance – and effective and efficient solutions, making it equally valuable for researchers engaged in the science and engineering of sensors, instruments, chemometrics, etc.
Dynamic service-oriented environments (SOEs) are characterised by a large number of heterogeneous service components that are expected to support the business as a whole. The present work provides a negotiation-based approach to facilitate automated and multi-level service-level management in an SOE, where each component autonomously arranges its contribution to the whole operational goals. Evaluation experiments have shown an increased responsiveness and stability of an SOE in case of changes.
This book introduces readers to the application of orbital data on space objects in the contexts of conjunction assessment and space situation analysis, including theories and methodologies. It addresses the main topics involved in space object conjunction assessment, such as: orbital error analysis of space objects; close approach analysis; the calculation, analysis and application of collision probability; and the comprehensive assessment of collision risk. In addition, selected topics on space situation analysis are also presented, including orbital anomaly and space event analysis, and so on. The book offers a valuable guide for researchers and engineers in the fields of astrodynamics, space telemetry, tracking and command (TT&C), space surveillance, space situational awareness, and space debris, as well as for graduates majoring in flight vehicle design and related fields.
A Choice Outstanding Academic Title for 2022 China's Cultural Revolution (1966-1976) produced propaganda music that still stirs unease and, at times, evokes nostalgia. Lei X. Ouyang uses selections from revolutionary songbooks to untangle the complex interactions between memory, trauma, and generational imprinting among those who survived the period of extremes. Interviews combine with ethnographic fieldwork and surveys to explore both the Cultural Revolution's effect on those who lived through it as children and contemporary remembrance of the music created to serve the Maoist regime. As Ouyang shows, the weaponization of music served an ideological revolution but also revolutionized the senses. She examines essential questions raised by this phenomenon, including: What did the revolutionization look, sound, and feel like? What does it take for individuals and groups to engage with such music? And what is the impact of such an experience over time? Perceptive and provocative, Music as Mao's Weapon is an insightful look at the exploitation and manipulation of the arts under authoritarianism.
This book presents the data privacy protection which has been extensively applied in our current era of big data. However, research into big data privacy is still in its infancy. Given the fact that existing protection methods can result in low data utility and unbalanced trade-offs, personalized privacy protection has become a rapidly expanding research topic.In this book, the authors explore emerging threats and existing privacy protection methods, and discuss in detail both the advantages and disadvantages of personalized privacy protection. Traditional methods, such as differential privacy and cryptography, are discussed using a comparative and intersectional approach, and are contrasted with emerging methods like federated learning and generative adversarial nets. The advances discussed cover various applications, e.g. cyber-physical systems, social networks, and location-based services. Given its scope, the book is of interest to scientists, policy-makers, researchers, and postgraduates alike.
ISHM is an innovative combination of technologies and methods that offers solutions to the reliability problems caused by increased complexities in design, manufacture, use conditions, and maintenance. Its key strength is in the successful integration of reliability (quantitative estimation of successful operation or failure), "diagnosibility" (ability to determine the fault source), and maintainability (how to maintain the performance of a system in operation). It draws on engineering issues such as advanced sensor monitoring, redundancy management, probabilistic reliability theory, artificial intelligence for diagnostics and prognostics, and formal validation methods, but also "quasi-technical" techniques and disciplines such as quality assurance, systems architecture and engineering, knowledge capture, information fusion, testability and maintainability, and human factors. This groundbreaking book defines and explains this new discipline, providing frameworks and methodologies for implementation and further research. Each chapter includes experiments, numerical examples, simulations and case studies. It is the ideal guide to this crucial topic for professionals or researchers in aerospace systems, systems engineering, production engineering, and reliability engineering. - Solves prognostic information selection and decision-level information fusion issues - Presents integrated evaluation methodologies for complex aerospace system health conditions and software system reliability assessment - Proposes a framework to perform fault diagnostics with a distributed intelligent agent system and a data mining approach for multistate systems - Explains prognostic methods that combine both the qualitative system running state prognostics and the quantitative remaining useful life prediction
Professor Xihua Cao (1920-2005) was a leading scholar at East China Normal University (ECNU) and a famous algebraist in China. His contribution to the Chinese academic circle is particularly the formation of a world-renowned 'ECNU School' in algebra, covering research areas include algebraic groups, quantum groups, algebraic geometry, Lie algebra, algebraic number theory, representation theory and other hot fields. In January 2020, in order to commemorate Professor Xihua Cao's centenary birthday, East China Normal University held a three-day academic conference. Scholars at home and abroad gave dedications or delivered lectures in the conference. This volume originates from the memorial conference, collecting the dedications of scholars, reminiscences of family members, and 16 academic articles written based on the lectures in the conference, covering a wide range of research hot topics in algebra. The book shows not only scholars' respect and memory for Professor Xihua Cao, but also the research achievements of Chinese scholars at home and abroad.
Silk Road was once the most important economic-cultural tie connecting the Eurasian countries before the rise of the West. In September 2013, Chinese President Xi Jinping put forward the initiative to jointly build the Silk Road Economic Belt and 21st-Century Maritime Silk Road, which is abbreviated as the Belt and Road Initiative (BRI). This book analyzes the BRI through the approach of political economy and establishes the analytic framework of BRI from historical and comparative perspectives. It clearly displays the strategic considerations, future vision, constructing framework, governmental actions, latest achievements, multiple opportunities and potential risks of BRI.As China's grand national development strategy and international cooperation initiative, the BRI will largely shape China's domestic and foreign policies in the Xi Jinping era. The book is the first academic monograph on the BRI and it enables readers to comprehensively understand this initiative and its implications to China, Eurasia and the world.
Social media data contains our communication and online sharing, mirroring our daily life. This book looks at how we can use and what we can discover from such big data: Basic knowledge (data & challenges) on social media analytics Clustering as a fundamental technique for unsupervised knowledge discovery and data mining A class of neural inspired algorithms, based on adaptive resonance theory (ART), tackling challenges in big social media data clustering Step-by-step practices of developing unsupervised machine learning algorithms for real-world applications in social media domain Adaptive Resonance Theory in Social Media Data Clustering stands on the fundamental breakthrough in cognitive and neural theory, i.e. adaptive resonance theory, which simulates how a brain processes information to perform memory, learning, recognition, and prediction. It presents initiatives on the mathematical demonstration of ART’s learning mechanisms in clustering, and illustrates how to extend the base ART model to handle the complexity and characteristics of social media data and perform associative analytical tasks. Both cutting-edge research and real-world practices on machine learning and social media analytics are included in the book and if you wish to learn the answers to the following questions, this book is for you: How to process big streams of multimedia data? How to analyze social networks with heterogeneous data? How to understand a user’s interests by learning from online posts and behaviors? How to create a personalized search engine by automatically indexing and searching multimodal information resources? .
Communication-Protocol-Based Filtering and Control of Networked Systems is a self-contained treatment of the state of the art in communication-protocol-based filtering and control; recent advances in networked systems; and the potential for application in sensor networks. This book provides new concepts, new models and new methodologies with practical significance in control engineering and signal processing. The book first establishes signal-transmission models subject to different communication protocols and then develops new filter design techniques based on those models and preset requirements for filtering performance. The authors then extend this work to finite-horizon H-infinity control, ultimately bounded control and finite-horizon consensus control. The focus throughout is on three typical communications protocols: the round-robin, random-access and try-once-and-discard protocols, and the systems studied are drawn from a variety of classes, among them nonlinear systems, time-delayed and time-varying systems, multi-agent systems and complex networks. Readers are shown the latest techniques—recursive linear matrix inequalities, backward recursive difference equations, stochastic analysis and mapping methods. The unified framework for communication-protocol-based filtering and control for different networked systems established in the book will be of interest to academic researchers and practicing engineers working with communications and other signal-processing systems. Senior undergraduate and graduate students looking to increase their knowledge of current methods in control and signal processing of networked systems will also find this book valuable.
This book covers C-Programming focussing on its practical side. Volume 2 deals mainly with composite data structures and their composition. An extensive use of figures and examples help to give a clear description of concepts and help the reader to gain a systematic understanding of the programming language.
The main focus of this monograph will be on the Enhanced Anti-Disturbance Control and filtering theory and their applications. In fact, the classical anti-disturbance control theory only considered one "equivalent" disturbance which is merged by different unknown sources. However, it is noted that along with the development of information obtaining and processing technologies, one can get more information or knowledge about various types of disturbances.
This book systemically presents key concepts of multi-modal hashing technology, recent advances on large-scale efficient multimedia search and recommendation, and recent achievements in multimedia indexing technology. With the explosive growth of multimedia contents, multimedia retrieval is currently facing unprecedented challenges in both storage cost and retrieval speed. The multi-modal hashing technique can project high-dimensional data into compact binary hash codes. With it, the most time-consuming semantic similarity computation during the multimedia retrieval process can be significantly accelerated with fast Hamming distance computation, and meanwhile the storage cost can be reduced greatly by the binary embedding. The authors introduce the categorization of existing multi-modal hashing methods according to various metrics and datasets. The authors also collect recent multi-modal hashing techniques and describe the motivation, objective formulations, and optimization steps for context-aware hashing methods based on the tag-semantics transfer.
China has been holding its annual China International Import Expo (CIIE), starting from 2018 in Shanghai. This is a significant move for China to actively open the Chinese market to the rest of the world as this supports trade liberalization and economic globalization This book systematically expounds the background and content of CIIE, and studies the opportunities that China's expansion of imports brings to its economy, enterprises, consumers and to that of other countries. It elaborates on how the CIIE facilitates countries and regions from different parts of the world to strengthen their economic cooperation and trade, and promote global trade and world economic growth. The book helps readers understand China's reform and opening-up, as well as the latest trends and policies of the country's expansion of import.
This book focuses on the mathematical and physical foundations of remote sensing digital image processing and introduces key algorithms utilized in this area. The book fully introduces the basic mathematical and physical process of digital imaging, the basic theory and algorithm of pixel image processing, and the higher-order image processing algorithm and its application. This book skillfully and closely integrates theory, algorithms, and applications, making it simple for readers to understand and use. Researchers and students working in the fields of remote sensing, computer vision, geographic information science, electronic information, etc., can profit from this book. For their work and research in digital image processing, they can master the fundamentals of imaging and image processing techniques.
With the growing popularity of “big data”, the potential value of personal data has attracted more and more attention. Applications built on personal data can create tremendous social and economic benefits. Meanwhile, they bring serious threats to individual privacy. The extensive collection, analysis and transaction of personal data make it difficult for an individual to keep the privacy safe. People now show more concerns about privacy than ever before. How to make a balance between the exploitation of personal information and the protection of individual privacy has become an urgent issue. In this book, the authors use methodologies from economics, especially game theory, to investigate solutions to the balance issue. They investigate the strategies of stakeholders involved in the use of personal data, and try to find the equilibrium. The book proposes a user-role based methodology to investigate the privacy issues in data mining, identifying four different types of users, i.e. four user roles, involved in data mining applications. For each user role, the authors discuss its privacy concerns and the strategies that it can adopt to solve the privacy problems. The book also proposes a simple game model to analyze the interactions among data provider, data collector and data miner. By solving the equilibria of the proposed game, readers can get useful guidance on how to deal with the trade-off between privacy and data utility. Moreover, to elaborate the analysis on data collector’s strategies, the authors propose a contract model and a multi-armed bandit model respectively. The authors discuss how the owners of data (e.g. an individual or a data miner) deal with the trade-off between privacy and utility in data mining. Specifically, they study users’ strategies in collaborative filtering based recommendation system and distributed classification system. They built game models to formulate the interactions among data owners, and propose learning algorithms to find the equilibria.
Etale cohomology is an important branch in arithmetic geometry. This book covers the main materials in SGA 1, SGA 4, SGA 4 1/2 and SGA 5 on etale cohomology theory, which includes decent theory, etale fundamental groups, Galois cohomology, etale cohomology, derived categories, base change theorems, duality, and ℓ-adic cohomology. The prerequisites for reading this book are basic algebraic geometry and advanced commutative algebra.
Special Distillation Processes, Second Edition focuses on the latest developments in the field, such as separation methods that may prove useful for solving problems encountered during research. Topics include extraction, membrane and adsorption distillation involving the separation principle, process design and experimental techniques. The relationship between processes and techniques are also presented. Comprehensive and easy-to-read, this book provides key information needed to understand processes. It will be a valuable reference source for chemical engineers and students wishing to branch out in chemical engineering. - Provides the only comprehensive book available on special distillation processes - Contains a thorough introduction to recent developments in the field - Presents a valuable reference for students, academics and engineers in chemical engineering
Intelligent Fault Diagnosis and Remaining Useful Life Prediction of Rotating Machinery provides a comprehensive introduction of intelligent fault diagnosis and RUL prediction based on the current achievements of the author's research group. The main contents include multi-domain signal processing and feature extraction, intelligent diagnosis models, clustering algorithms, hybrid intelligent diagnosis strategies, and RUL prediction approaches, etc. This book presents fundamental theories and advanced methods of identifying the occurrence, locations, and degrees of faults, and also includes information on how to predict the RUL of rotating machinery. Besides experimental demonstrations, many application cases are presented and illustrated to test the methods mentioned in the book. This valuable reference provides an essential guide on machinery fault diagnosis that helps readers understand basic concepts and fundamental theories. Academic researchers with mechanical engineering or computer science backgrounds, and engineers or practitioners who are in charge of machine safety, operation, and maintenance will find this book very useful. - Provides a detailed background and roadmap of intelligent diagnosis and RUL prediction of rotating machinery, involving fault mechanisms, vibration characteristics, health indicators, and diagnosis and prognostics - Presents basic theories, advanced methods, and the latest contributions in the field of intelligent fault diagnosis and RUL prediction - Includes numerous application cases, and the methods, algorithms, and models introduced in the book are demonstrated by industrial experiences
This book examines China and Australia’s economic and security relations against the background of China’s increasing economic and political role. Utilizing the theory of complex interdependence, the authors consider whether greater interdependence between Beijing and Canberra augments closer economic cooperation and trade or prompts political leverage and a security challenge. Exploring China-Australia relations from the mainstream Chinese perspective this book will be of interest to scholars and students of international relations, Chinese studies, global political economy, governmental and intergovernmental organizations.
This book offers a comprehensive and systematic review of the latest research findings in the area of intuitionistic fuzzy calculus. After introducing the intuitionistic fuzzy numbers’ operational laws and their geometrical and algebraic properties, the book defines the concept of intuitionistic fuzzy functions and presents the research on the derivative, differential, indefinite integral and definite integral of intuitionistic fuzzy functions. It also discusses some of the methods that have been successfully used to deal with continuous intuitionistic fuzzy information or data, which are different from the previous aggregation operators focusing on discrete information or data. Mainly intended for engineers and researchers in the fields of fuzzy mathematics, operations research, information science and management science, this book is also a valuable textbook for postgraduate and advanced undergraduate students alike.
How China’s economic development combines a veneer of unprecedented progress with the increasingly despotic rule of surveillance over all aspects of life Since the mid-2000s, the Chinese state has increasingly shifted away from labor-intensive, export-oriented manufacturing to a process of socioeconomic development centered on science and technology. Ya-Wen Lei traces the contours of this techno-developmental regime and its resulting form of techno-state capitalism, telling the stories of those whose lives have been transformed—for better and worse—by China’s rapid rise to economic and technological dominance. Drawing on groundbreaking fieldwork and a wealth of in-depth interviews with managers, business owners, workers, software engineers, and local government officials, Lei describes the vastly unequal values assigned to economic sectors deemed “high-end” versus “low-end,” and the massive expansion of technical and legal instruments used to measure and control workers and capital. She shows how China’s rise has been uniquely shaped by its time-compressed development, the complex relationship between the nation’s authoritarian state and its increasingly powerful but unruly tech companies, and an ideology that fuses nationalism with high modernism, technological fetishism, and meritocracy. Some have compared China’s extraordinary transformation to America’s Gilded Age. This provocative book reveals how it is more like a gilded cage, one in which the Chinese state and tech capital are producing rising inequality and new forms of social exclusion.
Computational nanoelectronics is an emerging multi-disciplinary field covering condensed matter physics, applied mathematics, computer science, and electronic engineering. In recent decades, a few state-of-the-art software packages have been developed to carry out first-principle atomistic device simulations. Nevertheless those packages are either black boxes (commercial codes) or accessible only to very limited users (private research codes). The purpose of this book is to open one of the commercial black boxes, and to demonstrate the complete procedure from theoretical derivation, to numerical implementation, all the way to device simulation. Meanwhile the affiliated source code constitutes an open platform for new researchers. This is the first book of its kind. We hope the book will make a modest contribution to the field of computational nanoelectronics.
New Edition available hereEtale cohomology is an important branch in arithmetic geometry. This book covers the main materials in SGA 1, SGA 4, SGA 4 1/2 and SGA 5 on etale cohomology theory, which includes decent theory, etale fundamental groups, Galois cohomology, etale cohomology, derived categories, base change theorems, duality, and l-adic cohomology. The prerequisites for reading this book are basic algebraic geometry and advanced commutative algebra.
Provides a modern mathematical approach to the design of communication networks for graduate students, blending control, optimization, and stochastic network theories. A broad range of performance analysis tools are discussed, including important advanced topics that have been made accessible to students for the first time. Taking a top-down approach to network protocol design, the authors begin with the deterministic model and progress to more sophisticated models. Network algorithms and protocols are tied closely to the theory, illustrating the practical engineering applications of each topic. The background behind the mathematical analyses is given before the formal proofs and is supported by worked examples, enabling students to understand the big picture before going into the detailed theory. End-of-chapter problems cover a range of difficulties, with complex problems broken into several parts, and hints to many problems are provided to guide students. Full solutions are available online for instructors.
With the increasing penetration of renewable energy and distributed energy resources, smart grid is facing great challenges, which could be divided into two categories. On the one hand, the endogenous uncertainties of renewable energy and electricity load lead to great difficulties in smart grid forecast. On the other hand, massive electric devices as well as their complex constraint relationships bring about significant difficulties in smart grid dispatch. Owe to the rapid development of artificial intelligence in recent years, several artificial intelligence enabled computational methods have been successfully applied in the smart grid and achieved good performances. Therefore, this book is concerned with the research on the key issues of artificial intelligence enabled computational methods for smart grid forecast and dispatch, which consist of three main parts. (1) Introduction for smart grid forecast and dispatch, in inclusion of reviewing previous contribution of various research methods as well as their drawbacks to analyze characteristics of smart grid forecast and dispatch. (2) Artificial intelligence enabled computational methods for smart grid forecast problems, which are devoted to present the recent approaches of deep learning and machine learning as well as their successful applications in smart grid forecast. (3) Artificial intelligence enabled computational methods for smart grid dispatch problems, consisting of edge-cutting intelligent decision-making approaches, which help determine the optimal solution of smart grid dispatch. The book is useful for university researchers, engineers, and graduate students in electrical engineering and computer science who wish to learn the core principles, methods, algorithms, and applications of artificial intelligence enabled computational methods.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.