Modern AI techniques –- especially deep learning –- provide, in many cases, very good recommendations: where a self-driving car should go, whether to give a company a loan, etc. The problem is that not all these recommendations are good -- and since deep learning provides no explanations, we cannot tell which recommendations are good. It is therefore desirable to provide natural-language explanation of the numerical AI recommendations. The need to connect natural language rules and numerical decisions is known since 1960s, when the need emerged to incorporate expert knowledge -- described by imprecise words like "small" -- into control and decision making. For this incorporation, a special "fuzzy" technique was invented, that led to many successful applications. This book described how this technique can help to make AI more explainable.The book can be recommended for students, researchers, and practitioners interested in explainable AI.
This book describes new techniques for making decisions in situations with uncertainty and new applications of decision-making techniques. The main emphasis is on situations when it is difficult to decrease uncertainty. For example, it is very difficult to accurately predict human economic behavior, so in economics, it is very important to take this uncertainty into account when making decisions. Other areas where it is difficult to decrease uncertainty are geosciences and teaching. The book analyzes the general problem of decision making and shows how its results can be applied to economics, geosciences, and teaching. Since all these applications involve computing, the book also shows how these results can be applied to computing, including deep learning and quantum computing. The book is recommended to researchers, practitioners, and students who want to learn more about decision making under uncertainty—and who want to work on remaining challenges.
On various examples ranging from geosciences to environmental sciences, this book explains how to generate an adequate description of uncertainty, how to justify semiheuristic algorithms for processing uncertainty, and how to make these algorithms more computationally efficient. It explains in what sense the existing approach to uncertainty as a combination of random and systematic components is only an approximation, presents a more adequate three-component model with an additional periodic error component, and explains how uncertainty propagation techniques can be extended to this model. The book provides a justification for a practically efficient heuristic technique (based on fuzzy decision-making). It explains how the computational complexity of uncertainty processing can be reduced. The book also shows how to take into account that in real life, the information about uncertainty is often only partially known, and, on several practical examples, explains how to extract the missing information about uncertainty from the available data.
This book describes analytical techniques for optimizing knowledge acquisition, processing, and propagation, especially in the contexts of cyber-infrastructure and big data. Further, it presents easy-to-use analytical models of knowledge-related processes and their applications. The need for such methods stems from the fact that, when we have to decide where to place sensors, or which algorithm to use for processing the data—we mostly rely on experts’ opinions. As a result, the selected knowledge-related methods are often far from ideal. To make better selections, it is necessary to first create easy-to-use models of knowledge-related processes. This is especially important for big data, where traditional numerical methods are unsuitable. The book offers a valuable guide for everyone interested in big data applications: students looking for an overview of related analytical techniques, practitioners interested in applying optimization techniques, and researchers seeking to improve and expand on these techniques.
This book is about methodological aspects of uncertainty propagation in data processing. Uncertainty propagation is an important problem: while computer algorithms efficiently process data related to many aspects of their lives, most of these algorithms implicitly assume that the numbers they process are exact. In reality, these numbers come from measurements, and measurements are never 100% exact. Because of this, it makes no sense to translate 61 kg into pounds and get the result—as computers do—with 13 digit accuracy. In many cases—e.g., in celestial mechanics—the state of a system can be described by a few numbers: the values of the corresponding physical quantities. In such cases, for each of these quantities, we know (at least) the upper bound on the measurement error. This bound is either provided by the manufacturer of the measuring instrument—or is estimated by the user who calibrates this instrument. However, in many other cases, the description of the system is more complex than a few numbers: we need a function to describe a physical field (e.g., electromagnetic field); we need a vector in Hilbert space to describe a quantum state; we need a pseudo-Riemannian space to describe the physical space-time, etc. To describe and process uncertainty in all such cases, this book proposes a general methodology—a methodology that includes intervals as a particular case. The book is recommended to students and researchers interested in challenging aspects of uncertainty analysis and to practitioners who need to handle uncertainty in such unusual situations.
How can we solve engineering problems while taking into account data characterized by different types of measurement and estimation uncertainty: interval, probabilistic, fuzzy, etc.? This book provides a theoretical basis for arriving at such solutions, as well as case studies demonstrating how these theoretical ideas can be translated into practical applications in the geosciences, pavement engineering, etc. In all these developments, the authors’ objectives were to provide accurate estimates of the resulting uncertainty; to offer solutions that require reasonably short computation times; to offer content that is accessible for engineers; and to be sufficiently general - so that readers can use the book for many different problems. The authors also describe how to make decisions under different types of uncertainty. The book offers a valuable resource for all practical engineers interested in better ways of gauging uncertainty, for students eager to learn and apply the new techniques, and for researchers interested in processing heterogeneous uncertainty.
This book addresses an intriguing question: are our decisions rational? It explains seemingly irrational human decision-making behavior by taking into account our limited ability to process information. It also shows with several examples that optimization under granularity restriction leads to observed human decision-making. Drawing on the Nobel-prize-winning studies by Kahneman and Tversky, researchers have found many examples of seemingly irrational decisions: e.g., we overestimate the probability of rare events. Our explanation is that since human abilities to process information are limited, we operate not with the exact values of relevant quantities, but with “granules” that contain these values. We show that optimization under such granularity indeed leads to observed human behavior. In particular, for the first time, we explain the mysterious empirical dependence of betting odds on actual probabilities. This book can be recommended to all students interested in human decision-making, to researchers whose work involves human decisions, and to practitioners who design and employ systems involving human decision-making —so that they can better utilize our ability to make decisions under uncertainty.
This book presents a clear, systematic treatment of convergence theorems of set-valued random variables (random sets) and fuzzy set-valued random variables (random fuzzy sets). Topics such as strong laws of large numbers and central limit theorems, including new results in connection with the theory of empirical processes are covered. The author's own recent developments on martingale convergence theorems and their applications to data processing are also included. The mathematical foundations along with a clear explanation such as Hölmander's embedding theorem, notions of various convergence of sets and fuzzy sets, Aumann integrals, conditional expectations, selection theorems, measurability and integrability arguments for both set-valued and fuzzy set-valued random variables and newly obtained optimizations techniques based on invariant properties are also given.
This collection contains translations of papers on propositional satisfiability and related logical problems which appeared in roblemy Sokrashcheniya Perebora, published in Russian in 1987 by the Scientific Council "Cybernetics" of the USSR Academy of Sciences. The problems form the nucleus of this intensively developing area. This translation is dedicated to the memory of two remarkable Russian mathematicians, Sergei Maslov and his wife Nina Maslova. Maslov is known as the originator of the universe method in automated deduction, which was discovered at the same time as the resolution method of J. A. Robison and has approximately the same range of applications. In 1981, Maslov proposed an iterative algorithm for propositional satisfiability based on some general ideas of search described in detail in his posthumously published book, Theory of Deductive Systems and Its Applications (1986; English 1987). This collection contains translations of papers on propositional satisfiability and related logical problems. The papers related to Maslov's iterative method of search reduction play a significant role.
The book explores a new general approach to selecting—and designing—data processing techniques. Symmetry and invariance ideas behind this algebraic approach have been successful in physics, where many new theories are formulated in symmetry terms. The book explains this approach and expands it to new application areas ranging from engineering, medicine, education to social sciences. In many cases, this approach leads to optimal techniques and optimal solutions. That the same data processing techniques help us better analyze wooden structures, lung dysfunctions, and deep learning algorithms is a good indication that these techniques can be used in many other applications as well. The book is recommended to researchers and practitioners who need to select a data processing technique—or who want to design a new technique when the existing techniques do not work. It is also recommended to students who want to learn the state-of-the-art data processing.
In many practical situations, we are interested in statistics characterizing a population of objects: e.g. in the mean height of people from a certain area. Most algorithms for estimating such statistics assume that the sample values are exact. In practice, sample values come from measurements, and measurements are never absolutely accurate. Sometimes, we know the exact probability distribution of the measurement inaccuracy, but often, we only know the upper bound on this inaccuracy. In this case, we have interval uncertainty: e.g. if the measured value is 1.0, and inaccuracy is bounded by 0.1, then the actual (unknown) value of the quantity can be anywhere between 1.0 - 0.1 = 0.9 and 1.0 + 0.1 = 1.1. In other cases, the values are expert estimates, and we only have fuzzy information about the estimation inaccuracy. This book shows how to compute statistics under such interval and fuzzy uncertainty. The resulting methods are applied to computer science (optimal scheduling of different processors), to information technology (maintaining privacy), to computer engineering (design of computer chips), and to data processing in geosciences, radar imaging, and structural mechanics.
The vigorous development of the internet and other information technologies have significantly expanded the amount and variety of sources of information available on decision making. This book presents the current trends of soft computing applications to the fields of measurements and information acquisition. Main topics are the production and presentation of information including multimedia, virtual environment, and computer animation as well as the improvement of decisions made on the basis of this information in various applications ranging from engineering to business. In order to make high-quality decisions, one has to fuse information of different kinds from a variety of sources with differing degrees of reliability and uncertainty. The necessity to use intelligent methodologies in the analysis of such systems is demonstrated as well as the inspiring relation of computational intelligence to its natural counterpart. This book includes several contributions demonstrating a further movement towards the interdisciplinary collaboration of the biological and computer sciences with examples from biology and robotics.
This book deals with the effect of public and semi-public companies on economy. In traditional economic models, several private companies – interested in maximizing their profit – interact (e.g., compete) with each other. Such models help to avoid wild oscillation in production and prices (typical for uncontrolled competition), and to come up with a stable equilibrium solution. The problems become very complex if we take into account the presence of public and semi-public companies – that are interested in public good as well as in the profit. The book contains theoretical results and numerical techniques for computing resulting equilibria. As a case study, it considers the problem of selecting optimal tolls for the public roads – tolls that best balance the public good and the need to recover the cost of building the roads. It is recommended to specialists in economics as well as to students interested in learning the corresponding economic models.
This book demonstrates how to describe and analyze a system's behavior and extract the desired prediction and control algorithms from this analysis. A typical prediction is based on observing similar situations in the past, knowing the outcomes of these past situations, and expecting that the future outcome of the current situation will be similar to these past observed outcomes. In mathematical terms, similarity corresponds to symmetry, and similarity of outcomes to invariance. This book shows how symmetries can be used in all classes of algorithmic problems of sciences and engineering: from analysis to prediction to control. Applications cover chemistry, geosciences, intelligent control, neural networks, quantum physics, and thermal physics. Specifically, it is shown how the approach based on symmetry and similarity can be used in the analysis of real-life systems, in the algorithms of prediction, and in the algorithms of control.
This book is intended for specialists in systems engineering interested in new, general techniques and for students and practitioners interested in using these techniques for solving specific practical problems. For many real-world, complex systems, it is possible to create easy-to-compute explicit analytical models instead of time-consuming computer simulations. Usually, however, analytical models are designed on a case-by-case basis, and there is a scarcity of general techniques for designing such easy-to-compute models. This book fills this gap by providing general recommendations for using analytical techniques in all stages of system design, implementation, testing, and monitoring. It also illustrates these recommendations using applications in various domains, such as more traditional engineering systems, biological systems (e.g., systems for cattle management), and medical and social-related systems (e.g., recommender systems).
How can we solve engineering problems while taking into account data characterized by different types of measurement and estimation uncertainty: interval, probabilistic, fuzzy, etc.? This book provides a theoretical basis for arriving at such solutions, as well as case studies demonstrating how these theoretical ideas can be translated into practical applications in the geosciences, pavement engineering, etc. In all these developments, the authors' objectives were to provide accurate estimates of the resulting uncertainty; to offer solutions that require reasonably short computation times; to offer content that is accessible for engineers; and to be sufficiently general - so that readers can use the book for many different problems. The authors also describe how to make decisions under different types of uncertainty. The book offers a valuable resource for all practical engineers interested in better ways of gauging uncertainty, for students eager to learn and apply the new techniques, and for researchers interested in processing heterogeneous uncertainty.
Satisfactory pavement performance can only be assured with appropriate process controls to ensure compacted materials meet proper density and stiffness requirements. The TRB National Cooperative Highway Research Program's NCHRP Research Report 933: Evaluating Mechanical Properties of Earth Material During Intelligent Compaction details the development of procedures to estimate the mechanical properties of geomaterials using intelligent compaction (IC) technology in a robust manner so that departments of transportation can incorporate it in their specifications. Appendix A, containing the proposed specifications and test methods, is included in the report. Appendices B through H appear in a supplementary file.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.