Motivation for this Book Ontologies have received increasing attention over the last two decades. Their roots can be traced back to the ancient philosophers, who were interested in a c- ceptualization of the world. In the more recent past, ontologies and ontological engineering have evolved in computer science, building on various roots such as logics, knowledge representation, information modeling and management, and (knowledge-based) information systems. Most recently, largely driven by the next generation internet, the so-called Semantic Web, ontological software engineering has developed into a scientific field of its own, which puts particular emphasis on the theoretical foundations of representation and reasoning, and on the methods and tools required for building ontology-based software applications in diverse domains. Though this field is largely dominated by computer science, close re- tionships have been established with its diverse areas of application, where - searchers are interested in exploiting the results of ontological software engine- ing, particularly to build large knowledge-intensive applications at high productivity and low maintenance effort. Consequently, a large number of scientific papers and monographs have been p- lished in the very recent past dealing with the theory and practice of ontological software engineering. So far, the majority of those books are dedicated to the th- retical foundations of ontologies, including philosophical treatises and their re- tionships to established methods in information systems and ontological software engineering.
TRENDS IN LINGUISTICS is a series of books that open new perspectives in our understanding of language. The series publishes state-of-the-art work on core areas of linguistics across theoretical frameworks, as well as studies that provide new insights by approaching language from an interdisciplinary perspective. TRENDS IN LINGUISTICS considers itself a forum for cutting-edge research based on solid empirical data on language in its various manifestations, including sign languages. It regards linguistic variation in its synchronic and diachronic dimensions as well as in its social contexts as important sources of insight for a better understanding of the design of linguistic systems and the ecology and evolution of language. TRENDS IN LINGUISTICS publishes monographs and outstanding dissertations as well as edited volumes, which provide the opportunity to address controversial topics from different empirical and theoretical viewpoints. High quality standards are ensured through anonymous reviewing. To discuss your book idea or submit a proposal, please contact Birgit Sievert.
This book shows in a comprehensive presentation how Bond Graph methodology can support model-based control, model-based fault diagnosis, fault accommodation, and failure prognosis by reviewing the state-of-the-art, presenting a hybrid integrated approach to Bond Graph model-based fault diagnosis and failure prognosis, and by providing a review of software that can be used for these tasks. The structured text illustrates on numerous small examples how the computational structure superimposed on an acausal bond graph can be exploited to check for control properties such as structural observability and control lability, perform parameter estimation and fault detection and isolation, provide discrete values of an unknown degradation trend at sample points, and develop an inverse model for fault accommodation. The comprehensive presentation also covers failure prognosis based on continuous state estimation by means of filters or time series forecasting. This book has been written for students specializing in the overlap of engineering and computer science as well as for researchers, and for engineers in industry working with modelling, simulation, control, fault diagnosis, and failure prognosis in various application fields and who might be interested to see how bond graph modelling can support their work. Presents a hybrid model-based, data-driven approach to failure prognosis Highlights synergies and relations between fault diagnosis and failure prognostic Discusses the importance of fault diagnosis and failure prognostic in various fields
This work by two New Testament scholars is the first comprehensive social history of the earliest churches. Integrating the historical and social data, they locate the ancient Galileans, Judeans, and the Jesus movement in their respective matrices. The Stegemanns deal with such issues as conflict between the messianic communities and the rest of Judaism, religious pluralism, social stratification, group composition, gender division, ancient economics, and urban/rurual distinctions.
Wolfgang Gründinger explores how interest groups, veto opportunities, and electoral pressure formed the German energy transition: nuclear exit, renewables, coal (CCS), and emissions trading. His findings provide evidence that logics of political competition in new German politics have fundamentally changed over the last two decades with respect to five distinct mechanisms: the end of ’fossil-nuclear’ corporatism, the new importance of trust in lobbying, ’green ’ path dependence, the emergence of a ’Green Grand Coalition’, and intra-party fights over energy politics.
Research on Pentecostal and Charismatic Christianity has increased dramatically in recent decades, and a diverse array of disciplines have begun to address a range of elements of these movements. Yet, there exists very little understanding of Pentecostal theology, and it is not uncommon to encounter stereotypes and misperceptions. Addressing this gap in current research, The Routledge Handbook of Pentecostal Theology is an exceptional reference source to the key topics, challenges, and debates in this growing field of study and is the first collection of its kind to offer a comprehensive presentation and critical discussion of this subject. Comprising over forty chapters written by a team of international contributors, the Handbook is divided into five parts: Contextualizing Pentecostal Theology Sources Theological Method Doctrines and Practices Conversations and Challenges. These sections take the reader through a comprehensive introduction to what Pentecostals believe and how they practice their faith. Looking at issues such as the core teachings of Pentecostalism concerning Spirit baptism, divine healing, or eschatology; unique practices, such as spiritual warfare and worship; and less discussed issues, such as social justice and gender, each chapter builds towards a nuanced and global picture of the theology of the Pentecostal movement. The Routledge Handbook of Pentecostal Theology is essential reading for students and researchers in Pentecostal Studies, World Christianity, and Theology as well as scholars working in contemporary Religious Studies.
Young people about to leave high school argue that they are determining their own destinies. Scholarly debates also suggest that the influence of structural factors such as social class on an individual's life course is decreasing. Wolfgang Lehmann challenges this view and offers a detailed comparative analysis of the inter-relationships between social class, institutional structures, and individual educational and career choices. Through a qualitative study of academic-track high school students and participants in youth apprenticeships in Germany and Canada, Lehmann shows how the range of available school-work transition options are defined by both gender and social class. Highlighting the importance of the institutional context in understanding school-work transitions, particularly in relation to Germany's celebrated apprenticeship system, which rests on highly streamed secondary schooling and a stratified labour market, Lehmann argues that social inequalities are maintained in part by the choices made by young people, rather than simply by structural forces. Choosing to Labour? concludes with an exploration of how public policy can meet the dual challenge of providing young people with meaningful and equitable educational experiences, while simultaneously fulfilling the need for a skilled workforce.
Why do so many people become overweight and obese and why do they find it so difficult to lose weight? In this second edition of his influential book on Dieting, Overweight and Obesity, Wolfgang Stroebe – who developed the goal conflict model of eating – explores the physiological, environmental and psychological influence on weight gain and examines how these processes are affected by genetic factors. Like the first edition, the book takes a social-cognitive approach to weight regulation and discusses how exposure to environmental cues can set-off overeating in chronic dieters. In addition to extensively revising and updating the chapters of the first edition, this second edition features three new chapters. The chapter on successful restrained eating reviews personality factors as well as recent experimental research on impulse control. The chapters on psychological treatment of obesity and on primary prevention describe and evaluate the various treatment and prevention approaches and the research conducted to assess their efficacy. This book is essential reading for students, researchers and clinicians interested in an up-to-date review of the field of eating research and a new theoretical approach to the study of overweight and obesity.
Adaptive Multimodal Interactive Systems introduces a general framework for adapting multimodal interactive systems and comprises a detailed discussion of each of the steps required for adaptation. This book also investigates how interactive systems may be improved in terms of usability and user friendliness while describing the exhaustive user tests employed to evaluate the presented approaches. After introducing general theory, a generic approach for user modeling in interactive systems is presented, ranging from an observation of basic events to a description of higher-level user behavior. Adaptations are presented as a set of patterns similar to those known from software or usability engineering.These patterns describe recurring problems and present proven solutions. The authors include a discussion on when and how to employ patterns and provide guidance to the system designer who wants to add adaptivity to interactive systems. In addition to these patterns, the book introduces an adaptation framework, which exhibits an abstraction layer using Semantic Web technology.Adaptations are implemented on top of this abstraction layer by creating a semantic representation of the adaptation patterns. The patterns cover both graphical interfaces as well as speech-based and multimodal interactive systems.
Business environments are now frequently described as VUCA – volatile, uncertain, complex and ambiguous. The COVID–19 pandemic breaking out and spreading globally in 2020 serves as a case in point. Strategies, business models, tactics and plans set for the year were challenged. In this situation, executives around the world did not suffer from insufficient general knowledge about strategizing, business modelling or planning. This book posits that what practitioners and their organizations needed to survive and thrive is practical wisdom. Executive education institutions play a key role in supporting an executive’s learning. Embarking on exploratory research and journey of discovery, this study addresses the crucial questions of how do build practical wisdom in executive education and how do executive education course participants perceive the process of developing practical wisdom in business schools? The research adopts a constructivist grounded theory design and relies on in-depth interviews as the foundation for an emerging substantive theory. It portrays a three–act process and six concrete steps within them to explain how study participants grew their practical wisdom. The book and the presented research contribute to both the academic body of knowledge on how to learn better as well as how to add more value in executive education. Regarding practice, business school leaders and faculty members benefit from this research by critically comparing their approaches to the proposed model in order to trigger improvements. Finally, the individual program participant can gain a better understanding on how to learn faster and in more directions, which contributes to a better return on investment (ROI) and return on education (ROE). It also prepares the learner more adequately for this VUCA world.
A widely used, classroom-tested text, Applied Medical Image Processing: A Basic Course delivers an ideal introduction to image processing in medicine, emphasizing the clinical relevance and special requirements of the field. Avoiding excessive mathematical formalisms, the book presents key principles by implementing algorithms from scratch and using simple MATLAB®/Octave scripts with image data and illustrations on an accompanying companion website. Organized as a complete textbook, it provides an overview of the physics of medical image processing and discusses imaging physics, clinical applications of image processing, image formats and data storage, intensity transforms, filtering of images and applications of the Fourier transform, three-dimensional spatial transforms, volume rendering, image registration, tomographic reconstruction and basic machine learning. This Third Edition of the bestseller: Contains a brand-new chapter on the basics of machine learning Devotes more attention to the subject of color space Includes additional examples from radiology, internal medicine, surgery, and radiation therapy Incorporates freely available programs in the public domain (e.g., GIMP, 3DSlicer, and ImageJ) when applicable Beneficial to students of medical physics, biomedical engineering, computer science, applied mathematics, and related fields, as well as medical physicists, radiographers, radiologists, and other professionals, Applied Medical Image Processing: A Basic Course, Third Edition is fully updated and expanded to ensure a perfect blend of theory and practice. Wolfgang Birkfellner studied theoretical physics at, and holds a Ph.D in medical physics from, the University of Vienna, Austria. Currently, he is heading the Digital Image Processing Laboratory at the Center for Biomedical Engineering and Physics at the Medical University of Vienna. He is also a reviewer and editorial board member for major journals in the field, program committee member for international conferences, and principal investigator for several third-party funded research projects. Previously, he served as senior researcher at the University Hospital Basel/Switzerland and associate professor of medical physics at the Center for Biomedical Engineering and Physics of Vienna Medical School.
In 1984 Desmond O’Connor and David Phillips published their comprehensive book „Time-correlated Single Photon Counting“. At that time time-correlated s- gle photon counting, or TCSPC, was used primarily to record fluorescence decay functions of dye solutions in cuvettes. From the beginning, TCSPC was an am- ingly sensitive and accurate technique with excellent time-resolution. However, acquisition times were relatively slow due to the low repetition rate of the light sources and the limited speed of the electronics of the 70s and early 80s. Moreover, TCSPC was intrinsically one-dimensional, i.e. limited to the recording of the wa- form of a periodic light signal. Even with these limitations, it was a wonderful te- nique. More than 20 years have elapsed, and electronics and laser techniques have made impressive progress. The number of transistors on a single chip has approximately doubled every 18 months, resulting in a more than 1,000-fold increase in compl- ity and speed. The repetition rate and power of pulsed light sources have increased by about the same factor.
This textbook explains how mountains are formed and why there are old and young mountains. It provides a reconstruction of the Earths paleogeography and shows why the shapes of South America and Africa fit so well together. Furthermore, it explains why the Pacific is surrounded by a ring of volcanos and earthquake-prone areas while the edges of the Atlantic are relatively peaceful. This thoroughly revised textbook edition addresses all these questions and more through the presentation and explanation of the geodynamic processes upon which the theory of continental drift is based and which have led to the concept of plate tectonics. It is a source of information for students of geology, geophysics, geography, geosciences in general, general natural sciences, as well as professionals, and interested layman.
This book presents bond graph model-based fault detection with a focus on hybrid system models. The book addresses model design, simulation, control and model-based fault diagnosis of multidisciplinary engineering systems. The text beings with a brief survey of the state-of-the-art, then focuses on hybrid systems. The author then uses different bond graph approaches throughout the text and provides case studies.
A widely used, classroom-tested text, Applied Medical Image Processing: A Basic Course delivers an ideal introduction to image processing in medicine, emphasizing the clinical relevance and special requirements of the field. Avoiding excessive mathematical formalisms, the book presents key principles by implementing algorithms from scratch and using simple MATLAB®/Octave scripts with image data and illustrations on an accompanying CD-ROM or companion website. Organized as a complete textbook, it provides an overview of the physics of medical image processing and discusses image formats and data storage, intensity transforms, filtering of images and applications of the Fourier transform, three-dimensional spatial transforms, volume rendering, image registration, and tomographic reconstruction. This Second Edition of the bestseller: Contains two brand-new chapters on clinical applications and image-guided therapy Devotes more attention to the subject of color space Includes additional examples from radiology, internal medicine, surgery, and radiation therapy Incorporates freely available programs in the public domain (e.g., GIMP, 3DSlicer, and ImageJ) when applicable Beneficial to students of medical physics, biomedical engineering, computer science, applied mathematics, and related fields, as well as medical physicists, radiographers, radiologists, and other professionals, Applied Medical Image Processing: A Basic Course, Second Edition is fully updated and expanded to ensure a perfect blend of theory and practice.
This voluminous work on Church History by Philip Schaff (1819-1893) was originally published between 1858 and 1893 in eight volumes in the USA and covers the period from the beginnings of Biblical Christianity in A.D. 1 to the History of the Reformation in Germany and Switzerland (1517-1648). Being still a popular text in North America, this work had been out of print for over a century and has now been carefully edited and reformatted for republication in three volumes, each of them containing the text of two volumes of the original edition. Schaff’s work, unlike other works in the field, covers a multitude of church history-related aspects – from church doctrine, policy, events and processes to aspects of social moral and family life, arts and more. It is a very comprehensive text, extremely well-written and readable, rich in material and sources used, and attests to the excellence of protestant German theological scholarship under the influence of emerging Historical-Critical Biblical Exegesis at his time. This first volume covers the period from the beginnings to the Ante-Nicene Fathers (A.D. 1-311).
The study tackles the subject in a new and unique way: Due to the fact that the borders between classical academic disciplines disappear at the nanoscale, a truly interdisciplinary approach is chosen. A functional definition of nanotechnology is developed by the authors as basis for the further sections of the study. The most important results enable recommendations with respect to scientific progress, industrial relevance, economic potential, educational needs, potential adverse health effects and philosophical aspects of nanotechnology. The book addresses the relevant decision levels, media, and academia.
Gravity interpretation involves inversion of data into models, but it is more. Gravity interpretation is used in a “holistic” sense going beyond “inversion”. Inversion is like optimization within certain a priori assumptions, i.e., all anticipated models lie in a limited domain of the a priori errors. No source should exist outside the anticipated model volume, but that is never literally true. Interpretation goes beyond by taking “outside” possibilities into account in the widest sense. Any neglected possibility carries the danger of seriously affecting the interpretation. Gravity interpretation pertains to wider questions such as the shape of the Earth, the nature of the continental and oceanic crust, isostasy, forces and stresses, geol- ical structure, nding useful resources, climate change, etc. Interpretation is often used synonymously with modelling and inversion of observations toward models. Interpretation places the inversion results into the wider geological or economic context and into the framework of science and humanity. Models play a central role in science. They are images of phenomena of the physical world, for example, scale images or metaphors, enabling the human mind to describe observations and re- tionships by abstract mathematical means. Models served orientation and survival in a complex, partly invisible physical and social environment.
Expanded to twice as many entries as the 1985 edition, and updated with new publications, new editions of previous entries, titles missed the first time around, more of the artists' own writings, and monographs that deal with significant aspects or portions of an artist's work though not all of it. The listing is alphabetical by artist, and the index by author. The works cited include analytical and critical, biographical, and enumerative; their formats range from books and catalogues raisonnes to exhibition and auction sale catalogues. A selection of biographical dictionaries containing information on artists is arranged by country. Annotation copyrighted by Book News, Inc., Portland, OR
A comprehensive, multidisciplinary perspective on endonasal endoscopic skull base surgery This book presents a complete step-by-step guide to endonasal endoscopic skull base surgery, written by prominent interdisciplinary specialists and reflecting important recent developments in the field. Combining the fundamentals of skull base anatomy and pathology with current diagnostic and interventional imaging techniques, Endonasal Endoscopic Surgery of Skull Base Tumors provides a solid clinical foundation for anyone working in this challenging and evolving specialty. Special features: State-of-the-art contributions from international experts in endonasal endoscopic skull base surgery A 360 degree panoramic assessment of skull base pathologies Description of basic and advanced endoscopic procedures based on the endonasal corridor system Current tumor-specific strategies, including indications and preoperative work-up, endoscopic surgical techniques, sequel and potential complications, postoperative care, outcomes, and pearls and pitfalls Clear and consistent interdisciplinary guidelines for managing the internal carotid artery in skull base surgery, allowing the removal of previously inoperable tumors Surgical outcomes from two of the leading international skull base centers, one in Fulda, Germany (formerly headed by Professor Draf), and one joint program at the University of Brescia and University of Varese, Italy Complete with 500 full-color photographs, anatomic illustrations, flowcharts and tables, Endonasal Endoscopic Surgery of Skull Base Tumors offers a practical management approach and sets a new standard in the field. It is invaluable for all otolaryngologists, head and neck surgeons, neurosurgeons, neuroradiologists, and pathologists who routinely make diagnostic and therapeutic decisions with regard to skull base lesions. It is also an essential text and reference for those who are learning how to perform endonasal endoscopic skull base surgery in a multidisciplinary environment.
This book and CD-ROM offer a complete simulation system for modeling groundwater flow and transport processes. The companion full-version software (PMWIN) comes with a professional graphical user-interface, supported models and programs and several other useful modeling tools. Tools include a Presentation Tool, a Result Extractor, a Field Interpolator, a Field Generator, a Water Budget Calculator and a Graphic Viewer. Book and CD-ROM are targeted at novice and experienced groundwater modelers.
Only a few solvers are currently available to solve eigenvalue optimization problems. In case of nonlinear objective and/or constraints, the capabilities of existing methods are still limited. This contribution addresses two classes of eigenvalue optimization problems: the maximization (minimization) of the smallest (largest) eigenvalue of a real symmetric matrix and optimization subject to inequalities constraining the real parts of all eigenvalues of a real square matrix. This contribution considers the reformulation of such problems into optimization problems subject to the positive definiteness of a suitable matrix to enable the use of efficient and robust off-the-shelf solvers. This contribution revisits the utilization of Sylvester's criterion suggested previously and proposes to alternatively employ Cholesky decomposition to compel the constraints on positive definiteness. The methodology is implemented in an integrated symbolic-numeric computational environment. A comparative computational study demonstrates that the latter performs better than the former, at least in the set of examples studied.
The book is intended to help under- and postgraduate students and young scientists in the correct application of NMR to the solution of physico-chemical problems concerning the study of equilibria in solution. The first part of the book (Chapters 1-3) is a trivium, but should enable a student to design and conduct simple physico-chemical NMR experiments. The following chapters give illustrative material on the physico-chemical applications of NMR of increasing complexity. These chapters include the problem of determination of equilibrium and rate constants in solution, the study of paramagnetism using NMR, the application of Dynamic NMR techniques and relaxation measurements. A multipurpose nonlinear regression program is supplied (on disc for PC) and is referred to throughout the book.
The writing is superb... each (Nelles) guide is delightfully comprehensive, a solid source of reliable information for the traveller... All travel guides claim to be comprehensive, but we found Nelles Guides superior". -- Arizona Senior World "(The Nelles Guides are) . . . beautifully photographed . . . the maps are better than Insight's, and practical information is integrated with the text, not relegated to the end". -- National Geographic Traveller -- Quality writing, often by native writers -- Detailed sections on the history, culture, special features and festivals -- Accommodations, restaurant guides, sights to see, places to shop, how to get around
Reliable models for rate-based phenomena are the backbone of model-based process design. These models are often unknown in the early design phase and need to be determined from laboratory experiments. Although model-based experimental analysis and process design are often executed sequentially, the kinetic models might not be suitable to reliably design a process. In this paper, we address this problem and present a first step on the integration of model identification and process optimization. Rather than decoupling model identification and process optimization, we use information from process optimization to design optimal experiments for improving the quality of the kinetic model given the intended use of the model. Sensitivities, which describe the influence of parametric uncertainties on the economic objective used in process optimization, are used as weights for optimal experimental design. This way, the confidence in the parameter values is maximized to reduce their influence on the process optimization objective. This first step on the integration of model identification and process optimization improves the predictive quality of a reaction kinetic model for process design without any further experimental effort.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.