A monumental new biography of a pivotal yet poorly understood pioneer in modern philosophy. When a painter once told Goethe that he wanted to paint the most celebrated man of the age, Goethe directed him to Georg Wilhelm Friedrich Hegel. Hegel worked from the credo: To philosophize is to learn to live freely. While he was slow and cautious in the development of his philosophy, his intellectual growth was like an odyssey of the mind, and, contrary to popular belief, his life was full of twists and turns, suspense and even danger. In this landmark biography, the philosopher Klaus Vieweg paints a new picture of the life and work of the most important representative of German idealism. His vivid portrait provides readers an intimate account of Hegel's times and the milieu in which he developed his thought, along with detailed, clear-sighted analyses of Hegel's four major works. What results is a new interpretation of Hegel through the lens of reason and freedom. Vieweg draws on extensive archival research that has brought to light a wealth of hitherto undiscovered documents and handwritten notes relating to Hegel's work, touching on Hegel's engagement with the leading thinkers and writers of his age: Kant, Fichte, Schelling, Hölderlin, and others. Combatting clichés and misunderstandings about Hegel, Vieweg also offers a sustained defense of the philosopher's more progressive impulses. Highly praised upon its release in Germany as having set the new biographical standard, this monumental work emphasizes Hegel's relevance for today, depicting him as a vital figure in the history of philosophy.
This volume brings together essays on Hegel from various decades of my involvement with the most important philosophical thinker of modernity. It is directed against some of the misinterpretations, malicious legends, and fairy tales about Hegel that are still prevalent today"--
More than 80 personalities, in or from Germany, that over the centuries have shaped the development of analytical chemistry are introduced by brief biographies. These accounts go beyond summarising key biographical information and outline the individual's contributions to analytical chemistry. This richly illustrated Brief offers a unique resource of information that is not available elsewhere.
Role-based access control (RBAC) is a widely used technology to control information flows as well as control flows within and between applications in compliance with restrictions implied by security policies, in particular, to prevent disclosure of information or access to resources beyond restrictions defined by those security policies. Since RBAC only provides the alternatives of either granting or denying access, more fine-grained control of information flows such as “granting access to information provided that it will not be disclosed to targets outside our organisation during further processing” is not possible. In business processes, in particular those spanning several organisations, which are commonly defined using business process execution language (BPEL), useful information flows not violating security policy-implied limitations would be prevented if only the access control capabilities offered by RBAC are in use. The book shows a way of providing more refined methods of information flow control that allow for granting access to information or resources by taking in consideration the former or further information flow in a business process requesting this access. The methods proposed are comparatively easy to apply and have been proven to be largely machine-executable by a prototypical realisation. As an addition, the methods are extended to be also applicable to BPEL-defined workflows that make use of Grid services or Cloud services. IT Security Specialists Chief Information Officers (CIOs) Chief Security Officers (CSOs) Security Policy and Quality Assurance Officers and Managers Business Process and Web/Grid/Cloud Service Designers, Developers, Operational Managers Interested Learners / Students in the Field of Security Management.
Calculi of temporal logic are widely used in modern computer science. The temporal organization of information flows in the different architectures of laptops, the Internet, or supercomputers would not be possible without appropriate temporal calculi. In the age of digitalization and High-Tech applications, people are often not aware that temporal logic is deeply rooted in the philosophy of modalities. A deep understanding of these roots opens avenues to the modern calculi of temporal logic which have emerged by extension of modal logic with temporal operators. Computationally, temporal operators can be introduced in different formalisms with increasing complexity such as Basic Modal Logic (BML), Linear-Time Temporal Logic (LTL), Computation Tree Logic (CTL), and Full Computation Tree Logic (CTL*). Proof-theoretically, these formalisms of temporal logic can be interpreted by the sequent calculus of Gentzen, the tableau-based calculus, automata-based calculus, game-based calculus, and dialogue-based calculus with different advantages for different purposes, especially in computer science.The book culminates in an outlook on trendsetting applications of temporal logics in future technologies such as artificial intelligence and quantum technology. However, it will not be sufficient, as in traditional temporal logic, to start from the everyday understanding of time. Since the 20th century, physics has fundamentally changed the modern understanding of time, which now also determines technology. In temporal logic, we are only just beginning to grasp these differences in proof theory which needs interdisciplinary cooperation of proof theory, computer science, physics, technology, and philosophy.
This second edition of a bestselling textbook offers an instructive and comprehensive overview of our current knowledge of biocatalysis and enzyme technology. The book now contains about 40% more printed content. Three chapters are completely new, while the others have been thoroughly updated, and a section with problems and solutions as well as new case studies have been added. Following an introduction to the history of enzyme applications, the text goes on to cover in depth enzyme mechanisms and kinetics, production, recovery, characterization and design by protein engineering. The authors treat a broad range of applications of soluble and immobilized biocatalysts, including wholecell systems, the use of non-aqueous reaction systems, applications in organic synthesis, bioreactor design and reaction engineering. Methods to estimate the sustainability, important internet resources and their evaluation, and legislation concerning the use of biocatalysts are also covered.
This new edition also treats smart materials and artificial life. A new chapter on information and computational dynamics takes up many recent discussions in the community.
Ruthenberg highlights the unique aspects of chemistry, specifically its metachemical fundamentals, which have been largely overlooked in current philosophies of science. Conventional metaphysics, derived from or focused on theoretical physics, is inadequate when applied to chemistry. The author examines and integrates historical and philosophical perspectives on important aspects of chemistry, including affinity, compositionism, emergence, synthesis/analysis, atomism/non-atomism, chemical species, chemical bond, chemical concepts, plurality, temporality/potentiality, reactivity, and underdetermination. To accomplish this, he draws on the works of notable chemists such as František Wald, Wilhelm Ostwald, Friedrich Paneth, and Hans Primas, who have contributed to the philosophical understanding of chemistry. The central conclusion of this study aligns with Immanuel Kant's viewpoint: Chemistry is a systematic art.
This much-needed book presents a clear and very practice-oriented overview of thermal separation processes. An extensive introduction elucidates the physical and physicochemical fundamentals of different unit operations used to separate homogenous mixtures. This is followed by a concise text with numerous explanatory figures and tables referring to process and design, flowsheets, basic engineering and examples of separation process applications. Very helpful guidance in the form of process descriptions, calculation models and operation data is presented in an easy-to- understand manner thereby assisting the practicing engineer in the choosing and evaluation of separation processes and facilitating the modeling and design of innovative equipment. A comprehensive reference list provides further opportunity for the following up of special separation problems. Chemical and mechanical engineers, chemists, physicists and biotechnologists in research and development, plant design and environmental protection, as well as students in chemical engineering and natural sciences will find this all-embracing reference guide of tremendous value and practical use.
In this textbook, fundamental methods for model-based design of mechatronic systems are presented in a systematic, comprehensive form. The method framework presented here comprises domain-neutral methods for modeling and performance analysis: multi-domain modeling (energy/port/signal-based), simulation (ODE/DAE/hybrid systems), robust control methods, stochastic dynamic analysis, and quantitative evaluation of designs using system budgets. The model framework is composed of analytical dynamic models for important physical and technical domains of realization of mechatronic functions, such as multibody dynamics, digital information processing and electromechanical transducers. Building on the modeling concept of a technology-independent generic mechatronic transducer, concrete formulations for electrostatic, piezoelectric, electromagnetic, and electrodynamic transducers are presented. More than 50 fully worked out design examples clearly illustrate these methods and concepts and enable independent study of the material.
This book conveys the theoretical and experimental basics of a well-founded measurement technique in the areas of high DC, AC and surge voltages as well as the corresponding high currents. Additional chapters explain the acquisition of partial discharges and the electrical measured variables. Equipment exposed to very high voltages and currents is used for the transmission and distribution of electrical energy. They are therefore tested for reliability before commissioning using standardized and future test and measurement procedures. Therefore, the book also covers procedures for calibrating measurement systems and determining measurement uncertainties, and the current state of measurement technology with electro-optical and magneto-optical sensors is discussed.
Foundations of Dialectical Psychology is a compilation of the writings of Klaus F. Riegel on dialectical psychology. The book presents chapters discussing such topics as the dialectics of human development; history of dialectical psychology; temporal organization of dialogues; and the analysis of the concept of crisis and its underlying philosophical model and ideology. Psychologists and students will find the book invaluable.
This new edition of the near-legendary textbook by Schlichting and revised by Gersten presents a comprehensive overview of boundary-layer theory and its application to all areas of fluid mechanics, with particular emphasis on the flow past bodies (e.g. aircraft aerodynamics). The new edition features an updated reference list and over 100 additional changes throughout the book, reflecting the latest advances on the subject.
Translation by: Laura Grossmann This book presents – for the first time in the English language - the concept of systemic organization development and its use in management and consultancy. It demonstrates in a succinct and compact way, how the systemic approach, in its up-to-date version, is well suited to describe and handle complex challenges in diverse organizations of all sectors of society. First, the authors sketch out the crucial role organizations play today and the increasing importance of their ability to change. The central theme of the book is thus the design of organizational change processes with the help of different tools. These tools deal cautiously with employees, clients and cooperation partners in order to ensure sustainable success of an organization. In the final chapters the authors delve into specific attitudes during the change process, such as the building of trust and the allowing of emotions. Several cases illustrate how the concept and the tools promote organizational development. The book well provides a practical guideline. Additionally, the book talks about important aspects managers have to pay attention to, such as dealing with concerns and resistance. The values of the systemic concept like sustainability, selective participation and growth from inside are convincingly exemplified. The book is theoretically sound and grounded by the authors’ long management and consulting experience and their research activities with the university background. It is addressed mainly at actors in corporations, not-for-profit and public organizations, who’s task it is to organize, design and effectuate change while the daily business continues alongside. These actors may be leaders, managers, experts, consultants, project managers or employees.
Cosmic evolution leads from symmetry to complexity by symmetry breaking and phase transitions. The emergence of new order and structure in nature and society is explained by physical, chemical, biological, social and economic self-organization, according to the laws of nonlinear dynamics. All these dynamical systems are considered computational systems processing information and entropy. Are symmetry and complexity only useful models of science or are they universals of reality? Symmetry and Complexity discusses the fascinating insights gained from natural, social and computer sciences, philosophy and the arts. With many diagrams and pictures, this book illustrates the spirit and beauty of nonlinear science. In the complex world of globalization, it strongly argues for unity in diversity.
This book is devoted to non-destructive materials characterization (NDMC) using different non-destructive evaluation techniques. It presents theoretical basis, physical understanding, and technological developments in the field of NDMC with suitable examples for engineering and materials science applications. It is written for engineers and researchers in R&D, design, production, quality assurance, and non-destructive testing and evaluation. The relevance of NDMC is to achieve higher reliability, safety, and productivity for monitoring production processes and also for in-service inspections for detection of degradations, which are often precursors of macro-defects and failure of components. Ultrasonic, magnetic, electromagnetic and X-rays based NDMC techniques are discussed in detail with brief discussions on electron and positron based techniques.
This book covers the material of an introductory course in linear algebra. Topics include sets and maps, vector spaces, bases, linear maps, matrices, determinants, systems of linear equations, Euclidean spaces, eigenvalues and eigenvectors, diagonalization of self-adjoint operators, and classification of matrices. It contains multiple choice tests with commented answers.
This book is a solid foundation of the most important formalisms used for specification and verification of reactive systems. In particular, the text presents all important results on m-calculus, w-automata, and temporal logics, shows the relationships between these formalisms and describes state-of-the-art verification procedures for them. It also discusses advantages and disadvantages of these formalisms, and shows up their strengths and weaknesses. Most results are given with detailed proofs, so that the presentation is almost self-contained. Includes all definitions without relying on other material Proves all theorems in detail Presents detailed algorithms in pseudo-code for verification as well as translations to other formalisms
Methyl methacrylate (MMA) is the basic component of bone cements. To use it, a dough is prepared from the liquid and powder by mixing right before application, which is normally done by the operating team. During its working phase the dough is then inserted into the tissue where polymerization is completed. Thus, the final implant polymethyl methacrylate (PMMA) is only created at the implantation site. Besides methyl methacrylate, bone cements sometimes contain other methacrylates, such as butyl methacrylate. To achieve X-ray opacity, radiopacifiers (zirconium dioxide or barium sulfate) are added to the powder. Both the liquid and powder components contain additives (initiator and activator) that launch polymerization and control the set ting when mixed together. Moreover, softener and emulsifiers are some times used. The addition of antibiotics to the powder component in order to prevent or treat infections has become especially important. Commercial bone cements differ in composition and the course of curing. Some are designed for high and others for low viscosity. The way the user handles and applies the cement always crucially influences the quality of the implant. This is why clear and comprehensive information about the cements should be available to show the user how all the relevant factors work It should also be possible together and how they depend on each other.
The main goal of this book is to present the methods used to calculate the most important parameters for ropes, and to explain how they are applied on the basis of numerous sample calculations. The book, based on the most important chapters of the German book DRAHTSEILE, has been updated to reflect the latest developments, with the new edition especially focusing on computational methods for wire ropes. Many new calculations and examples have also been added to facilitate the dimensioning and calculation of mechanical characteristics of wire ropes. This book offers a valuable resource for all those working with wire ropes, including construction engineers, operators and supervisors of machines and installations involving wire ropes.
This book reviews the current state of the theory of pattern formation by a liquid-solid interface during crystal growth. It gives a pedagogical introduction to the subject, including experimental results, mathematical modeling and linear stability analysis. After highlighting the success of the theory in resolving the selection problem of dendritic growth, various new directions of research are presented in which progress has been made recently. These are the formation of nondendritic seaweed-like structures, growth of lamellar eutectics and rapid solidification. The interplay between analytic methods on the one hand (scaling arguments, asymptotic analysis, similarity equation, Sivashinsky singular expansion) and numerical calculations on the other (Newton method, dynamical schemes) is emphasized.
In this study of Hegel's philosophy, Brinkmann undertakes to defend Hegel's claim to objective knowledge by bringing out the transcendental strategy underlying Hegel's argument in the Phenomenology of Spirit and the Logic. Hegel's metaphysical commitments are shown to become moot through this transcendental reading. Starting with a survey of current debates about the possibility of objective knowledge, the book next turns to the original formulation of the transcendental argument in favor of a priori knowledge in Kant's First Critique. Through a close reading of Kant's Transcendental Deduction and Hegel's critique of it, Brinkmann tries to show that Hegel develops an immanent critique of Kant's position that informs his reformulation of the transcendental project in the Introduction to the Phenomenology of Spirit and the formulation of the position of 'objective thought' in the Science of Logic and the Encyclopedia of the Philosophical Sciences. Brinkmann takes the reader through the strategic junctures of the argument of the Phenomenology that establishes the position of objective thinking with which the Logic begins. A critical examination of the Introduction to the Lectures on the History of Philosophy shows that Hegel's metaphysical doctrine of the self-externalization of spirit need not compromise the ontological project of the Logic and thus does not burden the position of objective thought with pre-critical metaphysical claims. Brinkmann's book is a remarkable achievement. He has given us what may be the definitive version of the transcendental, categorial interpretation of Hegel. He does this in a clear approachable style punctuated with a dry wit, and he fearlessly takes on the arguments and texts that are the most problematic for this interpretation. Throughout the book, he situates Hegel firmly in his own context and that of contemporary discussion." -Terry P. Pinkard, University Professor, Georgetown University, Washington, D.C, USA "Klaus Brinkmann’s important Hegel study reads the Phenomenology and the Logic as aspects of a single sustained effort, in turning from categories to concepts, to carry Kant’s Copernican turn beyond the critical philosophy in what constitutes a major challenge to contemporary Cartesianism." - Tom Rockmore, McAnulty College Distinguished Professor, Duquesne University, Pittsburgh, Pennsylvania, USA "In this compelling reconstruction of the theme of objective thought, Klaus Brinkmann takes the reader through Hegel’s dialectic with exceptional philosophical acumen.... Many aspects of this book are striking: the complete mastery of the central tenets of Kant’s and Hegel’s philosophy, the admirable clarity in treating obscure texts and very difficult problems, and how Brinkmann uses his expertise for a discussion of the problems of truth, objectivity and normativity relevant to the contemporary philosophical debate. This will prove to be a very important book, one that every serious student of Kant and Hegel will have to read." - Alfredo Ferrarin, Professor, Department of Philosophy, University of Pisa, Pisa, Italy
This book offers a comprehensive introduction to the theory of structural dynamics, highlighting practical issues and illustrating applications with a large number of worked out examples. In the spirit of “learning by doing” it encourages readers to apply immediately these methods by means of the software provided, allowing them to become familiar with the broad field of structural dynamics in the process. The book is primarily focused on practical applications. Earthquake resistant design is presented in a holistic manner, discussing both the underlying geophysical concepts and the latest engineering design methods and illustrated by fully worked out examples based on the newest structural codes. The spectral characteristics of turbulent wind processes and the main analysis methods in the field of structural oscillations due to wind gusts and vortex shedding are also discussed and applications illustrated by realistic examples of slender chimney structures. The user‐friendly software employed is downloadable and can be readily used by readers to tackle their own problems.
A new edition of the almost legendary textbook by Schlichting completely revised by Klaus Gersten is now available. This book presents a comprehensive overview of boundary-layer theory and its application to all areas of fluid mechanics, with emphasis on the flow past bodies (e.g. aircraft aerodynamics). It contains the latest knowledge of the subject based on a thorough review of the literature over the past 15 years. Yet again, it will be an indispensable source of inexhaustible information for students of fluid mechanics and engineers alike.
The fourth edition includes new developments, in particular a new section on the double beta decay including a discussion of the possibility of a neutrinoless decay and its implications for the standard model.
Everybody knows them. Smartphones that talk to us, wristwatches that record our health data, workflows that organize themselves automatically, cars, airplanes and drones that control themselves, traffic and energy systems with autonomous logistics or robots that explore distant planets are technical examples of a networked world of intelligent systems. Machine learning is dramatically changing our civilization. We rely more and more on efficient algorithms, because otherwise we will not be able to cope with the complexity of our civilizing infrastructure. But how secure are AI algorithms? This challenge is taken up in the 2nd edition: Complex neural networks are fed and trained with huge amounts of data (big data). The number of necessary parameters explodes exponentially. Nobody knows exactly what is going on in these "black boxes". In machine learning we need more explainability and accountability of causes and effects in order to be able to decide ethical and legal questions of responsibility (e.g. in autonomous driving or medicine)! Besides causal learning, we also analyze procedures of tests and verification to get certified AI-programs. Since its inception, AI research has been associated with great visions of the future of mankind. It is already a key technology that will decide the global competition of social systems. "Artificial Intelligence and Responsibility" is another central supplement to the 2nd edition: How should we secure our individual liberty rights in the AI world? This book is a plea for technology design: AI must prove itself as a service in society.
Real life phenomena in engineering, natural, or medical sciences are often described by a mathematical model with the goal to analyze numerically the behaviour of the system. Advantages of mathematical models are their cheap availability, the possibility of studying extreme situations that cannot be handled by experiments, or of simulating real systems during the design phase before constructing a first prototype. Moreover, they serve to verify decisions, to avoid expensive and time consuming experimental tests, to analyze, understand, and explain the behaviour of systems, or to optimize design and production. As soon as a mathematical model contains differential dependencies from an additional parameter, typically the time, we call it a dynamical model. There are two key questions always arising in a practical environment: 1 Is the mathematical model correct? 2 How can I quantify model parameters that cannot be measured directly? In principle, both questions are easily answered as soon as some experimental data are available. The idea is to compare measured data with predicted model function values and to minimize the differences over the whole parameter space. We have to reject a model if we are unable to find a reasonably accurate fit. To summarize, parameter estimation or data fitting, respectively, is extremely important in all practical situations, where a mathematical model and corresponding experimental data are available to describe the behaviour of a dynamical system.
This book is a true introduction to the basic concepts and techniques of algebraic geometry. The language is purposefully kept on an elementary level, avoiding sheaf theory and cohomology theory. The introduction of new algebraic concepts is always motivated by a discussion of the corresponding geometric ideas. The main point of the book is to illustrate the interplay between abstract theory and specific examples. The book contains numerous problems that illustrate the general theory. The text is suitable for advanced undergraduates and beginning graduate students. It contains sufficient material for a one-semester course. The reader should be familiar with the basic concepts of modern algebra. A course in one complex variable would be helpful, but is not necessary.
This book focuses on the gradual formation of the concept of ‘light quanta’ or ‘photons’, as they have usually been called in English since 1926. The great number of synonyms that have been used by physicists to denote this concept indicates that there are many different mental models of what ‘light quanta’ are: simply finite, ‘quantized packages of energy’ or ‘bullets of light’? ‘Atoms of light’ or ‘molecules of light’? ‘Light corpuscles’ or ‘quantized waves’? Singularities of the field or spatially extended structures able to interfere? ‘Photons’ in G.N. Lewis’s sense, or as defined by QED, i.e. virtual exchange particles transmitting the electromagnetic force? The term ‘light quantum’ made its first appearance in Albert Einstein’s 1905 paper on a “heuristic point of view” to cope with the photoelectric effect and other forms of interaction of light and matter, but the mental model associated with it has a rich history both before and after 1905. Some of its semantic layers go as far back as Newton and Kepler, some are only fully expressed several decades later, while others initially increased in importance then diminished and finally vanished. In conjunction with these various terms, several mental models of light quanta were developed—six of them are explored more closely in this book. It discusses two historiographic approaches to the problem of concept formation: (a) the author’s own model of conceptual development as a series of semantic accretions and (b) Mark Turner’s model of ‘conceptual blending’. Both of these models are shown to be useful and should be explored further. This is the first historiographically sophisticated history of the fully fledged concept and all of its twelve semantic layers. It systematically combines the history of science with the history of terms and a philosophically inspired history of ideas in conjunction with insights from cognitive science.
This book presents an up-to-date formalism of non-equilibrium Green's functions covering different applications ranging from solid state physics, plasma physics, cold atoms in optical lattices up to relativistic transport and heavy ion collisions. Within the Green's function formalism, the basic sets of equations for these diverse systems are similar, and approximations developed in one field can be adapted to another field. The central object is the self-energy which includes all non-trivial aspects of the system dynamics. The focus is therefore on microscopic processes starting from elementary principles for classical gases and the complementary picture of a single quantum particle in a random potential. This provides an intuitive picture of the interaction of a particle with the medium formed by other particles, on which the Green's function is built on.
This book shows its readers how to achieve the goal of genuine IT governance. The key here is the successful development of enterprise architecture as the necessary foundation. With its capacity to span and integrate business procedures, IT applications and IT infrastructure, enterprise architecture opens these areas up to analysis and makes them rich sources of critical data. Enterprise architecture thereby rises to the status of a crucial management information system for the CIO. The focused analysis of the architecture (its current and future states) illuminates the path to concrete IT development planning and the cost-effective and beneficial deployment of IT. Profit from the author's firsthand experience - proven approaches firmly based in enterprise reality.
The principle of local activity explains the emergence of complex patterns in a homogeneous medium. At first defined in the theory of nonlinear electronic circuits in a mathematically rigorous way, it can be generalized and proven at least for the class of nonlinear reaction-diffusion systems in physics, chemistry, biology, and brain research. Recently, it was realized by memristors for nanoelectronic device applications. In general, the emergence of complex patterns and structures is explained by symmetry breaking in homogeneous media, which is caused by local activity. This book argues that the principle of local activity is really fundamental in science, and can even be identified in quantum cosmology as symmetry breaking of local gauge symmetries generating the complexity of matter and forces in our universe. Applications are considered in economic, financial, and social systems with the emergence of equilibrium states, symmetry breaking at critical points of phase transitions and risky acting at the edge of chaos./a
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.