Theories of Democracy builds on Robert Dahl's observation that there is no single theory of democracy; only theories. Beyond the broad commitment to rule by the majority, democracy involves a set of contentious debates concerning the proper function and scope of power, equality, freedom, justice, and interests. In this anthology, Ronald J. Terchek and Thomas C. Conte have brilliantly assembled the works of classical, modern, and contemporary commentators to illustrate the deep and diverse roots of the democratic ideal, as well as to provide materials for thinking about the way some contemporary theories build on different traditions of democratic theorizing. The arguments addressed in Theories of Democracy appear in the voices of authors who have championed influential theories concerning the opportunities and dangers associated with democratic politics. In this collection, Terchek and Conte have selected excerpts not as a means for promoting a particular way of looking at democracy, but rather they have wisely chosen works that will enable students to carry on an informed discourse on the meaning and purposes of democratic principles and practices. Theories of Democracy is a must for every student of democracy's past, present, and future.
This book retraces the history of macroeconomics from Keynes's General Theory to the present. Central to it is the contrast between a Keynesian era and a Lucasian - or dynamic stochastic general equilibrium (DSGE) - era, each ruled by distinct methodological standards. In the Keynesian era, the book studies the following theories: Keynesian macroeconomics, monetarism, disequilibrium macroeconomics (Patinkin, Leijongufvud and Clower), non-Walrasian equilibrium models, and first-generation new Keynesian models. Three stages are identified in the DSGE era: new classical macroeconomics (Lucas), RBC modelling, and second-generation new Keynesian modeling. The book also examines a few selected works aimed at presenting alternatives to Lucasian macroeconomics. While not eschewing analytical content, Michel De Vroey focuses on substantive assessments, and the models studied are presented in a pedagogical and vivid yet critical way.
This book offers a provocative analysis of the neuroscience of morality. Written by three leading scholars of science, medicine, and bioethics, it critiques contemporary neuroscientific claims about individual morality and notions of good and evil. Winner of a 2021 prize from the Expanded Reason Institute, it connects moral philosophy to neoliberal economics and successfully challenges the idea that we can locate morality in the brain. Instead of discovering the source of morality in the brain as they claim to do, the popularizers of contemporary neuroscience are shown to participate in an understanding of human behavior that serves the vested interests of contemporary political economy. Providing evidence that the history of claims about morality and brain function reach back 400 years, the authors locate its genesis in the beginnings of modern philosophy, science, and economics. They further map this trajectory through the economic and moral theories of Francis Bacon, David Hume, Jeremy Bentham, John Stuart Mill, and the Chicago School of Economics to uncover a pervasive colonial anthropology at play in the neuroscience of morality today. The book concludes with a call for a humbler and more constrained neuroscience, informed by a more robust human anthropology that embraces the nobility, beauty, frailties, and flaws in being human.
Economic Thought Since Keynes provides a concise overview of changing economic thought in the latter part of the twentieth century. Part 1 gives an analysis of topics including: * Keynes and the General Theory, * the triumph of interventionism, * the neoclassical synthesis, * the resurgence of liberalism. Part 11 gives a concise biography of the 150 most influential economists since Keynes. This invaluable book will be a useful reference tool for anyone teaching or studying economics.
The concept of economic rationality is important for the historical evolution of Economics as a scientific discipline. The common idea about this concept -even between economists- is that it has a unique meaning which is universally accepted. This new volume argues that "economic rationality" is not not a universal concept with one single meaning, and that it in fact has different, if not conflicting, interpretations in the evolution of discourse on economics. In order to achieve this, the book traces the historical evolution of the concept of economic rationality from Adam Smith to the present, taking in thinkers from Mill to Friedman, and encompassing approaches from neoclassical to behavioural economics. The book charts this history in order to reveal important instances of conceptual transformation of the meaning of economic rationality. In doing so, it presents a uniquely detailed study of the historical change of the many faces of the homo oeconomicus .
This book summarizes the state of the art in tree-based methods for insurance: regression trees, random forests and boosting methods. It also exhibits the tools which make it possible to assess the predictive performance of tree-based models. Actuaries need these advanced analytical tools to turn the massive data sets now at their disposal into opportunities. The exposition alternates between methodological aspects and numerical illustrations or case studies. All numerical illustrations are performed with the R statistical software. The technical prerequisites are kept at a reasonable level in order to reach a broad readership. In particular, master's students in actuarial sciences and actuaries wishing to update their skills in machine learning will find the book useful. This is the second of three volumes entitled Effective Statistical Learning Methods for Actuaries. Written by actuaries for actuaries, this series offers a comprehensive overview of insurance data analytics with applications to P&C, life and health insurance.
The rapid growth of behavior therapy over the past 20 years has been well doc umented. Yet the geometric expansion of the field has been so great that it deserves to be recounted. We all received our graduate training in the mid to late 1960s. Courses in behavior therapy were then a rarity. Behavioral training was based more on informal tutorials than on systematic programs of study. The behavioral literature was so circumscribed that it could be easily mastered in a few months of study. A mere half-dozen books (by Wolpe, Lazarus, Eysenck, Ullmann, and Krasner) more-or-Iess comprised the behavioral library in the mid- 1960s. Semirial works by Ayllon and Azrin, Bandura, Franks, and Kanfer in 1968 and 1969 made it only slightly more difficult to survey the field. Keeping abreast of new developments was not very difficult, as Behaviour Research and Therapy and the Journal of Applied Behavior Analysis were the only regular outlets for behavioral articles until the end of the decade, when Behavior Therapy and Be havior Therapy and Experimental Psychiatry first appeared. We are too young to be maudlin, but "Oh for the good old days!" One of us did a quick survey of his bookshelves and stopped counting books with behavior or behavioral in the titles when he reached 100. There were at least half again as many behavioral books without those words in the title.
Understanding distributed computing is not an easy task. This is due to the many facets of uncertainty one has to cope with and master in order to produce correct distributed software. Considering the uncertainty created by asynchrony and process crash failures in the context of message-passing systems, the book focuses on the main abstractions that one has to understand and master in order to be able to produce software with guaranteed properties. These fundamental abstractions are communication abstractions that allow the processes to communicate consistently (namely the register abstraction and the reliable broadcast abstraction), and the consensus agreement abstractions that allows them to cooperate despite failures. As they give a precise meaning to the words "communicate" and "agree" despite asynchrony and failures, these abstractions allow distributed programs to be designed with properties that can be stated and proved. Impossibility results are associated with these abstractions. Hence, in order to circumvent these impossibilities, the book relies on the failure detector approach, and, consequently, that approach to fault-tolerance is central to the book. Table of Contents: List of Figures / The Atomic Register Abstraction / Implementing an Atomic Register in a Crash-Prone Asynchronous System / The Uniform Reliable Broadcast Abstraction / Uniform Reliable Broadcast Abstraction Despite Unreliable Channels / The Consensus Abstraction / Consensus Algorithms for Asynchronous Systems Enriched with Various Failure Detectors / Constructing Failure Detectors
The hegemony of finance compels a new orientation for everyone and everything: companies care more about the moods of their shareholders than about longstanding commercial success; governments subordinate citizen welfare to appeasing creditors; and individuals are concerned less with immediate income from labor than appreciation of their capital goods, skills, connections, and reputations. That firms, states, and people depend more on their ratings than on the product of their activities also changes how capitalism is resisted. For activists, the focus of grievances shifts from the extraction of profit to the conditions under which financial institutions allocate credit. While the exploitation of employees by their employers has hardly been curbed, the power of investors to select investees — to decide who and what is deemed creditworthy — has become a new site of social struggle. In clear and compelling prose, Michel Feher explains the extraordinary shift in conduct and orientation generated by financialization. Above all, he articulates the new political resistances and aspirations that investees draw from their rated agency.
This book presents the most important fault-tolerant distributed programming abstractions and their associated distributed algorithms, in particular in terms of reliable communication and agreement, which lie at the heart of nearly all distributed applications. These programming abstractions, distributed objects or services, allow software designers and programmers to cope with asynchrony and the most important types of failures such as process crashes, message losses, and malicious behaviors of computing entities, widely known under the term "Byzantine fault-tolerance". The author introduces these notions in an incremental manner, starting from a clear specification, followed by algorithms which are first described intuitively and then proved correct. The book also presents impossibility results in classic distributed computing models, along with strategies, mainly failure detectors and randomization, that allow us to enrich these models. In this sense, the book constitutes an introduction to the science of distributed computing, with applications in all domains of distributed systems, such as cloud computing and blockchains. Each chapter comes with exercises and bibliographic notes to help the reader approach, understand, and master the fascinating field of fault-tolerant distributed computing.
Headache syndromes rank amongst the most common presenting symptoms in general practice and neurology, affecting up to 15% of the adult population. Part of the Oxford Textbooks in Clinical Neurology series, the Oxford Textbook of Headache Syndromes provides clinicians with a definitive resource for diagnosing and managing patients with primary and secondary forms of headaches, either as isolated complaints or as part of a more complex syndrome. Split into 7 key sections with 59 chapters, this comprehensive work discusses the scientific basis and practical management of headache syndromes in a logical format. Each chapter is written by international experts in neurology who share their research and extensive experience by providing a wealth of practical advice for use in clinical situations. In addition, all content is up-to-date and chapters incorporate discussions on the latest International Classification of Headache Disorders 3rd edition when relevant.
Here is a chapter from The Essentials of Risk Management, a practical, non-ivory tower approach that is necessary to effectively implement a superior risk management program. Written by three of the leading figures with extensive practical and theoretical experience in the global risk management and corporate governance arena, this straightforward guidebook features such topics as governance, compliance and risk management; how to implement integrated risk management; measuring, managing and hedging market, and more.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.