The volume results from a seminar sponsored by the ’Foundation for Intellectual History’ at the Herzog August Bibliothek, Wolfenbüttel, in 1992. Starting with the theory of regressus as displayed in its most developed form by William Wallace, these papers enter the vast field of the Renaissance discussion on method as such in its historical and systematical context. This is confined neither to the notion of method in the strict sense, nor to the Renaissance in its exact historical limits, nor yet to the Aristotelian tradition as a well defined philosophical school, but requires a new scholarly approach. Thus - besides Galileo, Zabarella and their circles, which are regarded as being crucial for the ’emergence of modern science’ in the end of the 16th century - the contributors deal with the ancient and medieval origins as well as with the early modern continuity of the Renaissance concepts of method and with ’non-regressive’ methodologies in the various approaches of Renaissance natural philosophy, including the Lutheran and Calvinist traditions.
This book describes the doctrine and impact of Alexander of Aphrodisias, the second-century commentator on Aristotle, through the centuries and up to his sixteenth-century role as the clandestine prompter of a new philosophy of nature. In the millennium after his death, Alexander first served the Neo-Platonic schools as their authority on Aristotle, and in the Arabic centuries subsequently served as Averroes’ exemplary exponent of the doctrine of the mortality of the soul. For this reason, the Latin Scholastics deemed his work unworthy of being translated. This changed only in the late Middle Ages, when Alexander emerged as the only Aristotelian alternative to Averroes. When in 1495 his account of Aristotle’s psychology was translated and published, his principles of a natural philosophy, which were exempt from metaphysics and based on sense perception, eventually became accessible. The prompt reception and widespread endorsement of Alexander’s teaching testify to his impact throughout the sixteenth century. Originally published as Volume XVI, No. 1 (2011) of Brill's journal Early Science and Medicine.
Basic research of the pathobiology of diseases as well as of therapeutic strategies usually is carried out in rodents as animal models. Translational research that transfers novel results from basic research to clinical application often requires analyses in additional nonrodent models and/or large animal models that share specific pathophysiological characteristics with the human diseases in question. As prerequisites for the generation of appropriate disease models by genetic engineering, pigs exhibit suitable reproductive performance traits, pig genome analyses resulted in the availability of several resources of genomic data, and efficient and precise techniques for the genetic modification of pigs have been established. In the recent years, genetically engineered pigs were increasingly generated as biomedical research tools for specific human genetic diseases. Here, we review the current state of the techniques used for the production of genetically engineered pigs as well as the establishment of genetically engineered pigs as models for human diseases.
Accompanied by the advent of animal cloning, the technique of nuclear transfer produced alpha1,3-galactosyltransferase-knockout (Gal-KO) pigs in many institutes, including the ones in Japan, at the beginning of 21st Century. In addition, the controversy of the risks of PERV has gradually minimized, because of the fact that there are no cases of PERV infections reported in humans. Furthermore, a large clinical wave for islet allotransplantation resumed the interest of xenotransplantation, especially porcine islet transplantation and some exceptions. Clinical trials were done in many countries so far, such as Sweden, China, Mexico, USA (Inventory of Human Xenotransplantation Practices - IXA and HUG in collaboration with WHO). In addition, a new clinical trial was approved by the government, and resumed the porcine islet transplantation research in New Zealand two years ago.
The volume results from a seminar sponsored by the ’Foundation for Intellectual History’ at the Herzog August Bibliothek, Wolfenbüttel, in 1992. Starting with the theory of regressus as displayed in its most developed form by William Wallace, these papers enter the vast field of the Renaissance discussion on method as such in its historical and systematical context. This is confined neither to the notion of method in the strict sense, nor to the Renaissance in its exact historical limits, nor yet to the Aristotelian tradition as a well defined philosophical school, but requires a new scholarly approach. Thus - besides Galileo, Zabarella and their circles, which are regarded as being crucial for the ’emergence of modern science’ in the end of the 16th century - the contributors deal with the ancient and medieval origins as well as with the early modern continuity of the Renaissance concepts of method and with ’non-regressive’ methodologies in the various approaches of Renaissance natural philosophy, including the Lutheran and Calvinist traditions.
This book describes the doctrine and impact of Alexander of Aphrodisias, the second-century commentator on Aristotle, through the centuries and up to his sixteenth-century role as the clandestine prompter of a new philosophy of nature. In the millennium after his death, Alexander first served the Neo-Platonic schools as their authority on Aristotle, and in the Arabic centuries subsequently served as Averroes’ exemplary exponent of the doctrine of the mortality of the soul. For this reason, the Latin Scholastics deemed his work unworthy of being translated. This changed only in the late Middle Ages, when Alexander emerged as the only Aristotelian alternative to Averroes. When in 1495 his account of Aristotle’s psychology was translated and published, his principles of a natural philosophy, which were exempt from metaphysics and based on sense perception, eventually became accessible. The prompt reception and widespread endorsement of Alexander’s teaching testify to his impact throughout the sixteenth century. Originally published as Volume XVI, No. 1 (2011) of Brill's journal Early Science and Medicine.
Basic research of the pathobiology of diseases as well as of therapeutic strategies usually is carried out in rodents as animal models. Translational research that transfers novel results from basic research to clinical application often requires analyses in additional nonrodent models and/or large animal models that share specific pathophysiological characteristics with the human diseases in question. As prerequisites for the generation of appropriate disease models by genetic engineering, pigs exhibit suitable reproductive performance traits, pig genome analyses resulted in the availability of several resources of genomic data, and efficient and precise techniques for the genetic modification of pigs have been established. In the recent years, genetically engineered pigs were increasingly generated as biomedical research tools for specific human genetic diseases. Here, we review the current state of the techniques used for the production of genetically engineered pigs as well as the establishment of genetically engineered pigs as models for human diseases.
In financial and actuarial modeling and other areas of application, stochastic differential equations with jumps have been employed to describe the dynamics of various state variables. The numerical solution of such equations is more complex than that of those only driven by Wiener processes, described in Kloeden & Platen: Numerical Solution of Stochastic Differential Equations (1992). The present monograph builds on the above-mentioned work and provides an introduction to stochastic differential equations with jumps, in both theory and application, emphasizing the numerical methods needed to solve such equations. It presents many new results on higher-order methods for scenario and Monte Carlo simulation, including implicit, predictor corrector, extrapolation, Markov chain and variance reduction methods, stressing the importance of their numerical stability. Furthermore, it includes chapters on exact simulation, estimation and filtering. Besides serving as a basic text on quantitative methods, it offers ready access to a large number of potential research problems in an area that is widely applicable and rapidly expanding. Finance is chosen as the area of application because much of the recent research on stochastic numerical methods has been driven by challenges in quantitative finance. Moreover, the volume introduces readers to the modern benchmark approach that provides a general framework for modeling in finance and insurance beyond the standard risk-neutral approach. It requires undergraduate background in mathematical or quantitative methods, is accessible to a broad readership, including those who are only seeking numerical recipes, and includes exercises that help the reader develop a deeper understanding of the underlying mathematics.
The authors of this volume set themselves one task, to trace the extra-biblical primary texts that are relevant for understanding Jesus' trial and crucifixion. With that goal in mind, the book is built on three major themes: (1) Jesus' trial / interrogation before the Sanhedrin, (2) Jesus' trial before Pontius Pilatus, and (3) crucifixion as a method of execution in antiquity. In chronologically sequential order (where possible), the authors select and arrange an overwhelming amount of extra-biblical primary texts -- 462 to be exact -- underneath these three categories (75, 46, and 341 texts respectively)."--Brian J. Wright in Religious Studies Review.
First published in 1998, this volume considers the Nuremberg Code in light of new ethical grey areas which have become evident due to recent scientific advancements, particularly the questions of DNA and cloning. The contributors reflect in 26 articles on the impact of the Code, events which prompted it including Japan, and more recent ethical issues raised. The book contains the results of two European/American preparatory workshops for the First World Conference on Ethics Codes in Medicine and Biotechnology (October 1997 Freiburg, Germany) supported by the leading national institutions in the field. It aims to stimulate research about codes, the effects of codification and other forms of implementing ethics. It breaks new ground with interdisciplinary and international discourse on the subject, emphasising the need for a complete collection of codes for systematic research and evaluation and filling the gap in literature on the subject to date.
A framework for financial market modeling, the benchmark approach extends beyond standard risk neutral pricing theory. It permits a unified treatment of portfolio optimization, derivative pricing, integrated risk management and insurance risk modeling. This book presents the necessary mathematical tools, followed by a thorough introduction to financial modeling under the benchmark approach, explaining various quantitative methods for the fair pricing and hedging of derivatives.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.