Now, for the first time, publication of the landmark work inbackpropagation! Scientists, engineers, statisticians, operationsresearchers, and other investigators involved in neural networkshave long sought direct access to Paul Werbos's groundbreaking,much-cited 1974 Harvard doctoral thesis, The Roots ofBackpropagation, which laid the foundation of backpropagation. Now,with the publication of its full text, these practitioners can gostraight to the original material and gain a deeper, practicalunderstanding of this unique mathematical approach to socialstudies and related fields. In addition, Werbos has provided threemore recent research papers, which were inspired by his originalwork, and a new guide to the field. Originally written for readerswho lacked any knowledge of neural nets, The Roots ofBackpropagation firmly established both its historical andcontinuing significance as it: * Demonstrates the ongoing value and new potential ofbackpropagation * Creates a wealth of sound mathematical tools useful acrossdisciplines * Sets the stage for the emerging area of fast automaticdifferentiation * Describes new designs for forecasting and control which exploitbackpropagation * Unifies concepts from Freud, Jung, biologists, and others into anew mathematical picture of the human mind and how it works * Certifies the viability of Deutsch's model of nationalism as apredictive tool--as well as the utility of extensions of thiscentral paradigm "What a delight it was to see Paul Werbos rediscover Freud'sversion of 'back-propagation.' Freud was adamant (in The Projectfor a Scientific Psychology) that selective learning could onlytake place if the presynaptic neuron was as influenced as is thepostsynaptic neuron during excitation. Such activation of bothsides of the contact barrier (Freud's name for the synapse) wasaccomplished by reducing synaptic resistance by the absorption of'energy' at the synaptic membranes. Not bad for 1895! But Werbos1993 is even better." --Karl H. Pribram Professor Emeritus,Stanford University
Now, for the first time, publication of the landmark work inbackpropagation! Scientists, engineers, statisticians, operationsresearchers, and other investigators involved in neural networkshave long sought direct access to Paul Werbos's groundbreaking,much-cited 1974 Harvard doctoral thesis, The Roots ofBackpropagation, which laid the foundation of backpropagation. Now,with the publication of its full text, these practitioners can gostraight to the original material and gain a deeper, practicalunderstanding of this unique mathematical approach to socialstudies and related fields. In addition, Werbos has provided threemore recent research papers, which were inspired by his originalwork, and a new guide to the field. Originally written for readerswho lacked any knowledge of neural nets, The Roots ofBackpropagation firmly established both its historical andcontinuing significance as it: * Demonstrates the ongoing value and new potential ofbackpropagation * Creates a wealth of sound mathematical tools useful acrossdisciplines * Sets the stage for the emerging area of fast automaticdifferentiation * Describes new designs for forecasting and control which exploitbackpropagation * Unifies concepts from Freud, Jung, biologists, and others into anew mathematical picture of the human mind and how it works * Certifies the viability of Deutsch's model of nationalism as apredictive tool--as well as the utility of extensions of thiscentral paradigm "What a delight it was to see Paul Werbos rediscover Freud'sversion of 'back-propagation.' Freud was adamant (in The Projectfor a Scientific Psychology) that selective learning could onlytake place if the presynaptic neuron was as influenced as is thepostsynaptic neuron during excitation. Such activation of bothsides of the contact barrier (Freud's name for the synapse) wasaccomplished by reducing synaptic resistance by the absorption of'energy' at the synaptic membranes. Not bad for 1895! But Werbos1993 is even better." --Karl H. Pribram Professor Emeritus,Stanford University
This book explores the intuitive appeal of neural networks and the genetic algorithm in finance. It demonstrates how neural networks used in combination with evolutionary computation outperform classical econometric methods for accuracy in forecasting, classification and dimensionality reduction. McNelis utilizes a variety of examples, from forecasting automobile production and corporate bond spread, to inflation and deflation processes in Hong Kong and Japan, to credit card default in Germany to bank failures in Texas, to cap-floor volatilities in New York and Hong Kong. * Offers a balanced, critical review of the neural network methods and genetic algorithms used in finance * Includes numerous examples and applications * Numerical illustrations use MATLAB code and the book is accompanied by a website
This tutorial text provides the reader with an understanding of artificial neural networks (ANNs), and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed, and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks.
Neural networks have had considerable success in a variety of disciplines including engineering, control, and financial modelling. However a major weakness is the lack of established procedures for testing mis-specified models and the statistical significance of the various parameters which have been estimated. This is particularly important in the majority of financial applications where the data generating processes are dominantly stochastic and only partially deterministic. Based on the latest, most significant developments in estimation theory, model selection and the theory of mis-specified models, this volume develops neural networks into an advanced financial econometrics tool for non-parametric modelling. It provides the theoretical framework required, and displays the efficient use of neural networks for modelling complex financial phenomena. Unlike most other books in this area, this one treats neural networks as statistical devices for non-linear, non-parametric regression analysis.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.