Design of Experiments in Nonlinear Models: Asymptotic Normality, Optimality Criteria and Small-Sample Properties provides a comprehensive coverage of the various aspects of experimental design for nonlinear models. The book contains original contributions to the theory of optimal experiments that will interest students and researchers in the field. Practitionners motivated by applications will find valuable tools to help them designing their experiments. The first three chapters expose the connections between the asymptotic properties of estimators in parametric models and experimental design, with more emphasis than usual on some particular aspects like the estimation of a nonlinear function of the model parameters, models with heteroscedastic errors, etc. Classical optimality criteria based on those asymptotic properties are then presented thoroughly in a special chapter. Three chapters are dedicated to specific issues raised by nonlinear models. The construction of design criteria derived from non-asymptotic considerations (small-sample situation) is detailed. The connection between design and identifiability/estimability issues is investigated. Several approaches are presented to face the problem caused by the dependence of an optimal design on the value of the parameters to be estimated. A survey of algorithmic methods for the construction of optimal designs is provided.
Certain algorithms that are known to converge can be renormalized or "blown up" at each iteration so that their local behavior can be seen. This creates dynamical systems that we can study with modern tools, such as ergodic theory, chaos, special attractors, and Lyapounov exponents. Furthermore, we can translate the rates of convergence into less studied exponents known as Renyi entropies. This all feeds back to suggest new algorithms with faster rates of convergence. For example, in line-search, we can improve upon the Golden Section algorithm with new classes of algorithms that have their own special-and sometimes chaotic-dynamical systems. The ellipsoidal algorithms of linear and convex programming have fast, "deep cut" versions whose dynamical systems contain cyclic attractors. And ordinary steepest descent has, buried within, a beautiful fractal that controls the gateway to a special two-point attractor. Faster "relaxed" versions exhibit classical period doubling. Dynamical Search presents a stimulating introduction to a brand new field - the union of dynamical systems and optimization. It will prove fascinating and open doors to new areas of investigation for researchers in both fields, plus those in statistics and computer science.
At the core of many engineering problems is the solution of sets of equa tions and inequalities, and the optimization of cost functions. Unfortunately, except in special cases, such as when a set of equations is linear in its un knowns or when a convex cost function has to be minimized under convex constraints, the results obtained by conventional numerical methods are only local and cannot be guaranteed. This means, for example, that the actual global minimum of a cost function may not be reached, or that some global minimizers of this cost function may escape detection. By contrast, interval analysis makes it possible to obtain guaranteed approximations of the set of all the actual solutions of the problem being considered. This, together with the lack of books presenting interval techniques in such a way that they could become part of any engineering numerical tool kit, motivated the writing of this book. The adventure started in 1991 with the preparation by Luc Jaulin of his PhD thesis, under Eric Walter's supervision. It continued with their joint supervision of Olivier Didrit's and Michel Kieffer's PhD theses. More than two years ago, when we presented our book project to Springer, we naively thought that redaction would be a simple matter, given what had already been achieved . . .
Localization for underwater robots remains a challenging issue. Typical sensors, such as Global Navigation Satellite System (GNSS) receivers, cannot be used under the surface and other inertial systems suffer from a strong integration drift. On top of that, the seabed is generally uniform and unstructured, making it difficult to apply Simultaneous Localization and Mapping (SLAM) methods to perform localization. Reliable Robot Localization presents an innovative new method which can be characterized as a raw-data SLAM approach. It differs from extant methods by considering time as a standard variable to be estimated, thus raising new opportunities for state estimation, so far underexploited. However, such temporal resolution is not straightforward and requires a set of theoretical tools in order to achieve the main purpose of localization. This book not only presents original contributions to the field of mobile robotics, it also offers new perspectives on constraint programming and set-membership approaches. It provides a reliable contractor programming framework in order to build solvers for dynamical systems. This set of tools is illustrated throughout this book with realistic robotic applications.
This text presents a wide-ranging and rigorous overview of nearest neighbor methods, one of the most important paradigms in machine learning. Now in one self-contained volume, this book systematically covers key statistical, probabilistic, combinatorial and geometric ideas for understanding, analyzing and developing nearest neighbor methods. Gérard Biau is a professor at Université Pierre et Marie Curie (Paris). Luc Devroye is a professor at the School of Computer Science at McGill University (Montreal).
Design of Experiments in Nonlinear Models: Asymptotic Normality, Optimality Criteria and Small-Sample Properties provides a comprehensive coverage of the various aspects of experimental design for nonlinear models. The book contains original contributions to the theory of optimal experiments that will interest students and researchers in the field. Practitionners motivated by applications will find valuable tools to help them designing their experiments. The first three chapters expose the connections between the asymptotic properties of estimators in parametric models and experimental design, with more emphasis than usual on some particular aspects like the estimation of a nonlinear function of the model parameters, models with heteroscedastic errors, etc. Classical optimality criteria based on those asymptotic properties are then presented thoroughly in a special chapter. Three chapters are dedicated to specific issues raised by nonlinear models. The construction of design criteria derived from non-asymptotic considerations (small-sample situation) is detailed. The connection between design and identifiability/estimability issues is investigated. Several approaches are presented to face the problem caused by the dependence of an optimal design on the value of the parameters to be estimated. A survey of algorithmic methods for the construction of optimal designs is provided.
Certain algorithms that are known to converge can be renormalized or "blown up" at each iteration so that their local behavior can be seen. This creates dynamical systems that we can study with modern tools, such as ergodic theory, chaos, special attractors, and Lyapounov exponents. Furthermore, we can translate the rates of convergence into less studied exponents known as Renyi entropies. This all feeds back to suggest new algorithms with faster rates of convergence. For example, in line-search, we can improve upon the Golden Section algorithm with new classes of algorithms that have their own special-and sometimes chaotic-dynamical systems. The ellipsoidal algorithms of linear and convex programming have fast, "deep cut" versions whose dynamical systems contain cyclic attractors. And ordinary steepest descent has, buried within, a beautiful fractal that controls the gateway to a special two-point attractor. Faster "relaxed" versions exhibit classical period doubling. Dynamical Search presents a stimulating introduction to a brand new field - the union of dynamical systems and optimization. It will prove fascinating and open doors to new areas of investigation for researchers in both fields, plus those in statistics and computer science.
This volume contains the majority of the papers presented at the 5th Inter national Workshop on Model-Oriented Data Analysis held in June 1998. This series started in March 1987 with a meeting on the Wartburg, Eisenach (Germany). The next three meetings were in 1990 (St Kyrik monastery, Bulgaria), 1992 (Petrodvorets, StPetersburg, Russia) and 1995 (Spetses, Greece). The main purpose of these workshops was to bring together lead ing scientists from 'Eastern' and 'Western' Europe for the exchange of ideas in theoretical and applied statistics, with special emphasis on experimen tal design. Now that the separation between East and West has become less rigid, this dialogue has, in principle, become much easier. However, providing opportunities for this dialogue is as vital as ever. MODA meetings are known for their friendly atmosphere, leading to fruitful discussions and collaboration, especially between young and senior scien tists. Indeed, many long term collaborations were initiated during these events. This intellectually stimulating atmosphere is achieved by limiting the number of participants to around eighty, by the choice of location so that participants can live as a community, and, of course, through the care ful selection of scientific direction made by the Programme Committee.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.