This self-contained, comprehensive book tackles the principal problems and advanced questions of probability theory and random processes in 22 chapters, presented in a logical order but also suitable for dipping into. They include both classical and more recent results, such as large deviations theory, factorization identities, information theory, stochastic recursive sequences. The book is further distinguished by the inclusion of clear and illustrative proofs of the fundamental results that comprise many methodological improvements aimed at simplifying the arguments and making them more transparent. The importance of the Russian school in the development of probability theory has long been recognized. This book is the translation of the fifth edition of the highly successful Russian textbook. This edition includes a number of new sections, such as a new chapter on large deviation theory for random walks, which are of both theoretical and applied interest. The frequent references to Russian literature throughout this work lend a fresh dimension and make it an invaluable source of reference for Western researchers and advanced students in probability related subjects. Probability Theory will be of interest to both advanced undergraduate and graduate students studying probability theory and its applications. It can serve as a basis for several one-semester courses on probability theory and random processes as well as self-study.
The object of queueing theory (or the theory of mass service) is the investigation of stochastic processes of a special form which are called queueing (or service) processes in this book. Two approaches to the definition of these processes are possible depending on the direction of investigation. In accordance with this fact, the exposition of the subject can be broken up into two self-contained parts. The first of these forms the content of this monograph. . The definition of the queueing processes (systems) to be used here is dose to the traditional one and is connected with the introduction of so-called governing random sequences. We will introduce algorithms which describe the governing of a system with the aid of such sequences. Such a definition inevitably becomes rather qualitative since under these conditions a completely formal construction of a stochastic process uniquely describing the evolution of the system would require introduction of a complicated phase space not to mention the difficulties of giving the distribution of such a process on this phase space.
This self-contained, comprehensive book tackles the principal problems and advanced questions of probability theory and random processes in 22 chapters, presented in a logical order but also suitable for dipping into. They include both classical and more recent results, such as large deviations theory, factorization identities, information theory, stochastic recursive sequences. The book is further distinguished by the inclusion of clear and illustrative proofs of the fundamental results that comprise many methodological improvements aimed at simplifying the arguments and making them more transparent. The importance of the Russian school in the development of probability theory has long been recognized. This book is the translation of the fifth edition of the highly successful Russian textbook. This edition includes a number of new sections, such as a new chapter on large deviation theory for random walks, which are of both theoretical and applied interest. The frequent references to Russian literature throughout this work lend a fresh dimension and make it an invaluable source of reference for Western researchers and advanced students in probability related subjects. Probability Theory will be of interest to both advanced undergraduate and graduate students studying probability theory and its applications. It can serve as a basis for several one-semester courses on probability theory and random processes as well as self-study.
Because of the rapid increase in commercially available Fourier transform infrared spectrometers and computers over the past ten years, it has now become feasible to use IR spectrometry to characterize very thin films at extended interfaces. At the same time, interest in thin films has grown tremendously because of applications in microelectronics, sensors, catalysis, and nanotechnology. The Handbook of Infrared Spectroscopy of Ultrathin Films provides a practical guide to experimental methods, up-to-date theory, and considerable reference data, critical for scientists who want to measure and interpret IR spectra of ultrathin films. This authoritative volume also: Offers information needed to effectively apply IR spectroscopy to the analysis and evaluation of thin and ultrathin films on flat and rough surfaces and on powders at solid-gaseous, solid-liquid, liquid-gaseous, liquid-liquid, and solid-solid interfaces. * Provides full discussion of theory underlying techniques * Describes experimental methods in detail, including optimum conditions for recording spectra and the interpretation of spectra * Gives detailed information on equipment, accessories, and techniques * Provides IR spectroscopic data tables as appendixes, including the first compilation of published data on longitudinal frequencies of different substances * Covers new approaches, such as Surface Enhanced IR spectroscopy (SEIR), time-resolved FTIR spectroscopy, high-resolution microspectroscopy and using synchotron radiation
Praise for the First Edition " . . . the book is a valuable addition to the literature in the field, serving as a much-needed guide for both clinicians and advanced students."—Zentralblatt MATH A new edition of the cutting-edge guide to diagnostic tests in medical research In recent years, a considerable amount of research has focused on evolving methods for designing and analyzing diagnostic accuracy studies. Statistical Methods in Diagnostic Medicine, Second Edition continues to provide a comprehensive approach to the topic, guiding readers through the necessary practices for understanding these studies and generalizing the results to patient populations. Following a basic introduction to measuring test accuracy and study design, the authors successfully define various measures of diagnostic accuracy, describe strategies for designing diagnostic accuracy studies, and present key statistical methods for estimating and comparing test accuracy. Topics new to the Second Edition include: Methods for tests designed to detect and locate lesions Recommendations for covariate-adjustment Methods for estimating and comparing predictive values and sample size calculations Correcting techniques for verification and imperfect standard biases Sample size calculation for multiple reader studies when pilot data are available Updated meta-analysis methods, now incorporating random effects Three case studies thoroughly showcase some of the questions and statistical issues that arise in diagnostic medicine, with all associated data provided in detailed appendices. A related web site features Fortran, SAS®, and R software packages so that readers can conduct their own analyses. Statistical Methods in Diagnostic Medicine, Second Edition is an excellent supplement for biostatistics courses at the graduate level. It also serves as a valuable reference for clinicians and researchers working in the fields of medicine, epidemiology, and biostatistics.
This work is devoted to several problems of parametric (mainly) and nonparametric estimation through the observation of Poisson processes defined on general spaces. Poisson processes are quite popular in applied research and therefore they attract the attention of many statisticians. There are a lot of good books on point processes and many of them contain chapters devoted to statistical inference for general and partic ular models of processes. There are even chapters on statistical estimation problems for inhomogeneous Poisson processes in asymptotic statements. Nevertheless it seems that the asymptotic theory of estimation for nonlinear models of Poisson processes needs some development. Here nonlinear means the models of inhomogeneous Pois son processes with intensity function nonlinearly depending on unknown parameters. In such situations the estimators usually cannot be written in exact form and are given as solutions of some equations. However the models can be quite fruitful in en gineering problems and the existing computing algorithms are sufficiently powerful to calculate these estimators. Therefore the properties of estimators can be interesting too.
Strong pulsed magnetic fields are important for several fields in physics and engineering, such as power generation and accelerator facilities. Basic aspects of the generation of strong and superstrong pulsed magnetic fields technique are given, including the physics and hydrodynamics of the conductors interacting with the field as well as an account of the significant progress in generation of strong magnetic fields using the magnetic accumulation technique. Results of computer simulations as well as a survey of available field technology are completing the volume.
An effective technique for data analysis in the social sciences The recent explosion in longitudinal data in the social sciences highlights the need for this timely publication. Latent Curve Models: A Structural Equation Perspective provides an effective technique to analyze latent curve models (LCMs). This type of data features random intercepts and slopes that permit each case in a sample to have a different trajectory over time. Furthermore, researchers can include variables to predict the parameters governing these trajectories. The authors synthesize a vast amount of research and findings and, at the same time, provide original results. The book analyzes LCMs from the perspective of structural equation models (SEMs) with latent variables. While the authors discuss simple regression-based procedures that are useful in the early stages of LCMs, most of the presentation uses SEMs as a driving tool. This cutting-edge work includes some of the authors' recent work on the autoregressive latent trajectory model, suggests new models for method factors in multiple indicators, discusses repeated latent variable models, and establishes the identification of a variety of LCMs. This text has been thoroughly class-tested and makes extensive use of pedagogical tools to aid readers in mastering and applying LCMs quickly and easily to their own data sets. Key features include: Chapter introductions and summaries that provide a quick overview of highlights Empirical examples provided throughout that allow readers to test their newly found knowledge and discover practical applications Conclusions at the end of each chapter that stress the essential points that readers need to understand for advancement to more sophisticated topics Extensive footnoting that points the way to the primary literature for more information on particular topics With its emphasis on modeling and the use of numerous examples, this is an excellent book for graduate courses in latent trajectory models as well as a supplemental text for courses in structural modeling. This book is an excellent aid and reference for researchers in quantitative social and behavioral sciences who need to analyze longitudinal data.
This is the first handbook on zeolites and other microporous materials. It is an up-to-date, highly sophisticated collection of information for those who deal with zeolites in industry or at academic institutions as well as being a guide for newcomers.
Discusses the sixteenth century roots of the lack of a unified Russian identity, the division between the gentry and the peasantry, and the widening gap in the eighteenth and nineteenth centuries which led to revolution and continues to affect Russia today.
In this study of the modern Uzbeks, Professor Edward A. Allworth provides a comprehensive and authoritative survey of an important group of Muslim people who live within the boundaries of the Soviet Union. After the Russians and the Ukranians, the Uzbeks are the largest ethnic group in the Soviet Union and the strongest of a number of Muslim communities that populate the vast region of Central Asia.
Praise for the Third Edition "This book provides in-depth coverage of modelling techniques used throughout many branches of actuarial science. . . . The exceptional high standard of this book has made it a pleasure to read." —Annals of Actuarial Science Newly organized to focus exclusively on material tested in the Society of Actuaries' Exam C and the Casualty Actuarial Society's Exam 4, Loss Models: From Data to Decisions, Fourth Edition continues to supply actuaries with a practical approach to the key concepts and techniques needed on the job. With updated material and extensive examples, the book successfully provides the essential methods for using available data to construct models for the frequency and severity of future adverse outcomes. The book continues to equip readers with the tools needed for the construction and analysis of mathematical models that describe the process by which funds flow into and out of an insurance system. Focusing on the loss process, the authors explore key quantitative techniques including random variables, basic distributional quantities, and the recursive method, and discuss techniques for classifying and creating distributions. Parametric, non-parametric, and Bayesian estimation methods are thoroughly covered along with advice for choosing an appropriate model. New features of this Fourth Edition include: Expanded discussion of working with large data sets, now including more practical elements of constructing decrement tables Added coverage of methods for simulating several special situations An updated presentation of Bayesian estimation, outlining conjugate prior distributions and the linear exponential family as well as related computational issues Throughout the book, numerous examples showcase the real-world applications of the presented concepts, with an emphasis on calculations and spreadsheet implementation. A wealth of new exercises taken from previous Exam C/4 exams allows readers to test their comprehension of the material, and a related FTP site features the book's data sets. Loss Models, Fourth Edition is an indispensable resource for students and aspiring actuaries who are preparing to take the SOA and CAS examinations. The book is also a valuable reference for professional actuaries, actuarial students, and anyone who works with loss and risk models. To explore our additional offerings in actuarial exam preparation visit www.wiley.com/go/c4actuarial .
This book discusses in depth many of the key problems in non-equilibrium physics. The origin of macroscopic irreversible behavior receives particular attention and is illustrated in the framework of solvable models. An updated discussion on the linear response focuses on the correct electrodynamic aspects, which are essential for example, in the proof of the Nyquist theorem. The material covers the scaling relationship between different levels of description (kinetic to hydrodynamic) as well as spontaneous symmetry breaking in real time in terms of nonlinear dynamics (attractors), illustrated using the example of Bose-Einstein condensation. The presentation also includes the latest developments - quantum kinetics - related to modern ultrafast spectroscopy, where transition from reversible to irreversible behavior occurs.
The subject of time series is of considerable interest, especiallyamong researchers in econometrics, engineering, and the naturalsciences. As part of the prestigious Wiley Series in Probabilityand Statistics, this book provides a lucid introduction to thefield and, in this new Second Edition, covers the importantadvances of recent years, including nonstationary models, nonlinearestimation, multivariate models, state space representations, andempirical model identification. New sections have also been addedon the Wold decomposition, partial autocorrelation, long memoryprocesses, and the Kalman filter. Major topics include: * Moving average and autoregressive processes * Introduction to Fourier analysis * Spectral theory and filtering * Large sample theory * Estimation of the mean and autocorrelations * Estimation of the spectrum * Parameter estimation * Regression, trend, and seasonality * Unit root and explosive time series To accommodate a wide variety of readers, review material,especially on elementary results in Fourier analysis, large samplestatistics, and difference equations, has been included.
The Handbook of Zeolite Science and Technology offers effective analyses ofsalient cases selected expressly for their relevance to current and prospective research. Presenting the principal theoretical and experimental underpinnings of zeolites, this international effort is at once complete and forward-looking, combining fundamental concepts with the most sophisticated data for each scientific subtopic and budding technology. Supplying over 750 figures, and 350 display equations, this impressive achievement in zeolite science observes synthesis through the lens of MFI (ZSM-5 and silicalite). Chapters progress from conceptual building blocks to complex research presentations.
Salomo A. Birnbaum (1891-1989) ist unbestrittener Pionier auf zwei großen, eng aufeinander bezogenen Forschungsgebieten, nämlich der historischen jiddischen Sprachwissenschaft sowie der Paläographie des Hebräischen und aller jüdischen Nachfolgesprachen: 1918 veröffentlichte er die erste wissenschaftliche Grammatik des Jiddischen (vier weitere Auflagen ab 1966); in den 20er Jahren begann er - ausgehend von dem Bedürfnis, mittelalterliche jiddische Manuskripte zu datieren und zu lokalisieren - mit seinen paläographischen Studien, die in dem Standardwerk The Hebrew Scripts (1954/57-1971) gipfelten und ihm zugleich ein weiteres Forschungsgebiet, die Vergleichung jüdischer Sprachen, eröffneten. Die vorliegende Aufsatzsammlung (teils Wiederabdrucke, teils nach bisher ungedruckten Manuskripten ediert) bildet nach Birnbaums eigenen Vorstellungen einen Querschnitt durch sein Lebenswerk. Bd. I enthält Beiträge zur jiddischen Philologie: Entstehung und Alter der Sprache, orthographische und lautliche Entwicklungen, Probleme der Umsetzung in Lateinschrift, Dialektologie, Etymologie u.a., ferner einen Überblick über die weiteren jüdischen Sprachen sowie umfangreiche Einzeluntersuchungen insbesondere zum Dzudezmo ('Jüdischspanischen') und zum Bucharischen ('Jüdischpersischen'). Die Beiträge in Bd. II dokumentieren die Entwicklung der sich etablierenden hebräischen Paläographie in den 30er bis 60er Jahren, wo Birnbaums quantitative und vergleichende Methoden sich vor allem seit 1947 an den damals aufgefundenen Schriftrollen vom Toten Meer, aber auch an Neufunden mittelalterlicher jiddischer Handschriften glänzend bewährten. Birnbaum lehrte von 1922 bis 1933 Jiddisch an der Universität Hamburg, von 1936 bis 1957 Jiddisch und hebräische Paläographie an Londoner universitären Instituten.
This book provides an unparalleled contemporary assessment of hydrocarbon chemistry – presenting basic concepts, current research, and future applications. • Comprehensive and updated review and discussion of the field of hydrocarbon chemistry • Includes literature coverage since the publication of the previous edition • Expands or adds coverage of: carboxylation, sustainable hydrocarbons, extraterrestrial hydrocarbons • Addresses a topic of special relevance in contemporary science, since hydrocarbons play a role as a possible replacement for coal, petroleum oil, and natural gas as well as their environmentally safe use • Reviews of prior edition: “...literature coverage is comprehensive and ideal for quickly reviewing specific topics...of most value to industrial chemists...” (Angewandte Chemie) and “...useful for chemical engineers as well as engineers in the chemical and petrochemical industries.” (Petroleum Science and Technology)
Foreword by Charles H Townes This volume includes highlights of the theories underlying the essential phenomena occurring in novel semiconductor lasers as well as the principles of operation of selected heterostructure lasers. To understand scattering processes in heterostructure lasers and related optoelectronic devices, it is essential to consider the role of dimensional confinement of charge carriers as well as acoustical and optical phonons in quantum structures. Indeed, it is important to consider the confinement of both phonons and carriers in the design and modeling of novel semiconductor lasers such as the tunnel injection laser, quantum well intersubband lasers, and quantum dot lasers. The full exploitation of dimensional confinement leads to the exciting new capability of scattering time engineering in novel semiconductor lasers.As a result of continuing advances in techniques for growing quantum heterostructures, recent developments are likely to be followed in coming years by many more advances in semiconductor lasers and optoelectronics. As our understanding of these devices and the ability to fabricate them grow, so does our need for more sophisticated theories and simulation methods bridging the gap between quantum and classical transport.
The Wiley-Interscience Paperback Series consists of selectedbooks that have been made more accessible to consumers in an effortto increase global appeal and general circulation. With these newunabridged softcover volumes, Wiley hopes to extend the lives ofthese works by making them available to future generations ofstatisticians, mathematicians, and scientists. "This is a nice book containing a wealth of information, much ofit due to the authors. . . . If an instructor designing such acourse wanted a textbook, this book would be the best choiceavailable. . . . There are many stimulating exercises, and the bookalso contains an excellent index and an extensive list ofreferences." —Technometrics "[This] book should be read carefully by anyone who isinterested in dealing with statistical models in a realisticfashion." —American Scientist Introducing concepts, theory, and applications, RobustStatistics is accessible to a broad audience, avoidingallusions to high-powered mathematics while emphasizing ideas,heuristics, and background. The text covers the approach based onthe influence function (the effect of an outlier on an estimater,for example) and related notions such as the breakdown point. Italso treats the change-of-variance function, fundamental conceptsand results in the framework of estimation of a single parameter,and applications to estimation of covariance matrices andregression parameters.
The Chemistry of Diamondoids Comprehensive resource on an important and fascinating compound class, covering synthesis, properties, functionalization, and applications in organic synthesis, materials science, and more The Chemistry of Diamondoids gives a state-of-the-art overview of all aspects of diamondoid chemistry, covering nomenclature, natural occurrence, chemical and physical properties, along with synthesis and functionalization of diamondoids as well as their applications as molecular building blocks in organic synthesis, polymer and materials science, nanotechnology, and medicinal chemistry. The book concludes with a perspective towards future developments in the field, thereby drawing attention to areas open for discovery. Written by experts in the field, The Chemistry of Diamondoids includes information on: Naturally occurring diamondoids, their formation, and the role they play in the petroleum industry and in geosciences, plus man-made approaches to prepare them on large scale Growing diamond from diamondoids via seeding, preparation and properties of diamondoid oligomers and doped diamondoids C–H-bond functionalization, a precondition for their use in many applications, and fine-tuning of diamondoid properties by precise cage substitution reactions With its all-encompassing approach, The Chemistry of Diamondoids is a valuable guide for newcomers and researchers in organic chemistry and materials science interested in modern synthetic methods and organic functional materials.
Easy-to-read and comprehensive, this book shows how the SAS System performs multivariate time series analysis and features the advanced SAS procedures STATSPACE, ARIMA, and SPECTRA. The interrelationship of SAS/ETS procedures is demonstrated with an accompanying discussion of how the choice of a procedure depends on the data to be analysed and the reults desired. Other topics covered include detecting sinusoidal components in time series models and performing bivariate corr-spectral analysis and comparing the results with the standard transfer function methodology. The authors? unique approach to integrating students in a variety of disciplines and industries. Emphasis is on correct interpretation of output to draw meaningful conclusions. The volume, co-pubished by SAS and JWS, features both theory and practicality, and accompanies a soon-to-be extensive library of SAS hands-on manuals in a multitude of statistical areas. The book can be used with a number of hardware-specific computing machines including CMS, Mac, MVS, Opem VMS Alpha, Opmen VMS VAX, OS/390, OS/2, UNIX, and Windows.
The most comprehensive, single-volume guide to conductingexperiments with mixtures "If one is involved, or heavily interested, in experiments onmixtures of ingredients, one must obtain this book. It is, as wasthe first edition, the definitive work." -Short Book Reviews (Publication of the International StatisticalInstitute) "The text contains many examples with worked solutions and with itsextensive coverage of the subject matter will prove invaluable tothose in the industrial and educational sectors whose work involvesthe design and analysis of mixture experiments." -Journal of the Royal Statistical Society "The author has done a great job in presenting the vitalinformation on experiments with mixtures in a lucid and readablestyle. . . . A very informative, interesting, and useful book on animportant statistical topic." -Zentralblatt fur Mathematik und Ihre Grenzgebiete Experiments with Mixtures shows researchers and students how todesign and set up mixture experiments, then analyze the data anddraw inferences from the results. Virtually every technique thathas appeared in the literature of mixtures can be found here, andcomputing formulas for each method are provided with completelyworked examples. Almost all of the numerical examples are takenfrom real experiments. Coverage begins with Scheffe latticedesigns, introducing the use of independent variables, and endswith the most current methods. New material includes: * Multiple response cases * Residuals and least-squares estimates * Categories of components: Mixtures of mixtures * Fixed as well as variable values for the major componentproportions * Leverage and the Hat Matrix * Fitting a slack-variable model * Estimating components of variances in a mixed model using ANOVAtable entries * Clarification of blocking mates and choice of mates * Optimizing several responses simultaneously * Biplots for multiple responses
A comprehensive text and reference bringing together advances in the theory of probability and statistics and relating them to applications. The three major categories of statistical models that relate dependent variables to explanatory variables are covered: univariate regression models, multivariate regression models, and simultaneous equations models. Methods are illustrated with worked examples, complete with figures that display code and output.
The first book in inference for stochastic processes from a statistical, rather than a probabilistic, perspective. It provides a systematic exposition of theoretical results from over ten years of mathematical literature and presents, for the first time in book form, many new techniques and approaches.
The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists. "The effort of Professor Fuller is commendable . . . [the book] provides a complete treatment of an important and frequently ignored topic. Those who work with measurement error models will find it valuable. It is the fundamental book on the subject, and statisticians will benefit from adding this book to their collection or to university or departmental libraries." -Biometrics "Given the large and diverse literature on measurement error/errors-in-variables problems, Fuller's book is most welcome. Anyone with an interest in the subject should certainly have this book." -Journal of the American Statistical Association "The author is to be commended for providing a complete presentation of a very important topic. Statisticians working with measurement error problems will benefit from adding this book to their collection." -Technometrics " . . . this book is a remarkable achievement and the product of impressive top-grade scholarly work." -Journal of Applied Econometrics Measurement Error Models offers coverage of estimation for situations where the model variables are observed subject to measurement error. Regression models are included with errors in the variables, latent variable models, and factor models. Results from several areas of application are discussed, including recent results for nonlinear models and for models with unequal variances. The estimation of true values for the fixed model, prediction of true values under the random model, model checks, and the analysis of residuals are addressed, and in addition, procedures are illustrated with data drawn from nearly twenty real data sets.
Today, the theory of random processes represents a large field of mathematics with many different branches, and the task of choosing topics for a brief introduction to this theory is far from being simple. This introduction to the theory of random processes uses mathematical models that are simple, but have some importance for applications. We consider different processes, whose development in time depends on some random factors. The fundamental problem can be briefly circumscribed in the following way: given some relatively simple characteristics of a process, compute the probability of another event which may be very complicated; or estimate a random variable which is related to the behaviour of the process. The models that we consider are chosen in such a way that it is possible to discuss the different methods of the theory of random processes by referring to these models. The book starts with a treatment of homogeneous Markov processes with a countable number of states. The main topic is the ergodic theorem, the method of Kolmogorov's differential equations (Secs. 1-4) and the Brownian motion process, the connecting link being the transition from Kolmogorov's differential-difference equations for random walk to a limit diffusion equation (Sec. 5).
Amstat News asked three review editors to rate their top five favorite books in the September 2003 issue. Statistical Methods for Reliability Data was among those chosen. Bringing statistical methods for reliability testing in line with the computer age This volume presents state-of-the-art, computer-based statistical methods for reliability data analysis and test planning for industrial products. Statistical Methods for Reliability Data updates and improves established techniques as it demonstrates how to apply the new graphical, numerical, or simulation-based methods to a broad range of models encountered in reliability data analysis. It includes methods for planning reliability studies and analyzing degradation data, simulation methods used to complement large-sample asymptotic theory, general likelihood-based methods of handling arbitrarily censored data and truncated data, and more. In this book, engineers and statisticians in industry and academia will find: A wealth of information and procedures developed to give products a competitive edge Simple examples of data analysis computed with the S-PLUS system-for which a suite of functions and commands is available over the Internet End-of-chapter, real-data exercise sets Hundreds of computer graphics illustrating data, results of analyses, and technical concepts An essential resource for practitioners involved in product reliability and design decisions, Statistical Methods for Reliability Data is also an excellent textbook for on-the-job training courses, and for university courses on applied reliability data analysis at the graduate level. An Instructor's Manual presenting detailed solutions to all the problems in the book is available upon requestfrom the Wiley editorial department.
Presents the most innovative results in carbene chemistry, setting the foundation for new discoveries and applications The discovery of stable carbenes has reinvigorated carbene chemistry research, with investigators seeking to develop carbenes into new useful catalysts and ligands. Presenting the most innovative and promising areas of carbene research over the past decade, this book explores newly discovered structural, catalytic, and organometallic aspects of carbene chemistry, with an emphasis on new and emerging synthetic applications. Contemporary Carbene Chemistry features contributions from an international team of pioneering carbene chemistry researchers. Collectively, these authors have highlighted the most interesting and promising areas of investigation in the field. The book is divided into two parts: Part 1, Properties and Reactions of Carbenes, explores new findings on carbene stability, acid-base behavior, and catalysis. Carbenic structure and reactivity are examined in chapters dedicated to stable carbenes, carbodicarbenes, carbenes as guests in supramolecular hosts, tunneling in carbene and oxacarbene reactions, and ultrafast kinetics of carbenes and their excited state precursors. Theoretical concerns are addressed in chapters on computational methods and dynamics applied to carbene reactions. Part 2, Metal Carbenes, is dedicated to the synthetic dimensions of carbenes, particularly the reactions and catalytic properties of metal carbenes. The authors discuss lithium, rhodium, ruthenium, chromium, molybdenum, tungsten, cobalt, and gold. All the chapters conclude with a summary of the current situation, new challenges on the horizon, and promising new research directions. A list of key reviews and suggestions for further reading also accompanies every chapter. Each volume of the Wiley Series on Reactive Intermediates in Chemistry and Biology focuses on a specific reactive intermediate, offering a broad range of perspectives from leading experts that sets the stage for new applications and further discoveries.
By the second half of the twentieth century, a new branch of materials science had come into being — crystalline materials research. Its appearance is linked to the emergence of advanced technologies primarily based on single crystals (bulk crystals and films). At the turn of the last century, the impending onset of the “ceramic era” was forecasted. It was believed that ceramics would play a role comparable to that of the Stone or Bronze Ages in the history of civilization. Naturally, such an assumption was hypothetical, but it showed that ceramic materials had evoked keen interest among researchers. Although sapphire traditionally has been considered a gem, it has developed into a material typical of the “ceramic era.” Widening the field of sapphire application necessitated essential improvement of its homogeneity and working characteristics and extension of the range of sapphire products, especially those with stipulated properties including a preset structural defect distribution. In the early 1980s, successful attainment of crystals with predetermined char- teristics was attributed to proper choice of the growth method. At present, in view of the fact that the requirements for crystalline products have become more str- gent, such an approach tends to be insufficient. It is clear that one must take into account the physical–chemical processes that take place during the formation of the real crystal structure, i.e., the growth mechanisms and the nature and causes of crystal imperfections.
This book is a reference for librarians, mathematicians, and statisticians involved in college and research level mathematics and statistics in the 21st century. We are in a time of transition in scholarly communications in mathematics, practices which have changed little for a hundred years are giving way to new modes of accessing information. Where journals, books, indexes and catalogs were once the physical representation of a good mathematics library, shelves have given way to computers, and users are often accessing information from remote places. Part I is a historical survey of the past 15 years tracking this huge transition in scholarly communications in mathematics. Part II of the book is the bibliography of resources recommended to support the disciplines of mathematics and statistics. These are grouped by type of material. Publication dates range from the 1800's onwards. Hundreds of electronic resources-some online, both dynamic and static, some in fixed media, are listed among the paper resources. Amazingly a majority of listed electronic resources are free.
A quantum dot is a particle of matter so small that the addition or removal of an electron changes its properties in some useful way. All atoms are quantum dots, but multi-molecular combinations can have this characteristic. In biochemistry, quantum dots are called redox groups. In nanotechnology, they are called quantum bits or qubits. Quantum dots typically have dimensions measured in nanometers, where one nanometer is 10-9 meter or a millionth of a millimetre. The fields of biology, chemistry, computer science, and electronics are all of interest to researchers in nanotechnology. Other applications of quantum dots include nanomachines, neural networks, and high-density memory or storage media. Research is being carried out on nano-crystals, self-assembled dots, and gated structures. This book presents leading-edge research from around the world.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.