Complex-Valued Neural Networks have higher functionality, learn faster and generalize better than their real-valued counterparts. This book is devoted to the Multi-Valued Neuron (MVN) and MVN-based neural networks. It contains a comprehensive observation of MVN theory, its learning, and applications. MVN is a complex-valued neuron whose inputs and output are located on the unit circle. Its activation function is a function only of argument (phase) of the weighted sum. MVN derivative-free learning is based on the error-correction rule. A single MVN can learn those input/output mappings that are non-linearly separable in the real domain. Such classical non-linearly separable problems as XOR and Parity n are the simplest that can be learned by a single MVN. Another important advantage of MVN is a proper treatment of the phase information. These properties of MVN become even more remarkable when this neuron is used as a basic one in neural networks. The Multilayer Neural Network based on Multi-Valued Neurons (MLMVN) is an MVN-based feedforward neural network. Its backpropagation learning algorithm is derivative-free and based on the error-correction rule. It does not suffer from the local minima phenomenon. MLMVN outperforms many other machine learning techniques in terms of learning speed, network complexity and generalization capability when solving both benchmark and real-world classification and prediction problems. Another interesting application of MVN is its use as a basic neuron in multi-state associative memories. The book is addressed to those readers who develop theoretical fundamentals of neural networks and use neural networks for solving various real-world problems. It should also be very suitable for Ph.D. and graduate students pursuing their degrees in computational intelligence.
Multi-Valued and Universal Binary Neurons deals with two new types of neurons: multi-valued neurons and universal binary neurons. These neurons are based on complex number arithmetic and are hence much more powerful than the typical neurons used in artificial neural networks. Therefore, networks with such neurons exhibit a broad functionality. They can not only realise threshold input/output maps but can also implement any arbitrary Boolean function. Two learning methods are presented whereby these networks can be trained easily. The broad applicability of these networks is proven by several case studies in different fields of application: image processing, edge detection, image enhancement, super resolution, pattern recognition, face recognition, and prediction. The book is hence partitioned into three almost equally sized parts: a mathematical study of the unique features of these new neurons, learning of networks of such neurons, and application of such neural networks. Most of this work was developed by the first two authors over a period of more than 10 years and was only available in the Russian literature. With this book we present the first comprehensive treatment of this important class of neural networks in the open Western literature. Multi-Valued and Universal Binary Neurons is intended for anyone with a scholarly interest in neural network theory, applications and learning. It will also be of interest to researchers and practitioners in the fields of image processing, pattern recognition, control and robotics.
The book focuses on the thermal transformations of various types of metal chelates, e.g. low molecular weight and polymeric metal chelates, coordination polymers and metal-organic frameworks. It analyzes the major advances and the problems in the preparation of metal oxide materials, mixed-oxide nanocomposites, carbon materials and polymer derived non-oxide nanocomposites by the thermolysis of different metal chelates. It also highlights the influence of the spatial and electronic structure of metal chelates on the mechanism and kinetics of their thermal transformations, and discusses important issues like conjugate thermolysis and computer modelling of the thermolysis process. This book is useful for researchers experienced in thermolysis as well as for young scientists interested in this area of science.
This book is a collection of the works of leading experts worldwide in the rapidly developing fields of plasmonics and metamaterials. These developments are promising to revolutionize ways of generating, controlling and processing light in the nanoscale. The technological applications range from nano-lasers to optical nano-waveguides to artificial media with unusual and exotic optical properties unattainable in natural materials. The volume cuts across all relevant disciplines and covers experiments, measurements, fabrication, physical and mathematical analysis, as well as computer simulation.
Antennas represent a critical technology in any of these wireless systems. Not only do they directly affect the received power of the system, they are also typically the largest and most visible part. Recently, the need for low-cost, low-profile, and lightweight antenna in the frequency range of the microwave/millimeter wave/THz band has regained momentum. "Basic Principles of Fresnel Antenna Arrays" provides us a with the basics of the various Fresnel Antenna approaches, in order to achieve low-cost, low-profile, and lightweight antenna in the microwave/millimeter wave band. A potential solution of the antenna problem lies in using lens technology in an array. The Fresnel zone plate lens (FZPL) antenna is in particular an interesting candidate for the array element. The limiting focusing properties of FZPL including subwave length focus are described in detail. The book further presents a novel hexagonal FZPL antenna which can be more effectively packed in an array due to its shape. Before considering the hexagonal FZPL antenna in an array, the authors investigate two ideas, described as methods to potentially improve the radiation characteristics. The first idea is to change the reference phase of the Fresnel zone radii - a novel free parameter in the usual design of zone plate’s lenses and antennas. To further improve the radiation characteristics of the hexagonal FZPL antenna, a technique involving Fresnel zone rotation is investigated. The book is of interest for designers of optical systems because, taking scaling effects into account, the characteristics of diffractive quasioptical elements are valid for diffractive focusing elements of integrated optics.
Multi-Valued and Universal Binary Neurons deals with two new types of neurons: multi-valued neurons and universal binary neurons. These neurons are based on complex number arithmetic and are hence much more powerful than the typical neurons used in artificial neural networks. Therefore, networks with such neurons exhibit a broad functionality. They can not only realise threshold input/output maps but can also implement any arbitrary Boolean function. Two learning methods are presented whereby these networks can be trained easily. The broad applicability of these networks is proven by several case studies in different fields of application: image processing, edge detection, image enhancement, super resolution, pattern recognition, face recognition, and prediction. The book is hence partitioned into three almost equally sized parts: a mathematical study of the unique features of these new neurons, learning of networks of such neurons, and application of such neural networks. Most of this work was developed by the first two authors over a period of more than 10 years and was only available in the Russian literature. With this book we present the first comprehensive treatment of this important class of neural networks in the open Western literature. Multi-Valued and Universal Binary Neurons is intended for anyone with a scholarly interest in neural network theory, applications and learning. It will also be of interest to researchers and practitioners in the fields of image processing, pattern recognition, control and robotics.
Complex-Valued Neural Networks have higher functionality, learn faster and generalize better than their real-valued counterparts. This book is devoted to the Multi-Valued Neuron (MVN) and MVN-based neural networks. It contains a comprehensive observation of MVN theory, its learning, and applications. MVN is a complex-valued neuron whose inputs and output are located on the unit circle. Its activation function is a function only of argument (phase) of the weighted sum. MVN derivative-free learning is based on the error-correction rule. A single MVN can learn those input/output mappings that are non-linearly separable in the real domain. Such classical non-linearly separable problems as XOR and Parity n are the simplest that can be learned by a single MVN. Another important advantage of MVN is a proper treatment of the phase information. These properties of MVN become even more remarkable when this neuron is used as a basic one in neural networks. The Multilayer Neural Network based on Multi-Valued Neurons (MLMVN) is an MVN-based feedforward neural network. Its backpropagation learning algorithm is derivative-free and based on the error-correction rule. It does not suffer from the local minima phenomenon. MLMVN outperforms many other machine learning techniques in terms of learning speed, network complexity and generalization capability when solving both benchmark and real-world classification and prediction problems. Another interesting application of MVN is its use as a basic neuron in multi-state associative memories. The book is addressed to those readers who develop theoretical fundamentals of neural networks and use neural networks for solving various real-world problems. It should also be very suitable for Ph.D. and graduate students pursuing their degrees in computational intelligence.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.