Complex-Valued Neural Networks have higher functionality, learn faster and generalize better than their real-valued counterparts.
This book is devoted to the Multi-Valued Neuron (MVN) and MVN-based neural networks. It contains a comprehensive observation of MVN theory, its learning, and applications. MVN is a complex-valued neuron whose inputs and output are located on the unit circle. Its activation function is a function only of argument (phase) of the weighted sum. MVN derivative-free learning is based on the error-correction rule. A single MVN can learn those input/output mappings that are non-linearly separable in the real domain. Such classical non-linearly separable problems as XOR and Parity n are the simplest that can be learned by a single MVN. Another important advantage of MVN is a proper treatment of the phase information.
These properties of MVN become even more remarkable when this neuron is used as a basic one in neural networks. The Multilayer Neural Network based on Multi-Valued Neurons (MLMVN) is an MVN-based feedforward neural network. Its backpropagation learning algorithm is derivative-free and based on the error-correction rule. It does not suffer from the local minima phenomenon. MLMVN outperforms many other machine learning techniques in terms of learning speed, network complexity and generalization capability when solving both benchmark and real-world classification and prediction problems. Another interesting application of MVN is its use as a basic neuron in multi-state associative memories.
The book is addressed to those readers who develop theoretical fundamentals of neural networks and use neural networks for solving various real-world problems. It should also be very suitable for Ph.D. and graduate students pursuing their degrees in computational intelligence.