Complex-Valued Neural Networks -  - E-Book

Complex-Valued Neural Networks E-Book

0,0
116,99 €

oder
-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Presents the latest advances in complex-valued neural networks by demonstrating the theory in a wide range of applications

Complex-valued neural networks is a rapidly developing neural network framework that utilizes complex arithmetic, exhibiting specific characteristics in its learning, self-organizing, and processing dynamics. They are highly suitable for processing complex amplitude, composed of amplitude and phase, which is one of the core concepts in physical systems to deal with electromagnetic, light, sonic/ultrasonic waves as well as quantum waves, namely, electron and superconducting waves. This fact is a critical advantage in practical applications in diverse fields of engineering, where signals are routinely analyzed and processed in time/space, frequency, and phase domains.

Complex-Valued Neural Networks: Advances and Applications covers cutting-edge topics and applications surrounding this timely subject. Demonstrating advanced theories with a wide range of applications, including communication systems, image processing systems, and brain-computer interfaces, this text offers comprehensive coverage of:

  • Conventional complex-valued neural networks
  • Quaternionic neural networks
  • Clifford-algebraic neural networks

Presented by international experts in the field, Complex-Valued Neural Networks: Advances and Applications is ideal for advanced-level computational intelligence theorists, electromagnetic theorists, and mathematicians interested in computational intelligence, artificial intelligence, machine learning theories, and algorithms.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 437

Veröffentlichungsjahr: 2013

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Contents

Cover

Half Title page

Title page

Copyright page

Contributors

Preface

Chapter 1: Application Fields and Fundamental Merits of Complex-Valued Neural Networks

1.1 Introduction

1.2 Applications of Complex-Valued Neural Networks

1.3 What is a complex Number?

1.4 Complex Numbers in Feedforward Neural Networks

1.5 Metric in Complex Domain

1.6 Experiments to Elucidate the Generalization Characteristics

1.7 Conclusions

References

Chapter 2: Neural System Learning on Complex-Valued Manifolds

2.1 Introduction

2.2 Learning Averages over the Lie Group of Unitary Matrices

2.3 Riemannian-Gradient-Based Learning on the Complex Matrix-Hypersphere

2.4 Complex ICA Applied to Telecommunications

2.5 Conclusion

References

Chapter 3: N-Dimensional Vector Neuron and Its Application to the N-Bit Parity Problem

3.1 Introduction

3.2 Neuron Models with High-Dimensional Parameters

3.3 N-Dimensional Vector Neuron

3.4 Discussion

3.5 Conclusion

References

Chapter 4: Learning Algorithms in Complex-Valued Neural Networks using Wirtinger Calculus

4.1 Introduction

4.2 Derivatives in Wirtinger Calculus

4.3 Complex Gradient

4.4 Learning Algorithms for Feedforward CVNNs

4.5 Learning Algorithms for Recurrent CVNNs

4.6 Conclusion

References

Chapter 5: Quaternionic Neural Networks for Associative Memories

5.1 Introduction

5.2 Quaternionic Algebra

5.3 Stability of Quaternionic Neural Networks

5.4 Learning Schemes for Embedding Patterns

5.5 Conclusion

References

Chapter 6: Models of Recurrent Clifford Neural Networks and Their Dynamics

6.1 Introduction

6.2 Clifford Algebra

6.3 Hopfield-Type Neural Networks and Their Energy Functions

6.4 Models of Hopfield-Type Clifford Neural Networks

6.5 Definition of Energy Functions

6.6 Existence Conditions of Energy Functions

6.7 Conclusion

References

Chapter 7: Meta-Cognitive Complex-Valued Relaxation Network and its Sequential Learning Algorithm

7.1 Meta-Cognition in Machine Learning

7.2 Meta-Cognition in Complex-Valued Neural Networks

7.3 Meta-Cognitive Fully Complex-Valued Relaxation Network

7.4 Performance Evaluation of McFCRN: Synthetic Complex-Valued Function Approximation Problem

7.5 Performance Evaluation of McFCRN: Real-Valued Classification Problems

7.6 Conclusion

Acknowledgment

References

Chapter 8: Multilayer Feedforward Neural Network with Multi-Valued Neurons for Brain–Computer Interfacing

8.1 Brain–Computer Interface (BCI)

8.2 BCI Based on Steady-State Visual Evoked Potentials

8.3 EEG Signal Preprocessing

8.4 Decoding Based on MLMVN for Phase-Coded SSVEP BCI

8.5 System Validation

8.6 Discussion

Appendix: Decoding Methods

A.1 Method of Jia and Co-Workers

A.2 Method of Lee and Co-Workers

References

Chapter 9: Complex-Valued B-Spline Neural Networks for Modeling and Inverse of Wiener Systems

9.1 Introduction

9.2 Identification and Inverse of Complex-Valued Wiener Systems

9.3 Application to Digital Predistorter Design

9.4 Conclusions

References

Chapter 10: Quaternionic Fuzzy Neural Network for View-Invariant Color Face Image Recognition

10.1 Introduction

10.2 Face Recognition System

10.3 Quaternion-Based View-Invariant Color Face Image Recognition

10.4 Enrollment Stage and Recognition Stage for Quaternion-Based Color Face Image Correlator

10.5 Max-Product Fuzzy Neural Network Classifier

10.6 Experimental Results

10.7 Conclusion and Future Research Directions

References

Index

Complex-Valued Neural Networks

IEEE Press445 Hoes LanePiscataway, NJ 08854

IEEE Press Editorial Board 2013John Anderson, Editor in Chief

Linda ShaferGeorge W. ArnoldEkram HossainOm P. Malik

Saeid NahavandiDavid JacobsonMary Lanzerotti

George ZobristTariq SamadDmitry Goldgof

Kenneth Moore, Director of IEEE Book and Information Services (BIS)

Technical ReviewersGeorge M. GeorgiouGouhei Tanaka

Copyright © 2013 by The Institue of Electrical and Electronics Engineers. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey.Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representation or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print, however, may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

Library of Congress Cataloging-in-Publication Data is available.

ISBN 9781118344606

CONTRIBUTORS

CHAPTER 1AKIRA HIROSE, The University of Tokyo, Tokyo, Japan

CHAPTER 2SIMONE FIORI, Università Politecnica delle Marche, Ancona, Italy

CHAPTER 3TOHRU NITTA, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Japan

CHAPTER 4MD. FAIJUL AMIN, University of Fukui, Fukui, Japan; Khulna University of Engineering and Technology, Khulna, BangladeshKAZUYUKI MURASE, University of Fukui, Fukui, Japan

CHAPTER 5TEIJIRO ISOKAWA, University of Hyogo, Hyogo, JapanHARUHIKO NISHIMURA, University of Hyogo, Hyogo, JapanNOBUYUKI MATSUI, University of Hyogo, Hyogo, Japan

CHAPTER 6YASUAKI KUROE, Kyoto Institute of Technology, Kyoto, Japan

CHAPTER 7RAMASAMY SAVITHA, Nanyang Technological University, SingaporeSUNDARAM SURESH, Nanyang Technological University, SingaporeNARASIMHAN SUNDARARAJAN, Sri Jaya Chamarajendra College of Engineering (SJCE), Mysore, India

CHAPTER 8NIKOLAY MANYAKOV, KU Leuven, Leuven, BelgiumIGOR AIZENBERG, Texas A&M University–Texarkana, Texarkana, Texas, U.S.A.NILOLAY CHUMERIN, KU Leuven, Leuven, BelgiumMARK M. VAN HULLE, KU Leuven, Leuven, Belgium

CHAPTER 9XIA HONG, University of Reading, Reading, U.K.SHENG CHEN, University of Southampton, Southampton, U.K.; King Abdulaziz University, Jeddah, Saudi ArabiaCHRIS J. HARRIS, University of Southampton, Southampton, U.K.

CHAPTER 10WAI KIT WONG, Multimedia University, Melaka, MalaysiaGIN CHONG LEE, Multimedia University, Melaka, MalaysiaCHU KIONG LOO, University of Malaya, Kuala Lumpur, MalaysiaWAY SOONG LIM, Multimedia University, Melaka, MalaysiaRAYMOND LOCK, Multimedia University, Melaka, Malaysia

PREFACE

Complex-valued neural networks (CVNNs) have continued to open doors to various new applications. The CVNNs are the neural networks that deal with complex amplitude, i.e. signal having phase and amplitude, which is one of the most core concepts in science and technology, in particular in electrical and electronic engineering. A CVNN is not equivalent to a double-dimensional real-valued neural network. It has different dynamics and characteristics such as generalization, which is significantly useful in treatment of complex-amplitude information and wave-related phenomena. This is a critical point in applications in engineering fields. It is also crucial for developing new devices in the future. That is, the CVNN framework will play an important role in introduction of learning and self-organization into future quantum devices dealing with electron waves and photonic waves.

We can further expect that broad-sense CVNNs such as quaternion neural networks break ground in unique directions respectively. Quaternion has been essential in computer graphics to render three-dimensional moving objects. When we introduce learning and self-organization in virtual realities and computer-aided amenities, quaternion neural networks will surely bring an important fundamental basis. CVNNs may be useful even in physiological analysis and modeling where the researchers suggest, for example, that the phase information of neuron firing timing against the theta wave in electroencephalography possesses a close relationship to short-term position memory in the brain.

This book includes recent advances and applications of CVNNs in the following ten chapters. Chapter 1 presents historical and latest advances in applications of CVNNs first. Then it illustrates one of the most important merits of CVNNs, namely, the suitability for adaptive processing of coherent signals. Chapter 2 deals with complex-valued parameter manifolds and with applications of CVNNs in which the connection parameters work in complex-valued manifolds. Successful applications are also shown, such as blind source separation of complex-valued sources, multichannel blind deconvolution of signals in telecommunications, nondestructive evaluation of materials in industrial metallic slab production, and a purely algorithmic problem of averaging the parameters of a pool of cooperative CVNNs. Chapter 3 describes the N-dimensional vector neuron, which can deal with N signals as one cluster, by extending the three-dimensional vector neuron to N dimensions. The N-bit parity problem is solved with a signal N-dimensional vector neuron with an orthogonal decision boundary. It is shown that the extension of the dimensionality of neural networks to N dimensions originates the enhancement of computational power in neural networks. Chapter 4 discusses the Wirtinger calculus and derives several algorithms for feedforward and recurrent CVNNs. A functional dependence diagram is shown for visual understanding of respective derivatives. For feedforward networks, two algorithms are considered, namely, the gradient descent (backpropagation) and the Levenberg–Marquardt (LM) algorithms. Simultaneously, for recurrent networks, the authors discuss the complex versions of the real-time recurrent learning (RTRL) and the extended Kalman filter (EKF) algorithms.

Chapter 5 presents quaternion associative memories. Quaternion is a four-dimensional hypercomplex number system and has been extensively employed in the fields of robotics, control of satellites, computer graphics, and so on. One of its benefits lies in the fact that affine transforms in three-dimensional space can be compactly and consistently represented. Thus neural networks based on quaternion are expected to process three-dimensional data with learning or self-organization more successfully. Several schemes to embed patterns into a network are presented. In addition to the quaternion version of the Hebbian learning scheme, the projection rule for embedding nonorthogonal patterns and local iterative learning are described. Chapter 6 extends neural networks into the Clifford algebraic domain. Since geometric product is non-commutative, some types of models are considered possible. In this chapter three models of fully connected recurrent networks are i nvestigated, in particular from the viewpoint of existence conditions of an energy function, for two classes of the Hopfield-type Clifford neural networks.

Chapter 7 presents a meta-cognitive learning algorithm for a single hidden layer CVNN called Meta-cognitive Fully Complex-valued Relaxation Network (McFCRN). McFCRN has two components, that is, cognitive and meta-cognitive components. The meta-cognitive component possesses a self-regulatory learning mechanism which controls the learning stability of FCRN by deciding what to learn, when to learn, and how to learn from a sequence of training data. They deal with the problem of explicit minimization of magnitude and phase errors in logarithmic error function. Chapter 8 describes a multilayer feedforward neural network equipped with multi-valued neurons and its application to the domain of brain–computer interface (BCI). A new methodology for electroencephalogram (EEG)-based BCI is developed with which subjects can issue commands by looking at corresponding targets that are flickering at the same frequency but with different initial phase. Chapter 9 develops a complex-valued (CV) B-spline (basis-spline) neural network approach for efficient identification of the CV Wiener system as well as the effective inverse of the estimated CV Wiener model. Specifically, the CV nonlinear static function in the Wiener system is represented using the tensor product from two univariate B-spline neural networks. The effectiveness is demonstrated using the application of digital predistorter for high-power amplifiers with memory. Chapter 10 presents an effective color image processing system for persons’ face image recognition. The system carries out the recognition with a quaternion correlator and a max-product fuzzy neural network classifier. The performance is evaluated in terms of accuracy, calculation cost, and noise and/or scale tolerance.

This is the first book planned and published by the Complex-Valued Neural Networks Task Force (CVNN TF) of the IEEE Computational Intelligence Society (CIS) Neural Networks Technical Committee (NNTC). The CVNN TF has been established to promote research in this developing field. The authors expect readers to get more interested in this area, to send feedback in any form, and to join us. Please visit our website http://www.eis.t.u-tokyo.ac.jp/news/NNTC_CVNN/.

AKIRA HIROSE

TokyoJanuary 2013

CHAPTER 1

APPLICATION FIELDS AND FUNDAMENTAL MERITS OF COMPLEX-VALUED NEURAL NETWORKS

AKIRA HIROSE

The University of Tokyo, Tokyo, Japan

This chapter presents historical and latest advances in applications of complex-valued neural networks (CVNNs) first. Then it also shows one of the most important merits of CVNNs, namely, the suitability for adaptive processing of coherent signals.

1.1 INTRODUCTION

This chapter presents historical and latest advances in applications of complex-valued neural networks (CVNNs) first. Then it also shows one of the most important merits of CVNNs, namely, the suitability for adaptive processing of coherent signals.

CVNNs are effective and powerful in particular to deal with wave phenomena such as electromagnetic and sonic waves, as well as to process wave-related information. Regarding the history of CVNNs, we can trace back to the middle of the 20th century. The first introduction of phase information in computation was made by Eiichi Goto in 1954 in his invention of ”Parametron” [17, 18, 61]. He utilized the phase of a high-frequency carrier to represent binary or multivalued information. However, the computational principle employed there was ”logic” of Turing type, or von Neumann type, based on symbol processing, so that he could not make further extensive use of the phase. In the present CVNN researches, contrarily, the researchers extend the world of computation to pattern processing fields based on a novel use of the structure of complex-amplitude (phase and amplitude) information.

We notice that the above feature is significantly important when we give thought to the fact that various modern technologies centered on electronics orient toward coherent systems and devices rather than something incoherent. The feature will lead to future general probability statistics, stochastic methods, and statistical learning and self-organization framework in coherent signal processing and information analysis. The fundamental idea is applicable also to hypercomplex processing based on quaternion, octonion, and Clifford algebraic networks.

Some parts of the following contents of this chapter were published in detail in the Journal of Society of Instrument and Control Engineers [29], the Frontiers in Electrical and Electronic Engineering in China [28], and IEEE Transactions in Neural Networks and Learning Systems [35].

1.2 APPLICATIONS OF COMPLEX-VALUED NEURAL NETWORKS

Complex-valued neural networks (CVNNs) have become widely used in various fields. The basic ideas and fundamental principles have been published in several books in recent years [27, 22, 26, 41, 53, 2]. The following subsections present major application fields.

1.2.1 Antenna Design

The most notable feature of CVNNs is the compatibility with wave phenomena and wave information related to, for example, electromagnetic wave, lightwave, electron wave, and sonic wave [28]. Application fields include adaptive design of antennas such as patch antennas for microwave and millimeter wave. Many researches have been reported on how to determine patch-antenna shape and sub-element arrangement, as well as on the switching patterns of the sub-elements [46, 10, 47]. A designer assigns desired frequency-domain characteristics of complex amplitude, or simply amplitude, such as transmission characteristics, return loss, and radiation patterns. A CVNN mostly realizes a more suitable design than a real-valued network does even when he/she presents only simple amplitude. The reason lies in the elemental dynamics consisting of phase rotation (or time delay × carrier frequency) and amplitude increase or decrease, based on which dynamics the CVNN learning or self-organization works. As a result, the generalization characteristics (error magnitude at nonlearning points in supervised learning) and the classification manner often become quite different from those of real-valued neural networks [28, 35]. The feature plays the most important role also in other applications referred to below.

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!