113,99 €
A new approach to unsupervised learning Evolving technologies have brought about an explosion of information in recent years, but the question of how such information might be effectively harvested, archived, and analyzed remains a monumental challenge--for the processing of such information is often fraught with the need for conceptual interpretation: a relatively simple task for humans, yet an arduous one for computers. Inspired by the relative success of existing popular research on self-organizing neural networks for data clustering and feature extraction, Unsupervised Learning: A Dynamic Approach presents information within the family of generative, self-organizing maps, such as the self-organizing tree map (SOTM) and the more advanced self-organizing hierarchical variance map (SOHVM). It covers a series of pertinent, real-world applications with regard to the processing of multimedia data--from its role in generic image processing techniques, such as the automated modeling and removal of impulse noise in digital images, to problems in digital asset management and its various roles in feature extraction, visual enhancement, segmentation, and analysis of microbiological image data. Self-organization concepts and applications discussed include: * Distance metrics for unsupervised clustering * Synaptic self-amplification and competition * Image retrieval * Impulse noise removal * Microbiological image analysis Unsupervised Learning: A Dynamic Approach introduces a new family of unsupervised algorithms that have a basis in self-organization, making it an invaluable resource for researchers, engineers, and scientists who want to create systems that effectively model oppressive volumes of data with little or no user intervention.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 439
Veröffentlichungsjahr: 2014
IEEE Press
445 Hoes Lane
Piscataway, NJ 08854
IEEE Press Editorial Board
Tariq Samad, Editor in Chief
George W. Arnold Mary Lanzerotti Linda Shafer
Dmitry Goldgof Pui-In Mak MengChu Zhou
Ekram Hossain Ray Perez George Zobrist
Kenneth Moore, Director of IEEE Book and Information Services (BIS)
Matthew Kyan Paisarn Muneesawang Kambiz Jarrah Ling Guan
Copyright © 2014 by The Institute of Electrical and Electronics Engineers, Inc.
Published by John Wiley & Sons, Inc., Hoboken, New Jersey. All rights reserved. Published simultaneously in Canada.
No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.
Library of Congress Cataloging-in-Publication Data:
Kyan, Matthew. Unsupervised learning : a dynamic approach / Matthew Kyan, Paisarn Muneesawang, Kambiz Jarrah, Ling Guan. pages cm ISBN 978-0-470-27833-8 (cloth) 1. Database management. 2. Self-organizing systems. 3. Machine learning. 4. Big data. I. Muneesawang, Paisarn. II. Jarrah, Kambiz. III. Guan, Ling. IV. Title. QA76.9.D3K 93 2014 005.74–dc23
2013046024
Acknowledgments
1 Introduction
1.1 Part I: The Self-Organizing Method
1.2 Part II: Dynamic Self-Organization for Image Filtering and Multimedia Retrieval
1.3 Part III: Dynamic Self-Organization for Image Segmentation and Visualization
1.4 Future Directions
2 Unsupervised Learning
2.1 Introduction
2.2 Unsupervised Clustering
2.3 Distance Metrics for Unsupervised Clustering
2.4 Unsupervised Learning Approaches
2.5 Assessing Cluster Quality and Validity
3 Self-Organization
3.1 Introduction
3.2 Principles of Self-Organization
3.3 Fundamental Architectures
3.4 Other Fixed Architectures for Self-Organization
3.5 Emerging Architectures for Self-Organization
3.6 Conclusion
4 Self-Organizing Tree Map
4.1 Introduction
4.2 Architecture
4.3 Competitive Learning
4.4 Algorithm
4.5 Evolution
4.6 Practical Considerations, Extensions, and Refinements
4.7 Conclusions
Notes
5 Self-Organization in Impulse Noise Removal
5.1 Introduction
5.2 Review of Traditional Median-Type Filters
5.3 The Noise-Exclusive Adaptive Filtering
5.4 Experimental Results
5.5 Detection-Guided Restoration and Real-Time Processing
5.6 Conclusions
6 Self-Organization in Image Retrieval
6.1 Retrieval of Visual Information
6.2 Visual Feature Descriptor
6.3 User-Assisted Retrieval
6.4 Self-Organization for Pseudo Relevance Feedback
6.5 Directed Self-Organization
6.6 Optimizing Self-Organization for Retrieval
6.7 Retrieval Performance
6.8 Summary
7 The Self-Organizing Hierarchical Variance Map:
7.1 An Intuitive Basis
7.2 Model Formulation and Breakdown
7.3 Algorithm
7.4 Simulations and Evaluation
7.5 Tests on Self-Determination and the Optional Tuning Stage
7.6 Cluster Validity Analysis on Synthetic and UCI Data
7.7 Summary
8 Microbiological Image Analysis Using Self-Organization
8.1 Image Analysis in the Biosciences
8.2 Image Analysis Tasks Considered
8.3 Microbiological Image Segmentation
8.4 Image Segmentation Using Hierarchical Self-Organization
8.5 Harvesting Topologies to Facilitate Visualization
8.6 Summary
Notes
9 Closing Remarks and Future Directions
9.1 Summary of Main Findings
9.2 Future Directions
Appendix A
A.1 Global and Local Consistency Error
References
Index
IEEE Press Series
End User License Agreement
Chapter 2
Table 2.1
Table 2.2
Chapter 5
Table 5.1
Table 5.2
Table 5.3
Table 5.4
Table 5.5
Table 5.6
Table 5.7
Table 5.8
Chapter 6
Table 6.1
Table 6.2
Table 6.3
Table 6.4
Table 6.5
Table 6.6
Chapter 7
Table 7.1
Chapter 1
FIGURE 1.1
Unsupervised Learning–based framework for (a) automated modeling and removal of impulse noise in digital images and (b) image classification in multimedia indexing and retrieval.
FIGURE 1.2
Unsupervised Learning–based framework for mining segmentations for visualization and characterization of microbiological image data.
Chapter 2
FIGURE 2.1
The problem of defining a cluster—portrayal of the ill-posed nature of clustering. (a) Ambiguity in grouping illustrates the difficulty inherent in distinguishing between what may be deemed alternative options for the notion of similar groups. (b) Enhancement/sparsity promotion via feature space ideally should work to reduce the ambiguity in distinguishing between possible groupings.
FIGURE 2.2
Unsupervised Learning approaches. All acronyms are explained in Table 2.2.
FIGURE 2.3
Hard vs. fuzzy decision boundaries in iterative mean-squared-error-based clustering: (left) Hard Clustering with K-Means vs. (right) Fuzzy Clustering; fuzzy membership
blurs
the line between clusters allowing for a certain degree of overlap in boundary regions. While mixture models build
uncertainty
into a solution through incorporating specific probabilities, fuzzy partitions attempt to yield a similar effect by factoring into the result, a certain level of
imprecision
.
FIGURE 2.4
Examples of agglomerative-based clustering approaches: in (a) Single Linkage and (b) Complete Linkage, linkage decisions are shown for the sixth agglomerative step; in (c) Shared Nearest Neighbors and (d) Shared Farthest Neighbors, the distances considered when two candidate samples are considered for grouping are indicated.
Chapter 3
FIGURE 3.1
Self-amplification, competition, and cooperation in learning: wk* represents the synaptic vector of the winning neuron (competition), which adapts (self-amplifies) toward the input, thereby strengthening its correlation with future inputs in this region. At the same time, associative memory is imparted to neighboring neurons (cooperation)—related to the winner through some defined or inferred topology.
FIGURE 3.2
The ART architecture.
FIGURE 3.3
Prototypes formation.
FIGURE 3.4
The SOM architecture.
FIGURE 3.5
Local neighborhood smoothing in the SOM.
FIGURE 3.6
A square 2D vector.
FIGURE 3.7
SOM mapping of the 2D vector in Figure 3.6 onto a square array.
FIGURE 3.8
SOM mapping of a triangular 2D vector onto a square array.
FIGURE 3.9
SOM mapping of the square 2D vector into a 1D representation.
FIGURE 3.10
SOM mapping of the triangular 2D vector into a 1D representation.
FIGURE 3.11
Classic stationary Competitive Learning architectures: (a) Kohonen’s SOM; (b) Neural Gas; (c) Hierarchical Feature Map.
FIGURE 3.12
Dynamic Competitive Learning architectures: (a) Growing Hierarchical SOM; (b) Evolving Tree; (c) Growing Neural Gas; (d) Dynamic Adaptive Self-Organizing Hybrid.
Chapter 4
FIGURE 4.1
(a) Uniformly distributed input vectors in five squares; (b) K-shape distribution of the input vectors; (c) final representation of the SOM for input vectors uniformly distributed in five squares; (d) final representation of the SOM for K-shape distribution of the input vectors.
FIGURE 4.2
Pictorial representation of the SOM vs. the SOTM. The SOM (left) has a predefined lattice of neurons that unfold to span the input space by progressively relating input samples to their closest prototypes in the lattice—information from each sample is imparted to the “winning” neuron and its immediate neighbors. The SOTM (right), by contrast, explores the input space by randomly parsing and cultivating the growth of characteristic prototypes, in a top-down vigilant (outside-in) manner.
FIGURE 4.3
Decision boundary used to trigger map growth in the SOTM—shown for 2D (left) and 3D (right) feature spaces. Xs represent initial stimuli falling within the radius/ellipsoid of significant similarity (assumed Euclidean metric). Decay of this boundary inspires hierarchical growth of neurons (prototypes).
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
