39,59 €
Create and unleash the power of neural networks by implementing professional Java code
This book is for Java developers with basic Java programming knowledge. No previous knowledge of neural networks is required as this book covers the concepts from scratch.
Vast quantities of data are produced every second. In this context, neural networks become a powerful technique to extract useful knowledge from large amounts of raw, seemingly unrelated data. One of the most preferred languages for neural network programming is Java as it is easier to write code using it, and most of the most popular neural network packages around already exist for Java. This makes it a versatile programming language for neural networks.
This book gives you a complete walkthrough of the process of developing basic to advanced practical examples based on neural networks with Java.
You will first learn the basics of neural networks and their process of learning. We then focus on what Perceptrons are and their features. Next, you will implement self-organizing maps using the concepts you've learned. Furthermore, you will learn about some of the applications that are presented in this book such as weather forecasting, disease diagnosis, customer profiling, and characters recognition (OCR). Finally, you will learn methods to optimize and adapt neural networks in real time.
All the examples generated in the book are provided in the form of illustrative source code, which merges object-oriented programming (OOP) concepts and neural network features to enhance your learning experience.
This book adopts a step-by-step approach to neural network development and provides many hands-on examples using Java programming. Each neural network concept is explored through real-world problems and is delivered in an easy-to-comprehend manner.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 221
Veröffentlichungsjahr: 2016
Copyright © 2016 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the authors, nor Packt Publishing, and its dealers and distributors will be held liable for any damages caused or alleged to be caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
First published: January 2016
Production reference: 1060116
Published by Packt Publishing Ltd.
Livery Place
35 Livery Street
Birmingham B3 2PB, UK.
ISBN 978-1-78588-090-2
www.packtpub.com
Authors
Fábio M. Soares
Alan M.F. Souza
Reviewer
Saeed Afzal
Commissioning Editor
Amarabha Banerjee
Acquisition Editor
Rahul Nair
Content Development Editor
Riddhi Tuljapurkar
Technical Editor
Vivek Pala
Copy Editor
Tani Kothari
Project Coordinator
Kinjal Bari
Proofreader
Safis Editing
Indexer
Hemangini Bari
Graphics
Disha Haria
Production Coordinator
Nilesh Mohite
Cover Work
Nilesh Mohite
Fábio M. Soares holds a master's degree in applied computing from UFPA and is currently a PhD candidate at the same university. He has been designing neural network solutions since 2004 and has developed applications with this technique in several fields, ranging from telecommunications to chemistry process modeling, and his research topics cover supervised learning for data-driven modeling.
He is also self-employed, offering services such as IT infrastructure management as well as database administration to a number of small- and medium-sized companies in northern Brazil. In the past, he has worked for big companies such as Albras, one of the most important aluminium smelters in the world, and Eletronorte, a great power supplier in Brazil. He also has experience as a lecturer, having worked at the Federal Rural University of Amazon and as a Faculty of Castanhal, both in the state of Pará, teaching subjects involving programming and artificial intelligence.
He has published a number of works, many of them available in English, all including the topics of artificial intelligence applied to some problem. His publications include conference proceedings, such as the TMS (The Minerals Metals and Materials Society), Light Metals and the Intelligent Data Engineering and Automated Learning. He has also has published two book chapters for Intech.
I would like to give a special acknowledgement to God for having given me the opportunity to get access to rich knowledge on this theme, which I simply love doing research on. Special thanks to my family, my father, Josafá, and mother, Maria Alice (in memoriam), who would be very proud of me for this book, and also my brother, Flávio, my aunt, Maria Irenice, as well as all my relatives who always supported me in some way during my studies. I would also like to thank the support of my advisor, Prof. Roberto Limão. I am very grateful to him for having invited me to work with him on many projects regarding artificial intelligence and neural networks. Also, special thanks to my partners and former partners from Exodus Sistemas, who have helped me in my challenges in programming and IT infrastructure. Finally, I'd like to thank my friend Alan Souza, who wrote this book with me, for having extended to me this authorship.
Alan M.F. Souza is computer engineer from Instituto de Estudos Superiores da Amazônia (IESAM). He holds a post-graduate degree in project management software and a master's degree in industrial processes (applied computing) from Universidade Federal do Pará (UFPA). He has been working with neural networks since 2009 and has worked with IT Brazilian companies developing in Java, PHP, SQL, and other programming languages since 2006. He is passionate about programming and computational intelligence. Currently, he is a professor at Universidade da Amazônia (UNAMA) and a PhD candidate at UFPA.
Since I was a kid, I thought about writing a book. So, this book is a dream come true and the result of hard work. I'd like to thank God for giving me this opportunity. I'd also like to thank my father, Célio, my mother, Socorro, my sister, Alyne, and my amazing wife, Tayná, for understanding my absences and worries at various moments. I am grateful to all the members of my family and friends for always supporting me in difficult times and wishing for my success. I'd like to thank all the professors who passed through my life, especially Prof. Roberto Limão for introducing me the very first neural network concept. I must register my gratitude to Fábio Soares for this great partnership and friendship. Finally, I must appreciate the tireless team at Packt Publishing for the invitation and for helping us in the production process as a whole.
Saeed Afzal, also known as Smac Afzal, is a professional software engineer and technology enthusiast based in Pakistan. He specializes in solution architecture and the implementation of scalable high-performance applications.
He is passionate about providing automation solutions for different business needs on the Web. His current research and work includes the futuristic implementation of a next-generation web development framework, which reduces development time and cost and delivers productive websites with many necessary and killer features by default. He is hopeful of launching his upcoming technology in 2016.
He has also worked on the book Cloud Bees Development by Packt Publishing.
You can found out more about his skills and experience at http://sirsmac.com. He can be contacted at <[email protected]>.
I would like to thank the Allah Almighty, my parents, and my wife, Dr. H. Zara Saeed, for all their encouragement.
For support files and downloads related to your book, please visit www.PacktPub.com.
Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.PacktPub.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at <[email protected]> for more details.
At www.PacktPub.com, you can also read a collection of free technical articles, sign up for a range of free newsletters and receive exclusive discounts and offers on Packt books and eBooks.
https://www2.packtpub.com/books/subscription/packtlib
Do you need instant solutions to your IT questions? PacktLib is Packt's online digital book library. Here, you can search, access, and read Packt's entire library of books.
If you have an account with Packt at www.PacktPub.com, you can use this to access PacktLib today and view 9 entirely free books. Simply use your login credentials for immediate access.
The life of a programmer can be described as a continual never-ending learning pathway. A programmer always faces challenges regarding new technology or new approaches. Generally, during our lives, although we become used to repeated things, we are always subjected to learn something new. The process of learning is one of the most interesting topics in science, and there are a number of attempts to describe or reproduce the human learning process.
The writing of this book was guided by the challenge of facing new content and then mastering it. While the name neural networks may appear strange or even give an idea that this book is about neurology, we strived to simplify these nuances by focusing on your reasons for deciding to purchase this book. We intended to build a framework that shows you that neural networks are actually simple and easy to understand, and absolutely no prior knowledge on this topic is required to fully understand the concepts we present here.
So, we encourage you to explore the content of this book to the fullest, beholding the power of neural networks when confronting big problems but always with the point of view of a beginner. Every concept addressed in this book is explained in easy language, and also with a technical background. Our mission in this book is to give you an insight into intelligent applications that can be written using a simple language.
Finally, we would like to thank all those who directly or indirectly have contributed to this book and supported us from the very beginning, right from the Federal University of Pará, which is the university that we graduated from, to the data and component providers INMET (Brazilian Institute of Meteorology), Proben1, and JFreeCharts. We want to give special thanks to our advisor Prof. Roberto Limão, who introduced us to the subject of neural networks and coauthored many papers with us in this field. We also acknowledge the work performed by several authors cited in the references, which gave us a broader vision on neural networks and insights on how to adapt them to the Java language in a didactic way.
We welcome you to have a very pleasurable reading experience and you are encouraged to download the source code and follow the examples presented in this book.
Chapter 1, Getting Started with Neural Networks, is an introductory foundation on the neural networks and what they are designed for. You will be presented with the basic concepts involved in this book. A brief review of the Java programming language is provided. As in all subsequent chapters, an implementation of a neural network in Java code is also provided.
Chapter 2, How Neural Networks Learn, covers the learning process of neural networks and shows how data is used to that end. The complete structure and design of a learning algorithm is presented here.
Chapter 3, Handling Perceptrons, covers the use of perceptrons, which are one of the most commonly used neural network architectures. We present a neural network structure containing layers of neurons and show how they can learn by data in basic problems.
Chapter 4, Self-Organizing Maps, shows an unsupervised neural network architecture (the Self-Organising Map), which is applied to finding patterns or clusters in records.
Chapter 5, Forecasting Weather, is the first practical chapter showing an interesting application of neural networks in forecasting values, namely weather data.
Chapter 6, Classifying Disease Diagnostics, covers another useful task neural networks are very good at—classification. In this chapter, you will be presented with a very didactic but interesting application for disease diagnosis.
Chapter 7, Clustering Customer Profiles, talks about how neural networks are able to find patterns in data, and a common application is to group customers that share the same properties of buying.
Chapter 8, Pattern Recognition (OCR Case), talks about a very interesting and amazing capability of recognizing patterns, including optical character recognition, and this chapter explores how this can be done with neural networks in the Java language.
Chapter 9, Neural Network Optimization and Adaptation, shows advancements regarding how to optimize and add adaptability to neural networks, thereby strengthening their power.
You'll need Netbeans (www.netbeans.org) or Eclipse (www.eclipse.org). Both are free and available for download at the previously mentioned websites.
This book is targeted at both developers and enthusiasts who have a basic or even no Java programming knowledge. No previous knowledge of neural networks is required, this book will teach from scratch. Even if you are familiar with neural networks and/or other machine learning techniques but have little or no experience with Java, this book will take you to the level at which you will be able to develop useful applications. Of course, if you know basic programming concepts, you will benefit most from this book, but no prior experience is required.
Feedback from our readers is always welcome. Let us know what you think about this book—what you liked or disliked. Reader feedback is important for us as it helps us develop titles that you will really get the most out of.
To send us general feedback, simply e-mail <[email protected]>, and mention the book's title in the subject of your message.
If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, see our author guide at www.packtpub.com/authors.
Now that you are the proud owner of a Packt book, we have a number of things to help you to get the most from your purchase.
You can download the example code files from your account at http://www.packtpub.com for all the Packt Publishing books you have purchased. If you purchased this book elsewhere, you can visit http://www.packtpub.com/support and register to have the files e-mailed directly to you.
Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you find a mistake in one of our books—maybe a mistake in the text or the code—we would be grateful if you could report this to us. By doing so, you can save other readers from frustration and help us improve subsequent versions of this book. If you find any errata, please report them by visiting http://www.packtpub.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details of your errata. Once your errata are verified, your submission will be accepted and the errata will be uploaded to our website or added to any list of existing errata under the Errata section of that title.
To view the previously submitted errata, go to https://www.packtpub.com/books/content/support and enter the name of the book in the search field. The required information will appear under the Errata section.
Piracy of copyrighted material on the Internet is an ongoing problem across all media. At Packt, we take the protection of our copyright and licenses very seriously. If you come across any illegal copies of our works in any form on the Internet, please provide us with the location address or website name immediately so that we can pursue a remedy.
Please contact us at <[email protected]> with a link to the suspected pirated material.
We appreciate your help in protecting our authors and our ability to bring you valuable content.
If you have a problem with any aspect of this book, you can contact us at <[email protected]>, and we will do our best to address the problem.
In this chapter, we will introduce neural networks and what they are designed for. This chapter serves as a foundation layer for the subsequent chapters, while it presents the basic concepts for neural networks. In this chapter, we will cover the following:
First, the term "neural networks" may create a snapshot of a brain in our minds, particularly for those who have just been introduced to it. In fact, that's right, we consider the brain to be a big and natural neural network. However, what if we talk about artificial neural networks (ANNs)? Well, here comes an opposite word to natural, and the first thing now that comes into our head is an image of an artificial brain or a robot, given the term "artificial." In this case, we also deal with creating a structure similar to and inspired by the human brain; therefore, this can be called artificial intelligence. So, the reader who doesn't have any previous experience with ANN now may be thinking that this book teaches how to build intelligent systems, including an artificial brain, capable of emulating the human mind using Java codes, isn't it? Of course, we will not cover the creation of artificial thinking machines such as those from the Matrix trilogy movies; however, this book will discuss several incredible capabilities that these structures can do. We will provide the reader with Java codes for defining and creating basic neural network structures, taking advantage of the entire Java programming language framework.
We cannot begin talking about neural networks without understanding their origins, including the term as well. We use the termsneural networks (NN) and ANN interchangeably in this book, although NNs are more general, covering the natural neural networks as well. So, what actually is an ANN? Let's explore a little of the history of this term.
In the 1940s, the neurophysiologist Warren McCulloch and the mathematician Walter Pits designed the first mathematical implementation of an artificial neuron combining the neuroscience foundations with mathematical operations. At that time, many studies were being carried out on understanding the human brain and how and if it could be simulated, but within the field of neuroscience. The idea of McCulloch and Pits was a real novelty because it added the math component. Further, considering that the brain is composed of billions of neurons, each one interconnected with another million, resulting in some trillions of connections, we are talking about a giant network structure. However, each neuron unit is very simple, acting as a mere processor capable to sum and propagate signals.
On the basis of this fact, McCulloch and Pits designed a simple model for a single neuron, initially to simulate the human vision. The available calculators or computers at that time were very rare but capable of dealing with mathematical operations quite well; on the other hand, even today tasks such as vision and sound recognition are not easily programmed without the use of special frameworks, as opposed to the mathematical operations and functions. Nevertheless, the human brain can perform these latter tasks more efficiently than the first ones, and this fact really instigates scientists and researchers.
So, an ANN is supposed to be a structure to perform tasks such as pattern recognition, learning from data, and forecasting trends, just like an expert can do on the basis of knowledge, as opposed to the conventional algorithmic approach that requires a set of steps to be performed to achieve a defined goal. An ANN instead has the capability to learn how to solve some task by itself, because of its highly interconnected network structure.
Tasks Quickly Solvable by Humans
Tasks Quickly Solvable by Computers
Classification of images
Voice recognition
Face identification
Forecast events on the basis of experience
Complex calculation
Grammatical error correction
Signal processing
Operating system management
It can be said that the ANN is a nature-inspired structure, so it does have similarities with the human brain. As shown in the following figure, a natural neuron is composed of a nucleus, dendrites, and axon. The axon extends itself into several branches to form synapses with other neurons' dendrites.
So, the artificial neuron has a similar structure. It contains a nucleus (processing unit), several dendrites (analogous to inputs), and one axon (analogous to output), as shown in the following figure:
The links between neurons form the so-called neural network, analogous to the synapses in the natural structure.
Natural neurons have proven to be signal processors since they receive micro signals in the dendrites that can trigger a signal in the axon depending on their strength or magnitude. We can then think of a neuron as having a signal collector in the inputs and an activation unit in the output that can trigger a signal that will be forwarded to other neurons. So, we can define the artificial neuron structure as shown in the following figure:
In natural neurons, there is a threshold potential that when reached, fires the axon and propagates the signal to the other neurons. This firing behavior is emulated with activation functions, which have proven to be useful in representing nonlinear behaviors in the neurons.
The neuron's output is given by an activation function. This component adds nonlinearity to neural network processing, which is needed because the natural neuron has nonlinear behaviors. An activation function is usually bounded between two values at the output, therefore being a nonlinear function, but in some special cases, it can be a linear function.
The four most used activation functions are as follows:
The equations and charts associated with these functions are shown in the following table:
Function
Equation
Chart
Sigmoid
Hyperbolic tangent
Hard limiting threshold
Linear
In neural networks, weights
