170,99 €
The cognitive approach to the IoT provides connectivity to everyone and everything since IoT connected devices are known to increase rapidly. When the IoT is integrated with cognitive technology, performance is improved, and smart intelligence is obtained. Discussed in this book are different types of datasets with structured content based on cognitive systems. The IoT gathers the information from the real time datasets through the internet, where the IoT network connects with multiple devices.
This book mainly concentrates on providing the best solutions to existing real-time issues in the cognitive domain. Healthcare-based, cloud-based and smart transportation-based applications in the cognitive domain are addressed. The data integrity and security aspects of the cognitive computing main are also thoroughly discussed along with validated results.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 507
Veröffentlichungsjahr: 2021
Cover
Title Page
Copyright
Dedication
Preface
Acknowledgments
1 Introduction to Cognitive Computing
1.1 Introduction: Definition of Cognition, Cognitive Computing
1.2 Defining and Understanding Cognitive Computing
1.3 Cognitive Computing Evolution and Importance
1.4 Difference Between Cognitive Computing and Artificial Intelligence
1.5 The Elements of a Cognitive System
1.6 Ingesting Data Into Cognitive System
1.7 Analytics Services
1.8 Machine Learning
1.9 Machine Learning Process
1.10 Machine Learning Techniques
1.11 Hypothesis Space
1.12 Developing a Cognitive Computing Application
1.13 Building a Health Care Application
1.14 Advantages of Cognitive Computing
1.15 Features of Cognitive Computing
1.16 Limitations of Cognitive Computing
1.17 Conclusion
References
2 Machine Learning and Big Data in Cyber-Physical System: Methods, Applications and Challenges
2.1 Introduction
2.2 Cyber-Physical System Architecture
2.3 Human-in-the-Loop Cyber-Physical Systems (HiLCPS)
2.4 Machine Learning Applications in CPS
2.5 Use of IoT in CPS
2.6 Use of Big Data in CPS
2.7 Critical Analysis
2.8 Conclusion
References
3 HemoSmart: A Non-Invasive Device and Mobile App for Anemia Detection
3.1 Introduction
3.2 Literature Review
3.3 Methodology
3.4 Results
3.5 Discussion
3.6 Originality and Innovativeness of the Research
3.7 Conclusion
References
4 Advanced Cognitive Models and Algorithms
4.1 Introduction
4.2 Microsoft Azure Cognitive Model
4.3 IBM Watson Cognitive Analytics
4.4 Natural Language Modeling
4.5 Representation of Knowledge Models
4.6 Conclusion
References
5 iParking—Smart Way to Automate the Management of the Parking System for a Smart City
5.1 Introduction
5.2 Background & Literature Review
5.3 Research Gap
5.4 Research Problem
5.5 Objectives
5.6 Methodology
5.7 Testing and Evaluation
5.8 Results
5.9 Discussion
5.10 Conclusion
References
6 Cognitive Cyber-Physical System Applications
6.1 Introduction
6.2 Properties of Cognitive Cyber-Physical System
6.3 Components of Cognitive Cyber-Physical System
6.4 Relationship Between Cyber-Physical System for Human–Robot
6.5 Applications of Cognitive Cyber-Physical System
6.6 Case Study: Road Management System Using CPS
6.7 Conclusion
References
7 Cognitive Computing
7.1 Introduction
7.2 Evolution of Cognitive System
7.3 Cognitive Computing Architecture
7.4 Enabling Technologies in Cognitive Computing
7.5 Applications of Cognitive Computing
7.6 Future of Cognitive Computing
7.7 Conclusion
References
8 Tools Used for Research in Cognitive Engineering and Cyber Physical Systems
8.1 Cyber Physical Systems
8.2 Introduction: The Four Phases of Industrial Revolution
8.3 System
8.4 Autonomous Automobile System
8.5 Robotic System
8.6 Mechatronics
References
9 Role of Recent Technologies in Cognitive Systems
9.1 Introduction
9.2 Natural Language Processing for Cognitive Systems
9.3 Taxonomies and Ontologies of Knowledge Representation for Cognitive Systems
9.4 Support of Cloud Computing for Cognitive Systems
9.5 Cognitive Analytics for Automatic Fraud Detection Using Machine Learning and Fuzzy Systems
9.6 Design of Cognitive System for Healthcare Monitoring in Detecting Diseases
9.7 Advanced High Standard Applications Using Cognitive Computing
9.8 Conclusion
References
10 Quantum Meta-Heuristics and Applications
10.1 Introduction
10.2 What is Quantum Computing?
10.3 Quantum Computing Challenges
10.4 Meta-Heuristics and Quantum Meta-Heuristics Solution Approaches
10.5 Quantum Meta-Heuristics Algorithms With Application Areas
References
11 Ensuring Security and Privacy in IoT for Healthcare Applications
11.1 Introduction
11.2 Need of IoT in Healthcare
11.3 Literature Survey on an IoT-Aware Architecture for Smart Healthcare Systems
11.4 IoT in Healthcare: Challenges and Issues
11.5 Proposed System: 6LoWPAN and COAP Protocol-Based IoT System for Medical Data Transfer by Preserving Privacy of Patient
11.6 Conclusion
References
12 Empowering Secured Outsourcing in Cloud Storage Through Data Integrity Verification
12.1 Introduction
12.2 Literature Survey
12.3 System Design
12.4 Implementation and Result Discussion
12.5 Performance
12.6 Conclusion
References
Index
End User License Agreement
Chapter 1
Figure 1.1 Human-centered cognitive cycle.
Figure 1.2 Intuitive thinking and analysis [11].
Figure 1.3 Showing the evolution of Cognitive Computing [13].
Figure 1.4 Global cognitive market [17].
Figure 1.5 Global cognitive market revenue, by geography [17].
Figure 1.6 The general design of a cognitive system [11].
Figure 1.7 Figure showing the convergence of technologies.
Figure 1.8 Machine learning.
Figure 1.9 Supervised model.
Figure 1.10 Reinforcement learning.
Figure 1.11 Hypotheses generation IBM Watson.
Figure 1.12 Healthcare ecosystem.
Figure 1.13 Welltok training architecture [11].
Figure 1.14 Content acquisition.
Chapter 2
Figure 2.1 Cyber-physical system architecture.
Figure 2.2 Integration of ML with HiLCPS.
Figure 2.3 Machine learning algorithms with CPS.
Figure 2.4 Number of publications for ML, IoT, and Big-Data with year wise.
Figure 2.5 Number of machine learning algorithms proposed with CPS.
Figure 2.6 Distinguish approaches with CPS.
Figure 2.7 Applications of ML with CPS.
Figure 2.8 Applications of ML, IoT and Big data with CPS.
Figure 2.9 K-NN evaluation factors.
Figure 2.10 SVM evaluation factors.
Figure 2.11 RF evaluation factors.
Figure 2.12 DT evaluation factors.
Figure 2.13 MLP evaluation factors.
Figure 2.14 NB evaluation factors.
Figure 2.15 LR evaluation factors.
Figure 2.16 Big Data evaluation factors.
Figure 2.17 IoT evaluation factors.
Chapter 3
Figure 3.1 Conceptual Design of lighting system and camera position.
Figure 3.2 Thumb tip inserting part.
Figure 3.3 Completed prototype.
Figure 3.4 Proposed methodology.
Figure 3.5 Procedure of supervised machine learning.
Figure 3.6 With a green LED.
Figure 3.7 With a yellow LED.
Figure 3.8 With an orange LED.
Figure 3.9 With a blue LED.
Figure 3.10 With a red LED.
Figure 3.11 With a white super bright LED.
Figure 3.12 Flow chart of convolutional neural network.
Chapter 4
Figure 4.1 AI services-based Microsoft Interface.
Figure 4.2 Incorporating AI services into Microsoft Azure.
Figure 4.3 IBM Watson technology.
Figure 4.4 Evolution of computing towards cognitive.
Figure 4.5 NLG/NLU subset of NLP.
Figure 4.6 Natural language processing.
Figure 4.7 Knowledge representation design.
Chapter 5
Figure 5.1 Analyzing efficiency in parking [16].
Figure 5.2 Different parking models presentation [2].
Figure 5.3 Preprocess the input frame and generate detecting patches [13].
Figure 5.4 System diagram.
Figure 5.5 Steps of usability testing.
Chapter 6
Figure 6.1 Components of Smart Factory.
Figure 6.2 Relationship between cyber and physical layer integration.
Figure 6.3 Transport system using CPS components.
Figure 6.4 CPS Industrial interface.
Figure 6.5 ECG Connection Interface using CPS.
Figure 6.6 Clinical infrastructure using CPS.
Figure 6.7 Agriculture interface using CPS.
Figure 6.8 Road management using CPS.
Chapter 7
Figure 7.1 Evolution of cognitive computing [6].
Figure 7.2 The system architecture of cognitive computing.
Figure 7.3 Perceptual and rational method to recognize a square. (a) Rational me...
Chapter 9
Figure 9.1 Layered Architecture of Cognitive Computing [2].
Figure 9.2 Taxonomy of Text Media.
Figure 9.3 Knowledge Representation of data for Tic-Tac-Toe game along with rule...
Figure 9.4(a) Washing Machine Diagnostics & Repair when is doesn’t Drain.
Figure 9.4(b) Washing Machine Diagnostics & Repair when is doesn’t Spin.
Figure 9.4(c) Washing Machine Diagnostics & Repair when Water Flow is Slow.
Chapter 10
Figure 10.1 Bloch Sphere which represents the Q-bit.
Figure 10.2 Basic meta-heuristics solution approach.
Figure 10.3 Quantum meta-heuristics solution approach.
Figure 10.4 Quantum inspired meta-heuristic methods.
Figure 10.5 Basic quantum meta-heuristics genetic evolution approach for image s...
Figure 10.6 Airside specific operations & implementations with quantum meta-heur...
Chapter 11
Figure 11.1 An IoT framework for healthcare monitoring systems [10].
Figure 11.2 Hype cycle for the IoT, 2019. Source: Gartner (July 2019).
Figure 11.3 Architecture of IoT-based secure system for medical data transfer.
Figure 11.4 IoT network architecture for above hospitals [28].
Chapter 12
Figure 12.1 System architecture.
Figure 12.2 Encoding in Reed–Solomon codes.
Figure 12.3 Encoding in regeneration codes.
Figure 12.4 Decoding in Reed–Solomon codes.
Figure 12.5 Decoding in regeneration codes.
Figure 12.6 Repair in Reed–Solomon codes.
Figure 12.7 Repair in regeneration codes.
Figure 12.8 Creating containers for data.
Figure 12.9 Chunking input file.
Figure 12.10 XOR-chunked files.
Figure 12.11 Regeneration of file from select nodes.
Figure 12.12 Installing NC-cloud.
Figure 12.13 Working with NC-cloud.
Figure 12.14 Output of distributed chunks.
Figure 12.15 An input file and temp folders.
Figure 12.16 Executing the file.
Figure 12.17 Time for the upload operation.
Figure 12.18 Time for the repair operation.
Figure 12.19 Time for check operation.
Figure 12.20 Time for download operation.
Chapter 1
Table 1.1 Different types of analytics and their examples [11].
Table 1.2 Examples of cognitive application domains [11].
Table 1.3 Question–response pairs for different types of users.
Table 1.4 Sample questions to train the application [11].
Table 1.5 Sample of Welltok question/answer pairs [11].
Chapter 2
Table 2.1 Various applications in CPS using K-NN.
Table 2.2 Various applications in CPS using SVM.
Table 2.3 Different metrics in RF.
Table 2.4 Various applications in CPS using decision trees.
Table 2.5 Different applications in CPS using linear regression.
Table 2.6 Various applications in CPS using multi-layer perceptron.
Table 2.7 Various applications in CPS using Naïve Bayes.
Table 2.8 Role of IoT in CPS.
Table 2.9 Big data in cyber-physical system.
Chapter 5
Table 5.1 Used tools.
Table 5.2 Test results for parking space detection.
Chapter 9
Table 9.1 Features of Cloud Deployment Models for building Cognitive Systems.
Table 9.2 Concept Mapping Representation of Cloud Service Models.
Table 9.3 Features of Text Analytics Covered by the Top Cloud Service Providers.
Chapter 10
Table 10.1 Quantum computing influenced domain areas with applications and devel...
Table 10.2 Quantum meta-heuristics methods and application area.
Chapter 11
Table 11.1 IoT in healthcare scr [6].
Chapter 12
Table 12.1 High availability and integrity layer.
Table 12.2 Functional minimum storage regenerating.
Cover
Table of Contents
Title Page
Copyright
Dedication
Preface
Acknowledgments
Begin Reading
Index
End User License Agreement
vii
ii
iii
iv
v
xvii
xviii
xix
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
219
220
221
222
223
224
225
226
227
228
229
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
Scrivener Publishing100 Cummings Center, Suite 541JBeverly, MA 01915-6106
Next Generation Computing and Communication Engineering
Series Editors: Dr. G. R. Kanagachidambaresan and Dr. Kolla Bhanu Prakash
Developments in artificial intelligence are made more challenging because the involvement of multi-domain technology creates new problems for researchers. Therefore, in order to help meet the challenge, this book series concentrates on next generation computing and communication methodologies involving smart and ambient environment design. It is an effective publishing platform for monographs, handbooks, and edited volumes on Industry 4.0, agriculture, smart city development, new computing and communicatio paradigms. Although the series mainly focuses on design, it also addresses analytics and investigation of industry-related real-time problems.
Publishers at ScrivenerMartin Scrivener ([email protected])Phillip Carmical ([email protected])
Edited by
Kolla Bhanu Prakash,
G. R. Kanagachidambaresan,
V. Srikanth, E. Vamsidhar
This edition first published 2021 by John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA and Scrivener Publishing LLC, 100 Cummings Center, Suite 541J, Beverly, MA 01915, USA© 2021 Scrivener Publishing LLCFor more information about Scrivener publications please visit www.scrivenerpublishing.com.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
Wiley Global Headquarters111 River Street, Hoboken, NJ 07030, USA
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Limit of Liability/Disclaimer of WarrantyWhile the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials, or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read.
Library of Congress Cataloging-in-Publication Data
ISBN 978-1-119-71108-7
Cover image: Pixabay.ComCover design by Russell Richardson
Set in size of 11pt and Minion Pro by Manila Typesetting Company, Makati, Philippines
Printed in the USA
10 9 8 7 6 5 4 3 2 1
Dedicated to our parents, family members, students and the Almighty.
Cognitive computing is a hardware and software element which is presently being used mainly in smart system development. Technologies such as artificial intelligence, machine learning, advanced analytics, natural language processing, big data analytics, and distributed computing come under the umbrella of cognitive computing. The impact of this technology can be seen in areas such as healthcare, business, decision-making, personal lives, and many more. Cognitive engineering is commonly used in analysis, design, decision-making, and sociotechnical systems; and cognitive physical systems are used in applications such as human–robot interactions, transport management, industrial automation, healthcare, agriculture, etc. Human individual interactions and group behavior are important to all these applications. Cognitive cyber-physical systems are applied in different areas such as smart manufacturing, agriculture, education, energy management, security, environmental monitoring, transportation systems, process control, smart cities and homes, medical healthcare devices, etc. The increasing complexity of cognitive computing also includes the security problems confronted by such networks. This rise of the Internet of Things (IoT) network complexity is due to too many devices being interconnected with each other through the internet along with the enormous amount of data originating from these devices. Also, novel security issues arise relating to the development of the IoT while conventional security issues become more severe. The major reasons for this are the heterogeneity and the substantially large scale of the objects. As the threats to IoT devices are increasing and the security metrics are based on the developmental aspects of software as well as network, the hackers can expand control and carry out malicious activities and attacks on other devices close to the compromised one. Due to the natural significance of the low-power and low-memory nature of these devices, these devices do not have malware protection or virus protection software. The cognitive approach to the IoT provides connectivity to everyone and everything since IoT connected devices are known to increase rapidly. When the IoT is integrated with cognitive technology, performance is improved, and smart intelligence is obtained. Different types of datasets with structured content are discussed based on cognitive systems. The IoT gathers the information from the real-time datasets through the internet, where the IoT network connects with multiple devices.
This book mainly concentrates on providing the best solutions to existing real-time issues in the cognitive domain. Healthcare-based, cloud-based and smart transportation-based applications in the cognitive domain are addressed. The data integrity and security aspects of the cognitive computing domain are also thoroughly discussed along with validated results.
EditorsKolla Bhanu PrakashG. R. KanagachidambaresanV. SrikanthE. Vamsidhar
We would like to thank the Almighty and our parents for their endless support, guidance and love throughout all the stages of our lives. We are grateful to our beloved family members for standing beside us throughout our careers, which are advanced with the editing of this book.
We would especially like to thank Sri. Koneru Satyanarayana, president of K.L. University, India, and Vel Tech Rangarajan Dr.Sagunthala R&D Institute of Science and Technology for their continuous support and encouragement throughout the preparation of this book. We dedicate this book to them.
Many thanks go to our students and family members who have put in their time and effort to support and contribute in some manner. We would like to express our gratitude to all who supported, shared, talked things over, read, wrote, offered comments, allowed us to quote their remarks and assisted in the editing, proofreading and design of this book throughout the journey to its completion. We also give our sincere thanks to the open dataset providers.
We believe that the team of authors provided the perfect blend of knowledge and skills that went into authoring this book. We thank each of the authors for devoting their time, patience, perseverance and effort towards this book; we think that it will be a great asset to all researchers in this field!
We are grateful to Martin Scrivener and all other members of the publishing team, who showed us the ropes to creating a book. Without that knowledge we would not have ventured into such a project. Their trust in us, guidance and the necessary time and resources afforded us, gave us the freedom to manage this book.
Last, but definitely not least, we’d like to thank our readers, and we hope our work inspires and guides them.
EditorsKolla Bhanu PrakashG. R. KanagachidambaresanV. SrikanthE. Vamsidhar
Vamsidhar Enireddy*, Sagar Imambi† and C. Karthikeyan‡
Department of Computer Science and Engineering, Koneru Lakshmaiah Education Foundation, Guntur, India
Abstract
Cognitive computing is an interdisciplinary subject that brings under its umbrella several techniques such as Machine learning, big data analytics, artificial intelligence, analytics, natural language processing, and probability and statistics to gather information and understand it using different senses and learning from their experience. Cognitive computing helps humans in taking the right decisions at a right time helping the people to grow in their respective fields. In this chapter, we are going to discuss cognitive computing and the elements involved in it. Further, we will learn about the components and hypothesis generation and scoring of it.
Keywords: Artificial intelligence, cognition, cognitive computing, corpus, intuitive thinking, hypothesis generation, machine learning
The term Cognition is defined as “The procedure or the method of acquiring information and understanding through experience, thought and the senses” [1]. It envelops numerous parts of procedures and intellectual functions, for example, development of information, thinking, reasoning, attention, decision making, evaluating the decisions, problem-solving, computing techniques, judging and assessing, critical thinking, conception, and creation of language. This process produces new information using existing information. A large number of fields especially psychology, neuroscience, biology, philosophy, psychiatry, linguistics, logic, education, anesthesia, and computer science view and analyze the cognitive processes with a diverse perspective contained by dissimilar contexts [2].
The word cognition dates to the 15th century, derived from a Latin word where it meant “thinking and awareness” [3]. The term comes from cognitio which means “examination, learning or knowledge”, derived from the verb cognosco, a compound of con (‘with’), and gnōscō (‘know’). The latter half, gnōscō, itself is a cognate of a Greek verb, gi(g)nόsko (γι(γ)νώσκω, ‘I know,’ or ‘perceive’) [4, 5].
Aristotle is probably the first person who has shown interest to study the working of the mind and its effect on his experience. Memory, mental imagery, observation, and awareness are the major areas of cognition, hence Aristotle also showed keen interest in their study. He set incredible significance on guaranteeing that his examinations depended on exact proof, that is, logical data that is assembled through perception and principled experimentation [6]. Two centuries later, the basis for current ideas of comprehension was laid during the Enlightenment by scholars, like, John Locke and Dugald Stewart who tried to build up a model of the psyche in which thoughts were obtained, recalled, and controlled [7].
As Derived from the Stanford Encyclopedia of Philosophy the Cognitive science can be defined as “Cognitive science is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience, linguistics, and anthropology.”
The approach for cognitive computing depends on understanding the way how the human brain can process the information. The main theme or idea of a cognitive system is that it must able to serve as an associate for the human’s rather than simply imitating the capabilities of the human brain.
Cognitive computing can be defined as hardware and software to learn so that they need not be reprogrammed and automate the cognitive tasks [11]. This technology brings under its cover many different technologies such as Artificial Intelligence, Machine Learning, Advanced Analytics, Natural Language Processing, Big Data Analytics, and Distributed Computing. The impact of this technology can be seen in health care, business, decision making, private lives, and many more.
Two disciplines are brought together with cognitive computing
Cognitive Science
Computer Science.
The term cognitive science refers to the science of mind and the other is a computational approach where the theory is put into practice.
The ultimate objective of cognitive computing is that it must able to replicate the human thinking ability in a computer model. Using technologies like machine learning, natural language processing, advanced analytics, data mining, and statistics had made these things possible where the working of the human brain can be mimicked [8].
From a long back, we can construct the computers which perform the calculations at a high speed, also able to develop supercomputers which can do calculations in a fraction of second, but they are not able to perform the tasks as humans do like the reasoning, understanding and recognizing the objects and images.
Cognitive researchers discover the mental capability of humans through an examination of the aspects like memory, emotion, reasoning, perception, and language [12]. Figure 1.1 shows the Human centered cognitive cycle. On analysis, the human being’s cognitive process can be divided into two stages. One is the humans use their sensory organs to perceive the information about their surrounding environment and become aware of it, in this manner humans gather the input from the outside environment. The second stage is that this information is carried by the nerves to the brain for processing and the process of storing, analyzing, and learning takes place [13].
Figure 1.1 Human-centered cognitive cycle.
Many researchers and scientists from many years had tried to develop the systems that can mimic the human thoughts and process, but it is relatively complex to transform the intricacy of thinking of humans and actions into systems. Human beings have a lot of influence on them such as perception, culture, sentiment, lifestyle, and implicit beliefs about their surrounding environment. Cognition is the basic framework that not only leverages the way we imagine but also the way we behave and the way we make decisions. To understand this let us consider some examples that we see around us. Why there are different recommendations and approaches between the treatments for the same disease with different doctors? Why do people with the same background born and brought up in the same family have different views and opinions about the world?
Dr. Daniel Kahneman is a Nobel Prize winner in economic sciences in 2002 had paved a way for the cognitive computing approach. He had made a lot of research in the area of psychology of judgment and decision making [11]. The approach is divided into two systems: 1. Intuitive thinking and 2. Controlled andrulecentric thinking.
System 1: Intuitive thinkingIn this system, reasoning occurs in the human brain naturally. The conclusions are drawn using our instincts. In System 1 human thinking begins the moment they are born. Humans learn to notice and recognize the things and their relationship by themselves. To illustrate this we consider some examples for better understanding. The children correlate their parent’s voices with safety. People correlate strident sound with danger. At the same time, we can see that children with a harsh mother are not going to have a similar experience with the voice of the mother as the child with a good mother. Humans learn more things over time and continue assimilating their thoughts into their mode of working in the world. The chess grandmaster can play the game with their mind anticipating their opponent’s move and also they can play the game entirely in their mind without any need to touch the chessboard. The surrounding environment plays a major role in a person’s behavior, it affects their emotions and attitudes. A person brought up in treacherous surroundings, have a different attitude about the people compared to a person brought up in healthy surroundings. In System1 using the perception, we gather the data about the world and connect the events. In the cognitive computing point of view, this System 1 had taught the way how we gather information from the surroundings helps us to conclude. Figure 1.2 shows collaboration between the Intuitive thinking and analysis.
System 2: Controlled and rulecentric thinking.In this process, the reasoning is based on an additional premeditated process. This conclusion is made by taking into consideration both observations and test assumptions, rather than simply what is understood. In this type of system the thinking process to get a postulation, it uses a simulation model and observes the results of that particular statement. To do this a lot of data is required and a model is built to test the perceptions made by System 1. Consider the treatment of cancer patients in which a large number of ways and drugs are available to treat the patients. The cancer drugs not only kill the cancer cells but also kill the healthy cells, making the patient feel the side effects of it. When a drug company comes with any novel drug it tests on animals, records its results, and then it is tested on humans. After a long verification of the data checking the side effects of the drug on the other parts of the body, the government permits to release the drug into the market where it takes a long time from research to availability of that drug. In System 1 when a drug can destroy the cancer cells it determines it can be put onto the market. It is completely biased. System 2 will not conclude as of System 1, it collects the data from various sources, refines it, and then it comes to a conclusion. Although this process is slow it is important to study all the things before jumping to a conclusion. One of the most complex problems is predicting the outcomes as many factors can affect the outcomes. So, it is very important to merge the spontaneous thinking with the computational models.
Figure 1.2 Intuitive thinking and analysis [11].
The cognitive system is based on three important principles
Learn
Model
Hypothesis generation.
Learn: The cognitive framework must be able to learn. The framework use information to make inductions about an area, a theme, an individual, or an issue dependent on preparing and perceptions from all assortments, volumes, and speed of information.
Model: To learn, the framework it requires to make a model or portrayal of a domain which incorporates interior and conceivably exterior information and presumptions that direct what realizing calculations are utilized. Understanding the setting of how the information fits into the model is critical to a cognitive framework.
Generate hypotheses: A cognitive framework expects that there will be several solutions or answers to a question. The most fitting answer depends on the information itself. In this way, an intellectual framework is probabilistic. A theory is an up-and-comer clarification for a portion of the information previously comprehended. A cognitive framework utilizes the information to prepare, test, or score speculation.
The basis for cognitive computing is artificial intelligence. Artificial Intelligence has roots back at least 300 years ago, but in the last 50 years, there is much research and improvement in this field which has impacted the development of cognitive computing. The combined work of the mathematicians and scientists in converting the working of a brain into a model such that it mimics the working of the brain, but it has taken a long time to make them work and think like a human brain. During WW-II England has achieved victory due to the decoding of the messages of the opponent and this is achieved by the great work of Alan Turing who worked on the cryptography. Later Turing worked on machine learning and published a paper “Computing Machinery and Intelligence” in which he put up a question “Can machines think”, he greatly believed that machines can think and also throw away the argument that the machines cannot think as they do not have any emotions like the human beings. In the later years, he came up with the famous Turing test to prove that machines can think as human beings do. From ten many scientists had contributed to the development of artificial intelligence and can be termed as modern artificial intelligence. The cognitive computing is still evolving. Figure 1.3 shows how the evolution of Cognitive Computing had taken place over the years.
The main focus of cognitive computing is on processing methods, here the data that is to be processed need not be big. The most important thing in understanding the working of the brain is how a brain can decode the image and it is well known that 20% of the brain working function is allocated for the vision and the working of the brain in the image processing is highly efficient. The brain can do things with limited data and even the limited memory is not affecting the cognition of image information. Cognitive science helps to develop the algorithms required for cognitive computing, making the machines to function like a human brain to some degree of extending [14]. The only way to build up the computers to compute as a human brain is to understand and cognize the things and surroundings in the perspective of how a human brain thinks. The cognitive computing is very much important and critical to building up the cognition of a machine and thereby making it to understand the requirements of humans [15]. There is a necessity to make the machines think like humans and they must be able to make decisions and have some intelligence as of humans, of course, a lot of improvement is to be made in this field. With the help of the present techniques, it is possible to make machines think like humans, as they involve reasoning, understanding complicated emotions. Cognitive computing had made tremendous progress and also exceeded the conventional machine learning. Internet of Things is one technology that had made very good progress and helping the people in many ways and now IoT is embedded with cognitive computing developing a smarter Internet of Things systems assisting the humans in many ways like providing vital suggestions and helping in the decision making [16].
Figure 1.3 Showing the evolution of Cognitive Computing [13].
In today’s world with a lot of sensors around a lot of data is being generated all the time in many forms. The evolution of cognitive computing is to make a sense in this multifaceted world with this large volume of data. The older technologies have been developed to make sense with the structured data and machines, software is also developed to deal with such type of data and gathering information from the structured data. The growth of social site and apps have impacted the growth in the unstructured and semi-structured data and these older technologies are no more a way to handle these types of data and the cognitive computing helps in gathering the information from all types of data Unstructured, Semi-structured, and Structured data. Without the handling of these different types of data, a lot of information can be missed and the cognitive computing is going to help the humans to collaborate with the machines so that a maximum gain can be extracted from them. In the past also we have seen the technology had transformed the industries and also the human way of living from the last decades. Transactional processing had started in the 1950s had brought a lot of transformation in government operations and also in business transactions, giving a lot of efficient ways to deal with the operations. During that time the data was limited and major data is structured data and tools are developed to handle this type of data and many mining tools are developed to extract the information from that data. A large amount of data cannot be handled by the traditional tools and methods, so we need a mixture of traditional methods with traditional technical models with the innovations to solve the niggling problems.
Although it was stated that the foundation for cognitive computing is artificial intelligence there is a lot of difference between the two.
The basic use of artificial intelligence is to solve the problem by implementing the best algorithm, but cognitive computing is entirely different from artificial intelligence as cognitive computing adds the reasoning, intelligence to the machine and also analyzes different factors to solve the problem.
Artificial Intelligence mimics the human intelligence in machines. This process comprises making the machines learn constantly with the changing data, making sense of the information, and taking decisions including the self-corrections whenever needed.
Human beings use the senses to gather information about the surrounding environment and process that information using the brain to know about the environment. In this context, we can define that artificial intelligence can also include replicating the human senses such as hearing, smelling, touching, seeing, and tasting. It also includes simulating the learning process and this is made possible in the machines using machine learning and deep learning. Last but not least is human responses achieved through the robotics [18].
The cognitive computing is used to understand and simulate the reasoning and human behavior. Cognitive Computing assists humans to take better decisions in their respective fields. Their applications are fraud detection, face and emotion detection, sentiment analysis, risk analysis, and speech recognition [17].
The main focus of cognitive computing includes
To solve complex problems by mimicking human behavior and reasoning.
Trying to replicate the humans in solving the problems
Assists the human in taking decisions and do not replace humans at all.
Artificial Intelligence focus includes
To solve complex problems it augments human thinking, it tries to provide accurate results.
It tries to find new methods to solve problems which can potentially be superior to humans
The main intent of AI is to solve the problem utilizing the best algorithm and not simply mimicking the human brain.
The human role is minimized in taking the decisions and artificial intelligence takes over the responsibility.
The main advantage that needs to be highlighted is that Cognitive Computing does not pose any threat to humans. Cognitive computing helps in assisting human beings in taking better decisions in their tasks, endowing human beings with high precision in analyzing the things, same time having everything under their control. In the case of the health care system, cognitive computing assists the specialists in the diagnosis of the disease using the data and advanced analytics, by which it helps to take quality decisions by the doctors [10]. In Figure 1.4 we can see the growth of Cognitive Computing in various continents. In Figure 1.5 we can see the growth of revenue in the various locations of the world.
Figure 1.4 Global cognitive market [17].
Figure 1.5 Global cognitive market revenue, by geography [17].
Several different elements constitute the cognitive system, starting from hardware and operational prototypes to modern machine learning algorithms and applications. Figure 1.6 gives a general design for building a cognitive system.
The system needs to meet the demands of the industries as they continuously grow and the infrastructure should be flexible to carry on the applications required for the industry. A large amount of data is required to be processed and managed; this data consists of both public and private data. Cloud infrastructure services are required and constant support should be given, providing a highly parallel and distributed computing environment.
Figure 1.6 The general design of a cognitive system [11].
Data is the most important point where cognitive computing revolves around, so the data collection, accession, and maintaining it must have a very important role. A lot of essential services are required for adding the data and also using it. To ingest the data utmost care should be taken to check the source from which the data is originated. As a result, there is a requirement that data should be classified based on the origin of data, as it is required to check the data source was trusted or not. The most important thing to learn here is that the data is not static as it is required to update the data from the sources and upload it into the systems. The corpus is the one that holds the data and it relies on various internal and external sources. As a large data is available, a check should be done on data sources, data should be verified, cleaned, and check for accuracy so that it can be added into the corpus. This is a mammoth task as it requires a lot of management services to prepare the data.
Firmly connected with the information access and the other executive layer are the corpus and data analytics administrations. A corpus is the information base of ingested information and is utilized to oversee classified information. The information required to build up the area for the framework is incorporated in the corpus. Different types of information are ingested into the framework. In numerous cognitive frameworks, this information will principally be text-based (records, patient data, course books, client reports, and such). Other cognitive frameworks incorporate numerous types of unstructured and semi-structured information, (for example, recordings, pictures, sensors, and sounds). What’s more, the corpus may incorporate ontologies that characterize explicit elements and their connections. Ontologies are regularly evolved by industry gatherings to arrange industry-specific components, for example, standard synthetic mixes, machine parts, or clinical maladies and medicines. In a cognitive framework, it is frequently important to utilize a subset of an industry-based ontology to incorporate just the information that relates to the focal point of the cognitive framework. A taxonomy works inseparably with ontologies. It also provides a background contained by the ontology.
These are the methods used to increase the comprehension of the information ingested and managed inside the corpus. Ordinarily, clients can take a bit of leeway of structured, unstructured, and semi-structured information that has been ingested and start to utilize modern calculations to anticipate results, find designs, or decide the next best activities. These administrations don’t live in separation. They constantly get to new information from the information get to layer and pull information from the corpus. Various propelled calculations are applied to build up the model for the cognitive framework.
Machine learning is a strategy that gives the ability to the information to learn without being unequivocally modified. Cognitive frameworks are dynamic. These models are ceaselessly refreshed dependent on new information, examination, and associations. This procedure has two key components: Hypothesis generation and Hypothesis evaluation.
A distinctive cognitive framework utilizes machine learning calculations to construct a framework for responding to questions or conveying insight. The structure requires helping the following characteristics:
Access, administer, and evaluate information in the setting.
Engender and score different hypotheses dependent on the framework’s aggregated information. The framework may produce various potential arrangements to each difficult it illuminates and convey answers and bits of knowledge with related certainty levels.
The framework persistently refreshes the model dependent on client associations and new information. A cognitive framework gets more astute after some time in a robotized way.
The framework has an interior store of information (the corpus) and also communicates with the exterior surroundings to catch extra information, to possibly refresh external frameworks. Cognitive frameworks may utilize NLP to get text, yet additionally need another handling, profound learning capacities, and instruments to apprehend images, voice, recordings, and position. These handling capacities give a path to the cognitive framework to comprehend information in setting and understand a specific domain area. The cognitive framework creates hypotheses and furnishes elective answers or bits of knowledge with related certainty levels. Also, a cognitive framework should be able to do deep learning that is explicit to branches of knowledge and businesses. The existing pattern of a cognitive framework is an iterative procedure. The iterative procedure requires the amalgamation of best practices of the humans and also training the system with the available data.
Corpus can be defined as a machine-readable portrayal of the total record of a specific area or theme. Specialists in an assortment of fields utilize a corpus or corpora for undertakings, for example, semantic investigation to contemplate composing styles or even to decide the credibility of a specific work.
The information that is to be added into the corpus is of different types of Structured, Unstructured, and Semi-structured data. It is here what makes the difference with the normal database. The structured data is the data which have a structured format like rows and column format. The semi-structured data is like the raw data which includes XML, Jason, etc. The unstructured data includes the images, videos, log, etc. All these types of data are included in the corpus. Another problem we face is that the data needs to be updated from time to time. All the information that is to be added into the corpus must be verified carefully before ingesting into it.
In this application, the corpus symbolizes the body of information the framework can use to address questions, find new examples or connections, and convey new bits of knowledge. Before the framework is propelled, in any case, a base corpus must be made and the information ingested. The substance of this base corpus obliges the sorts of issues that can be tackled, and the association of information inside the corpus significantly affects the proficiency of the framework. In this manner, the domain area for the cognitive framework has to be chosen and then the necessary information sources can be collected for building the corpus. A large of issues will arise in building the corpus.
What kinds of issues would you like to resolve? If the corpus is as well barely characterized, you may pass up new and unforeseen insights.
If information is cut from outside resources before ingesting it into a corpus, they will not be utilized in the scoring of hypotheses, which is the foundation of machine learning.
Corpus needs to incorporate the correct blend of applicable information assets that can empower the cognitive framework to convey exact reactions in normal time. When building up a cognitive framework, it’s a smart thought to decide in favor of social occasion more information or information because no one can tell when the disclosure of an unforeseen affiliation will lead to significant new information.
Accorded the significance set on obtaining the correct blend of information sources, several inquiries must be tended to right off the bat in the planning stage for this framework:
Which interior and exterior information sources are required for the particular domain regions and issues to be unraveled? Will exterior information sources be ingested in entire or to some extent?
How would you be able to streamline the association of information for effective exploration and examination?
How would you be able to coordinate information over various corpora?
How would you be able to guarantee that the corpus is extended to fill in information gaps in your base corpus? How might you figure out which information sources need to be refreshed and at what recurrence?
The most critical point is that the decision of which sources to remember for the underlying corpus. Sources running from clinical diaries to Wikipedia may now be proficiently imported in groundwork for the dispatch of the cognitive framework. It is also important that the unstructured data has to be ingested from the recordings, pictures, voice, and sensors. These sources are ingested at the information get to layer (refer figure). Other information sources may likewise incorporate subject-specific organized databases, ontologies, scientific classifications, furthermore, indexes.
On the off chance that the cognitive computing application expects access to exceptionally organized information made by or put away in different frameworks, for example, open or exclusive databases, another structure thought is the amount of that information to import at first. It is additionally essential to decide if to refresh or invigorate the information intermittently, consistently, or in light of a solicitation from the framework when it perceives that more information can assist it with giving better answers.
During the plan period of an intellectual framework, a key thought is whether to build a taxonomy or ontology if none as of now exists for the specific domain. These types of structures not only streamline the activity of the framework, but they also make them more productive. In any case, if the designers are accountable for guaranteeing that an ontology and taxonomy is absolute and fully updated, it might be progressively viable to have the framework constantly assess connections between space components rather than have the originators incorporate that with a hard-coded structure. The performance of the hypothesis generation and scoring solely depend on the data structures that have been chosen in the framework. It is in this manner prudent to demonstrate or reenact regular outstanding tasks at hand during the planning stage before focusing on explicit structures. An information catalog, which incorporates metadata, for example, semantic data or pointers, might be utilized to deal with the basic information all the more productively. The list is, as a deliberation, progressively smaller what’s more, for the most part, quicker to control than a lot bigger database it speaks to. In the models and outlines, when alluding to corpora, it ought to be noted that these can be coordinated into a solitary corpus while doing so will help disentangle the rationale of the framework or improves execution. Much like a framework can be characterized as an assortment of littler incorporated frameworks, totaling information from an assortment of corpora brings about a solitary new corpus. Looking after isolated corpora is ordinarily accomplished for execution reasons, much like normalizing tables in a database to encourage inquiries, instead of endeavoring to join tables into a solitary, progressively complex structure.
Information sources and the development of that information are progressively turning out to be intensely managed, especially for by and by recognizable data. Some broad issues of information approach for assurance, security, and consistency are regular to all applications, however, cognitive computing applications be trained and infer new information or information that may likewise be dependent upon a developing collection of state, government, furthermore, global enactment.
At the point when the underlying corpus is created, almost certainly, a ton of information will be imported utilizing extract–transform–load (ETL) apparatuses. These devices may have risk management, security, and administrative highlights to enable the client to make preparations for information abuse or give direction when sources are known to contain sensitive information. The accessibility of the said instruments doesn’t clear the developers from a duty to guarantee that the information and metadata are consistent with material rules and guidelines. Ensured information might be ingested (for instance, individual identifiers) or produced (for instance, clinical findings) when the corpus is refreshed by the cognitive computing framework. Anticipating great corpus the executive sought to incorporate an arrangement to screen applicable strategies that sway information in the corpus. The information gets to layer instruments depicted in the following area must be joined by or implant consistence strategies and techniques to guarantee that imported and determining information and metadata stay consistent. That incorporates the thought of different sending modalities, for example, distributed computing, which may disperse information across geopolitical limits.
In contrast to numerous customary frameworks, the information that is added into the corpus is always dynamic, which means that the information should be always updated. There is a need to fabricate a base of information that sufficiently characterizes your domain space and also start filling this information base with information you anticipate to be significant. As you build up the model in the cognitive framework, you refine the corpus. Along these lines, you will consistently add to the information sources, change those information sources, and refine and purge those sources dependent on the model improvement and consistent learning.
Most associations as of now oversee immense volumes of organized information from their value-based frameworks and business applications, and unstructured information, for example, the text contained in structures or notes and conceivably pictures from archives or then again corporate video sources. Albeit a few firms are composing applications to screen outer sources, for example, news and online life channels, numerous IT associations are not yet well prepared to use these sources and incorporate them with interior information sources. Most subjective registering frameworks will be created for areas that require continuous access to coordinated information from outside the association.
The person figures out how to recognize the correct sources to sustain his statements or his decision, he is normally based on social media, news channels, newspapers, and also on different web resources. Similarly, the cognitive application for the most part needs to get to an assortment of efficient sources to keep updated on the topic on which the cognitive domain operates. Likewise, similar to experts who must adjust the news or information from these exterior sources in opposition to their understanding, a cognitive framework must figure out how to gauge the external proof and create trust in the source and also on the content after some time. For instance, one can find an article related to medicine in a famous magazine, which can be a good source of information but if this article is contrary to an article published in a peer-reviewed journal, then the cognitive system must able to gauge the contradicting positions. The data that has to be ingested into the corpus must be verified carefully. In the above example, we may find that all the information sources that might be helpful ought to be thought of and conceivably ingested. On the other hand, this doesn’t imply that all sources will be of equivalent worth.
Consider the case of the healthcare in which we can see that an average person meets several doctors or specialists for any health issue. A large number of records will be generated each time he meets the doctors, so Electronic Medical Records (EMRs) help to place all the records in one place and also help to refer them whenever required and doctors can map easily on verifying these records. This helps the specialist to find the association between the blends of side effects and disorders or infections that would be missed if a specialist or scientist approached uniquely to the records from their training or establishment. This cannot be done manually by a person as he may miss or forget to carry all the records with him while meeting the doctor.
A communications organization using the cognitive approach wants to improve its performance to capture or improve their market share. The cognitive system can foresee ant failures in the machine by calculating the inner variables, for example, traffic and traditional patterns; they also calculate the external components, for example, extreme climate threats that are probably going to cause over-burdens and also substantial damage.
In the diagram data access level has portrayed the principle interface connecting the cognitive system and the external world. Any information that is needed has to be imported from outer sources has to go through the procedures inside this layer. All types of structured, semi-structured, and unstructured data are required for the cognitive application is collected from different resources, and this information is arranged for processing using the machine learning algorithms. To put an analogy to the human way of learning is that it represents the senses. There are two tasks that the feature extraction layer needs to complete. One is identifying the significant information and the second is to extract the information so that it can be processed by the machine learning algorithms. Consider for instance with image processing application where the image representation is in pixels and it does not completely represent an object in the image. We need to represent the things in a meaningful manner as in the case of the medical images, where a dog or dog scan is not useful to the veterinary doctor until the essential structure is captured, identified, and represented. Using Natural Language Processing the meaning in the unstructured text can be identified. The corpus is dynamic hence the data is added or removed from it constantly by using the hypotheses score.
The term Analytics alludes to an assortment of procedures used to discover and provide details regarding fundamental qualities or associations inside a dataset. These techniques are very helpful in guiding us by providing knowledge about data so that a good decision can be taken based on the insights. Algorithms such as regression analysis are most widely used to find the solutions. In cognitive systems, a wide scope of sophisticated analytics is accessible for descriptive, predictive, and prescriptive tasks in many commercial library packages in the statistical software. In further to support the cognitive systems tasks a large number of supporting tools are available. In the present time analytics role in the market has changed a lot. Table 1.1 gives us a view of the analytics role that many organizations are experiencing. These analytics helps to learn and understand things from the past and thereby predict future outcomes. Most of the data collected from the past are utilized by business analytics and data scientists to come up with a good prediction. The main important thing in these days is that the technology is growing and it is meeting all levels of the people in the whole world and world has itself become a small global village due to the information technology, so the organizations should learn that they are many dynamic changes in the behavior and taste of the people. Using the advanced analytics it is necessary to build better predictive models so that for any small change in the trade environment these models can react to them.
Figure 1.7
