205,99 €
COGNITIVE INTELLIGENCE AND BIG DATA IN HEALTHCARE
Applications of cognitive intelligence, advanced communication, and computational methods can drive healthcare research and enhance existing traditional methods in disease detection and management and prevention.
As health is the foremost factor affecting the quality of human life, it is necessary to understand how the human body is functioning by processing health data obtained from various sources more quickly. Since an enormous amount of data is generated during data processing, a cognitive computing system could be applied to respond to queries, thereby assisting in customizing intelligent recommendations. This decision-making process could be improved by the deployment of cognitive computing techniques in healthcare, allowing for cutting-edge techniques to be integrated into healthcare to provide intelligent services in various healthcare applications.
This book tackles all these issues and provides insight into these diversified topics in the healthcare sector and shows the range of recent innovative research, in addition to shedding light on future directions in this area.
Audience
The book will be very useful to a wide range of specialists including researchers, engineers, and postgraduate students in artificial intelligence, bioinformatics, information technology, as well as those in biomedicine.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 596
Veröffentlichungsjahr: 2022
Cover
Title Page
Copyright
Preface
1 Era of Computational Cognitive Techniques in Healthcare Systems
1.1 Introduction
1.2 Cognitive Science
1.3 Gap Between Classical Theory of Cognition
1.4 Cognitive Computing’s Evolution
1.5 The Coming Era of Cognitive Computing
1.6 Cognitive Computing Architecture
1.7 Enabling Technologies in Cognitive Computing
1.8 Intelligent Systems in Healthcare
1.9 The Cognitive Challenge
1.10 Conclusion
References
2 Proposal of a Metaheuristic Algorithm of Cognitive Computing for Classification of Erythrocytes and Leukocytes in Healthcare Informatics
2.1 Introduction
2.2 Literature Concept
2.3 Materials and Methods (Metaheuristic Algorithm Proposal)
2.4 Case Study and Discussion
2.5 Conclusions with Future Research Scopes
References
3 Convergence of Big Data and Cognitive Computing in Healthcare
3.1 Introduction
3.2 Literature Review
3.3 Using Cognitive Computing and Big Data, a Smart Healthcare Framework for EEG Pathology Detection and Classification
3.4 An Approach to Predict Heart Disease Using Integrated Big Data and Cognitive Computing in Cloud
3.5 Conclusion
References
4 IoT for Health, Safety, Well-Being, Inclusion, and Active Aging
4.1 Introduction
4.2 The Role of Technology in an Aging Society
4.3 Literature Survey
4.4 Health Monitoring
4.5 Nutrition Monitoring
4.6 Stress-Log: An IoT-Based Smart Monitoring System
4.7 Active Aging
4.8 Localization
4.9 Navigation Care
4.10 Fall Monitoring
4.11 Conclusion
References
5 Influence of Cognitive Computing in Healthcare Applications
5.1 Introduction
5.2 Bond Between Big Data and Cognitive Computing
5.3 Need for Cognitive Computing in Healthcare
5.4 Conceptual Model Linking Big Data and Cognitive Computing
5.5 IBM’s Watson and Cognitive Computing
5.6 Future Directions
5.7 Conclusion
References
6 An Overview of the Computational Cognitive from a Modern Perspective, Its Techniques and Application Potential in Healthcare Systems
6.1 Introduction
6.2 Literature Concept
6.3 Discussion
6.4 Trends
6.5 Conclusions
References
7 Protecting Patient Data with 2F- Authentication
7.1 Introduction
7.2 Literature Survey
7.3 Two-Factor Authentication
7.4 Proposed Methodology
7.5 Medical Treatment and the Preservation of Records
7.6 Conclusion
References
8 Data Analytics for Healthcare Monitoring and Inferencing
8.1 An Overview of Healthcare Systems
8.2 Need of Healthcare Systems
8.3 Basic Principle of Healthcare Systems
8.4 Design and Recommended Structure of Healthcare Systems
8.5 Various Challenges in Conventional Existing Healthcare System
8.6 Health Informatics
8.7 Information Technology Use in Healthcare Systems
8.8 Details of Various Information Technology Application Use in Healthcare Systems
8.9 Healthcare Information Technology Makes it Possible to Manage Patient Care and Exchange of Health Information Data, Details are Given Below
8.10 Barriers and Challenges to Implementation of Information Technology in Healthcare Systems
8.11 Healthcare Data Analytics
8.12 Healthcare as a Concept
8.13 Healthcare’s Key Technologies
8.14 The Present State of Smart Healthcare Application
8.15 Data Analytics with Machine Learning Use in Healthcare Systems
8.16 Benefit of Data Analytics in Healthcare System
8.17 Data Analysis and Visualization: COVID-19 Case Study in India
8.18 Bioinformatics Data Analytics
8.19 Conclusion
References
9 Features Optimistic Approach for the Detection of Parkinson’s Disease
9.1 Introduction
9.2 Literature Survey
9.3 Methods and Materials
9.4 Results and Discussion
9.5 Conclusion
References
10 Big Data Analytics in Healthcare
10.1 Introduction
10.2 Need for Big Data Analytics
10.3 Characteristics of Big Data
10.4 Big Data Analysis in Disease Treatment and Management
10.5 Big Data: Databases and Platforms in Healthcare
10.6 Importance of Big Data in Healthcare
10.7 Application of Big Data Analytics
10.8 Conclusion
References
11 Case Studies of Cognitive Computing in Healthcare Systems: Disease Prediction, Genomics Studies, Medical Image Analysis, Patient Care, Medical Diagnostics, Drug Discovery
11.1 Introduction
11.2 Literature Survey
11.3 Methodology
11.4 Results and Discussion
11.5 Conclusion and Future Work
References
12 State of Mental Health and Social Media: Analysis, Challenges, Advancements
12.1 Introduction
12.2 Introduction to Big Data and Data Mining
12.3 Role of Sentimental Analysis in the Healthcare Sector
12.4 Case Study: Analyzing Mental Health
12.5 Results and Discussion
12.6 Conclusion and Future
References
13 Applications of Artificial Intelligence, Blockchain, and Internet-of-Things in Management of Chronic Disease
13.1 Introduction
13.2 Artificial Intelligence and Management of Chronic Diseases
13.3 Blockchain and Healthcare
13.4 Internet-of-Things and Healthcare Management of Chronic Disease
13.5 Conclusions
References
14 Research Challenges and Future Directions in Applying Cognitive Computing in the Healthcare Domain
14.1 Introduction
14.2 Cognitive Computing Framework in Healthcare
14.3 Benefits of Using Cognitive Computing for Healthcare
14.4 Applications of Deploying Cognitive Assisted Technology in Healthcare Management
14.5 Challenges in Using the Cognitive Assistive Technology in Healthcare Management
14.6 Future Directions for Extending Heathcare Services Using CATs
14.7 Addressing CAT Challenges in Healthcare as a General Framework
14.8 Conclusion
References
Index
Wiley End User License Agreement
Chapter 1
Table 1.1 Domain of healthcare.
Table 1.2 For various clinical uses, machine learning algorithms [70].
Chapter 3
Table 3.1 System confusion matrix using the VDCN approach.
Table 3.2 Confusion matrix of the system using ImageNet classification approach.
Table 3.3 Input nodes.
Chapter 9
Table 9.1 Shape features for all the three classes of subjects and its asymmetri...
Table 9.2 SBR values of caudate, putamen values.
Table 9.3 Performance measures of various classifiers
Table 9.4 Performance comparison with related works.
Chapter 10
Table 10.1
Big data analytics and their use in the treatment of diseases.
Table 10.2
Software and their use in the management of the healthcare.
Chapter 11
Table 11.1 For both normal and high blood pressure IOP scenarios, a collection o...
Table 11.2 Pupil/iris ratio, iris and pupil area of high-pressure eyes.
Table 11.3 Pupil/iris ratio, iris and pupil area of high-pressure eyes.
Table 11.4 Training phase confusion matrix.
Table 11.5 Accuracy, sensitivity, and specificity of SVM training data.
Chapter 12
Table 12.1 Analysis for participant 1.
Table 12.2 Analysis for second participant.
Chapter 1
Figure 1.1 Eras of computing (2013 International Business Machine Corporation) [...
Figure 1.2 Cognitive computing system architecture [48].
Figure 1.3 Perceptual and rational method to recognize a square. (a) Rational me...
Figure 1.4 Component of an intelligent system [51].
Figure 1.5 Intelligent system [49].
Figure 1.6 Application of model intelligent system [49].
Figure 1.7 Computer-aided medical diagnosis flow.
Figure 1.8 Task involved in NLP [53].
Figure 1.9 Level of NLP in healthcare [69].
Figure 1.10 Training, clinical trial evaluation, and clinical implementation of ...
Figure 1.11 Machine learning algorithms used in clinical studies are the most co...
Chapter 2
Figure 2.1 Artificial learning technologies.
Figure 2.2 Cognitive computing illustration.
Figure 2.3 Neural networks.
Figure 2.4 Deep learning.
Figure 2.5 Deep learning.
Figure 2.6 Convolutional neural networks.
Figure 2.7 Machine learning x deep learning.
Figure 2.8 Blood cell types.
Figure 2.9 Proposal modeling logic.
Figure 2.10 Leukocyte cell classes.
Figure 2.11 CNN architecture.
Figure 2.12 CNN accuracy.
Figure 2.13 Monocyte test image.
Figure 2.14 CNN monocyte classification.
Chapter 3
Figure 3.1 Significant services offered by cognitive computing.
Figure 3.2 Functions of big data in smart healthcare.
Figure 3.3 Model to understand the relation between cognitive computing and big ...
Figure 3.4 Features of cognitive system.
Figure 3.5 Summary of convergence of big data and cognitive in healthcare.
Figure 3.6 Smart healthcare framework based on cognitive computing and big data.
Figure 3.7 Distribution of files in gender-wise among two classes.
Figure 3.8 Comparison of both the approaches.
Figure 3.9 Cloud computing and big data analytics in healthcare.
Figure 3.10 Advantages of cloud computing in healthcare.
Figure 3.11 Types of cardiomyopathy.
Figure 3.12 Fuzzy input patterns.
Figure 3.13 List of attributes.
Figure 3.14 Base model.
Chapter 4
Figure 4.1 Components of remote patient monitoring system i.e. based on IoT clou...
Figure 4.2 System architecture of the wearable sensor network for environmental ...
Figure 4.3 Interfaces and smart device supporting nutrition monitoring systems.
Figure 4.4 Proposed methodology.
Figure 4.5 Fall detection system.
Chapter 5
Figure 5.1 Different participants of healthcare industry.
Figure 5.2 The five V’s of big data.
Figure 5.3 The interoperability between cognitive computing and big data.
Figure 5.4 Analytics.
Figure 5.5 Watson’s cognitive platforms.
Figure 5.6 The cognitive journey of an organization.
Figure 5.7 A coarse taxonomy of cognitive computing research areas.
Chapter 6
Figure 6.1 Artificial intelligence.
Figure 6.2 Cognitive computing.
Figure 6.3 Predictive analysis.
Figure 6.4 Machine learning.
Figure 6.5 Deep learning.
Figure 6.6 Big data.
Figure 6.7 Natural language processing.
Chapter 7
Figure 7.1 A healthcare safe system.
Figure 7.2 Big data analytics workflows.
Figure 7.3 Illustration of interconnected fitness concern connected with smart s...
Figure 7.4 Suggested approach.
Chapter 8
Figure 8.1 Healthcare organizational structure.
Figure 8.2 Interaction between patients and healthcare provider.
Figure 8.3 Workflow of big data analytics.
Chapter 9
Figure 9.1 SPECT images obtained from PPMI database in dicom format.
Figure 9.2 Zoomed version SPECT images obtained from PPMI database in dicom form...
Figure 9.3 Block diagram of the proposed method.
Figure 9.4 (a) Normal (b) SWEDD patient (c) PD patient.
Figure 9.5 Hyper planes for SVM.
Figure 9.6 Histogram of thresholds for patients.
Figure 9.7 Predictors 1-18 with its importance.
Chapter 10
Figure 10.1
Schematic diagram showing the health management benefit of big data.
Figure 10.2
Schematic diagram of 10 V’s of big data analytics.
Figure 10.3
Schematic diagram to show the role of Big data analytics in disease ...
Chapter 11
Figure 11.1 Block diagram of proposed work.
Figure 11.2 Fully convolutional neural network (FCN) structure.
Figure 11.3 Sclera segmentation of frontal eye image. (a) Normal eye image; (b) ...
Figure 11.4 (a) Normal eye image, (b) histogram equalized image.
Figure 11.5 (a) Histogram equalization, (b) morphologically reconstructed image.
Figure 11.6 (a) Morphologically Reconstructed image, (b) canny edge detection of...
Figure 11.7 (a) Canny edge detection image, (b) adaptive thresholding image.
Figure 11.8 Circular hough transformed image.
Chapter 12
Figure 12.1 Big five model.
Figure 12.2 System architecture.
Figure 12.3 Import files.
Figure 12.4 Data cleaning.
Figure 12.5 Train test.
Figure 12.6 Cleaning training data.
Figure 12.7 Correction of words.
Figure 12.8 Remove stop words.
Figure 12.9 Lemmatization.
Figure 12.10 Removing numbers.
Figure 12.11 Removal of numbers.
Figure 12.12 Prediction model.
Figure 12.13 To convert dataframe into list.
Figure 12.14 Tokenization.
Figure 12.15 Personality traits.
Figure 12.16 Generated words.
Figure 12.17 Analysis.
Figure 12.18 Description of each trait.
Figure 12.19 Raw unprocessed data.
Figure 12.20 Clean processed data.
Figure 12.21 Comparision.
Figure 12.22 Individual tokenization.
Figure 12.23
Figure 12.24 Analysis of Participant 1.
Figure 12.25 Result of test.
Figure 12.26 Different behavior of participant 2.
Figure 12.27 Result of test for participant 2.
Chapter 13
Figure 13.1 Schematic presentation of applications of AI in healthcare.
Figure 13.2 Schematic illustration of applications of Blockchain in the manageme...
Figure 13.3 Schematic presentation of healthcare management using advanced techn...
Chapter 14
Figure 14.1 General lifecycle of a cognitive agent.
Figure 14.2 Various sources of information for CIE.
Figure 14.3 Various stakeholders in healthcare management.
Figure 14.4 Aspects for deploying cognitive services across various applications...
Figure 14.5 Aspect importance for effective cognitive healthcare management.
Figure 14.6 Aspect importance for effective cognitive healthcare management.
Figure 14.7 Challenges most cited for deploying cognitive assisted healthcare se...
Figure 14.8 Other challenges most cited for deploying cognitive assisted healthc...
Cover
Table of Contents
Title Page
Copyright
Preface
1 Era of Computational Cognitive Techniques in Healthcare Systems
Index
End User License Agreement
v
iii
iv
xv
xvi
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
391
392
393
394
395
Scrivener Publishing
100 Cummings Center, Suite 541J
Beverly, MA 01915-6106
Artificial Intelligence and Soft Computing for Industrial Transformation
Series Editor: Dr. S. Balamurugan ([email protected])
Scope: Artificial Intelligence and Soft Computing Techniques play an impeccable role in industrial transformation. The topics to be covered in this book series include Artificial Intelligence, Machine Learning, Deep Learning, Neural Networks, Fuzzy Logic, Genetic Algorithms, Particle Swarm Optimization, Evolutionary Algorithms, Nature Inspired Algorithms, Simulated Annealing, Metaheuristics, Cuckoo Search, Firefly Optimization, Bio-inspired Algorithms, Ant Colony Optimization, Heuristic Search Techniques, Reinforcement Learning, Inductive Learning, Statistical Learning, Supervised and Unsupervised Learning, Association Learning and Clustering, Reasoning, Support Vector Machine, Differential Evolution Algorithms, Expert Systems, Neuro Fuzzy Hybrid Systems, Genetic Neuro Hybrid Systems, Genetic Fuzzy Hybrid Systems and other Hybridized Soft Computing Techniques and their applications for Industrial Transformation. The book series is aimed to provide comprehensive handbooks and reference books for the benefit of scientists, research scholars, students and industry professional working towards next generation industrial transformation.
Publishers at Scrivener
Martin Scrivener ([email protected])
Phillip Carmical ([email protected])
Edited by
D. Sumathi
T. Poongodi
B. Balamurugan
and
Lakshmana Kumar Ramasamy
This edition first published 2022 by John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA and Scrivener Publishing LLC, 100 Cummings Center, Suite 541J, Beverly, MA 01915, USA
© 2022 Scrivener Publishing LLC
For more information about Scrivener publications please visit www.scrivenerpublishing.com.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
Wiley Global Headquarters
111 River Street, Hoboken, NJ 07030, USA
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Limit of Liability/Disclaimer of Warranty
While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials, or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read.
Library of Congress Cataloging-in-Publication Data
ISBN 978-1-119-76888-3
Cover image: Pixabay.Com
Cover design by Russell Richardson
Set in size of 11pt and Minion Pro by Manila Typesetting Company, Makati, Philippines
Printed in the USA
10 9 8 7 6 5 4 3 2 1
The introduction of new technologies into various domains, such as manufacturing, pharmaceuticals, healthcare and education, has contributed to their evolution. As health is the foremost factor affecting the quality of human life, it is necessary to understand how the human body is functioning by processing health data obtained from various sources more quickly. Since an enormous amount of data is generated during data processing, a cognitive computing system could be applied to provide responses to queries, thereby providing assistance in customizing intelligent recommendations. This decision-making process could be improved by the deployment of cognitive computing techniques in healthcare, especially so that bodily functions and machines can be associated. Therefore, cutting-edge techniques that could be integrated into healthcare must be investigated in order to provide intelligent services in various healthcare applications.
This book can be viewed as part of an initiative to provide diversified topics in healthcare sectors to show the range of recent innovative research, in addition to shedding light on future directions in this area. It will be a useful source of information for those involved in different areas of research and both graduate and postgraduate students interested in advanced technologies for the augmentation of healthcare services. A brief chapter-by-chapter description of the information covered in the book follows.
Chapter 1 discusses the evolution of various computational cognitive techniques in healthcare systems; and Chapter 2 deals with the metaheuristic algorithm of cognitive computing for classification of erythrocytes and leukocytes in healthcare informatics. Chapter 3 presents information about the convergence of big data and cognitive computing in healthcare. Chapter 4 deliberates on the intervention of IoT techniques deployed to prevent health problems; and the significance of cognitive computing in health-care applications is analyzed in Chapter 5. A detailed overview of computational cognition and its techniques and potential in various healthcare systems is given in Chapter 6; and Chapter 7 discusses how to provide data security through the deployment of two-factor authentication.
Next, Chapter 8 highlights the benefits of data analytics for monitoring healthcare applications and its inferences through deployed models. Various investigations on optimistic approaches for the detection of Parkinson’s disease are presented in Chapter 9; and Chapter 10 presents a holistic approach to big data analytics in healthcare. Chapter 11 provides a detailed view of the integration of big data and cognitive intelligence in healthcare; and a detailed case study is discussed in Chapter 12 in which the authors focus on the analysis, challenges and advancements of social media and how much of an impact it has on mental health. Chapter 13 highlights the management of chronic disease by considering the integration of artificial intelligence (AI), blockchain, and the internet of things (IoT). Finally, Chapter 14 sheds light on the research challenges and future prospects of deploying cognitive computing in the healthcare domain.
The editors thank the contributors for their splendid work and time.
The Editors:D. SumathiT. PoongodiB. BalamuruganLakshmana Kumar Ramasamy
Deependra Rastogi1*, Varun Tiwari2, Shobhit Kumar3 and Prabhat Chandra Gupta4
1School of Computing Science and Engineering, Galgotias University, Greater Noida, Uttar Pradesh, India
2Manipal University Jaipur, Rajasthan, India
3Graphic Era Hill University, Bhimtal, Uttarakhand, India
4School of Computing Science and Engineering, Galgotias University, Greater Noida, India
Abstract
Biomedical informatics and behavioral medicine are developing in parallel with its own theories and methods in the field of science. The conjunction of research cognitive information science offers enormous challenges and opportunities in addressing community health glitches and supervision of disease prevention. Classification of the healthcare cognitive informatics system accumulates medical, communal, and individual statistics from diverse sources of healthcare to enhance the rendezvous of patients. This chapter provides comprehensive review on the preceding research linked to cognitive informatics and computing in healthcare division. The era of computational cognitive informatics and technique has been divided into three areas of computing and healthcare informatics such as tabulating era, programmable era, and cognitive computing. Tabulating era was completely nourished into machine-driven system and the figuring was principally achieved by tabulating machines, calculators, and vacuum systems. The programmable era was entirely meticulous by the user interface design such as mainframe, smart computer machines, and personal computer. The era evolved of cognitive computing from 2011, this era nourished to the formation of automated IT systems that can resolve difficulties deprived of the necessity for human assistance. The main emphasis and driver of this cognitive era was the swift exponential upsurge in the flow of unstructured data. This chapter deals with the significant role of cognitive informatics in terms of the emerging areas in healthcare as well as to explore the methods, algorithms, in the healthcare division.
Keywords: Cognitive science, cognitive computing, cognitive intelligence, machine learning, natural language processing, deep learning, cognitive intelligence in healthcare
The future of healthcare depends on giving patients a full understanding of the multiple variables impacting their health. Today’s consumers want customized, open, integrated, and high-quality care in search of the same conveniences they get in other industries. Healthcare providers need new ways to tap into and interpret health knowledge in real time in order to deliver the experience empowered by customer demand. Real-time data helps the most educated decisions to be taken by physicians, analysts, insurers, case managers, and other partners, while simultaneously allowing consumers more influence of their own care. This difficult coordination, however, takes substantial time and can tax even the most flexible organization’s capital [1, 3].
In their lifetime, the average citizen is expected to produce over 1 million gigabytes of health-related material, equal to 1.3 billion books, health information from personal fitness trackers, connected medical devices, implants, and other sensors that gather real-time data. The amount of health information currently doubles every 3 years and is expected to double every 73 days by 2020 [1].
It is reluctant to manage these growing databases of knowledge relating to healthcare, namely electronic health records, clinical studies, autopsy reports, laboratory findings, radiology photographs, voice recordings, and exogenous evidence, since they are scattered [64]. Furthermore, these sources of knowledge do not willingly incorporate important evidence about the non-clinical circumstances of a person, which can have a strong health effect. As a result, patients and their healthcare providers must make choices based on a limited data set [1, 2].
That’s one factor why health professionals are among the early adopters of cognitive computing technologies that, when communicating with individuals, can comprehend, reason, and learn. Cognitive channels, from numbers and text to audio, video, pictures, sensory and other content, are meant to absorb large volumes of organized and unstructured information. The proverbial needle in a haystack will help doctors and experts locate similarities and associations, finding emerging trends and observations to accelerate findings, procedures, and insight. Simply put, cognitive structures, particularly though the magnitude and speed of data continued to explode, help scale, and enhance human intelligence [1, 2].
“The aim of cognitive computing is to build automatic IT processes that are capable of understanding without the need for human assistance” [3].
In this chapter, we first include the framework of cognitive science and the difference between classical cognitive science theory. Then, to include the popularity of knowledge that reflects the developmental phase of cognitive computation. The next move includes architecture and enabling computational computing technology. Then, to explain how the Intelligent system makes cognitive computation in healthcare and finally to include a case study on cognitive computing and healthcare research.
Cognitive Science is an interdisciplinary and empirical study of the mind as well as its mechanisms. It addresses the nature of thought, its functions, and goals (in a broad sense). Cognitive sciences study intelligence and behavior, concentrating on how central nervous system interpret, store and apply knowledge [4].
In “cognitive” the word “cognitive science” is used for “any kind of mental operation or structure that can be studied in precise terms” (Lakoff and Johnson, 1999). In certain traditions of theoretical theory, this conceptualization is rather broad and cannot be confused with how “cognitive” is used, where only formal rules and conditional semantics of truth have to do with “cognitive”.
Cognitive science is a multidisciplinary field based on linguistics, psychology, anthropology, computer science, and philosophy that understands human behavior, including decision-making, reasoning, and problem solving. Cognitive science concepts have been extended to research the functionality of medicinal equipment and boundaries [6]; to identify advice, guidance and guidance [7]; to streamline and improve workflows and medical processes [8]; and to consider the complexities of clinical evaluation, logic and decision-making [9].
The nature and assessment of HIT has been influenced by analytical and methodological methods from cognitive psychology, as well as by recognizing and enhancing the efficacy of healthcare providers. Initial CI research has drawn extensively from subjects of cognitive science connected to learning, decision-making, and problem solving. Cognitive science emerged from the conceptualizations of person “thinking” and “mental processes” by Newell and Simon [10], and “human problem solving.” Initial problem-solving experiments have presented protocol-analytical tactics [11], human data dispensation ideas that have therefore placed the foundations for the humanoid–computer collaboration communication discipline (HCI). Strategies such as cognitive thinking have been commonly used in CI research and have also been instrumental in improving our perception of the resolution of medical problems and judgment and thinking [5]. Similarly, Kintsch’s [12] text comprehension thesis was instrumental in influencing Cognitive Informatics experiments relevant to inference and therapeutic judgment.
There is a gap among (a) traditional cognitive philosophy that questions understanding basic concepts of human action by models of the general purpose of the brain’s visual cortex, and (b) facets of human behavior that imitate real-life patterns, circumstances, and complexities that are on the context-specific, ground, and applied. For organizational innovations and workflow strategies that usually emerge from almost simple concepts of context-independent cognition, individual-centered, this difference has major implications [15].
This inconsistency is contained within the cognitive scientist Andy Clark’s conventional mind-body dualistic that permeates evolutionary computation, contributing to the disappointment to react in an ordered world to situated behavior [16]. As per Clark, the initial step (the glory days of traditional cognitivism) described the attention, in lieu of a dominant logic structure, conceptual repositories, and many bordering ‘sensory’ units. The reductionist approach of traditional cognitivism centered on the ‘primary processing system’ of expert systems, viewed as a symbol-based architecture that determines the cognitive characteristics of a human agent [3]. Although computer vision has recently undergone an extreme reassessment of the existence of the internal cognitive machine, moving away from the reliance on rational, rule-following protocols for defining engine processes, the conventional disenfranchisement of the body and environment has already been tacitly adopted throughout this “revolution” [16].
In classical mind theory, this marginalization of the surrounding environment ignores a profound aspect of human nature: our ability to use intelligence, resources, technology, mutual knowledge, and associates to undertake achievements that no particular agent can achieve alone [17]. This ability is widely accessible to our community, which anthropologists sometimes refer to as “culture,” and is fundamental to understanding human behaviors ranging from rural agricultural tasks to dynamic contemporary workplace, manufacturing, and healthcare jobs. Because of the essential need to manipulate objects, devices, and other individuals in cognitive practice, the complexities of complex operations requiring the organization of disparate tools for executing tasks must be discussed by examples of human presentation (and the technology and procedures employed to improve that concert). Such practices tend to be regulated by broader information management processes, systems in which persons are embedded actors, and thus need a separate research unit and separate cognition study techniques.
The evolution of the classical theory of cognition [18] has played a crucial role in medical informatics [13, 14, 18]. The differential diagnostic function of the clinician, often taken as an archetype of critical thinking, has been describing the subject matter for decades of investigations of rational thought, language comprehension, problem-solving, and experience [19, 20]. In the picture of the diagnosis and treatment clinician, a few of the better remembered “expert programs” in the field of artificial intelligence were conceived as isolated logical agents [21, 22]. More significantly, these devices have been repurposed as instruments to facilitate decision-making by clinicians [23]. Nevertheless, the basic theory underlying their creation as a mechanism for evidence representation and expert thought continues in methodology and studies on decision-making, problem solving, experience, and human achievement in healthcare [18].
Guided by the individual-centered model of cognition, the shortcomings of healthcare technologies and procedures are now attractive apparent. The state-of-the-art knowledge healthcare industry has also been hesitant to take root, has generated many unhappy customers, and has frequently triggered significant workflow problems and detrimental safety and health consequences that come with any of these technologies [23–25].
For contemporary cognitive science, Clark emphasizes the concept that needs clear diagnosis of cognitive processes that transcend the boundary of body-environment and mind-body. To achieve this requires focus rather than only “in the laboratory” to cognition “in the wild,” and requires a conceptual representation vocabulary capable of treating the relationship between agents, through various body systems, as well as between their instruments and agents in organized atmospheres [26, 29]. The influences of Clark, Hutchins, and others [27, 28] were the principal one to accomplish that quality improvement studies in composite work happenings include cognitive science where: (1) entails a test unit entitled the “activity system” which allows work environment study experience involving various media; (2) the organizational characteristics of such experiments which may emerge from human behavior; (3) it depends on the idea of meaningful implementation in public situations. The cognitive science of human behavior, which relies on technologies and process improvement factors to enhance human performance in diverse working conditions, will have a significant effect on future advances in the area of medical computer science. We first study the developments in functional cognitive science intricate in this change in interpretation, and then refer to the insinuations for healthcare practices of the conversation.
The era of computation started in 1900 and is still in continuous improvement (Figure 1.1.) The age of computing is mainly divided into three areas, that is: (i) era of tabulating, (ii) era of programmable, and (iii) cognitive computing’s era [30]. The first period of computation was an era of tabulation. Computing was done largely by tabulating machines, calculators, and vacuum systems throughout the period. The age grew from 1900 to 1950 [31]. The second era, which ranged from a vacuum tube to a microprocessor and was entirely programmatically operated, appeared in 1950. User interface architecture, such as mainframe, smart computing machines, and personal computers, was absolutely thorough in the programmable era. This was an archetypal transition from mechanical systems to electronic systems in which storage and output benchmarks were dramatically enhanced [31]. The evolution and life of the cognitive computing age from 2011 to date. Cognitive computing’s key goal is to build robotic information management structures that are capable of solving challenges without the necessity for human support. A modern methodology is cognitive computing which features the estimation and generation of theories, dynamic learning, and analysis of natural language. The quick exponential upsurge in the flow of unstructured data was the key target and catalyst of this cognitive era. Learning and collaborating in natural language rather than machine code with people [31].
Figure 1.1 Eras of computing (2013 International Business Machine Corporation) [1].
A subset of analytics taught or learned is given by cognitive computing, based on machine learning. It is distinct from programmed or rules-based analytics and computation. Cognitive processing systems continually gain knowledge from the information fed into them by mining these dynamic data for particular information sources. In order to be able to predict new problems and model potential solutions, those systems themselves optimize the way they look for patterns and the way they process knowledge.
The long-standing open domain study Query Answer (QA) was approached by IBM through the TV game show Jeopardy! with a massively parallel, probabilistic architecture based on proof, known as Watson [3]. The Watson project has created a real-world sensation and influenced a big challenge in computer science to show how NLP and convergence, deep learning, information processing, knowledge representation and concurrent computing that drive open domain content are increasing and advancing in natural language content [3]. On the basis of its Jeopardy! success, IBM Watson was customized and delivered by scientists and engineers as the first computational computing capability usable commercially, a capability that marks a coming age of computing. Delivered across the cloud, the device analyzes high data volumes, understands complicated questions posed in natural language, and offers answers based on facts [3].
The ability of Watson to assist in healthcare is only one of the opportunities opening up for innovations of the next decade. Scientists from IBM Watson and elsewhere are stretching the limits of the fields of science and technology ranging from nanotechnology to artificial intelligence in order to build computers that do much more than measure and coordinate and identify trends in data – they hear, understand, reason and communicate in a powerful way with humans naturally. The exploits of Watson on TV are one of the first steps towards a new period of information technology development, the coming age of cognitive computing [3, 31].
Humans and computers would become more intertwined during this period. For civilization, the emerging age of computers is not only an opportunity; it is also a need. We would be able to cope effectively with the exploding complexities of today’s environment with the help of a smart computer and effectively solve intertwined challenges such as obesity and hunger and burden on natural systems.
In the coming era of cognitive computing, the following characteristics [32] play a vital role in cognitive information.
Dynamic: Inherent value signifies the inherent natural property at the beginning of output knowledge. Extending value signifies an increasingly established social characteristic under the influence of external powers during the knowledge transmission process. To be exact, the data are viewed with an intrinsic meaning after development, which is constantly transmitted to different users on the basis of their own expectations. During this data transmission, the information is reported on and a yield of additional of the knowledge is consequently generated. As far as information is concerned, each customer, where its value is extensively excavated, assessed, and then used, measures its capacity differently. It is noted that each individual would have different associations and meanings of the same content at different times. Cognitive content, thus, changes dynamically across the communication path.
Polarity: In standard cognitive science, the measurement of knowledge is non-negative, but has positively and negatively polarities in the increasing value of cognitive deficits. The related awareness can be generated during the contact process and engagement with audiences by cognitive technologies to represent the increasing importance of information. It should be remembered that this information may have a beneficial influence on many individuals, but if it is wasted, it may have a negative impact on the dissemination of information. Discovering, for instance, that empirical experience and understanding knowledge will play a noteworthy role in its development and advancement will demonstrate the positive information polarization of a primary school student. Alternatively, if an elementary school student reflects on illogical and irrelevant information, it can have a negative effect on his/her intellect, suggesting the polarity of negative information. The understanding of data polarity is therefore of critical importance.
Evolution: During the transmission process, it will be continuously recognized until the information is produced. Knowledge can be converted into various aspects as the cognitive capacity exceeds a certain level. In order to generate new insights and viewpoints, it can be applied to the data level in other aspects after transformation. The cognitive machine imitates human thought during the training phase, constantly develops awareness through incremental learning, and eventually exceeds the cognitive ability of humans. During the knowledge delivery process, usable information is extended and compressed, based on the simple concepts of cognitive approach, to enable information to best represent the individual needs of users in a multi-dimensional world. As a consequence, the ancestral role of knowledge persists.
Convergence: Value density tends to be constant to a degree from the value of the data as the data is recalled, suggesting that the knowledge has a convergence consistency. For instance, the particulars of the definition of an object’s motion can be described as the three laws of Newton’s motion. According to the Shannon Intelligence Theorem, the quantity of knowledge conveyed by a standard transmitting device in a time unit is restricted to the efficiency of the channel, to be precise. However, the need for high-volume continuous data is always at odds with the capability of the communication medium. It is therefore necessary to continuously grow the capability of the communication system and to increase the amount of data transmission. In addition, it is possible to obtain the maximum value density by constantly identifying information, which decreases the quantity of information for transmission. Therefore, in order to efficiently reduce the pressure on the communication mechanism, it is important to delete redundant data on the basis of its usefulness as information. In other words, the elimination of unnecessary information and the succinct use of useful information are not limitless and agree with the theory of convergence on the basis of conformity with the minimum reduction law.
Multi-view: From the viewpoint of the consumer, as it converges, cognitive knowledge will achieve the highest density of meaning. Due to the various cognitive ability and demands of users, the importance of the same information will affect each user differently after receiving the information.
Cognitive design refers both to the concept of the function of the human consciousness and to the technological implementation of this principle in the areas of artificial intelligence (AI) and computer cognitive science. One of the main goals of cognitive architecture is to summarize the various implications of cognitive science in a comprehensive computer model. The Institute for Creative Technologies defines cognitive architecture as: “A cognitive architecture is a theory of fixed constructs that include a consciousness, whether in natural or artificial settings, and how they work together to create intelligent actions in a number of diverse environments in combination with information and skills embedded within the architecture.”
Figure 1.2 Cognitive computing system architecture [48].
Figure 1.2 demonstrates cognitive computing’s machine architecture. With the assistance of fundamental innovations such as the robotics, 5G-Network, and computer vision, along with Internet-of-Things (IoT) infrastructure and services, activities requiring communication between users and computers, machine vision and voice recognition can be implemented on a large scale. Each layer is accompanied in the cognitive computing environments by related technological problems and system requirements. The value of cognitive processing is researched and examined and each surface is explored.
It is strong from the explanation above that a particular cognitive phenomenon is grounded on intelligence. The communication area highlights the transmission of knowledge, although the computational field enhances the ingestion of evidence. In definite cognitive processing systems, the data is mainly signified by data from different organized and unstructured information. In the ability to affect the data environment with the physical world, the IoT [34] collects a variety of valuable real-time statistics about substances in the objective world, forms a massive network infrastructure, and understands the interconnectedness between large measuring instruments [35]. Some innovative distributed processing fusion approaches, such as [36], can potentially also be used to boost the precision of perceived, massive data networks:
Using awareness technologies such as wireless sensors and RFID, satellite tracking and aligning via WiFi, and authentication, the IoT gathers data on tracked objects.
It extends similar network information using multiple productive means of networking, aggregation and conducts sharing.
Knowledge of intelligent control of physical fusion and decision-making in the information world uses intelligent computational techniques such as deep learning, cloud computing, and data mining to store and extract knowledge.
In the Big Data age, the rapid development of information and the exponential growth of machine processing capacity are irretrievably visible [37]. The growth in vast volumes of data, such as information from social networks and digital connectivity, is growing increasingly in relation to the rise in regular structured data. Cognitive Big Data contains unorganized and organized data with a feature that can be defined as 5V, i.e., veracity, velocity, value, variety, and volume. Meanwhile, during the analysis and collection of results, these requirements have created unique challenges. We discuss the relationship between, and the distinction between, extensive data mining and cognitive computing. By semantic computation, human senses are mimicked. One link between the study of cognitive computing and big data is humans’ reasoning about big data. Information continually accumulates in the lives of human beings. Once the amount of knowledge from various experiences is enormous, humans’ significant data reasoning can be possessed and is hierarchical as deep learning.
The problem of improving material life and the atmosphere is the first step. At the second level, moral culture is practiced, and at the third level, the meaning of life is involved. The number of employees at the highest level is the lowest. With a view to living expectations and typical psychological environments, the thinking reproduced by machine learning focuses primarily on the first and additional stages. Linked uses include medical examinations, smart healthcare, smart house, intelligent community, and emotional treatment [38, 39]. The third stage is more profoundly concerned with life’s nature and gives customized advice for the user’s life growth path to make the user understand a better but more fulfilling life. It cannot be achieved by robots, which is an excellent obstacle for future artificial intelligence. Under the condition that the data collection satisfies significant data characteristics, the direct approach to interpret and process the information is to follow the new machine learning technique [40].
Data size is one distinction between the study of big data and cognitive computation. The study of large facts in relation to such data sets is not inherently cognitive computation. Big data thinking stresses the mining of value and the development of knowledge from vast quantities of data. The precision and efficiency of prediction cannot be assured without a large volume of data as a basis. The aims of cognitive computing to discourse the contests of fuzziness and misperception in the genetic environment are grounded on judgment and cognition like the human intellect. Thus, multiple levels of mechanisms such as vision, memory, comprehension, logic, and problem-solving are realized. As long as ordinary citizens and domain experts are concerned, the data is supposed to be indistinguishable, but the context of the information collected by mutual individuals may fluctuate from that received by subject matter specialists. As the intensity of perception is new, there might also be a different perspective for evaluating the findings. Using semantic processing, more hidden meanings can be extracted from limited data [41].
Cognitive computing is informed by the method of human learning. Living creatures only necessitate a very short time to recognize a picture, and they can effortlessly discern cat from dog, and so on. This basic human function can be accomplished by conventional Big Data after a significant amount of preparation, for instance, while “Google Photos” by studying a lot of pictures will differentiate cat from dog [42]. The multiple cat breeds, however, may not be recognized. In addition, with multiple data sizes, there is a significant consistency and these data would consume huge storage space. Cognitive computation supports using a pathway that is lighter and more accessible than processing big data. It violates data universality and meaning, and after gaining cognitive knowledge, it not only uses “brute computing force” for big data processing. Cognitive computation was not sufficiently researched until the age of big data. The emergence of AI and the sustenance of ample cloud computing tools currently deliver compensations for the advancement of cognitive computing [43] and enable the computer to view and mine the consequences of information from the perspective of knowing the user’s interior needs.
Computing, band width, and storage are virtualized by cloud computing. It thus lowers the expense of implementation of information systems and offers funding for industrialization and the promotion of cognitive computing applications [33, 44]. In accumulation, cloud computing’s powerful storage and computing ability offers cognitive computing with scalable, versatile, simulated, pooled, and well-organized computing infrastructure amenities [45, 46]. After the analysis of big data for a huge amount of facts information produced in real life on the technologies such as deep learning are applied to behavior data mining and the consequences are implemented in dissimilar fields. The numerous classes of knowledge concern multiple computational technologies. For instance, literal information and pictorial awareness correlate with computer vision and natural language processing, respectively.
The purpose of cognitive computing is straightforward: in a computerized environment, to replicate human thought processes. We may create classifications that simulate the human brain which performs through a range of existing technologies, such as mining of data, recognition and analysis of pattern, and natural language processing. To understand, learn, and respond automatically from experience without being specifically programmed, these systems rely on learning in terms of machine and deep learning algorithms. Data is the insight we offer to cognitive processing systems. Study presents the supporting technologies in this section, which comprise deep learning and reinforcement knowledge. Reinforcement learning can draw and learn from the behavior of the environment. High levels of characteristics may be learned by deep learning.
Standard approaches to machine learning can be differentiated from unmonitored learning and supervised learning. In these techniques, systems train data models, some of which are in predefined sequence, and machines perform accomplish projects such as regression, identification, and aggregation. The knowledge that the machines will obtain, however, is limited. In non-linear instances, it is impossible for computers to learn information since they can only perform predictions depending on the information obtained. In addition, the label for much the same information may be different with different cases, which implies that perhaps the functionality of the information gathered by the machines is different for each individual. Closed teaching with data feedback is the foundation of conventional supervised learning and unsupervised learning. These standard models of learning are inadequate to fulfill the demands of sustainable improvement of machine intelligence.
In the arena of computer erudition, reinforcement learning has since become a trendy branch of research. The human learning process is very similar to reinforcement learning. Let us take the situation where the infant studies to talk as an instance. Usually, when a youngster learns a language, a grownup reads the word frequently, refers to something identified by that expression, or performs an action supported by that word with expressions. If the youngster’s considerate is wrong due to missing judgment, the parent will make a correction. The adult will offer prizes if the kid has it right. During the human learning process, the environmental environment is often a very significant influence.
Reinforcement learning uses this stage as an example, and it will absorb from the atmosphere and respond to actions. A collection of incentive instruments is created, i.e., when convinced conduct is good for detached and some sentence is exercised on the conflicting, some reward is offered. Here are several decisions during the process towards the target. The choice is also not actually the optimum at any time, but it must be successful for the system to obtain further compensation. Let’s take the example of AlphaGo [47]. By improving learning, chess plays with one another after fascinating millions of chess games for deep learning. Each move is not always optimal during the self-learning process, but thanks to global preparation, the action is more likely to trigger a complete win. The machine is not only based on previous information in this level, but may also pursue new paths to increase the target reward. Inarticulate play will be introduced, much like drawing learning, until the fundamental skills are learned. Data is created in the process of computer attempts, not regression, classification, or accumulation, but optimum reward is the ultimate aim. With this goal, the computer is important for both good and failed attempts.
If a machine interacts only with itself, however, so its cognition is not adequate. This is almost like learning to communicate without connecting with others would be hard for an infant. Therefore, if, regardless of external conditions, a learning system implements its theory, it is not a strong cognitive system. Therefore, contact specifically with humans should be carried out by a neural device. However, it will take a lot of manpower and time if a human were specially appointed to connect with a computer.
Cerebral cortex of the human has distinct functions which are divided into two sides of the brain. As for certain entities, vocabulary, thoughts, reasoning, etc. are the responsibility of the left brain. Although visual thinking and emotions are essential for the right hemisphere, typically, individuals with a mature left brain have better reasoning and are more logical. Although entities with a qualified right brain frequently have good creativity, comprehension is exceptional in space and material types. Then logical reasoning and visual thinking are divided into the human mode of thought. According to different abstractions in the material of thought, human beings’ methodology to perceive the natural world is separated into the rational approach and visual procedure.
The conceptual approach is grounded on strict conception and relevance, though the perceptual process is a special relationship of mapping generated between various components. It is also uncertain whether the human brain knows 100 billion nerve cells’ data coding, encoding, and storage. However, in the nervous method, the human mind’s thinking function may be modeled through data investigation. The framework of manual feature production is purely established. This technique can be seen as a sort of logical system, i.e., it pretends the human capacity for analytical thought. The approach of function learning is to find the relationship of mapping between different components. It is a sort of system of vision, i.e., it replicates the potential of human visual thought [48].
As indicated in Figure 1.3, to assess that a quadrangle is a square, logical and perceptual methods have been adopted independently. To identify the characteristics of a cube, evaluate whether there were any four right angles, and determine whether or not the measurements of four sides are equivalent, the logical, analytical approach is used, as seen in Figure1.3 (a). The understanding of tilt, right angle, hand, and side span concepts requires this technique. If an illustration of a square is presented to a youngster and he or she is expressed that it is a square, that child can correctly define a court after several learning cycles, as exposed in Figure 1.3 (b). The infant does not understand the definition of side or slope, but he or she can also understand a square. The technique where the infant knows a court is a visual technique or perception. After many scenarios, the youngster discovers the relationship amid the square symbol and the unbiased definition of mapping. It is essential to look for image features to identify a square with a logical approach, and manual feature strategy can be seen as an imitation of this process. When a youngster absorbs to identify a rectangle, the perceptual methodology is used to create the mapping relationship amid figure and definition. It is possible to see learning characteristics using a deep learning model as a simulation of this technique.
Figure 1.3 Perceptual and rational method to recognize a square. (a) Rational method. (b) Perceptual method [69].
If the program is attempting to achieve glitches in the physical world, the simplest way is to mimic the human brain’s thought mode. In the cognitive paradigm, the characteristics can be derived from existing classification tasks data model using the manually function development strategy to simulate the human brain’s critical reasoning ability or can be taught through computer vision to replicate the human primary reward thinking ability. When computing applications grow increasingly complex, researchers realize that it is difficult to explain certain actual problems that are simple to identify by human beings using a logical approach, rendering the rational analytical method unreliable or completely impractical for machines. In other words, with the manual feature creation process, the successful data characteristics cannot be planned and it is very difficult for computers to understand the expression of the feature [48].
Technologically sophisticated devices that interpret and respond to the environment in them are intelligent systems. From automatic vacuums such as Roomba to facial recognition applications to Amazon’s customized shopping tips, intelligent systems can take several types.
In the late 1990s, first differentiated so-called smart information systems were from the entire collection of information systems. In this context, the word ‘intelligent’ is interpreted explicitly as referring to these systems’ capacity to show their desire, in conditions of ambiguity, where the correct action cannot be decided algorithmically, to address the formulated query in a not utterly determined environment [49]. At the same time, these programs are the most likely to achieve success. In this way, intelligence is developed at several (intelligence) stages, determined by the system’s computation and storage power, automated data search, and automatic collection of routines for the gathering of information as the process is used to find solutions to issues that are not well understood at the time of system creation, as well as the quality and quantity of information gathered in the system.
The Intelligent Device is a computer that can capture and process data and interconnect with other devices that can study from involvement and respond to existing data, etc. Intelligent systems are technologically sophisticated devices that interpret and react to the environment. Intelligent System, “The ability to perform activities generally associated with intelligent creatures by an automated machine or computer-controlled robot,” having thus the competences of “Developing structures that have the features of human intellectual functions, such as the capacity to think, explore meaning, generalize, or benefit from previous experience” [50].
Each (natural or artificial) intelligence system is focused on mechanisms that help to produce beneficial behaviors. Still, each of these mechanisms derives from the human capacities and practical capabilities that make up this system. The utilization of essential tools, including instructional and creation functions and the ‘instinct’, is of immense significance for an intelligent system’s proper functioning. Such processes (Figure 1.4 and Figure 1.5) take the same type as human intelligence functions [51].
Let’s also point out that Intelligent Systems may be seen in numerous ways, along with the concept of Intelligent System’s presented: from Artificial Intelligent models analyzing massive datasets to Artificial Intelligent systems commanding robots. The area of the Intelligent System describes an interdisciplinary field of study that draws together concepts from Artificial Intelligent, Machine Learning (ML) and a variety of arenas linked by multiple interdisciplinary partnerships, such as linguistics, brain sciences, and psychology.
Figure 1.4 Component of an intelligent system [51].
Figure 1.5 Intelligent system [49].
A wide number of Intelligent Systems have been produced nowadays, such as (Figure 1.6):
Memetic algorithms
Hybrid models (neuro-fuzzy, neuro-genetic, fuzzy-genetic, etc.)
Expert systems
Particle swarm optimization
Support vector machines
Figure 1.6 Application of model intelligent system [49].
Artificial neural networks
Ant colony systems
Clustering
Deep learning
Bayesian model
Fuzzy systems
Evolutionary computation (genetic programming, evolutionary strategies, evolutionary/genetic algorithms)
Ant colony optimization