97,99 €
The first book focusing on one of the hottest new topics in Internet of Things systems research and development Studies estimate that by 2020 we will have a vast Internet of Things (IoT) network comprising 26 billion connected devices, including everything from light bulbs to refrigerators, coffee makers to cars. From the beginning, the concept of cyber-physical systems (CPS), or the sensing and control of physical phenomena through networks of devices that work together to achieve common goals, has been implicit in the IoT enterprise. This book focuses on the increasingly hot topic of Human-in-the-loop Cyber-Physical Systems (HiTLCPS)--CPSs that incorporate human responses in IoT equation. Why have we not yet integrated the human component into CPSs? What are the major challenges to achieving HiTLCPS? How can we take advantage of ubiquitous sensing platforms, such as smartphones and personal devices to achieve that goal? While mature HiTLCPS designs have yet to be achieved, or a general consensus reached on underlying HiTLCPS requirements, principles, and theory, researchers and developers worldwide are on the cusp of realizing them. With contributions from researchers at the cutting edge of HiTLCPS R&D, this book addresses many of these questions from the theoretical and practical points of view. * An essential primer on a rapidly emerging Internet-of-Things concept, focusing on human-centric applications * Discusses new topics which, until now, have only been available in research papers scattered throughout the world literature * Addressed fundamental concepts in depth while providing practical insights into the development of complete HiTLCPS systems * Includes a companion website containing full source-code for all of the applications described This book is an indispensable resource for researchers and app developers eager to explore HiTL concepts and include them into their designs. It is also an excellent primer for advanced undergraduates and graduate students studying IoT, CPS, and HiTLCPS.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 363
Veröffentlichungsjahr: 2017
Cover
Title Page
Copyright
Dedication
List of Figures
List of Tables
Foreword
Preface
Acknowledgments
List of Abbreviations
About the Companion Website
Chapter 1: Introduction
1.1 The Rise of Cyber-Physical Systems
1.2 Humans as Elements of Cyber-Physical Systems
1.3 Objectives and Structure
Part I: Evolution and Theory
Chapter 2: Evolution of HiTL Technologies
2.1 “Things”, Sensors, and the Real World
2.2 Human Sensing and Virtual Communities
2.3 In Summary..
Chapter 3: Theory of HiTLCPSs
3.1 Taxonomies for HiTLCPSs
3.2 Data Acquisition
3.3 State Inference
3.4 Actuation
3.5 In Summary..
Chapter 4: HITL Technologies and Applications
4.1 Technologies for Supporting HiTLCPS
4.2 Experimental Projects
4.3 In Summary..
Part II: Human-in-the-Loop: Hands-On
Chapter 5: A Sample App
5.1 A Sample Behavior Change Intervention App
5.2 The Sample App's Base Architecture
5.3 Enhancing the Sample App with HiTL Emotion-awareness
5.4 In Summary..
Chapter 6: Setting up the Development Environment
6.1 Installing Android Studio
6.2 Cloning the Android Project
6.3 Deploying the Server
6.4 Testing the Sample App
6.5 In Summary..
Chapter 7: Data Acquisition
7.1 Creating the
EmotionTasker
7.2 Processing Sensory Data
7.3 In Summary..
Chapter 8: State Inference
8.1 Implementing a Neural Network
8.2 Requesting User Feedback
8.3 Processing User Feedback
8.4 In Summary..
Chapter 9: Actuation
9.1 Handling Emotions on the Server
9.2 Finishing up
EmotionTasker
9.3 Providing Positive Reinforcement
9.4 In Summary…
Part III: Future of Human-In-the-Loop Cyber-Physical Systems
Chapter 10: Requirements and Challenges for HiTL Applications
10.1 Resilience
10.2 Security and Privacy
10.3 Standard Communications
10.4 Localization
10.5 State Inference
10.6 Safety
10.7 In Summary…
Chapter 11: Human-in-the-Loop Constraints
11.1 Technical Limitations
11.2 Ethical limitations
Appendix A: EmotionTasker's full code
References
Index
End User License Agreement
xi
xii
xiii
xiv
xv
xvii
xix
xx
xxi
xxii
xxiii
xxv
xxvi
xxvii
1
2
3
4
5
6
7
9
11
12
13
14
15
16
17
18
19
20
22
23
25
26
27
28
29
30
31
32
33
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
289
Cover
Table of Contents
Foreword
Preface
Begin Reading
Chapter 2: Evolution of HiTL Technologies
Figure 2.1 In [1], books and other common objects were augmented with RFID tags and associated with virtual documents by PDAs.
Figure 2.2 Shaman [2] acted as a representative for the connected
Lite
Servers, offering Java and HTML interfaces.
Figure 2.3 Device web presence in Cooltown [3].
Source:
Adapted from Kindberg
et al.
2002.
Figure 2.4 JXTA [4] peers created virtual ad hoc networks which served to abstract the real ones.
Figure 2.5 Works such as [5] and [6] used proxies to offer embedded devices' capabilities through RESTful web services.
Figure 2.6 The SenseWeb [7] architecture.
Figure 2.7 WikiCity [8] interfaced between virtual data and the physical world through a semantically defined format for data exchange.
Figure 2.8 Nokia 6101 vs iPhone 6s/LG Nexus 5X.
Figure 2.9 HiTL technologies evolution timeline.
Chapter 3: Theory of HiTLCPSs
Figure 3.1 Basic processes of human-in-the-loop control.
Figure 3.2 Taxonomy of human control.
Figure 3.3 Taxonomy of human roles.
Chapter 4: HITL Technologies and Applications
Figure 4.1 SenQ's query system stack shown side-by-side with the topology and components of AlarmNet, a prototypical implementation for assisted-living [9].
Source:
Adapted from Wood 2008.
Figure 4.2 The architecture of CenceME [10], one of MetroSense's implementations.
Figure 4.3 The three key components of BCI using smartphones [11].
Source:
Adapted from Lathia
et al
. 2013.
Figure 4.4 SociableSense architecture [12].
Source:
Adapted from Rachuri 2011.
Figure 4.5 Control architecture for energy saving with HiTL [13].
Source:
Adapted from Liang 2013.
Figure 4.6 Architecture of an HiTL HVAC system [14].
Source:
Adapted from Agarwal 2011.
Figure 4.7 Diagram showing the main components of CAALYX's roaming monitoring system [15].
Source:
Adapted from Boulos
et al
. 2007.
Figure 4.8 A semi-autonomous wheelchair receives brain signals from the user and executes the associated tasks of path planning, obstacle avoidance, and localization [16].
Source:
Adapted from Schirner 2013.
Figure 4.9 A mockup of a map interface similar to the Highlight application.
Figure 4.10 Overview of the system proposed in [17].
Source:
Adapted from W.-H. Rho and S.-B. Cho 2014.
Chapter 5: A Sample App
Figure 5.1 HappyWalk HiTL control.
Figure 5.2 HappyWalk's architecture.
Figure 5.3 Android's
activity lifecycle.
Figure 5.4 HappyWalk's Android class structure.
Figure 5.5 An overview of HappyWalk Android app's main classes.
Figure 5.6 An overview of HappyWalkServer's main classes.
Figure 5.7 A typical artificial neural network architecture.
Figure 5.8 Sound signal in the time domain (left side) analyzed through a Fourier transformation to show its frequency domain (right side).
Figure 5.9 HappyWalk's Emotional Feedback.
Figure 5.10 HappyWalk's neural network design.
Chapter 6: Setting up the Development Environment
Figure 6.1 Installing Java SE Development Kit 7u79.
Figure 6.2 Installing Android Studio and Android SDK.
Figure 6.3 Canceling the setup wizard.
Figure 6.4 Opening the Android SDK manager.
Figure 6.5 Installing Android API 21.
Figure 6.6 Opening the standalone SDK manager.
Figure 6.7 Installing
Android SDK Build-tools 21.1.2
.
Figure 6.8 Installing Git #1. (a) Adding Git to the PATH, on Windows (b) Choose
Checkout Windows-style
.
Figure 6.9 Installing Git #2. (a) We recommend using MinTTY (b) Uncheck
Enable file system caching
.
Figure 6.10 Importing HappyWalk from Git.
Figure 6.11 Cloning the HappyWalk project.
Figure 6.12 Opening the HappyWalk project.
Figure 6.13 Choosing HappyWalk's project folder.
Figure 6.14 Do not upgrade Android Gradle or its plugin.
Figure 6.15 Running HappyWalk.
Figure 6.16 HappyWalk's first launch.
Figure 6.17 Obtaining the Android debug key.
Figure 6.18 Creating a project to obtain a Google Maps Android API key.
Figure 6.19 Creating the Google Maps Android API key.
Figure 6.20 Obtaining the Google Maps Android API key.
Figure 6.21 Changing into the project's view.
Figure 6.22 Opening
app/debug/res/values/google_maps_api.xml
.
Figure 6.23 Choosing PostgreSQL superuser's password.
Figure 6.24 No need to launch Stack Builder.
Figure 6.25 Clone from a URI.
Figure 6.26 Introduce the URI corresponding to HappyWalk's server.
Figure 6.27 Select the
master
branch.
Figure 6.28 Selecting the local storage directory.
Figure 6.29 Select the option
Import existing Eclipse projects
.
Figure 6.30 Tick the checkbox of the
HappyWalkServer
project.
Figure 6.31 Creating a Foursquare® app.
Figure 6.32 Foursquare®'s Client ID and Client Secret.
Figure 6.33 Navigating into the server's GlobalVariables.
Figure 6.34 Log in to the PostgreSQL 9.3 server.
Figure 6.35 Create a new database.
Figure 6.36 Name the new database as
happywalk
.
Figure 6.37 Select the correct SQL script.
Figure 6.38 Populating the database.
Figure 6.39 Create a new server.
Figure 6.40 Define a new Tomcat 7 installation.
Figure 6.41 Installing Tomcat 7 from Eclipse.
Figure 6.42 Adding HappyWalk to Tomcat 7.
Figure 6.43 Running the HappyWalk server.
Figure 6.44 Select the newly created Tomcat 7.
Figure 6.45 The HappyWalk server is up and running.
Figure 6.46 The
ipconfig
command.
Figure 6.47 HappyWalk's map screen.
Chapter 7: Data Acquisition
Figure 7.1 Creating a new class.
Figure 7.2 AS cannot resolve symbol issue.
Figure 7.3 Importing the appropriate class.
Figure 7.4 Creating a new package.
Figure 7.5 Creating the sensor processors.
Figure 7.6 Signal processing overview.
Figure 7.7 Current state of our HiTLCPS at the end of Chapter 7.
Chapter 8: State Inference
Figure 8.1 An example of a sigmoid activation function.
Figure 8.2 Creating a new basic activity.
Figure 8.3 Name the activity as
EmotionFeedback
.
Figure 8.4 The files that compose the
EmotionFeedback
activity.
Figure 8.5 Our goal for the
EmotionSpace
view.
Figure 8.6 Creating the
EmotionSpace
class.
Figure 8.7 Create
EmotionSpace
constructor matching super.
Figure 8.8 Choose
View(context:Context, attrs:AttributeSet)
.
Figure 8.9 Changing from the layout
Design
view to
Text
view.
Figure 8.10 Creating a new
Values resource
file.
Figure 8.11 Naming the
Values resource
file.
Figure 8.12 The coordinates of the
EmotionSpace
view.
Figure 8.13 The emotion feedback notification.
Figure 8.14 Creating
TaskSendEmotion
.
Figure 8.15 Current state of our HiTLCPS at the end of Chapter 8.
Chapter 9: Actuation
Figure 9.1 HappyWalk's database conceptual schema.
Figure 9.2 Creating a new class in Eclipse.
Figure 9.3 Naming
RequestSetEmotion
.
Figure 9.4 Generating the
Constructors, toString()
, and the
Getters and Setters
.
Figure 9.5 Generating a
Constructor
using fields.
Figure 9.6 Generating a
Constructor
from
Superclass
.
Figure 9.7 Generating the
Getters and Setters
.
Figure 9.8 Overriding the default
toString()
method.
Figure 9.9 The location of the
HappyWalkServer's
web.xml.
Figure 9.10 The emotion alert dialog.
Figure 9.11 The emotion heatmaps
Figure 9.12 Final state of our HiTLCPS at the end of Chapter 9.
Chapter 10: Requirements and Challenges for HiTL Applications
Figure 10.1 The HiTL resilience paradigm.
Chapter 11: Human-in-the-Loop Constraints
Figure 11.1 Lessons learned towards human-in-the-loop control.
Chapter 4: HITL Technologies and Applications
Table 4.1 Summary of some of the technologies/solutions that support HiTLCPS
Table 4.2 Summary of experimental HiTLCPS projects
Chapter 5: A Sample App
Table 5.1 Machine learning approaches for sensing context in smartphones [18].
Source:
Adapted from Guinness 2013
Table 5.2 Testing training performance (150 emotions)
Table 5.3 Testing neural network accuracy (41 emotions)
Chapter 6: Setting up the Development Environment
Table 6.1 Summary of the steps necessary to install AS 2.1.3
Table 6.2 Summary of the steps necessary to set up HappyWalk's Android project
Table 6.3 Summary of the steps necessary to deploy HappyWalk's server
Table 6.4 Summary of the steps necessary to test the base HappyWalk system
Chapter 10: Requirements and Challenges for HiTL Applications
Table 10.1 Summary of the identified HiTL requirements and challenges
David Nunes
University of Coimbra
Jorge Sá Silva
University of Coimbra
Fernando Boavida
University of Coimbra
This edition first published 2018
© 2018 John Wiley & Sons Ltd
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of David Nunes, Jorge Sá Silva and Fernando Boavida to be identified as the authors of this work has been asserted in accordance with law.
Registered Office(s)
John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA
John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
Editorial Office
The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print-on-demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of Warranty
While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging-in-Publication Data
Names: Nunes, David, 1987- author. | Silva, Jorge Sá, author. | Boavida, Fernando, 1959- author.
Title: A practical introduction to human-in-the-loop cyber-physical systems / David Nunes, Jorge Sá Silva, Fernando Boavida.
Description: First edition. | Hoboken, NJ : John Wiley & Sons, 2018. | Includes bibliographical references and index. |
Identifiers: LCCN 2017025006 (print) | LCCN 2017042126 (ebook) | ISBN 9781119377801 (pdf) | ISBN 9781119377788 (epub) | ISBN 9781119377771 (cloth)
Subjects: LCSH: Cooperating objects (Computer systems) | Human-computer interaction.
Classification: LCC TJ213 (ebook) | LCC TJ213 .N86 2017 (print) | DDC 621.39-dc23
LC record available at https://lccn.loc.gov/2017025006
Cover Design: Wiley
Cover Image: © ipopba/Gettyimages
To my parents, Jorge and Eulália, and to my brother, Telmo.
David Nunes
To Fátima, Catarina, Pedro, Jojó, and my parents
Jorge Sá Silva
To Maria Joào and our three daughters-Susana, Inês, and Catarina
Fernando Boavida
Figure 2.1
In [1], books and other common objects were augmented with RFID tags and associated with virtual documents by PDAs.
Figure 2.2
Shaman [2] acted as a representative for the connected
Lite
Servers, offering Java and HTML interfaces.
Figure 2.3
Device web presence in Cooltown [3].
Source:
Adapted from Kindberg
et al.
2002.
Figure 2.4
JXTA [4] peers created virtual ad hoc networks which served to abstract the real ones.
Figure 2.5
Works such as [5] and [6] used proxies to offer embedded devices' capabilities through RESTful web services.
Figure 2.6
The SenseWeb [7] architecture.
Figure 2.7
WikiCity [8] interfaced between virtual data and the physical world through a semantically defined format for data exchange.
Figure 2.8
Nokia 6101 vs iPhone 6s/LG Nexus 5X.
Figure 2.9
HiTL technologies evolution timeline.
Figure 3.1
Basic processes of human-in-the-loop control.
Figure 3.2
Taxonomy of human control.
Figure 3.3
Taxonomy of human roles.
Figure 4.1
SenQ's query system stack shown side-by-side with the topology and components of AlarmNet, a prototypical implementation for assisted-living [9].
Source:
Adapted from Wood 2008.
Figure 4.2
The architecture of CenceME [10], one of MetroSense's implementations.
Figure 4.3
The three key components of BCI using smartphones [11].
Source:
Adapted from Lathia
et al
. 2013.
Figure 4.4
SociableSense architecture [12].
Source:
Adapted from Rachuri 2011.
Figure 4.5
Control architecture for energy saving with HiTL [13].
Source:
Adapted from Liang 2013.
Figure 4.6
Architecture of an HiTL HVAC system [14].
Source:
Adapted from Agarwal 2011.
Figure 4.7
Diagram showing the main components of CAALYX's roaming monitoring system [15].
Source:
Adapted from Boulos
et al
. 2007.
Figure 4.8
A semi-autonomous wheelchair receives brain signals from the user and executes the associated tasks of path planning, obstacle avoidance, and localization [16].
Source:
Adapted from Schirner 2013.
Figure 4.9
A mockup of a map interface similar to the Highlight application.
Figure 4.10
Overview of the system proposed in [17].
Source:
Adapted from W.-H. Rho and S.-B. Cho 2014.
Figure 5.1
HappyWalk HiTL control.
Figure 5.2
HappyWalk's architecture.
Figure 5.3
Android's
activity lifecycle.
Figure 5.4
HappyWalk's Android class structure.
Figure 5.5
An overview of HappyWalk Android app's main classes.
Figure 5.6
An overview of HappyWalkServer's main classes.
Figure 5.7
A typical artificial neural network architecture.
Figure 5.8
Sound signal in the time domain (left side) analyzed through a Fourier transformation to show its frequency domain (right side).
Figure 5.9
HappyWalk's Emotional Feedback.
Figure 5.10
HappyWalk's neural network design.
Figure 6.1
Installing Java SE Development Kit 7u79.
Figure 6.2
Installing Android Studio and Android SDK.
Figure 6.3
Canceling the setup wizard.
Figure 6.4
Opening the Android SDK manager.
Figure 6.5
Installing Android API 21.
Figure 6.6
Opening the standalone SDK manager.
Figure 6.7
Installing
Android SDK Build-tools 21.1.2
.
Figure 6.8
Installing Git #1. (a) Adding Git to the PATH, on Windows (b) Choose
Checkout Windows-style
.
Figure 6.9
Installing Git #2. (a) We recommend using MinTTY (b) Uncheck
Enable file system caching
.
Figure 6.10
Importing HappyWalk from Git.
Figure 6.11
Cloning the HappyWalk project.
Figure 6.12
Opening the HappyWalk project.
Figure 6.13
Choosing HappyWalk's project folder.
Figure 6.14
Do not upgrade Android Gradle or its plugin.
Figure 6.15
Running HappyWalk.
Figure 6.16
HappyWalk's first launch.
Figure 6.17
Obtaining the Android debug key.
Figure 6.18
Creating a project to obtain a Google Maps Android API key.
Figure 6.19
Creating the Google Maps Android API key.
Figure 6.20
Obtaining the Google Maps Android API key.
Figure 6.21
Changing into the project's view.
Figure 6.22
Opening
app/debug/res/values/google_maps_api.xml
.
Figure 6.23
Choosing PostgreSQL superuser's password.
Figure 6.24
No need to launch Stack Builder.
Figure 6.25
Clone from a URI.
Figure 6.26
Introduce the URI corresponding to HappyWalk's server.
Figure 6.27
Select the
master
branch.
Figure 6.28
Selecting the local storage directory.
Figure 6.29
Select the option
Import existing Eclipse projects
.
Figure 6.30
Tick the checkbox of the
HappyWalkServer
project.
Figure 6.31
Creating a Foursquare® app.
Figure 6.32
Foursquare®'s Client ID and Client Secret.
Figure 6.33
Navigating into the server's GlobalVariables.
Figure 6.34
Log in to the PostgreSQL 9.3 server.
Figure 6.35
Create a new database.
Figure 6.36
Name the new database as
happywalk
.
Figure 6.37
Select the correct SQL script.
Figure 6.38
Populating the database.
Figure 6.39
Create a new server.
Figure 6.40
Define a new Tomcat 7 installation.
Figure 6.41
Installing Tomcat 7 from Eclipse.
Figure 6.42
Adding HappyWalk to Tomcat 7.
Figure 6.43
Running the HappyWalk server.
Figure 6.44
Select the newly created Tomcat 7.
Figure 6.45
The HappyWalk server is up and running.
Figure 6.46
The
ipconfig
command.
Figure 6.47
HappyWalk's map screen.
Figure 7.1
Creating a new class.
Figure 7.2
AS cannot resolve symbol issue.
Figure 7.3
Importing the appropriate class.
Figure 7.4
Creating a new package.
Figure 7.5
Creating the sensor processors.
Figure 7.6
Signal processing overview.
Figure 7.7
Current state of our HiTLCPS at the end of
Chapter 7
.
Figure 8.1
An example of a sigmoid activation function.
Figure 8.2
Creating a new basic activity.
Figure 8.3
Name the activity as
EmotionFeedback
.
Figure 8.4
The files that compose the
EmotionFeedback
activity.
Figure 8.5
Our goal for the
EmotionSpace
view.
Figure 8.6
Creating the
EmotionSpace
class.
Figure 8.7
Create
EmotionSpace
constructor matching super.
Figure 8.8
Choose
View(context:Context, attrs:AttributeSet)
.
Figure 8.9
Changing from the layout
Design
view to
Text
view.
Figure 8.10
Creating a new
Values resource
file.
Figure 8.11
Naming the
Values resource
file.
Figure 8.12
The coordinates of the
EmotionSpace
view.
Figure 8.13
The emotion feedback notification.
Figure 8.14
Creating
TaskSendEmotion
.
Figure 8.15
Current state of our HiTLCPS at the end of
Chapter 8
.
Figure 9.1
HappyWalk's database conceptual schema.
Figure 9.2
Creating a new class in Eclipse.
Figure 9.3
Naming
RequestSetEmotion
.
Figure 9.4
Generating the
Constructors, toString()
, and the
Getters and Setters
.
Figure 9.5
Generating a
Constructor
using fields.
Figure 9.6
Generating a
Constructor
from
Superclass
.
Figure 9.7
Generating the
Getters and Setters
.
Figure 9.8
Overriding the default
toString()
method.
Figure 9.9
The location of the
HappyWalkServer's
web.xml.
Figure 9.10
The emotion alert dialog.
Figure 9.11
The emotion heatmaps
Figure 9.12
Final state of our HiTLCPS at the end of
Chapter 9
.
Figure 10.1
The HiTL resilience paradigm.
Figure 11.1
Lessons learned towards human-in-the-loop control.
Table 4.1
Summary of some of the technologies/solutions that support HiTLCPS
Table 4.2
Summary of experimental HiTLCPS projects
Table 5.1
Machine learning approaches for sensing context in smartphones [18].
Source:
Adapted from Guinness 2013
Table 5.2
Testing training performance (150 emotions)
Table 5.3
Testing neural network accuracy (41 emotions)
Table 6.1
Summary of the steps necessary to install AS 2.1.3
Table 6.2
Summary of the steps necessary to set up HappyWalk's Android project
Table 6.3
Summary of the steps necessary to deploy HappyWalk's server
Table 6.4
Summary of the steps necessary to test the base HappyWalk system
Table 10.1
Summary of the identified HiTL requirements and challenges
Our world keeps being an increasingly technological one. As first put forward by the renowned computer scientist Mark Weiser, we continue to see that, as devices get smaller in size, more mobile, powerful, and efficient, they begin to “disappear”. Technology is now so intrinsic to our everyday lives that it has become an inherent part of our existence. This is the premise behind concepts such as the Internet of things and cyber-physical systems, in which distributed technology is used to monitor and control the environment. However, our current technological advancement still falls short of Weiser's ideas. Each time we have to hurdle through unintuitive configuration menus, errors, and software incompatibilities we become stressed by our computers and appliances. Weiser argued that the ultimate form of computers was an extension of our subconscious. To him, the ideal computer would be capable of truly understanding people's unconscious actions and desires. Instead of humans adapting to technology and learning how to use it, it would be technology that would adapt to the disposition and uniqueness of each human being.
In fact, systems that consider the human context are becoming increasingly more important, and there are strong indications that most future technologies will most likely be much more human-aware. This book focuses on the realm of human-in-the-loop cyber-physical systems (HiTLCPSs), that is cyber-physical systems that take human response into consideration. HiTLCPSs infer the user's, intents, psychological states, emotions, and actions through sensors, using this information to determine the system's actions. This involves using a large variety of sensors and mobile devices to monitor and evaluate human nature. Therefore, this technology has strong ties with wireless sensor networks, robotics, machine learning, and the Internet of things.
This book is useful to BSc and MSc students, as well as to PhD students, researchers, and professors addressing the areas of ubiquitous computing, Internet of things, cyber-physical systems, and human–computer interaction. It can also be useful to professional developers that intend to introduce HiTL concepts into their mobile apps and/or Internet of things/cyber-physical system applications.
Throughout its pages, the book will guide the reader through a journey into this novel and exciting area of research and technological development. As such, it is intended to be used as a primer on HiTLCPSs, providing some insights into the research being done on this topic, current challenges, and requirements. One of the book's objectives is to introduce the reader to the practical usage of HiTL paradigms within software development. Therefore, we included a comprehensive hands-on tutorial where the major theoretical concepts behind HiTLCPSs are applied to a sample mobile application and explained from a practical perspective. This tutorial requires some knowledge of Android and the Java programming language, as well as some notions about databases and RESTful web services. It is accompanied by a base source code repository and several code snippets which the reader can extensively modify.1 It is not our intention to provide in-depth knowledge about the programming languages, and/or the machine learning techniques, necessary to create complex HiTL systems. Instead, the tutorial aims at illustrating and consolidating some of the book's theoretical ideas.
Finally, we would like to thank you, the reader, for your interest. We would also like to ask you to contact us and tell us about your experience with our book. Your feedback is a very valuable resource towards improving the book. Send your email to [email protected], [email protected] or [email protected].
1
The source code repositories are located at:
https://git.dei.uc.pt/dsnunes/happywalk.git
https://git.dei.uc.pt/dsnunes/happywalkserver.git
The Internet has changed our whole life and it will have further impact on how we live and how we work. Most of the cyber-physical systems (CPSs) make use of the Internet and even define parts of it. Let me cite Wikipedia in this preface, even though it is not very scientific so to do. Understanding the CPS as “a mechanism controlled or monitored by computer-based algorithms, tightly integrated with the internet and its users” means that users, humans, are essential for any CPS. The National Institute of Standards and Technology of the US Department of Commerce (NIST) goes even further, stating that “these systems will provide the foundation of our critical infrastructure, form the basis of emerging and future smart services, and improve our quality of life in many areas”. Looking at the examples mentioned in Wikipedia, “smart grid, autonomous automobile systems, medical monitoring, process control systems, robotics systems, and automatic pilot avionics”, human are always involved.
Humans are not only involved; humans are the essential part of CPSs; CPSs have to serve us! With the basic idea, to incorporate humans as being in the system, we encounter human-in-the-loop (HiTL). It comprises a model, an adequate representation of the human behavior in order to treat it as an integral part of the whole system. Just as one example, let me cite Carsten Binning et.al. at his preface of the Proceedings of the first Workshop on Human-In-the-Loop Data Analytics HILDA of June 26th, 2016, in San Francisco, California: “A major bottleneck in data analytics today is to efficiently leverage the human capabilities to formulate questions and understand answers of data analytics systems … Recent technology trends (such as touchscreens, motion detection, and voice recognition) are widening the possibilities for users to interact with data, and data-driven industries are shifting to personalized processing to better target their services to users' needs”.
Hence it seems somewhat natural to look at both topics together in a kind of textbook and survey. In my six years as editor-in-chief of the journal ACM Transactions on Multimedia Computing, Communications, and Applications (ACM TOMM), I have, unfortunately, not come across a comprehensive high-quality survey paper of CPS HiTL; it has been even more serious: nobody even tried to cover with a survey this essential area on multimedia computing, communications, and its applications. No one did so far!
At the present time, writing this preface, I was only able to read parts of this book; I am looking forward to reading it all together–the whole book.
The authors of this book, David Nunes, Jorge Sá Silva, and Fernando Boavida from the University of Coimbra provide an in-depth view to HiTLCPS evolution, theory, technologies, and applications. Moreover, they illustrate how to apply HiTLCPS concepts to a sample smartphone application, through a hands-on approach that guides the reader from the development environment to the final product, including data acquisition, state inference, and actuation. With (1) their profound technical knowledge of many areas in computing and communications, as well as with (2) their expertise and experience as authors of other textbooks, the authors are certainly key for this book being a long-term successful scientific book in this area. Congratulations!
Dr. Ralf Steinmetz
Fellow of the IEEE and Fellow of the ACM
Director, Multimedia Communications Laboratory, Technische Universität Darmstadt
Chairman of the Board, Hessian Telemedia Technology CompetenceCenter, Germany
Darmstadt, March 2017
A book such as this would not have been possible without the help and support of many people and institutions.
First of all, we would like to thank our base institutions—the Department of Informatics Engineering, and the Center for Informatics and Systems, both from the University of Coimbra—in the scope of which we carry out our teaching and research activities, for the provided facilities and research environment. With their effort and contributions, enthusiasm, discussions, and suggestions during several years of joint research activities and human-in-the-loop social interaction, our students and our colleagues were instrumental in making this book a reality.
We also thank IMDEA Networks Institute, in Madrid, for the support provided during Fernando Boavida's sabbatical in 2015/2016, and especially to its leading computer scientist, Arturo Azcorra, for his support; to Antonio Fernández Anta, Miguel Péon, Jeanet Birkkjaer; and Rosa Gómez for their encouragement; and to all its researchers and staff in general.
Some of the research that formed the basis for this book was carried out in the scope of financed research projects and initiatives and, thus, it is also right to thank the entities that made the referred research possible, namely the Portuguese Foundation for Science and Technology (FCT), FCT's POPH/FSE program, and the SOCIALITE Project (PTDC/EEI-SCR/2072/2014), supported by COMPETE 2020, Portugal 2020, Operational Program for Competitiveness and Internationalization (POCI), and the European Union's ERDF (European Regional Development Fund).
We would also like to thank David Hutchison, from Lancaster University, for believing in us and putting us in contact with the excellent editorial team at John Wiley & Sons.
Finally, we would like to thank our families, for their unconditional love and support.
AI
Artificial Intelligence
ANN
Artificial Neural Network
API
Application Programming Interface
AS
Android Studio
AV
Autonomous Vehicle
BCC
Body-Coupled Communication
BCI
Behavior Change Interventions
CHIL
Computers in the Human Interaction Loop
CoAP
Constrained Application Protocol
cOre
Constrained RESTful environments
CPS(s)
Cyber-Physical System(s)
CPU
Central Processing Unit
DAO
Data Access Object
ECG
Electrocardiography
EEG
Electroencephalography
ESM
Experience Sampling Method
FCT
Fast Cosine Transform
FFT
Fast Fourier Transformation
GPRS
General Packet Radio Service
GPS
Global Positioning System
GSM
Global System for Mobile Communications
HiTL
Human-in-the-Loop
HiTLCPS(s)
Human-in-the-Loop Cyber-Physical System(s)
HTML
HyperText Markup Language
HTTP
Hypertext Transfer Protocol
HVAC
Heating, Ventilation, and Cooling
ID
Identification
IFR
International Federation of Robotics
IoA
Internet of All
IoT
Internet of Things
IP
Internet Protocol
IDE
Integrated Development Environment
IEEE
Institute of Electrical and Electronics Engineers
IETF
Internet Engineering Task Force
ISM band
Industrial, Scientific, and Medical radio bands
Java EE
Java Enterprise Edition
Java SE
Java Standard Edition
JDK
Java Development Kit
JSON
JavaScript Object Notation
LTE
Long-Term Evolution
M2M
Machine-to-Machine
MPTCP
MultiPath Transmission Control Protocol
NAT
Network Address Translation
NSF
National Science Foundation
OSI
Open Systems Interconnection
OS
Operating System
P2P
Peer-to-Peer
POI(s)
Point(s) of Interest
RAM
Random-Access Memory
REST
Representational state transfer
RF
Radio Frequency
RFID
Radio-Frequency Identification
RSSI
Received Signal Strength Indication
SCTP
Stream Control Transmission Protocol
SDK
Software Development Kit
sMAP
Simple Monitoring and Action Profile
SMS
Short Message Service
SOAP
Simple Object Access Protocol
SQL
Structured Query Language
TCP
Transmission Control Protocol
UDP
User Datagram Protocol
URI
Uniform Resource Identifier
URL
Uniform Resource Locator
UUID
Universally Unique Identifier
VoIP
Voice Over Internet Protocol
WSDL
Web Service Description Language
WSN(s)
Wireless Sensor Network(s)
XML
Extensible Markup Language
Don't forget to visit the companion website for this book:
www.wiley.com/go/nunesloop
There you will find valuable material designed to enhance your learning, including:
Source codes
Scan this QR code to visit the companion website.
Humans are a remarkable species. For most of our history, we have used our intellectual ability to create and develop many different tools and processes to assist us and ease our lives. Since the days our ancestors discovered how to control fire, around 300,000 years ago, we have achieved an exponential technological progress. From the invention of wheeled vehicles, around 6,000 years ago, to the transistor, invented just 70 years ago, many were the technological advances that have drastically changed the way we experience and perceive our reality.
The last few decades have seen an unprecedented surge of technological advancement, particularly in the area of computer science, resulting in some of the most revolutionary human inventions yet: we have developed personal desktop and portable computers, as well as a global network that interconnects all kinds of computerized devices, aptly called the Internet. Despite the fact that they have been in existence for an extremely short time, these technologies have transformed, and will continue to transform, the way our world and society work, at a very fundamental level and at an incredibly fast pace.
Interestingly, once the Internet was in place, we quickly achieved the power to extend it to our traditional tools and appliances, which then became “interconnected”. One of the first “tools” ever connected to the Internet was the Carnegie Mellon University Computer Science Department's Coke Machine, in the early 1980s [19], which was able to report its stock and label it as “cold” or not, depending on how much time it had been inside the machine. An idea began to spread: a vision of an interconnected world where information on most everyday objects was accessible.
Since then, scientists and engineers have developed this idea into a concept that is known as the “Internet of Things” (IoT). The idea started small, considering scenarios where radio-frequency identification allowed the “tagging” and managing of objects by computers. Each object would carry a radio-frequency identification (RFID) tag, a small, traceable chip which could be wirelessly scanned by a nearby RFID reader. The RFID tag enabled the automatic identification of the object and allowed it to be traced/managed through the Internet.
The continued advances in miniaturization allowed us to go beyond the simple tagging and identification of everyday objects. As predicted by Gordon Moore, back in 1965, the amount of computing power in integrated circuits has been doubling every 18 months for the last 50 years [20]. The remarkable work of computer industry engineers and scientists has led to many new technologies. The continuous integration of computational resources into all kinds of objects made our tools “intelligent”. Everything from light bulbs to refrigerators, microwaves, and coffee machines will soon be connected to the Internet. In fact, some studies estimate that we will have an IoT with 26 billion connected devices by 2020 [21].
We can see evidence of this trend all around us. The Internet now interconnects a large number of highly heterogeneous devices, from traditional desktop PCs to laptops, tablets, and smartphones.
For example, the area of sensing technologies and wireless sensor networks (WSNs) is becoming increasingly prominent. WSNs are composed of dozens or even hundreds of autonomous “sensor nodes”, small computerized devices that are capable of collecting physical world data and forwarding it by means of wireless communication. They can be used to monitor environmental luminosity, temperature, pressure, sound, and many other parameters, and can be spatially distributed in an ad hoc fashion. These technologies have been receiving a great deal of attention from the research community due to their potential in almost every application area. In fact, WSN deployments can now be found in many industrial, medical, and domestic environments. Recent studies in WSNs have brought great advancements in this area, namely in terms of energy efficiency and integration capabilities, with sensors being provided as services [22 23], accessible through the Internet [24]. Sensors are now indispensable devices, for they allow us to collect data from real-world phenomena, handle this data in digital form, and ultimately extend the Internet to the physical world.
In fact, the number of sensors that nowadays can be deployed on humans can turn them into walking sensor networks. Humans can use smart-shirts; carry a smartphone with several sensors and networking capabilities (e.g. global system for mobile communications (GSM), Bluetooth, long-term evolution (LTE)); and use Google glasses, iPods, smart watches, and shoes with sensors. In terms of sensing applied to individual users, Bosch Sensory Swarms and the Qualcomm Swarm Lab at UC Berkeley estimate that the number of sensors in personal devices can add up to 1000 wireless sensors per person, to be deployed over the next 10 to 15 years [25], resulting in large amounts of data being available for processing, and allowing a wide range of sensing applications to be deployed. This reality depends, of course, on the drastic reduction of sensor production costs, which are expected to come down to negligible values over time, as with most silicon-based hardware [26].
As for automated actuation, the world has seen a gradual increase in the number of installed robots per year. The 2015 World Robot Statistics study, issued by the International Federation of Robotics (IFR) [27], indicates that the total number of professional service robots sold in 2014 rose by 11.5% compared to 2013, from 21,712 to 24,207 units. IFR expects that, for the 2015–2018 period, sales of service robots for professional use will increase to about 152,375 units, while sales of robots for personal use will reach about 35 million units, with a total estimated value of about $40 billion. Global sales of industrial robots, on the other hand, will experience a yearly growth of 15% until 2018, while the number of sold units will double to around 400,000.
Interwoven with the concept of IoT is the concept of cyber-physical systems (CPSs), which consist in the sensing and control of physical phenomena through networks of devices that work together to achieve common goals. These CPSs represent a confluence of robotics, wireless sensor networks, mobile computing, and the IoT, to achieve highly monitored, easily controlled, and adaptable environments.
The IoT and CPS concepts have been pushed by two distinct communities. IoT was initially developed using a computer science perspective, mostly supported by the European Commission. The goal was to develop a network of smart objects with self-configuration capabilities on top of the current Internet. This development effort included hardware, software, standards, and interoperable communication protocols and languages that describe these intelligent devices [28]. IoT builds on several requirements, namely the development of intelligence in devices, interfaces and services; the assurance of security and privacy; systems integration; communication interoperability; and data “semantization” and management [29].
On the other hand, the concept of CPSs was initially supported by the US National Science Foundation (NSF). CPSs stem from an engineering perspective and concern the control and monitoring of physical environments and phenomena through sensing and actuation systems consisting of several distributed computing devices [30]. These systems are mostly interdisciplinary, requiring expertise and skills in mathematical abstractions (algorithms, processes) that model physical phenomena, smart devices and services, effective actuation, security and privacy, systems integration, communication, and data processing [31].
Thus, IoT tended to focus more on openness and the networking of intelligent devices, while CPSs were more concerned with applicability, modeling of physical processes, and problem solving, often through closed-looped systems. While their core philosophy and focus were initially different, their many similarities, such as intensive information processing, comprehensive intelligent services, and efficient interconnection and data exchange, have led to both terms being used interchangeably [32] without clearly identified borders [30].
