Digital Transformation of the Laboratory -  - E-Book

Digital Transformation of the Laboratory E-Book

0,0
124,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Take your lab into the 21st century with this insightful and exciting new resource

Digital Transformation of the Laboratory: A Practical Guide to the Connected Lab delivers essential and transformative new insights into current and future technologies and strategies for the digitization of laboratories. Thoroughly supported and backed-up with contributions from thought and industry leaders, the book shows scientists in academia and industry how to move from paper to digital in their own labs.

The distinguished editors have included resources from industry-leading voices in their respective fields that offer concrete and practical strategies to embrace modern, digital technology. You’ll learn to modernize your laboratory, cut costs, improve productivity, and find efficiencies you never considered.

You’ll discover a stepwise approach to move from paper to digital tech, including guidance on how to understand and define your lab’s requirements and evaluate potential solutions. Real-world case studies are included throughout the book to provide specific examples of how the ideas presented in the book can be applied in real life. You’ll also benefit from the inclusion of:

  • A thorough introduction to the evolution of the modern laboratory, including new available technologies and the new science being conducted with it
  • An exploration of crucial terms you’ll need to understand in order to chart your path into the future of the laboratory
  • Examinations of practical issues you’ll need to master in order to define your lab’s digitalization strategy
  • Numerous case studies and expert commentary on the subject of moving from paper to digital

Perfect for senior executives, lab managers, senior scientists, principal investigators, professors and PhDs working in the field of biotechnology, pharma, chemistry, healthcare, life science, Digital Transformation of the Laboratory: A Practical Guide to the Connected Lab will also earn a place in the libraries of laboratory heads and auditing departments seeking to find efficiencies, cut costs, and maximize productivity in their own labs.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 654

Veröffentlichungsjahr: 2021

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Copyright

Preface

Inspiration

Knowledge Base

Practical

Case Studies

Continuous Improvement

Vision of the Future and Changing the Way We Do Science

Part I: Inspiration

1 The Next Big Developments – The Lab of the Future

1.1 Introduction

1.2 Discussion

1.3 Thoughts on LotF Implementation

1.4 Conclusion

References

Part II: Knowledge Base

2 Crucial Software‐related Terms to Understand

2.1 Digital Revolution

2.2 Computers

2.3 Internet

2.4 Cloud Computing

2.5 Computer Platforms

2.6 Applications

2.7 Values of Software

2.8 Software Development

2.9 Software Product Lifecycle

2.10 Software Design

2.11 Software Quality

2.12 Software Integration

2.13 Data‐flow Modeling for Laboratories

2.14 Software Licensing

References

3 Introduction to Laboratory Software Solutions and Differences Between Them

3.1 Introduction

3.2 Types of Software Used in Laboratories

References

4 Data Safety and Cybersecurity

4.1 Introduction

4.2 Data Safety

4.3 Cybersecurity

References

5 FAIR Principles and Why They Matter

5.1 Introduction

5.2 What Is the Value of Making Data FAIR?

5.3 Considerations in Creating Lab‐based Data to Prepare for It to Be FAIR

5.4 The FAIR Guiding Principles Overview

References

6 The Art of Writing and Sharing Methods in the Digital Environment

6.1 Introduction

6.2 Tools and Resources for Tracking, Developing, Sharing, and Disseminating Protocols

6.3 Making Your Protocols Public

6.4 The Art of Writing Methods

References

Part III: Practical

7 How to Approach the Digital Transformation

7.1 Introduction

7.2 Defining the Requirements for Your Lab

7.3 Evaluating the Current State in the Lab

References

8 Understanding Standards, Regulations, and Guidelines

8.1 Introduction

8.2 The Need for Standards and Guidelines

8.3 How Does Digitalization Relate to Standards and Guidelines

8.4 Challenges Related to Digitalization in Certified Laboratories

8.5 Can Digital Strategy be Implemented without Certification?

References

9 Interoperability Standards

9.1 SiLA

9.2 AnIML

9.3 Allotrope

9.4 Conclusion

10 Addressing the User Adoption Challenge

10.1 Introduction

10.2 Identify Key Stakeholders and Explain the Reasons for Change

10.3 Establish a Steering Committee

10.4 Define the Project Objectives, Expected Behaviour, and Timeline

10.5 Check for Understanding and Encourage Debate

10.6 Acknowledge Ideas and Communicate Progress

10.7 Provide a Feedback Mechanism

10.8 Set Up Key Experience Indicators and Monitor Progress

10.9 Gradually Expand to a Larger Scale

10.10 Conclusions

References

11 Testing the Electronic Lab Notebook and Setting Up a Product Trial

11.1 Introduction

11.2 The Product Trial

11.3 The Importance of a Product Trial

11.4 Setting Up a Product Trial

11.5 Good Practices of Testing a Product

11.6 Conclusions

References

Part IV: Case Studies

12 Understanding and Defining the Academic Chemical Laboratory's Requirements: Approach and Scope of Digitalization Needed

12.1 Types of Chemistry Laboratory

12.2 Different Stages of Digitalization

12.3 Preparatory Stage

12.4 Laboratory Stage

12.5 Transferal Stage

12.6 Write‐up Stage

12.7 Conclusions and Final Considerations

References

13 Guidelines for Chemistry Labs Looking to Go Digital

13.1 Understanding the Current Setup

13.2 Understanding Your Scientists and Their Needs

13.3 Understanding User‐based Technology Adoption

13.4 Breaking Down the Barriers Between Science and Technology

13.5 Making Your Laboratory Team Understand Why This Is Necessary

13.6 Working with Domain Experts

13.7 Choosing the Right Software

13.8 Changing Attitude and Organization

References

14 Electronic Lab Notebook Implementation in a Diagnostics Company

14.1 Making the Decision

14.2 Problems with Paper Notebooks

14.3 Determining Laboratory's Needs

14.4 Testing

14.5 A Decision

14.6 How to Structure the ELN

14.7 Conclusion

15 Identifying and Overcoming Digitalization Challenges in a Fast‐growing Research Laboratory

15.1 Why Going Digital?

15.2 Steps to Introduce ELNs in Lab Practice

15.3 Creating the Mindset of a Digital Scientist

15.4 The Dilemma of Digitalization in Academia

16 Turning Paper Habits into Digital Proficiency

16.1 Five Main Reasons for the Implementation of a Digital System to Manage the Research Data

16.2 The Six‐step Process of Going from Paper to Digital

16.3 Onboarding All Team Members and Enhancing the Adoption of the New Technology in the Lab

16.4 Benefits of Switching from Paper to Digital

17 Going from Paper to Digital: Stepwise Approach by the National Institute of Chemistry (Contract Research)

17.1 Presentation of our CVTA Laboratory

17.2 Data Management Requirements Explained in Detail

17.3 Going from Paper to Digital

17.4 Implementation of SciNote (ELN) to CVTA System

17.5 Suggestions for Improvements and Vision for the Future

References

18 Wet Lab Goes Virtual: In Silico Tools, ELNs, and Big Data Help Scientists Generate and Analyze Wet‐lab Data

18.1 CRISPR‐Cas9 Explained

18.2 Introduction of the Digital Solutions and ELN into the Laboratory

18.3 The Role of the ELN and In Silico Tools in the Genome‐editing Process

18.4 The Role of the ELN and In Silico Tools in the Protein Design Process

References

Note

19 Digital Lab Strategy: Enterprise Approach

19.1 Motivation

19.2 Designing a Flexible and Adaptable Architecture

19.3 There is Only One Rule: No Rules

19.4 The Lab Digitalization Program Compass

19.5 Conclusion

References

Part V: Continuous Improvement

20 Next Steps – Continuity After Going Digital

20.1 Are You Ready to Upgrade Further?

20.2 Understanding the Big Picture

20.3 What to Integrate First?

20.4 Budgeting

20.5 Continuous Improvement as a Value

References

Part VI: Vision of the Future and Changing the Way We Do Science

21 Artificial Intelligence (AI) Transforming Laboratories

21.1 Introduction to AI

21.2 Artificial Intelligence in Laboratories

21.3 Process Monitoring

21.4 Discussion – Human in the Loop

References

22 Academic's Perspective on the Vision About the Technology Trends in the Next 5–10 Years

22.1 Hybrid Solutions

22.2 Voice Technologies

22.3 Smart Assistants

22.4 Internet of Things

22.5 Robot Scientists

22.6 Making Science Smart – Incorporating Semantics and AI into Scientific Software

22.7 Conclusions

References

23 Looking to the Future: Academic Freedom Versus Innovation in Academic Research Institutions

23.1 Introduction

23.2 Corporate Culture Versus Academic Freedom

23.3 Spoiled for Choice, but Still Waiting for the Perfect Solution

23.4 Building a Single, Shared Infrastructure for Research Data Management

23.5 A Journey of a Thousand Miles Begins with a Single Step

Reference

24 Future of Scientific Findings: Communication and Collaboration in the Years to Come

24.1 Preprints: Reversing the Increased Time to Publish

24.2 Virtual Communities

24.3 Evolving Publishing Models

24.4 Funders Are Starting to Play a Role in Facilitating and Encouraging Rapid Sharing and Collaboration

24.5 Conclusion

References

25 Entrepreneur's Perspective on Laboratories in 10 Years

25.1 Data Recording

25.2 Recognition of Voice and Writing

25.3 Data Recording in the Future

25.4 Experimental Processes

25.5 Research Project Management

25.6 Experimental Planning

25.7 Virtual Reality

25.8 Smart Furniture

25.9 Experiment Execution

25.10 Laboratory Automation Trends

25.11 Cloud Laboratories

25.12 Data Analysis Trends

25.13 Artificial Intelligence

25.14 Data Visualizations and Interpretation

25.15 Databases

25.16 Conclusion

References

Index

End User License Agreement

List of Tables

Chapter 19

Table 19.1 Example of a lab solutions portfolio.

List of Illustrations

Chapter 1

Figure 1.1 Complex, multivariate concept of lab transformation.

Figure 1.2 Virtual and real design‐make‐test‐analyze (DMTA) concept.

Figure 1.3 Hypothesis‐experiment‐analyze‐share (HEAS) cycle.

Figure 1.4 Request‐experiment‐analyze‐feedback (REAF) process.

Figure 1.5 Digital data life cycle.

Chapter 2

Figure 2.1 The Von Neumann architecture.

Figure 2.2 Operating system's role in modern computers.

Figure 2.3 On‐site setup.

Figure 2.4 Cloud setup.

Figure 2.5 Explanation of the differences between Private, IaaS, PaaS, and S...

Figure 2.6 Transformation of data.

Figure 2.7 Two options to achieve software interoperability by employing API...

Figure 2.8 Diagram 1.

Figure 2.9 Diagram 2.

Figure 2.10 Diagram 3.

Figure 2.11 Most popular open‐source license types.

Chapter 3

Figure 3.1 Types of software found in the laboratory.

Chapter 4

Figure 4.1 A list of interfaces/pathways that malicious adversaries can pote...

Figure 4.2 A man‐in‐the‐middle attack.

Figure 4.3 Differences between the two types of encryption.

Chapter 7

Figure 7.1 Digital transformation encompasses at least three aspects: people...

Figure 7.2 An example of a simple data flow of a laboratory.

Figure 7.3 SWOT analysis: Strengths, Weaknesses, Opportunities, and Threats....

Chapter 10

Figure 10.1 An example of a graphical representation of stakeholder analysis...

Figure 10.2 Gradual rollout of a new software solution.

Chapter 12

Figure 12.1 The different stages of scientific research.

Chapter 16

Figure 16.1 Organization of projects and experiments in SciNote.

Figure 16.2 Example of a different approach to organization of projects and ...

Figure 16.3 Example of using Microsoft Office online files in SciNote.

Figure 16.4 Team protocols repository in SciNote.

Figure 16.5 Choosing to link (or not) the protocol with the version saved in...

Figure 16.6 Tracking the day‐to‐day tasks and their completion progress.

Figure 16.7 Records of lab technicians' work progress in SciNote.

Figure 16.8 Organizing results in SciNote.

Figure 16.9 Organizing results in SciNote, under the dedicated Results secti...

Chapter 17

Figure 17.1 Creating a team in SciNote electronic lab notebook.

Figure 17.2 Setting time and date records in SciNote electronic lab notebook...

Figure 17.3 Levels of user roles and permissions in SciNote.

Figure 17.4 CVTA form no. 49 for change control management (consists of Modu...

Figure 17.5 Workflow for change control management in SciNote.

Figure 17.6 Example from SciNote – change control, task M1.

Figure 17.7 Conclusion of the change – task M2.

Figure 17.8 Inventory for change control management.

Figure 17.9 Workflow for investigations of OOS results.

Figure 17.10 Detailed audit trail within a task, where change in text (old v...

Chapter 18

Figure 18.1 Examples of the saved results comprising hundreds of image files...

Figure 18.2 Folded protein structure in stick and rod representation. Image ...

Figure 18.3 Redesign of zinc finger backbone structure.

Figure 18.4 Antibody structure in (a) schematic view and (b) cartoon view. I...

Figure 18.5 PDB file format and its three‐dimensional structure.

Figure 18.6 An example of antibody redesign information saved in the results...

Chapter 19

Figure 19.1 Enterprise system architecture diagram.

Figure 19.2 Simplified physical network diagram.

Figure 19.3 Example of a radar chart to measure the laboratory digital matur...

Chapter 20

Figure 20.1 Schematic representation of connections if all software tools co...

Figure 20.2 An example of a typical data workflow map in a “paper‐based” lab...

Figure 20.3 An example of a much simpler data workflow in a digital lab afte...

Chapter 21

Figure 21.1 Artificial intelligence can support laboratories on different po...

Chapter 22

Figure 22.1 Smart laboratory example.

Chapter 24

Figure 24.1 Monthly submissions to bioRxiv. Numbers based on: http://api.bio...

Guide

Cover

Table of Contents

Begin Reading

Pages

iii

iv

xvii

xviii

xix

1

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

33

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

107

108

109

110

111

112

113

115

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

135

136

137

138

139

140

141

142

143

144

145

146

147

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

177

179

180

181

182

183

184

185

186

187

188

189

191

192

193

194

195

196

197

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

275

277

278

279

280

281

282

283

284

285

286

287

289

290

291

292

293

294

295

297

298

299

300

301

303

304

305

306

307

308

309

310

311

312

313

314

315

317

318

319

320

321

322

323

324

325

326

327

329

330

331

332

333

334

335

336

337

338

339

340

341

Digital Transformation of the Laboratory

A Practical Guide to the Connected Lab

 

Edited by

Klemen Zupancic

Tea Pavlek

Jana Erjavec

 

 

The Editors

Klemen Zupancic

SciNote LLC

3000 Parmenter St.

53562 Middleton WI

United States

Tea Pavlek

SciNote LLC

3000 Parmenter St.

53562 Middleton WI

United States

Jana Erjavec

BioSistemika LLC

Koprska ulica 98

1000 Ljubljana

Slovenia

Cover Image: Tea Pavlek

All books published by WILEY-VCH are carefully produced. Nevertheless, authors, editors, and publisher do not warrant the information contained in these books, including this book, to be free of errors. Readers are advised to keep in mind that statements, data, illustrations, procedural details or other items may inadvertently be inaccurate.

Library of Congress Card No.:

applied for

British Library Cataloguing-in-Publication Data

A catalogue record for this book is available from the British Library.

Bibliographic information published by the Deutsche Nationalbibliothek

The Deutsche Nationalbibliothek lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data are available on the Internet at <http://dnb.d-nb.de>.

© 2021 WILEY-VCH GmbH, Boschstr. 12, 69469 Weinheim, Germany

All rights reserved (including those of translation into other languages). No part of this book may be reproduced in any form – by photoprinting, microfilm, or any other means – nor transmitted or translated into a machine language without written permission from the publishers. Registered names, trademarks, etc. used in this book, even when not specifically marked as such, are not to be considered unprotected by law.

Print ISBN: 978-3-527-34719-3

ePDF ISBN: 978-3-527-82505-9

ePub ISBN: 978-3-527-82506-6

oBook ISBN: 978-3-527-82504-2

Preface

The subject of digital transformation is actually about you.

Your science, your everyday work environment, your partnerships and collaborations, and the impact of your work on the future of scientific progress.

Welcome to this book.

As a brilliant astronomer, Maria Mitchell once said, “mingle the starlight with your lives and you won't be fretted by trifles.”

The greater meaning of digital transformation shifts the perspective toward the global scheme of things. The main evaluating factors behind the lab digitalization and digital transformation answer important questions: Are we improving the quality, efficiency, and the pace of innovation?

Lab digitalization is a people‐driven initiative that aims to address the global challenges and provide solutions, backed by unquestionable integrity of traceable and reproducible scientific data.

At the moment, regardless of the laboratory type or size, people are struggling with the growing amount of generated data and leveraging its value. It is extremely challenging to organize data and keep everything traceable and reusable long term.

To address the challenge, modularity and flexibility are being incorporated on different levels of lab operations. Labs are becoming inviting spaces suitable for interdisciplinary partnerships in a digital, virtual, or personal environment. Data integrity initiatives and setup of new, digital systems prioritize integration of all tech solutions used in the lab for optimal performance. Through effective integration of tools, improved scientist‐to‐scientist interactions and intellectual contributions, and quality change management, lab digitalization places the human element at the very forefront of the overall progress toward the digital future.

This can be intimidating to some and exhilarating to others.

That is why this book is divided into modules: Inspiration, Knowledge Base, Practical, Case Studies, Continuous Improvement, and Vision of the Future. Each module covers different aspects of lab digitalization.

Inspiration

We start this book with an inspiring overview of lab evolution, new technologies, and new science being done. It will give you a complete overview of the subject of laboratories of the future and, hopefully, add to the vision of your own career in science and technology.

Knowledge Base

Knowledge Base section focuses on crucial terms to understand. It will give you a solid basis of knowledge that you will be able to apply further on as your lab grows and evolves.

Practical

The Practical chapters give you examples and guidance on defining your lab's digitalization strategy.

Case Studies

We present different case studies and expert comments on the subject of going from paper to digital. You will be able to read how different laboratories and professionals approached the subject and put it into practice, and what are their conclusions, advice, and lessons learned.

Continuous Improvement

We have a closer look at the steps that follow after the digitalization.

Vision of the Future and Changing the Way We Do Science

With continuous improvements in mind, we conclude the book with insightful expert comments on the subject of the future of science. Many of the described technologies are already becoming important, and here we identify those that might shape the next 5–10 years and change the way we do science.

As you read this book, you will gain holistic knowledge on digital transformation of the laboratory. Tracking, analyzing, and leveraging the value of data you are collecting, by implementing tools that can empower the people in your lab, are the main points of this journey.

Using the knowledge, you will be able to start defining what exactly you want to achieve. Once you clarify your main goals, you will be able to go all the way back through the processes in your lab and see which need to be digitalized.

That is when you will get the real incentive to do it.

You will understand whether you are trying to just use technology as a convenience to support the system you already have, or are you ready to think about using the better technology to change and improve the system.

You will realize what kind of decisions you need to make throughout the cycle.

Selecting the right digital solutions is quite a challenge. It is important to think how the potential solutions will fit into your existing architecture. An investment of time, energy, and budget is always involved, especially if the solutions are not integrated properly or your team is not in sync.

The knowledge you will gain will enable you to measure and evaluate the impact of digitalization. How will the use of particular tools improve specific parts of your processes to reach your goals within the given time frames?

Keeping the razor‐sharp focus and determination is the most potent driver of digitalization.

All solutions work, but the execution is crucial. You will learn how to take an agile approach, define the value for your team, start small, and scale up efficiently and successfully.

This book is a result of collaboration between different authors – researchers, business owners, consultants, managers, and professors who wrote about their vast experience and provided valuable perspective on the subject of digital transformation. Because every lab is different, and there are as many use cases as there are labs, our aim was to introduce you to a digital mindset that will enable you to find the best solution for your lab.

This book guides you through the aspects of taking your time to understand the basics of technology, adapt the digital mindset, include your team and address their concerns and challenges, read how other labs started to pave their digital way, and stay inspired along the way.

Let's dive in.

Tea Pavlek

SciNote LLC, USA

Part IInspiration

We start this book with an inspiring overview of lab evolution, new technologies, and new science being done. It will give you a complete overview of the subject of laboratories of the future and, hopefully, add to the vision and purpose of your own career in science and technology.

1The Next Big Developments – The Lab of the Future

Richard Shuteand Nick Lynch

Curlew Research, Woburn Sands, UK

1.1 Introduction

Steve Jobs once said that “the biggest innovations of the 21st century will be at the intersection of biology and technology”; in this (r)evolution, the lab will most definitely play a key role.

When speculating on the future digital transformation of the life sciences R&D, one must consider how the whole lab environment and the science that goes on in that lab will inevitably evolve and change [1, 2]. It is unlikely that an R&D lab in 2030, and certainly in 2040, will look and feel like a comparable lab from 2020. So, what are the likely new big technologies and processes and ways of working that will make that lab of the future (LotF) so different? This section endeavors to introduce some of the new developments in technology and in science that we think will change and influence the life science lab environment over the upcoming decade.

1.2 Discussion

Before going into the new technology and science in detail, it is important to recognize that this lab evolution will be driven not just by new technologies and new science. In our view, there are four additional broader, yet fundamental and complementary attributes that influence how a lab environment changes over time. They are:

People and culture considerations

Process developments and optimization

Data management improvements

Lab environment and design

When we add the fifth major driver of change – new technology (including new science) – it becomes clear that digital transformation is a complex, multivariate concept (Figure 1.1).

Figure 1.1 Complex, multivariate concept of lab transformation.

In this section, we discuss how each of these high‐level attributes will influence the changing lab and the expectations of the users. For all five areas, we include what we think are some of the most important aspects, which we believe will have the most impact on the “LotF.”

1.2.1 People/Culture

The LotF and the people who work in it will undoubtedly be operating in an R&D world where there is an even greater emphasis on global working and cross‐organization collaboration. Modern science is also becoming more social [3], and the most productive and successful researchers will be familiar with the substance and the methods of each other's work so breaking down even more the barriers to collaboration. These collaborative approaches will foster and encourage individuals' capacity to adopt new research methods as they become available; we saw this with the fast uptake of clustered regularly interspaced short palindromic repeat (CRISPR) technology [4]. “Open science” [5] will grow evermore important to drive scientific discovery. This will be enabled through the increased use of new cryptographic Distributed Ledger Technology (DLT) [6], which will massively reduce the risk of IP being compromised [7]. The LotF will also enable more open, productive, collaborative working through vastly improved communication technology (5G moving to 6G) [8]. The people working in these labs will have a much more open attitude, culture, and mindset, given the influence of technology such as smartphones on their personal lives.

Robotics and automation will be ubiquitous, but with more automated assistance, the density of people in the lab will likely drop, allowing scientists to focus on key aspects and complex parts of the experiments. As a consequence, issues around safety and “lone working” will grow, and a focus on the interaction points which scientists have with automation will develop to ensure they are properly protected. For the few remaining lab technicians, not only will safe working become of increased importance, but the need for organizations to deliver a better “user experience” (UX) in their labs will become key to help them both attract the smaller numbers of more expert technicians and also retain them. The lab technician's UX will be massively boosted by many of the new technologies already starting to appear in the more future‐looking labs, e.g. voice recognition, augmented reality (AR), immersive lab experience, a more intelligent lab environment, and others (see later sections).

1.2.2 Process

The lab processes, or “how” science gets done in the LotF, will be dominated by robotics and automation. But there will be another strong driver which will force lab processes and mindsets to be different in 5–10 years time: sustainability. Experiments will have to be designed to minimize the excessive use of “noxious” materials (e.g. chemical and biological) throughout the process and in the cleanup once the experiment is complete. Similarly, the use of “bad‐for‐the‐planet” plastics (e.g. 96/384/1536‐well plates) will diminish. New processes and techniques will have to be conceived to circumvent what are standard ways of working in the lab of 2020. In support of the sustainability driver, miniaturization of lab processes will grow hugely in importance, especially in research, diagnostic, and testing labs. The current so‐called lab on a chip movement has many examples of process miniaturization [9]. Laboratories and plants that are focused on manufacturing will continue to work at scale, but the ongoing search for more environmentally conscious methods will continue, including climate‐friendly solvents, reagents, and the use of catalysts will grow evermore important [10]. There will also be a greater focus on better plant design. For example, 3D printing [11] could allow for localization of manufacturing processes near to the point of usage.

In the previous paragraph, we refer to “research, diagnostic, and testing labs” and to manufacturing “plant.” We believe there is a fundamental difference between what we are calling hypothesis‐ and protocol‐driven labs, and this is an important consideration when thinking about the LotF. The former are seen in pure research/discovery and academia. The experiments being undertaken in these labs may be the first of their kind and will evolve as the hypothesis evolves. Such labs will embrace high throughput and miniaturization. Protocol‐driven labs, where pure research is not the main focus, include facilities such as manufacturing, diagnostic, analytical, or gene‐testing labs. These tend to have a lower throughput, though their levels of productivity are growing as automation and higher quality processes enable ever higher throughput. In these labs, reproducibility combined with robust reliability is key. Examples in this latter area include the genomic screening and testing labs [12, 13], which have been growing massively in the past few years. For these labs the already high levels of automation will continue to grow.

Figure 1.2 Virtual and real design‐make‐test‐analyze (DMTA) concept.

In the hypothesis‐driven lab [14] with the strong driver of sustainability combined with the growth of ever higher quality artificial intelligence (AI) and informatics algorithms, there will be more in silico, virtual “design‐make‐test‐analyze” (vDMTA) and less, tangible Make and Test (see Figure 1.2). Fewer “real” materials will actually be made and tested, and those that are will be produced on a much smaller scale.

Finally, as labs get more sophisticated – with their high levels of automation, robotics, miniaturization, and data production (but with fewer staff) – combined with the need for those facilities to be both safe and sustainable, the concept of “laboratory as a service” (LaaS) will grow [15]. The LotF will not be a static, self‐contained, and single scientific area facility. It will be a blank canvas, as it were, in a large warehouse‐like facility or cargo container [16] which can be loaded up on demand with the necessary equipment, automation, and robotics to do a contracted piece of lab work. That piece of work might be a chemical synthesis or a cell‐based pharmacological assay one day, and an ex vivo safety screen in the same area the next day. The key will be use of a modular design supported by fully connected devices.

1.2.3 Lab Environment and Design

The lab environment, its design, usability, and sustainability are mentioned previously in this section and elsewhere in the book, but it is fair to say that all labs will face the pressure [17, 18] to design sustainable spaces [19] that can keep up with all the emerging technical trends as well as the usability and design features needed to support a new generation of scientists. These drivers will combine to influence how the LotF evolves and experiments are performed. Research institutions are already creating more “open” labs areas to support interdisciplinary teamwork, collaborative working, and joint problem solving, rather than the previous “siloed” departmental culture. This will continue in the LotF. The growth of innovation clusters [20] and lab coworking spaces will require more consideration as to how shared automation and lab equipment can be effectively and securely used by groups, who may be working for different organizations and who will want to ensure their data and methods are stored and protected in the correct locations. Effective scheduling will be critical in the LotF to enable high productivity and to ensure that the high value of the automation assets is realized.

1.2.4 Data Management and the “Real Asset”

It is true of 2020, just as it was 50 years ago and will be in 50 years time, that the primary output of R&D, in whatever industry, is data. The only physical items of any value are perhaps some small amounts of a few samples (and sometimes not even that) plus, historically, a lot of paper! It is therefore not surprising that the meme “data is the new oil” [21] has come to such prominence in recent times. While it may be viewed by many as hackneyed, and by many more as fundamentally flawed [22], the idea carries a lot of credence as we move toward a more data‐driven global economy. One of the main flaws arising from the oil analogy is the lack of organizations being able to suitably refine data into the appropriate next piece of the value chain, compared to oil, which has a very clear refining process and value chain. Furthermore, the “Keep it in the Ground” [23, 24] sustainability momentum makes the data‐oil analogy perhaps even less useful. However, within the LotF, and in a more open, collaborative global R&D world, experimental data, both raw and refined, will grow in criticality. Without doubt, data will remain a primary asset arising from the LotF.

At this point then it is worth considering how data and data management fit into the processes that drive the two fundamental lab types, which we have referred to earlier, namely (i) the hypothesis‐driven, more research/discovery‐driven lab and (ii) the protocol‐driven, more “manufacturing”‐like lab.

1.2.4.1 Data in the Hypothesis‐driven, Research Lab

In a pure research, hypothesis‐driven lab, whether it is in life science, chemical science, or physical science, there is a fundamental, cyclical process operating. This process underpins all of scientific discovery; we refer to it as the “hypothesis‐experiment‐analyze‐share” (“HEAS”) cycle (see Figure 1.3) or, alternatively, if one is in a discovery chemistry lab, for example a medicinal chemistry lab in biopharma, DMTA (see Figure 1.2).

The research scientists generate their idea/hypothesis and design an experiment to test it. They gather the materials they need to run that experiment, which they then perform in the lab. All the time they capture observations on what is happening. At the end they “workup” their experiment – continuing to capture observations and raw data. They analyze their “raw” data and generate results (“refined” data); these determine whether the experiment has supported their hypothesis or not. They then communicate those results, observations, and insights more widely. Ultimately, they move on to the next, follow‐on hypothesis; then, it is off round the cycle they go again until they reach some sort of end point or final conclusion. All the while they are generating data: raw data off instruments and captured visual observations and refined data, which are more readily interpretable and can more easily lead to insights and conclusions.

Figure 1.3 Hypothesis‐experiment‐analyze‐share (HEAS) cycle.

1.2.4.2 Data in the Protocol‐driven Lab

In the protocol‐driven lab, whether it is in a manufacturing or sample testing domain, there is again a fundamental process which operates to drive the value chain. Unlike the “HEAS” cycle this is more of a linear process. It starts with a request and ends in a communicable result or a shippable product. This process, which we refer to as the “request‐experiment‐analyze‐feedback” (REAF) process, is outlined in Figure 1.4.

There are many similarities, often close, between the linear REAF process and the HEAS cycle especially in the Experiment/Observe and Analyze/Report steps, but the REAF process does not start with an idea or hypothesis. REAF represents a service, which starts with a formal request, for example to run a protocol to manufacture a good or to test a sample, and ends with a product or a set of results, which can be fed back to the original customer or requester. As we noted in Section 1.2.4.1 above, it is increasingly likely that the LotF will be set up with a Laboratory as a Service (LaaS) mentality; REAF may therefore be much more broadly representative of how labs of the future might operate.

Figure 1.4 Request‐experiment‐analyze‐feedback (REAF) process.

It is important to acknowledge that the data and information, which drive Request and Feedback, are quite different in REAF than in the corresponding sections in HEAS. With the focus of this book being on Experiment/Observe, and to a degree Analyze, we will not say anything more about Request and Feedback (from REAF) and Hypothesis and Share (from HEAS). Instead, the remainder of this section focuses on what the Experiment and Analyze data management aspects of the LotF will look like, whether that LotF is a hypothesis‐ or a protocol‐driven lab. This is made simpler by the fact that in the Experiment/Observe and Analyze/Report steps, the data challenges in the two different lab types are, to all intents and purposes, the same. In the remainder of this section we treat them as such.

1.2.4.3 New Data Management Developments

So what new developments in data management will be prevalent in both the hypothesis‐ and the protocol‐driven labs of 2030? In the previous two sections we asserted that these labs will be populated by fewer people; there will be more robotics and automation, and the experiment throughput will be much higher, often on more miniaturized equipment. Building on these assertions then, perhaps the most impactful developments in the data space will be:

The all pervasiveness of

internet of things

(

IoT

) [

25

,

26

]. This will lead, in the LotF, to the growth of

the internet of laboratory things

(

IoLT

) environments; this will also be driven by ubiquitous 5G communications capability.

The widespread adoption of the

findable, accessible, interoperable, and reusable

(

FAIR

) data principles. These state that all data should be FAIR

[27]

.

The growing use of improved experimental data and automation representation standards, e.g. SiLA

[28]

and Allotrope

[29]

.

Data security and data privacy. These two areas will continue to be critical considerations for the LotF.

The ubiquity of “Cloud.” The LotF will not be able to operate effectively without access to cloud computing.

Digital twin approaches. These will complement both the drive toward labs operating more as a service and the demand for remote service customers wanting to see into, and to directly control from afar what is happening in the lab. Technologies such as augmented reality (AR) will also help to enable this (see

Sections 1.2.5

and

1.2.6

).

Quantum computing [

30

33

]. This moves from research to production and so impacts just about everything we do in life, not just in the LotF. Arguably, quantum computing might have a bigger impact in the more computationally intensive parts of the hypothesis‐ and protocol‐driven LotF, e.g. Idea/Hypothesis design and Analyze/Insight, but it will still disrupt the LotF massively. We say more on this in

Sections 1.2.5

and

1.2.6

.

The first three of these developments are all related to the drive to improve the speed and quality of the data/digital life cycle and the overall data supply chain. That digital life cycle aligns closely to the HEAS and REAF processes outlined in Figures 1.3 and 1.4 and can be summarized as follows (see Figure 1.5):

Figure 1.5 Digital data life cycle.

IoT technology [34] will allow much better connectivity between the equipment in the LotF. This will enable better, quicker, and more precise control of the lab kit, as well as more effective capturing of the raw data off the equipment. This in turn will allow the next stage in the life cycle – “Analyze Data” – to happen sooner and with more, better quality data. This improved interconnectedness in the lab will be made possible by the same 5G communication technology which will be making the devices and products in the home of 2025 more networked and more remotely controllable.

As improved instrument interconnectedness and IoLT enable more data to be captured by more instruments more effectively, the issue of how you manage the inevitable data flood to make the deluge useful comes to the fore. The biggest initiative in 2020 to maximize the benefits of the so‐called big data [35] revolves around the FAIR principles. These state that “for those wishing to enhance the reusability of their data holdings,” those data should be FAIR. In the LotF, the FAIR principles will need to be fully embedded in the lab culture and operating model. Implementing FAIR [36] is very much a change process rather than just introducing new technology. If fully implemented, though, FAIR will make it massively easier for the vast quantities of digital assets generated by organizations to be made much more useful. Data science as a discipline, and data scientists (a role which can be considered currently to equate to that of “informatician”), will grow enormously in importance and size/number. Organizations that are almost purely data driven will thrive, with any lab work they feel the need to do being outsourced via LaaS [37] to flexible, cost‐effective LotFs that operate per the REAF process.

Supporting the growth of FAIR requires the data that is generated in these LaaS LotFs to be easily transferable back to the requester/customer in a format which the lab can generate easily, accurately, and reproducibly, and which the customer can import and interpret, again, easily, accurately, and reproducibly. This facile interchange of “interoperable” data will be enabled by the widespread adoption of data standards such as SiLA and Allotrope. We describe these new data standards in more detail in the following section.

Two additional, significant data considerations for the LotF are those of data security and data privacy, just as they are now. The more LotF services that are operated outside the “firewall” of an organization, and the more that future labs are driven by data, the more risks potentially arise from accidental or malicious activities. Making sure that those risks are kept low, through continued diligence and data security, will ensure that the LotF is able to develop and operate to its full capability. Similarly, in labs that work with human‐derived samples (blood, tissues, etc.), the advent of regulations such as the General Data Protection Regulations (GDPR) [38, 39], along with the historical stringency surrounding informed consent [40] over what can happen to human samples and the data that arises from their processing, will put even more pressure on the organizations that generate and are accountable for human data to ensure these data are effectively secured. Improved adherence to the FAIR data principles, especially Findability and Accessibility, will ensure that LotFs working with human‐derived materials can be responsive to data privacy requests and are not compromised.

Going hand in hand with the data explosion of the past decade has been the evolution of the now ubiquitous, key operational technology of “Cloud Computing.” As explained by one of the originating organizations in this area, “cloud computing is the delivery of computing services – including servers, storage, databases, networking, software, analytics, and intelligence – over the Internet (the cloud) to offer faster innovation, flexible resources, and economies of scale.” [41] In the context of LotF, assuming that the equipment in the lab is fully networked, cloud computing means that all the data generated by the lab can be quickly, fully, and securely captured and stored on remote infrastructure (servers). This book is not the place to describe cloud computing in detail, but it should be sufficient to say that the LotF will not be reliant on IT hardware close to its location (i.e. on‐site) but will be highly reliant on speedy, reliable, available networks and efficient, cost‐effective cloud computing.

Finally, there is a data and modeling technology, which has been present in industries outside life science for many years, which could play a growing role in the LotF which is more automated and more remote. This is the technology termed “digital twin.” [42, 43] We say more on this exciting new technology in Section 1.2.5.1.

1.2.5 New Technology

In any future‐looking article we can only make some best guesses as to the new technologies and science that could be important during the next 5–10 years. In this section we make some suggestions as to what new technologies we feel will impact the LotF, and what new science will be happening in those future labs. In the first part of this section, we focus on new technologies. In the second part, we suggest some scientific areas which we feel will grow in importance and hence might drive the evolution of the LotF and the technology that is adopted in that new lab environment.

New technologies will undoubtedly play a major role in driving the development of the critical components within the LotF, but their introduction and usage need to be appropriate to the type of lab being used. The role of the new technologies must be aligned to the future challenges and needs of the lab environment. These needs include, more specifically:

Flexibility and agility of the experiment cycles, balancing between prediction (in silico) and physical (in vitro) experiments

Improved data collection and experiment capture (e.g. “data born FAIR”)

Reproducibility of the experiment processes

Enhancements to the scientists' UX and capabilities in the lab.

To emphasize these aspects, we focus on three broad areas in this section:

Lab automation integration and interoperability

Quantum computing and the LotF

Impact of AI and

machine learning

(

ML

).

1.2.5.1 Lab Automation Integration and Interoperability

Lab instrument integration and interoperability to support higher levels of lab automation have been and will continue to evolve quickly, driven by the pressure from scientists and lab managers and, above all to have better ways to manage and control their equipment [44–46]. Capabilities as diverse as chemical synthesis [47] and next‐generation sequencing (NGS) [48] are seeking to better automate their workflows to improve speed and quality and to align with the growing demands of AI in support of generative and experimental design as well as decision‐making [49]. An additional stimulus toward increased automation, integration, and interoperability is that of experiment reproducibility. The reproducibility crisis that exists in science today is desperately in need of resolution [50]. This is manifested not only in terms of being unable to confidently replicate externally published experiments, but also in not being able to reproduce internal experiments – those performed within individual organizations. Poor reproducibility and uncertainty over experimental data will also reduce confidence in the outputs from AI; the mantra “rubbish in, rubbish out” will thus continue to hold true! Having appropriate automation and effective data management can support this vital need for repeatability, for example of biological protocols [51]. This will be especially important to support and justify the lab as a service business model, which we have mentioned previously. It is our belief that the increased reliability and enhanced data‐gathering capability offered by increased automation initiatives in the LotF will be one important way to help to address the challenge of reproducibility.

Updated automation will always be coming available as an upgrade/replacement for the existing equipment and workflows; or to enhance and augment current automation; or to scale up more manual or emerging science workflows. When considering new automation, the choices for lab managers and scientists will depend on whether it is a completely new lab environment (a “green‐field site”) or an existing one (a “brown‐field site”).

As mentioned previously, the growth of integration protocols such as IoT [52] is expanding the options for equipment and automation to be connected [53]. The vision for how different workflows can be integrated – from single measurements (e.g. balance measurements), via medium‐throughput workflows (e.g. plate‐based screening), to high data volume processes such as high content screening (HCS) involving images and video – has the potential to be totally reimagined. IoT could enable the interconnectivity of a huge range of lab objects and devices, such as freezers, temperature control units, and fume hoods, which previously would have been more standalone, with minimal physical connectivity. All these devices could be actively connected into expanded data streams and workflows where the measurements they take, for example, temperature, humidity, and air pressure, now become a more integral part of the experiment record. This enhanced set of data collected during experiments in the LotF will be hugely valuable during later analysis to help spot more subtle trends and potential anomalies. Furthermore, these rich datasets could play an increasing role as AI is used more and more for data analysis; small fluctuations in the lab environment do have a significant impact on experimental results and hence reproducibility. As well as this passive sensor monitoring, there is also the potential for these devices to be actively controlled remotely; this opens up options for further automation and interaction between static devices and lab robots, which have been programmed to perform tasks involving these devices. As always, it will be necessary to select appropriate automation based on the lab's needs, the benefits the new automation and workflows can provide, and hence the overall return on investment (ROI).

While the potential for these new systems with regard to improved process efficiency is clear, yet again, though, there is one vital aspect which needs to be considered carefully as part of the whole investment: the data. These LotF automation systems will be capable of generating vast volumes of data. It is critical to have a clear plan of how that data will be annotated and where it will be stored (to make it findable and accessible), in such a way to make it appropriate for use (interoperable), and aligned to the data life cycle that your research requires (reusable). A further vital consideration will also be whether there are any regulatory compliance or validation requirements.

As stated previously, a key consideration with IoT will be the security of the individual items of equipment and the overall interconnected automation [54, 55]. With such a likely explosion in the number of networked devices [56], each one could be vulnerable. Consequently, lab management will need to work closely with colleagues in IT Network and Security to mitigate any security risks. When bringing in new equipment it will be evermore important to validate the credentials of the new equipment and ensure it complies with relevant internal and external security protocols.

While the role of lab scientist and manager will clearly be majorly impacted by these new systems, also significantly affected will be the physical lab itself. Having selected which areas should have more, or more enhanced and integrated, lab automation, it is highly likely that significant physical changes to the lab itself will have to be made, either to accommodate the new systems themselves or to support enhanced networking needs.

In parallel to the lab environment undergoing significant change over the upcoming decades, there will also be new generations of scientists entering the workforce. Their expectations of what makes the LotF efficient and rewarding will be different from previous generations. The UX [57] for these new scientists should be a key consideration when implementing some of the changes mentioned in this book. For example, apps on mobile phones or tablets have transformed peoples' personal lives, but there has been slower development and adoption of apps for the lab. The enhanced usage of automation will very likely need to be managed through apps; they will therefore become a standard part of the LotF. One cultural caveat around the growth of lab apps should be flagged here. With apps enabling much more sophisticated control of automation operating 24/7, via mobile phones, outside “human working hours,” there will need to be consideration of the new scientists' work/life balance. If handled sensitively, though, developments such as lab apps could offer much‐increased efficiency and safety, as well as reducing experiment and equipment issues.

Voice‐activated lab workflows are also an emerging area, just as voice assistants have become popular in the home and in office digital workflows [58]. For the laboratory environment, the current challenges being addressed are how to enrich the vocabulary of the devices with the specific language of the lab, not only basic lab terms but also domain‐specific language, whether that is biology, chemistry, physics, or other scientific disciplines. As with IoT, specific pilots could not only help with the assessment of the voice‐controlled device or system but also highlight possible integration issues with the rest of the workflow. A lab workflow where the scientist has to use both hands, like a pianist, is a possible use case where voice activation and recording could have benefits. The ability to receive alerts or updates while working on unfamiliar equipment would also help to support better, safer experimentation.

As with voice control, the use of AR and virtual reality (VR) in the lab has shown itself to have value in early pilots and in some production systems [59]. AR is typically deployed via smart glasses, of which there is a wide range now in production. There are a number of use cases already where AR in the lab shows promise, including the ability to support a scientist in learning a new instrument or to guide them through an unfamiliar experiment. These examples will only grow in the LotF. To take another, rather mundane example, pipetting is one of the most familiar activities in the lab. In the LotF where low throughput manual pipetting is still performed, AR overlays could support the process and reduce errors. AR devices will likely supplement and enhance what a scientist can already do and allow them to focus even more productively.

Another area of lab UX being driven by equivalents in consumer devices is how the scientist actually interacts physically with devices other than through simple keyboard and buttons. Technologies such as gesture control and multitouch interfaces will very likely play an increasing role controlling the LotF automation. As with voice activation, these input and control devices will likely evolve to support the whole lab and not just a single instrument. Nevertheless, items such as projected keyboards could have big benefits, making the lab even more digitally and technologically mature.

As mentioned before there is another technology which could play a significant role in enhancing the UX in the LotF; this is the “digital twin.” [60] In brief, a digital twin is a representation in silico of a person, a process, or a thing. Its role has been evolving in recent years, such that digital twins can now be seen as virtual replicas of physical environments or objects which managers, data scientists, and business users can use to run simulations, prepare decisions, and manage operations [42, 61]. This technology has the potential to impact the LotF in two primary areas: (i) simulation and (ii) remote control.

Starting with simulation, digital twins, unlike the physical world, which shows you a picture of the present, can review the past and simulate the future. The digital twin can therefore become an environment to test out in pilot mode not only emerging technologies such as voice activation, AR, VR, and multigesture devices but also novel or redesigned workflows without the need for full‐scale deployment. Indeed, with increased computational capability (provided by exascale computing and ultimately quantum computing – see Section 1.2.5.2), the processes that operate within the LotF will be simulatable to such a degree of sophistication that a person will be able to see, in silico, a high‐resolution representation of the technology, experiment, or process they are looking to perform, in a simulation of the lab in which it will run. This digital twin will allow the operator to check, for example that the novel process is likely to run smoothly and deliver the output that is hoped for. While digital twin technology may be more applicable to the protocol‐driven lab, it may also have applicability in the research lab as a means of exploring “what‐if” scenarios prior to doing the actual physical experiment.

Turning to digital twin technology and improved remote control, massively improved computational technology combined with advances in AR and VR will allow operators, who might be located nowhere near the lab in which their experiment is being run, to don appropriate AR/VR headsets and walk into an empty space that will “feel” to them like they are right inside the lab or even right inside the experiment itself. The potential for scientists to “walk” into the active site of an enzyme and “manually” dock the molecules they have designed, or for an automation operator to “step into” the reaction vessel running the large‐scale manufacturing of, say, a chemical intermediate to check that there are no clumps, or localized issues (e.g. overheating), will revolutionize how the LotF can operate, making it more likely to be more successful and, importantly, safer.

One final, obvious application of digital twin technology is where that LotF is not even on Earth. Running experiments in low or zero gravity can lead to interesting, sometimes unexpected findings [62]. This has led to numerous experiments having been performed on the NASA Space Station [63]. But having a trained astronaut who can effectively run any experiment or protocol, from organic synthesis to genetic manipulation, is asking a great deal. Digital twin technology could make the LotF in zero gravity a much more compelling proposition [64].

Returning to the area of instrument integration and interoperability, a more practical consideration is how different instruments communicate with each other, and how the data they generate is shared.

Within any lab there is and always will be a wide range of different instruments from different manufactures, likely bought over several years to support the business workflows. This “kit diversity” creates a challenge when you want to define a protocol which involves linking two or more instruments together that do not use the same control language. SiLA‐2 [65] is a communication standard [66] for lab instruments, such as plate readers, liquid handling devices, and other analytical equipment, to enable interoperability. As indicated throughout this section, the ability to fully connect devices together will enable a more flexible and agile lab environment, making it possible to track, monitor, and remote control automation assets. This will further enable enhanced robotic process automation (RPA) as well as easier transition to scale up and transfer to remote parties. Specific devices connected together for one workflow will be easily repurposable for other tasks without a monolithic communication design and coding.

Data in all its forms will remain the dominant high‐value output from lab experiments. As with protocols and communications, there need to be standards to support full data integration and interoperability within and between research communities. Over the years, data standards have evolved to support many aspects of the life science process whether that is for registration of new chemical entities [67], images [68], or macromolecular structures [69] or for describing the experiment data itself. Analytical instrument data (e.g. from nuclear magnetic resonance machines [NMRs], chromatographs, and mass spectrometers) are produced by a myriad of instruments, and the need to analyze and compare data from different machines and support data life cycle access in a retrievable format has driven the creation of the Allotrope data format[70] (ADF). This is a vendor‐neutral format, generated initially for liquid chromatography, with plans to expand to other analytical data. These wide community‐driven efforts such as those from Allotrope, SLAS, IMI [71], or the Pistoia Alliance [72] highlight the value of research communities coming together in life sciences, as happens elsewhere in industries such as financials and telecoms. Such enhanced efforts of collaboration will be needed even more in future.