Close
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
21 CFR Part 11 compliance for digital health technologies in clinical trials
(USC Thesis Other)
21 CFR Part 11 compliance for digital health technologies in clinical trials
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
21 CFR PART 11 COMPLIANCE FOR DIGITAL HEALTH TECHNOLOGIES IN CLINICAL
TRIALS
by
Anjali Malhotra
A Dissertation Presented to the
FACULTY OF THE USC SCHOOL OF PHARMACY
UNIVERSITY OF SOUTHERN CALIFORNIA
In Fulfillment of the
Requirements for the Degree
DOCTOR OF REGULATORY SCIENCE
December 2023
Copyright 2023 Anjali Malhotra
ii
Dedication
This doctoral thesis is the culmination of commitment and relentless effort for several
years. It is a tribute to countless hours of research, the guidance of my advisors, the support of
my family, and the inspiration drawn from peers. It is a testament to the enduring dedication
required to push the boundaries of knowledge and contribute to the scholarly community.
It is with immense pride and humility that I dedicate this work to my husband, Umesh
Malhotra, my son Varun Malhotra and my daughter Nikhita Malhotra, who encouraged and
supported me with patience and love throughout this beautiful and soul-satisfying journey. My
late father, Wg. Cdr. Chaman Lal Gupta, mother Prabha Gupta, sister Pareeta Keedia, brother
Deepankar Gupta and sister-in-law Lovleen Varma; along with my immediate family were my
greatest cheerleaders with the motto of our family being, “never give up and never stop
learning”.
Thank you for making me worthy of this amazing opportunity and giving me the strength
to fulfill my dream!
iii
Acknowledgments
I wish to express my deep gratitude to the many individuals whose support and
contributions were instrumental in the completion of this doctoral thesis. First and foremost, I am
deeply indebted to my dissertation chair and advisor, Dr. Frances Richmond, whose mentorship
and encouragement made this work possible. Her commitment to academic excellence and her
patience in nurturing my intellectual growth has been instrumental in my success in completing
this research. Her wisdom and guidance helped me grow academically, in industry for my work,
and in my personal life. Thank you, Dr. Richmond!
I also thank my dissertation committee, comprising Dr. Nancy Pire-Smerkanich, Dr.
Terry Church and Dr. Lequina Myles. Their expertise, guidance and unwavering commitment to
excellence have been invaluable throughout this research endeavor. Their constructive feedback
and insightful suggestions have been instrumental in shaping this thesis and propelled this work
forward. I would also like to express my immense gratitude to Dr. Eunjoo Pacifici for her
mentorship, encouragement and support throughout this journey. I want to thank the entire
teaching staff at USC. Every lecture taught me so much and allowed me to bring back to industry
the many current initiatives and insights in real-time. Thank you also to the kind and supportive
staff of the Regulatory Science Program who worked tirelessly behind the scenes to make this
journey so smooth.
I extend my heartfelt appreciation to my family for their unwavering encouragement,
support and understanding during this challenging process which sustained my grit to reach the
finish line.
iv
I would like to give a special thanks to those who participated in my focus group and to
all the survey participants that took time out of their busy schedules to share their valuable
insights by completing my very long survey.
I am also thankful to my numerous colleagues from industry and academia (cohort of
2020) and friends who provided a stimulating environment and a network of support. This
dissertation is a testament to the collective efforts and encouragement of all those who believed
in this endeavor. It is with sincere gratitude that I acknowledge their contributions.
Thank you…I am humbled and honored to present this work to you.
v
TABLE OF CONTENTS
Dedication....................................................................................................................................... ii
Acknowledgments.......................................................................................................................... iii
List of Tables ................................................................................................................................ vii
List of Figures................................................................................................................................ ix
Abstract.......................................................................................................................................... xi
Chapter 1. Overview ........................................................................................................................1
1.1 Introduction................................................................................................................. 1
1.2 Statement of the Problem............................................................................................ 7
1.3 Purpose of the Study ................................................................................................... 8
1.4 Importance of the Study.............................................................................................. 9
1.5 Limitation, Delimitations, Assumptions................................................................... 10
1.6 Organization of Thesis.............................................................................................. 12
Chapter 2. Literature Review.........................................................................................................13
2.1 Literature Search....................................................................................................... 13
2.2 Introduction............................................................................................................... 15
2.3 A Historical Perspective............................................................................................ 15
2.3.1 Evolution of Electronic Records and Signatures......................................... 15
2.3.2 Introduction of Part 11 Regulation to Control Electronic
Documentation ............................................................................................ 17
2.3.3 FDA’s Response to the initial Part 11 Confusion – the 2003 Guidance
Document .................................................................................................... 23
2.4 Evolution of Regulatory Oversight of Clinical Trials............................................... 25
2.4.1 Regulatory Expectations for Electronic Records and Signatures for
Clinical Trials.............................................................................................. 27
2.4.2 Industry Concerns........................................................................................ 31
2.4.2.1 Challenges Related to Risk Assessments ........................................ 32
2.4.2.2 Challenges in Synchronizing with the FDA Predicate Rules.......... 33
2.5 Decentralized Trials (DCT) ...................................................................................... 38
2.5.1 Digital Health Technologies in Clinical Trials............................................ 43
2.5.1.1 Prevalence of Digital Health Technologies.................................... 43
2.5.1.2 Regulatory Challenges.................................................................... 45
2.5.1.3 Technological and Logistical Challenges....................................... 46
2.5.1.4 Training Challenges....................................................................... 49
2.6 Part 11 Assessment: A Changing Landscape............................................................ 50
2.7 Research Approaches and Framework for Part 11.................................................... 54
vi
Chapter 3. Methodology ................................................................................................................58
3.1 Introduction............................................................................................................... 58
3.2 Survey Development................................................................................................. 58
3.3 Survey Deployment................................................................................................... 59
3.4 Survey Result Analysis............................................................................................. 60
Chapter 4. Results..........................................................................................................................63
4.1 Survey Participation.................................................................................................. 63
4.2 Demographic Profiles of Participants....................................................................... 63
4.3 Consumer Wearables (CW) – Survey Responses..................................................... 75
4.4 Software Platforms (SP) – Survey Responses .......................................................... 95
4.5 Concluding Questions............................................................................................. 115
Chapter 5. Discussion ..................................................................................................................118
5.1 Introduction............................................................................................................. 118
5.2 Methodological Considerations .............................................................................. 118
5.2.1 Delimitations ............................................................................................. 118
5.2.2 Limitations................................................................................................. 121
5.3 Consideration of Results or Research Insights........................................................ 124
5.3.1 Exploration ................................................................................................ 125
5.3.1.1 Regulatory Documents and Guidance .......................................... 125
5.3.1.2 Initial Discussions/Meetings with Regulatory Authorities ........... 128
5.3.1.3 Input from Colleagues.................................................................. 129
5.3.1.4 Additional Key Inputs to Decision-Making .................................. 130
5.3.2 Installation ................................................................................................. 131
5.3.2.1 Regulatory Requirements.............................................................. 132
5.3.3 Initial Implementation ............................................................................... 134
5.3.3.1 Technical Expertise and Training ................................................ 135
5.3.3.2 Interactions with FDA .................................................................. 136
5.3.4 Full Implementation .................................................................................. 137
5.3.5 Conclusions and Future Directions............................................................ 140
References....................................................................................................................................143
Appendices...................................................................................................................................154
Industry Survey .............................................................................................. 155
Cross Tabulations........................................................................................... 193
vii
List of Tables
Table 1: Part 11 Regulation - Enforcement Action by the Agency ..........................................24
Table 2: Some Important Differences Among GLP, GCP, GMP.............................................35
Table 3: Part 11 Enforced Predicate Rule References..............................................................36
Table 4: Data Conversion to Weighted Average (WA)............................................................61
Table 5: Other Challenges, Please Specify:..............................................................................71
Table 6: Importance of Factors When Implementing Consumer Wearables............................79
Table 7: Potential Concerns in Implementation........................................................................81
Table 8: Additional Comments on Implementing Consumer Wearables.................................82
Table 9: Challenges Affecting Implementation........................................................................84
Table 10: Factors Resulting in Delays........................................................................................85
Table 11: Challenges with Completing Validations...................................................................87
Table 12: Usefulness of Feedback from FDA Discussions........................................................89
Table 13: Risk Assessment Challenges ......................................................................................90
Table 14: Challenges for Consumer Wearables..........................................................................92
Table 15: Ongoing Compliance for Consumer Wearables.........................................................93
Table 16: Maintenance of Compliance for Consumer Wearables..............................................95
Table 17: Importance of Factors When Implementing Software Platforms...............................99
Table 18: Potential Concerns in Implementation......................................................................101
Table 19: Challenges Affecting Implementation......................................................................103
Table 20: Factors Resulting in Delays......................................................................................105
Table 21: Challenges with Completing Validations.................................................................106
Table 22: Usefulness of Feedback from FDA Discussions......................................................109
Table 23: Challenges for Software Platforms...........................................................................111
Table 24: Ongoing Compliance for Software Platforms ..........................................................113
Table 25: Maintenance of Compliance for Software Platforms ...............................................114
Table 26: Additional Compliance Information.........................................................................117
Table 27: Q2: How well do you think that your company understands Part 11
requirements for clinical trials?................................................................................193
Table 28: Q9: How familiar are you with the following regulations, standards and
guidances? ................................................................................................................194
viii
Table 29: Q11: What challenges did you face when trying to understand the role of
Good Clinical Practices (GCPs) in implementing compliance with Part 11
for clinical trials?......................................................................................................195
Table 30: Q14: How challenging were the following in completing validations for
consumer wearables?................................................................................................196
Table 31: Q2: How well do you think that your company understands Part 11
requirements for clinical trials?................................................................................197
Table 32: Q9: How familiar are you with the following regulations, standards and
guidances? ................................................................................................................198
Table 33: Q11: What challenges did you face when trying to understand the role of
Good Clinical Practices (GCPs) in implementing compliance with Part 11
for clinical trials?......................................................................................................199
Table 34: Q14: How challenging were the following in completing validations for
consumer wearables?................................................................................................200
Table 35: Q2: How well do you think that your company understands Part 11
requirements for clinical trials?................................................................................201
Table 36: Q9: How familiar are you with the following regulations, standards and
guidances? ................................................................................................................202
Table 37: Q11: What challenges did you face when trying to understand the role of
Good Clinical Practices (GCPs) in implementing compliance with Part 11
for clinical trials?......................................................................................................203
Table 38: Q14: How challenging were the following in completing validations for
consumer wearables?................................................................................................204
Table 39: Q1: In one word, how would you characterize FDA guidances to support the
Part 11 regulation?....................................................................................................205
ix
List of Figures
Figure 1: Timeline – 21 CFR Part 11 and Related Guidances.....................................................6
Figure 2: Key Term Used for Searches and Screening ..............................................................14
Figure 3: Regulatory Requirements for 21 CFR Part 11 Regulation .........................................19
Figure 4: Illustration of Decentralized Clinical Trials...............................................................40
Figure 5: Digital Health Tools in Biopharmaceutical R&D ......................................................44
Figure 6: NIRN - Fixsen’s Implementation Model....................................................................56
Figure 7: Organizations Where Participants Worked ................................................................64
Figure 8: Functional Group Affiliations of Participant..............................................................65
Figure 9: Current Role of Participant.........................................................................................66
Figure 10: Size of Participant Organization.................................................................................66
Figure 11: Clinical Trials Conducted in a Year ...........................................................................67
Figure 12: Participant Involvement in Clinical Trials..................................................................67
Figure 13: Participant Experience with Phases of Clinical Trials................................................68
Figure 14: Familiarity with Regulations, Standards and Guidances............................................69
Figure 15: Participant Experience with Part 11 Compliance .......................................................70
Figure 16: Challenges in Understanding the Role of GCPs.........................................................71
Figure 17: Explorations of Software Platforms Versus Consumer Wearables............................72
Figure 18: Experience with Specific Digital Health Technologies..............................................72
Figure 19: Participant Plans to use Digital Health Technologies.................................................73
Figure 20: Understanding of Part 11 Requirements.....................................................................74
Figure 21: Characterization of FDA Guidances...........................................................................74
Figure 22: Consumer Wearables - Provisioned or Bring Your Own Device...............................75
Figure 23: Use of Data to Support Studies...................................................................................75
Figure 24: Role Played in Assuring Part 11 Compliance.............................................................76
Figure 25: Sources Used to Educate About Consumer Wearables..............................................78
Figure 26: Importance of Factors When Adopting Consumer Wearables...................................79
Figure 27: Fit-for-Purpose Assessment........................................................................................80
Figure 28: Potential Concerns in Implementation........................................................................81
Figure 29: Final Incorporation of Consumer Wearables..............................................................82
Figure 30: Challenges Affecting Implementation........................................................................83
Figure 31: Delays in Start of Clinical Trials................................................................................84
x
Figure 32: Factors Resulting in Delays........................................................................................85
Figure 33: Challenges with Completing Validations...................................................................86
Figure 34: FDA Compliance Discussions for Consumer Wearables...........................................87
Figure 35: Time Needed for FDA Compliance Discussions........................................................88
Figure 36: Time to address FDA’s Feedback...............................................................................88
Figure 37: Feedback from FDA discussions................................................................................89
Figure 38: Performed Risk Assessments for Consumer Wearables.............................................90
Figure 39: Challenges for Consumer Wearables..........................................................................91
Figure 40: Ongoing Compliance for Consumer Wearables.........................................................93
Figure 41: Maintenance of Compliance for Consumer Wearables..............................................94
Figure 42: Use of Data to Support Patient Reported Outcomes ..................................................95
Figure 43: Role Played in Assuring Part 11 Compliance.............................................................96
Figure 44: Sources Used to Educate About Software Platforms .................................................98
Figure 45: Importance of Factors When Implementing Software Platforms...............................99
Figure 46: Fit-for-purpose Assessment......................................................................................100
Figure 47: Potential Concerns in Implementation......................................................................101
Figure 48: Final Incorporation of Software Platforms...............................................................102
Figure 49: Challenges Affecting Implementation......................................................................103
Figure 50: Delays in Start of Clinical Trials..............................................................................104
Figure 51: Factors Resulting in Delays......................................................................................105
Figure 52: Challenges with Completing Validations.................................................................106
Figure 53: FDA Compliance Discussions for Software Platforms............................................107
Figure 54: Time Needed for FDA Compliance Discussions......................................................107
Figure 55: Time to address FDA’s Feedback.............................................................................108
Figure 56: Feedback from FDA Discussions.............................................................................109
Figure 57: Performed Risk Assessments for Software Platforms..............................................110
Figure 58: Challenges for Software Platforms...........................................................................111
Figure 59: Ongoing Compliance for Software Platforms ..........................................................112
Figure 60: Maintenance of Compliance for Software Platforms ...............................................114
Figure 61: Characterization of FDA Guidances.........................................................................115
Figure 62: Assessment of Cybersecurity Risks..........................................................................116
Figure 63: Handling Data Breaches...........................................................................................116
Figure 64: Ongoing Compliance ................................................................................................117
xi
Abstract
The rapid evolution of electronic technologies has challenged traditional methods of
documentation and signature management for all FDA-regulated industries. The use of such
tools led the Food and Drug Administration to introduce regulations regarding the use of
electronic records and signatures, 21 CFR Part 11, in 1997. However, the industry has struggled
to implement its multiple requirements, made more complex because those requirements must
align with other predicate rules governing Good Clinical Practices and Quality Systems. Thus,
for clinical trials, additional challenges are present when adopting novel digital health
technologies to collect patient data. This exploratory research used survey methods to examine
how companies implemented, managed and maintained compliance with Part 11 for consumer
wearables (CW) and software platforms (SP) in their clinical trials by using an implementation
framework to systematize the examination. Data reflects the views and experiences of 62
industry experts in companies of different sizes who were familiar with the use of digital health
technologies (DHTs) as part of clinical trial management. When deciding to use these
technologies, respondents identified that decision making was complicated. As the DHTs were
introduced, they faced challenges with developing sufficient staff expertise and conducting
appropriate validations, which then continued to delay study progress. As DHTs were
implemented for data collection, training of site personnel and patients was more challenging
than often expected, and risk management predictions from earlier stages required resource
allocation and time commitments. Throughout the implementation, resources were often
insufficient and regulatory requirements were difficult to understand without meetings with
regulators to clarify expectations.
1
Chapter 1. Overview
1.1 Introduction
Signatures have been central to transactional documentation for almost as long as
civilization has had written language. The methods for collecting signatures and verifications
have evolved over time, but the intent has remained the same - to prove a person’s identity and
validate the authenticity of the document that was signed. For much of the twentieth century, a
written signature was typical in Western culture and sometimes accompanied by a notarized
stamp. However, more recently, electronic methods have been developed to automate, collect,
process and analyze research information which then are signed using electronic signatures. The
COVID-19 pandemic has accelerated the evolution and use of electronic records and signatures
to enhance the speed and convenience of information exchange. However, they have also
underlined the importance of robust documentation and quality systems to assure that a
document and its associated signatures can be authenticated and protected.
The ability to sign and maintain documents electronically has become important to
regulated industries making medical products. Pharmaceutical and medical device companies
execute hundreds of written transactions ranging from contracts and sales activities to design and
production documents, marketing approvals and product reviews. Specifically for clinical trials,
e-signatures and validation measures are needed, for example, when documenting electronicallybased informed consent (eConsent), Electronic Patient Reported Outcomes (ePRO), and
electronic data capture (EDC) using digital health applications (apps) and wearable devices. Both
the industry and regulatory agencies are committed to increasing the use of electronic tools to
encourage innovation and increase efficiencies. However, those records must comply with
regulatory requirements to assure that the signatures and records are authentic and resistant to
2
change i.e. accurate and reliable, secure and protected, accessible, retrievable and legally
recognized. In the US, the Food and Drug Administration (FDA) has several regulatory
requirements, perhaps the most important of which are described in the controversial regulation,
21 Code of Federal Regulations, Part 11: Electronic Records and Electronic Signatures. The first
iteration of 21 CFR Part 11 (hereafter identified as “Part 11”), introduced in 1997, defined the
criteria by which electronic records and electronic signatures would be judged as equivalent to
records and handwritten signatures on paper (FDA, 2016 [1997]). In so doing, it cast a wide net
over activities throughout the design, testing and production of medical products in the
pharmaceutical, medical device, biotech, food, health care and other regulated industries. It:
…applies to records in electronic form that are created, modified, maintained,
archived, retrieved, or transmitted under any records requirements set forth in
Agency regulations. Part 11 also applies to electronic records submitted to the
Agency under the Federal Food, Drug, and Cosmetic Act (the Act) and the Public
Health Service Act (the PHS Act), even if such records are not specifically
identified in Agency regulations (FDA, 2016 [1997], p.170).
However, these regulations were not well-received by industry despite the FDA’s good
intentions. Many critics argued that they would dampen industry efficiency and increase costs by
the need to satisfy regulations across so many different aspects of activity (Marshall, 2006,
Speer, 2016, Richman, 2003). Industry complained that they were given no clear direction on
how to achieve the strict requirements of the regulation. For clinical trials, the implementation
of Part 11 during the transition from paper Case Report Forms (CRFs) to Remote Data Entry
(RDE) and Electronic Data Capture (EDC) presented particular challenges and issues. These
challenges included the need to assure data integrity and security, system validation, system
integration, user acceptance, staff training and experience, data standardization, data migration,
cost and resource allocation, regulatory compliance, and vendor selection. In response,
3
regulatory agencies attempted to provide additional guidance regarding their expectations for
electronic signatures and electronic records in its guidance document titled, Guidance for
Industry Part 11, Electronic Records; Electronic Signatures — Scope and Application. It
acknowledged the disconnect between industry and the Agency by this statement:
Some interpretations of Part 11 requirements would (1) unnecessarily restrict the
use of electronic technology in a manner that is inconsistent with FDA's stated
intent in issuing the rule, (2) significantly increase the costs of compliance to an
extent that was not contemplated at the time the rule was drafted, and (3)
discourage innovation and technological advances without providing a significant
public health benefit (FDA, 2003, p. 3).
The concerns of industry were one reason why FDA took the unusual step of walking
back some of its requirements and narrowing the focus of its interpretation of the Part 11
regulations. FDA stated its intent to exercise enforcement discretion of this regulation by
defaulting to the predicate rules to determine the extent of control, as reflected by the statement,
“…records must still be maintained or submitted in accordance with the underlying predicate
rules, and the Agency can take regulatory action for noncompliance with such predicate rules”
(FDA, 2003, p.2).
By definition, a “predicate rule” is any requirement set forth in the Federal Food, Drug
and Cosmetic Act, the Public Health Service Act, or any FDA regulation other than Part 11
(FDA, 2003). Examples of FDA predicate rules include regulations such as Good Laboratory
Practices (GLP), Good Clinical Practices (GCP) and Current Good Manufacturing Practices
(cGMP) or Good Manufacturing Processes (GMP), all of which already required that a company
maintain certain records and submit specific information to the Agency; meeting these rules
would be evidence of compliance with FDA’s expectations. Thus, it made clear that Part 11
4
requirements would not override the existing regulations pertaining to electronic records and
signatures.
Often the individuals working in these areas had difficulty in interpreting the
requirements, so FDA recently provided several guidance documents to assist them (Figure 1).
First was the Guidance for Industry Computerized Systems Used in Clinical Investigations,
issued in May 2007 (FDA, 2007) to supersede the April 1999 document titled Guidance for
Industry- Computerized Systems Used in Clinical Trials (FDA, 1999).
Both these documents attempted to show how FDA intended to harmonize related to
FDA Guidance for Industry Part 11, Electronic Records; Electronic Signatures — Scope and
Application (FDA, 2003). The 1999 guidance had already identified at a high level that data
accumulated by computerized systems must be of the highest quality and integrity. The 2007
guidance extended its considerations more specifically to include electronic source data and
source documentation. In both, emphasis was placed on meeting the fundamental elements of
data integrity, often summarized by the acronym, ALCOA- attributable, legible,
contemporaneous, original, and accurate. Some organizations and experts use ALCOA++, where
the ++ are used to emphasize two additional principles related to data integrity i.e. complete and
consistent. The Agency then published Use of Electronic Records and Electronic Signatures in
Clinical Investigations Under 21 CFR Part 11 – Questions and Answers (FDA, 2017), to clarify
certain contradictory requirements related to clinical trials. However, these guidance still
appeared insufficient to alleviate some confusion about how to interpret the Part 11 regulation.
Very recently, in December 2021, the FDA disseminated a draft guidance titled, Digital Health
Technologies for Remote Data Acquisition in Clinical Investigations, Guidance for Industry,
Investigators, and other Stakeholders (FDA, 2021b), with comments due to the FDA by the end
5
of March 2022. On 23 March 2023 the Agency provided a draft guidance titled Electronic
Systems, Electronic Records, and Electronic Signatures in Clinical Investigations Questions and
Answers (FDA, 2023b) that revised the draft guidance for industry, Use of Electronic Records
and Electronic Signatures in Clinical Investigations Under 21 CFR Part 11 – Questions and
Answers (FDA, 2017) (Figure 1). This new draft guidance expands on the 2003 Part 11,
Electronic Records; Electronic Signature – Scope and Application (FDA, 2003) and when
finalized will supersede the 2007 guidance, Computerized Systems Used in Clinical
Investigations (FDA, 2007). It makes updated recommendations for: electronic records;
electronic systems owned or controlled by sponsors or other regulated entities; information
technology service providers and IT services; DHTs; and electronic signatures used in clinical
investigations. Updates included but were not limited to: acknowledgment of advances in
technology; use of IT service providers; revision of the risk-based approach to validation of
electronic systems; and an updated glossary of terms (Hanson, 2023). FDA requested public
comments on the draft guidance on or before May 15, 2023. The FDA has also shown its
commitment to this industry by setting up a Digital Health Center of Excellence (DHCoE) in
September 2020 (FDA, 2022a). DHCoE’s aims are:
To connect and build partnerships to accelerate digital health advancements;
share knowledge to increase awareness and understanding, advance best
practices; and innovate regulatory approaches to provide efficient and least
burdensome oversight while meeting the FDA standards for safe and effective
products (FDA, 2023a, par. 2).
6
Figure 1: Timeline – 21 CFR Part 11 and Related Guidances
Confusion associated with FDA requirements appears to come not only from concerns
about the interpretation of Part 11 but also the need to coordinate compliance with multiple
predicate rules. Areas of greatest concern included system validation, audit trail management,
record management and retention, record copying, system security management, system
document management and certification and legacy systems. Further complications are also
introduced because additional regulations such as those covering privacy protection and
cybersecurity may also be relevant for the trials. Confusion is not helped by the fact the master
regulation represented in Part 11 is now 25 years old. FDA had announced its intent to release a
new iteration of the Part 11 regulations in late 2006 but has since pushed back the release date.
“The regulation won't go away”, Janet Woodcock, director of the FDA’s Center for Drug
Evaluation and Research (CDER), said in an interview with BIO-IT World. “However, while we
won't rescind it, we may revise it” (Jardine, 2020, par. 6). As of this time of writing, FDA has
issued no projected date and has stated publicly that the timetable for release remains “flexible”.
7
1.2 Statement of the Problem
Clinical trials rely increasingly on electronic methods to encourage remote participation;
to capture, store, analyze and archive critical data; to ensure its integrity; and to monitor the state
of clinical trial participants. The COVID-19 pandemic has accelerated this transition. Travel
restrictions forced the adoption of telehealth methods and decentralized clinical trials (DCTs).
DCTs are considered as “those executed through telemedicine or local/mobile healthcare
providers using procedures that vary from the traditional clinical trial model (for e.g., the
investigational medical product [IMP] is shipped directly to the trial participant)” (CTTI, 2018,
p. 2). Mobile applications and digital health technology (DHT), have a technology-centric focus
and provide end-to-end digitization and data management tools throughout the clinical trial
process. Both DCTs and DHTs represent innovative approaches that can improve the
accessibility and efficiency of clinical trials in the digital age. Examples of this progression
include wearable devices used for remote monitoring, virtual visits with telemedicine platforms,
mobile apps to collect ePROs, decentralized recruitment and consent, patient engagement
platforms and the use of DHT for tracking and shipping in-home sample collections.
These advanced technologies help to streamline the trial process, increase patient
engagement, provide accurate and real-time data and reduce the burden of in-person site visits
for trial participants. However, they also raise important questions about how to assure the
integrity, authenticity and security of electronic records and signatures, which are essential for
regulatory acceptance.
We know that most companies are aware that they must comply with Part 11 but seem
unsure about whether the methods that they have in place are appropriate. A little anecdotal
information exists about these concerns in the literature. To my knowledge, however, these
challenges have never been examined systematically. The research conducted here attempts to
8
understand better how industry implements the rules of compliance that govern the use of digital
health technologies in clinical trials. I have specifically examined their implementation of two
particularly challenging types of DHTs, consumer wearables and software platforms, for patient
reported data, as a type of probe into their experiences and challenges. In the recently updated
guidance draft, the FDA has rearranged its 2017 discourse on "mobile technology" to center on
the phrase "Digital Health Technology”. This revised term encompasses "a system that leverages
computing platforms, connectivity, software, and/or sensors for healthcare and associated
purposes" (FDA, 2023b, p. 17). Izmailova et al. in their article, Wearable Devices in Clinical
Trials: Hype and Hypothesis, define wearable technologies as “…sensors and/or software
applications (apps) on smart phones and tablets that collect health-related data remotely, i.e.,
outside of the healthcare provider’s office. The data can be collected passively or may require a
user’s input” (Izmailova, 2018, p. 42).
1.3 Purpose of the Study
This study was designed to understand how industry is interpreting and implementing the
requirements of the FDA regulations and guidance documents in clinical trials that use digital
health technologies to collect data from participants. A framework based on Fixsen’s
Implementation Stages was used to structure the survey. It was designed to explore current
practices in the industry for managing and sustaining compliance with the requirements of Part
11 regulations related to DHTs and to identify challenges and potential concerns that hinder their
implementation. The views of industry regarding the adequacy of regulatory guidance governing
that use were also evaluated.
A fit-for-purpose survey was developed and distributed to mid and senior-level career
professionals with experience in different areas of clinical research, which included but was not
9
limited to data management, quality assurance and clinical operations. Also included were
individuals with experience using DHTs as part of information technology roles. The survey
focused on two exemplar types of DHT, consumer wearables and software platforms for patient
reported data. The survey was disseminated via Qualtrics (Qualtrics.com), a research platform
that specializes in survey management and analysis. The same platform assisted the collection,
compilation and analysis of the responses.
1.4 Importance of the Study
DHTs have become ubiquitous and are evolving rapidly. Because they permit frequent
data collection over extended, uninterrupted periods of time, they can simplify trial management,
increase knowledge of disease variability and permit earlier identification of safety issues to
inform dosing adjustments. They have begun to change fundamentally how clinical trials are
designed and conducted (Izmailova, 2018).
Nonetheless, industry-wide confusion appears to exist regarding the application of Part
11 to DHTs and mobile applications used in clinical trials. Results of this research should
provide stakeholders with a better understanding of challenges that the industry has experienced
during the implementation process and best practice solutions that they may have found. The
results may assist business process owners to anticipate and overcome barriers by navigating
ambiguities in the interpretation of the regulation that have been identified here. This may
influence them to adopt solutions that can offer more productive and efficient use of monetary
and human resources. The results of this study will also help regulators to understand the most
taxing issues and confusions arising from the management of digital and mobile applications in
clinical trials. That information could be helpful in formulating new guidance documents and
ultimately more appropriate regulations.
10
1.5 Limitation, Delimitations, Assumptions
There were several delimitations to this study. The study dealt specifically with the
challenges of Part 11 with respect to medical product development in clinical research. Thus, its
respondents were drawn primarily from the pharmaceutical and medical device sectors, and
results might differ from the experience of those in other sectors, such as food and cosmetic
testing. It was further delimited to the experience of one set of stakeholders, clinical trial
sponsors and their biopharmaceutical partners such as clinical research organizations (CROs)
and clinical trial vendors. Respondents mostly held mid- to senior-level positions with expertise
in clinical research along with experience using DHTs in the biopharmaceutical industry and
who direct or participate in the implementation of the Part 11 requirements. Views might differ
from those of other groups within the industry, such as commercial or marketing sectors, but the
chosen groups likely have much experience with the specialized issues faced during the
implementation of novel methods of data management in clinical trials. Data for this survey was
also delimited in a snapshot of time between 2017 to 2023.
Further, the study was concerned with the application of US regulations, including Part
11 and the applicable predicate rules, so did not solicit the participation of industry professionals
who only conduct clinical trials outside of the US. The scope of the study was also delimited to
US regulations so did not examine the challenges of meeting the requirements of regulations in
other countries. Discussions of the US Health Insurance Portability and Accountability Act
(HIPAA) and Protected Health Information (PHI) regulations in the context of patient related
data used for clinical trials were also outside of the scope of this study. Privacy and security
controls and how DHTs must be kept safe from and external interferences whether they are
accidental or malicious were not a primary focus of this research. Only two basic questions were
asked of all the survey participants on cybersecurity and data breaches.
11
The requirements of the Common Rule (a set of federal regulations in the United States
that governs the ethical conduct of human subjects research also known as the Federal Policy for
the Protection of Human subjects) may also impact the implementation of Part 11 but were
excluded from the research. GMP and GLP regulations cover broad aspects of medical product
development and manufacture but in this dissertation they are discussed only in the context of
their relevance to the clinical trials research process.
The research also faced certain limitations. Not all individuals who were asked to
participate agreed, presumably because of time constraints, limitations of knowledge regarding
the subject matter or confidentiality requirements of their job functions. Promising anonymity
may have increased the response rate by protecting the respondents’ identities, but limited my
ability to link specific data to respondent identifiers. Survey research was limited in its depth
because the number of questions had to be restricted to assure the participation of busy
professionals. The survey had many questions that could be skipped by some, but for experts
with wide knowledge, the larger numbers of questions could have resulted in survey fatigue and
restricted the number of questions that they were willing to complete.
Hidden bias of the researcher could also affect the validity of the survey and
interpretation of the results. I am employed in a pharmaceutical organization in which I
participate actively in the oversight of clinical trials. The use of a focus group of experienced
industry and academic professionals was important to reduce the effects of bias that may be
related to my views. I made the assumption that the participants answered the questions honestly
and that the field is sufficiently advanced that responses across different implementation stages
were possible.
12
1.6 Organization of Thesis
This thesis is organized into 5 chapters as follows:
Chapter 1 introduces the problem to be studied and briefly summarizes the rationale for
the approach that will be taken for this thesis.
Chapter 2 reviews relevant literature and provides an overview of the history of the Part
11 regulation in the US with respect to DHTs specifically the use of consumer wearables and
software platforms for patient reported data used in DCTs for clinical research.
Chapter 3 describes the methodology used in conducting the survey.
Chapter 4 describes the results of the study in the form of a narrative, tables and graphs.
Chapter 5 discusses the implications of the research and provides an interpretation of the
results in the context of other literature with conclusions and recommendations for
implementation and further research.
Appendices to this thesis include the ‘Industry Survey’ (Appendix A) and ‘Cross
Tabulations’ of results from the survey (Appendix B).
13
Chapter 2. Literature Review
2.1 Literature Search
The first step of the literature review was to review the current, publicly available
information related to Part 11 compliance in electronic records and signatures for Digital Health
Technologies (DHTs) in clinical trials. An online search was initiated using the web-based
portals of the University of Southern California Libraries, where scholarly works were evaluated
by examining journals, articles and books retrieved from the search engines, Wiley Online
Library (https://onlinelibrary.wiley.com/), PubMed (http://www.pubmed.gov) and Google
Scholar (http://scholar.google.com) (Figure 2). Keywords used during the search included
“electronic records and signatures”, “part 11 compliance in electronic records and signatures”,
“part 11 compliance in electronic records and signatures in clinical trials”, “part 11 compliance
in electronic records and signatures for DHTs in clinical trials”. The data was collected between
07-18 August 2023.
Records identified by the initial search word selections were screened further to provide
articles that were relevant to the research topic. The search was refined by screening the retrieved
records for a narrowed range of the publications between 2017-2023, eliminating duplicate
records and those that did not meet the criteria for eligibility, after reviewing the titles and the
abstracts. Included were articles discussing wearable technologies as DHTs. Documents were
filtered by considering only documents in English; 89 articles relevant to the research topic were
selected as most appropriate for the research and were reviewed in detail. Relevant literature
references from these 89 articles were also used to provide more context to the research topic
(Figure 3).
14
Figure 2: Key Term Used for Searches and Screening
Other references included FDA regulations, webinars and guidance documents; trade
magazines; perspectives from industry blogs and consortiums such as CISCRP (The Center for
Information & Study on Clinical Research Participation) and DTRA (Decentralized Trials and
Research Alliance); workshops by NIH (US National Institutes of Health); and NSF (National
Science Foundation) and vendor websites. Three books titled Electronic Record Keeping
Achieving and Maintaining Compliance with 21 CFR Part 11 and 45 CFR Parts 160, 162 and
164 (Nettleton, 2004) and Managing the Documentation Maze (Gough, 2010), both by David
Nettleton and Janet Gough and The Fundamentals of Clinical Research (Dubinsky, 2022),
specifically the chapter on Data Collection and Data Management, were reviewed and found to
be valuable resources for this research.
15
2.2 Introduction
Almost 25 years have passed since the US Food and Drug Administration published
21CFR Part 11 in 1997. The original intent of the Agency was to encourage the promising
opportunities offered by electronic technologies to facilitate product development without
threatening the quality, safety and effectiveness of medical products. However, the rapid
evolution of electronic signatures and electronic records into the 21st century and a COVID
recovering world has underlined the magnitude of that task.
2.3 A Historical Perspective
2.3.1 Evolution of Electronic Records and Signatures
“Give me your John Hancock” entered our vernacular when John Hancock was the first
to sign the United States Declaration of Independence in 1776. The expression has become
synonymous with the phrase, “give me your signature”. Of course, his signature did not represent
the first time that a document was used for business purposes and authenticated with proof of
identity. Different methods had been used for centuries to formalize events and agreements. Pay
slips to workers recorded around 2350 BCE on cuneiform clay tablets documented business
transaction in the city of Ur, located in present-day Iraq. The tablets did not contain handwritten
signatures but featured marks or seals to show authenticity and ownership. Also found with the
letters were governmental orders by Simon bar Kokhba and his administrators during his first
year as President of Israel (Yadin, 1971). Papyri discovered in the “Cave of Letters” documented
land transactions and lease agreements in second century AD and were significant for the light
they shed on the business and legal rights of upper middle class Jewish women at that time.
Documentation and signatures continued to be used to identify rights and responsibilities
in and beyond governments throughout western history. For example, the Constitution of Medina
16
from AD 622 documented two agreements between the Prophet Muhammad and the clans of
Medina after the Hegira (emigration) to Medina, to regulate the interactions between the
Muslims and the Jews (Young, 2016). Similarly, documents bearing signatures in Europe could
be identified as early as the sixth century. The use of an ‘X’ to signify agreement in a contract
may have its origins around the 9th or 10th centuries when scribes, like our current-day notaries,
validated documents with the sign of the cross. In the 16th and 17th centuries, the number of
written transactions with signatures on paper documents increased. In England, the 1677 Statute
of Frauds was instrumental in recognizing signatures as a standard method to validate
agreements, an approach that was later adopted in colonial America.
However, signatures have not been the only way that documents have been validated.
Wax seals with impressions or embossed figures made with signet rings or chops date back to
biblical times. China, Japan, and Korea still currently use a mixture of seals, wet-ink signatures
and, increasingly, electronic signatures (Felsenthal, 2011). However, it has only been in the last
few decades that alternative methods to wet ink signatures have become common in the western
world.
Today, alternative methods are being developed to assure the authenticity of electronic
documents, which became increasingly common by the end of the twentieth century. The
movement of records and signatures from paper to electronic media has had its share of benefits
and drawbacks. For clinical trials it raised numerous questions about how signatures would be
applied and records would be maintained. Electronic records are often easier and more
convenient to distribute access and search; they are compact, portable and easy to store.
However, drawbacks can include the expense of conversions to electronic systems, difficulties to
keep up with evolving technology and risks that software and documentation could be damaged,
17
hacked or manipulated in ways not anticipated when documents were written on paper. Thus,
confidence in the use of electronic records and signatures began to depend on methods to protect
them using audit trails and validations.
The need to assure authenticity spawned new business opportunities for vendors that
provided storage mechanisms to protect valuable data. Companies such as Iron Mountain began
to provide processes and infrastructure as well as scanning and encryption expertise to help
companies transition from paper to a paperless electronic/digital format (IronMountain.com,
2022). These services became particularly important to manage the documents for life science,
medical, legal and finance industries that needed ways to protect confidential records for a
defined period and then dispose of them appropriately when they reached the end of mandated
retention times. Similarly, specialized vendors such as DocuSign and Adobe designed protocols
and software to assure the validity of electronic records and signatures. Their systems helped to
ensure that electronic signatures would meet regulatory requirements and be seen as legally
binding. This was achieved by encrypting documents while in transit and at rest and providing a
fully traceable, tamper-proof audit trail (DocuSign, 2022). Such facile systems had the added
advantage that they could reduce the cycle time of multiuser approvals without geographical
constraints, and thus increase business efficiencies.
2.3.2 Introduction of Part 11 Regulation to Control Electronic Documentation
The opportunities presented by electronic methods to facilitate workflow and
documentation were embraced enthusiastically by businesses of all types, including the medical
product industries. However, electronic methods had risks that were recognized by both the
regulatory authorities and the industry. In 1997, these concerns drove the FDA to publish
regulations designed to explain their expectations related to electronic records and signatures.
18
These regulations, finalized in 1997 and published in Part 11 of the CFR, were written to give
FDA the assurance that the electronic records were equivalent in validity to paper records and
that electronic signatures could be regarded as a legally binding alternative to a person’s
handwritten signature. They described the types of records and signatures that would be covered
by Part 11 regulation as follows:
Digital signature means an electronic signature based upon cryptographic
methods of originator authentication, computed by using a set of rules and a set
of parameters such that the identity of the signer and the integrity of the data can
be verified.
Electronic record means any combination of text, graphics, data, audio, pictorial,
or other information representation in digital form that is created, modified,
maintained, archived, retrieved, or distributed by a computer system.
Electronic signature means a computer data compilation of any symbol or series
of symbols executed, adopted, or authorized by an individual to be the legally
binding equivalent of the individual’s handwritten signature (FDA, 2016 [1997],
p.172).
FDA further specified in considerable detail the kinds of activities that would be covered by the
regulations. It extended the need for controls over many facets of data protection to safeguard the
authenticity, integrity, confidentiality, accuracy, and reliability of those systems. It also identified
the need to recognize when records had been changed by using, for example, audit trails and
access restrictions (Figure 3).
19
Figure 3: Regulatory Requirements for 21 CFR Part 11 Regulation
(Johner, 2018)
The FDA viewed their new regulations as a positive step to assure that the novel methods
introduced as part of the electronic movement would not compromise public safety or data
integrity. However, its efforts intersected with already existing efforts by industry to validate and
use electronic records and signatures in compliance with certain manufacturing-specific
regulations and guidances that predated Part 11. For example, in the 1980s, industry had already
become accustomed to controlling document validation and management in its production
processes, as required by the GMP requirements of 21 CFR 210/211. Those requirements were
20
important to assure data integrity because such records could otherwise be vulnerable to
manipulation (Richman, 2003).
Ironically, industry’s existing familiarity with GMP requirements may have been one
factor contributing to its confusion surrounding the rollout of the Part 11 regulation for electronic
records and signatures. Although rules had existed regarding quality assurance and quality
control systems to plan, structure, control and document manufacturing activities, these rules
were mostly put into place for a system based on paper (Richman, 2003). These rules were not
necessarily transferable to computerized systems. As described by Richman:
FDA did not realize that there was a substantial gap between what it perceived or
presumed to be the existing level of good software and systems development
practice being employed in the pharmaceutical industry and the reality of the
industry’s practices (Richman, 2003, par. 6).
What the industry did not appreciate in the early 1990s and still struggles to
accept is that the very same concepts apply to the quality and integrity of
computerized systems, as these attributes in the electronic context also result from
following a planned, rational, controlled and documented process. For a variety
of reasons, the industry has generally not successfully transferred its
understanding of these process-oriented concepts into the electronic environment.
This non-transference, combined with the poor level of existing software and
systems development practice, helped to create (and continues to feed) the
industry's share of the root causes of Part 11 issues (Richman, 2003, par. 9).
The “reality gap” described by Richman set the stage for controversy and confusion that
followed the issuance of the final rule in 1997.
A significant concern for industry was also the nature and costs of compliance with the
new regulations (Richman, 2003). Companies recognized that they would need to develop more
robust systems to control electronically managed documentation. They would have to purchase
expensive new equipment, hire and train competent staff and implement more extensive quality
21
practices. The time-intensive and costly interventions would have to be applied to what were
then immature software systems, which became a significant hurdle that challenged the effective
implementation of the Part 11 regulations (Richman, 2003, Kronk, 2018). Debates between the
FDA and industry coalitions reflected different interpretations of the regulations. Richman
suggested that much of the “controversy and confusion” occurred because “the agency has failed
to provide timely and sufficiently specific guidance to fill in all the details that lurk beneath the
high-level principles outlined in the regulations” (Richman, 2003, par.5). Nettleton also
identified that the rapid evolution of computer technology made compliance with Part 11 a
moving target (Nettleton, 2021).
The wide net cast by Part 11 suggested by the rather general language of the regulations,
using terms such as electronic data, record keeping, or signatures in health and human services,
also confused the industry, because it was difficult to understand “whether and what parts of it
applied to which companies” (Kronk, 2018, par. 8). Further, in public docket No. 92S-0251,
titled Electronic Submissions; Establishment of Public Docket (FDA, 1997), Part 11 identifies a
confusing process to determine which documents or parts of a document the agency will accept
electronically. It states that submissions can only be made to specific units, such as a specific
center, office, division, branch, and that the unit must be consulted regarding whether to proceed
with the electronic submission and how that submission should be structured (method of
transmission, media, file formats, and technical protocols). This requirement suggests that
significant homework would need to be done to clarify how companies would have to adapt their
existing practices and whether Part 11 would apply to all or some of its systems.
Johner identified other areas of confusion. Part 11 is identified to apply only when an
electronic record replaces a paper record; systems that simply generate paper printouts are
22
exempt (Johner, 2018). What if a system can produce a paper printout but relies on electronic
recording to generate it? Is it covered or not covered by the regulation? Richman noted:
This is another loophole that may potentially be subject to abuse and one that, in
the absence of 100% data verification, is clearly inconsistent with the need to
ensure the integrity of what is printed on the paper records and thus used for
regulatory purposes (Richman, 2003, par. 23).
Nettleton observed that “most companies are so unsure about electronic signatures that they print
out copies of electronic records and sign the paper” (Nettleton, 2021, par. 4), thereby
circumventing the requirements of the regulation.
Johner also points out another point of contention - that the IT systems discussed in the
regulation need to be validated. To this end, the regulation references the guidance document,
General Principles of Software Validation (FDA, 2002), but clarity is not provided on whether
the Part 11 regulations apply only to validation requirements during the design phase or whether
they extend to the complete software life cycle. These efforts would also require involvement of
the information systems (IS) and information technology (IT) teams that generally support core
business practices and are “driven more by system delivery deadlines and budgets than by
quality and good practice” (Richman, 2003, par. 19).
Fear of noncompliance can drive companies to develop onerous procedures or practices
that can stand in the way of effective logistics. Speer advances the opinion that the new
requirements are the reason why “paper is still king” (Speer, 2016, par. 2) for managing
documentation, “even though you know how challenging, inefficient, and risky managing piles
of paper can be” (Speer, 2016, par. 5). Jardine crystallized the nature of industry views by stating
that “it did not take long before the praises for the highly anticipated regulation turned into
criticisms” (Jardine, 2020, par. 4). Gough and Nettleton summarized the regulation, by saying
23
that “21CFR Part 11, Electronic Records; Electronic Signatures is, “a vaguely worded law”
(Gough, 2010, Chpt. 1, Q. 13).
2.3.3 FDA’s Response to the initial Part 11 Confusion – the 2003 Guidance Document
The negative reactions of industry to Part 11 forced the FDA to reevaluate and address
the industry’s discontent. Representatives of FDA spoke at conferences. FDA published a
compliance policy guide (CPG 7153.17) as well as numerous other draft guidance documents on
topics as varied as Validation, Glossary of Terms, Time Stamps, and Maintenance of Electronic
Records and Electronic Copies of Electronic Records (FDA, 2003). Perhaps the most important
step that FDA took was to publish, in 2003, a guidance document titled Guidance for Industry
Part 11, Electronic Records; Electronic Signatures – Scope and Application (FDA, 2003).
Amongst other clarifications was an important change. The Agency identified that it would take
no enforcement action under Part 11 for deficiencies in validation, audit trails, record retention,
and record copying; it would instead default to “predicate rules”. These rules included a
collection of what are sometimes referred to as GXP regulations, such as Good Laboratory
Practices (GLP), Good Clinical Practices (GCP) and Good Manufacturing Practices (GMP)
(sometimes designated as current GMP (cGMP). For example, electronic manufacturing records
would be assessed according to GMPs; clinical documentation would be assessed according to
the GCPs. To clarify further, clinical trials would remain governed by the predicate rules in the
GCP regulations laid out in 21CFR 312, which require that companies maintain records to ensure
subject safety and data integrity. They entrusted the responsibility and burden of compliance to
the companies that then would have the responsibility to interpret and apply the predicate rule
correctly. At the same time, the guidance document clarified that the Agency would enforce
24
other parts of the regulation not covered by predicate rules; examples are shown in the Table 1
below:
Table 1: Part 11 Regulation - Enforcement Action by the Agency
(FDA, 2003)
Agency Will Enforce (included but not limited to)
Agency Will Not
Enforce
1 Certain controls for closed systems in § 11.10 1 Validation
Limiting system access to authorized individuals 2 Audit trails
Use of operational system, authority and device checks 3 Record retention
Determination that persons who develop, maintain, or use
electronic systems have the education, training, and experience
to perform their assigned tasks 4 Record copying
Establishment of and adherence to written policies that hold
individuals accountable for actions initiated under their
electronic signatures 5
Requirements on
systems operational
before 20 August 1997
Appropriate controls over systems documentation
2
Controls for open systems corresponding to controls for closed
systems bulleted above (§11.30)
3
Requirements related to electronic signatures (e.g., §§ 11.50,
11.70, 11.100, 11.200, and 11.300)
The industry welcomed the efforts that FDA had taken to address their concerns but still
appeared confused about how to interpret the regulations, given their general language.
According to Kronk, industry complained that the administration was waffling and, in some
cases, contradicting its previous positions (Kronk, 2018). According to Richman, “The more it
appeared that FDA was making up the rules as it went along, the more the industry struggled and
became entangled in its own internal debates regarding the basic expectations and how to
effectively meet them” (Richman, 2003, par. 15). Johner states that the FDA saw its own
objective, to use the Part 11 regulation to encourage the effective use of electronic information,
as being “thwarted” (Johner, 2018, par. 4).
25
2.4 Evolution of Regulatory Oversight of Clinical Trials
The confusion over the implementation of Part 11 continued to have effects in several
areas, but one in which it has become particularly problematic has been its effect on the
management of clinical trials. Historically, clinical trials in the US had been governed by a set of
regulations and guidance documents that had their genesis early in the 20th century. Initially
these were not highly prescriptive, but they did put into place the framework that would
eventually be the basis for a highly evolved system.
The evolution is often considered to begin in the 1930s, after more than 100 people,
mostly children, were killed by ingesting a syrup in which the antibiotic, sulfanilamide, was
dissolved in diethylene glycol, an industrial chemical poisonous to humans. The Food, Drug and
Cosmetic Act of 1938 (FDA, 2018a), signed by President Roosevelt into law soon after the
tragedy, required that companies provide the FDA with evidence of safety for all drugs prior to
marketing, and that the evidence be submitted in a new drug application (NDA). Proving that
safety could depend on acquiring data from human subjects (FDA, 2018b, ICH, 2016).
However, the standards for testing new drugs were then relatively lax. Some clinical
trials were conducted but those had few guidelines for ethical management. The clinical trials
landscape changed, however, when unethical Nazi experiments during WWII caused the public
to scrutinize the ethical rules under which clinical trials were conducted. In 1947, the Nuremberg
Code identified the need for subjects to give voluntary consent before they could participate in
research (Heller, 2011, FDA, n.d., Linder, n.d.). Soon thereafter, the United Nations proclaimed
the Universal Declaration of Human Rights (UN, n.d.). It reiterated the need to respect the rights
of individuals involved in medical experiments. In 1964, the Declaration of Helsinki was
developed by the World Medical Association (WMA, 2023), to set out guidelines to protect the
rights of human subjects and is the basis for the ethical principles that underlie the ICH-GCP
26
guidelines (ICH, 1995, ICH, 2016). In 1979, the Belmont Report by the National Commission
for Protection of Human Subjects of Biomedical and Behavioral Research defined a triad of
essential principles for clinical trials, including respect for persons, beneficence and justice
(DHEW, 1979).
Attention was also needed to improve the science and logistics associated with clinical
trials. In the early 1960s, another tragedy related to the use of the drug, thalidomide, brought
attention to the still-primitive oversight of clinical trials. In this case, over 10,000 infants in over
20 countries were born with fetal limb deformities, caused when their mothers used thalidomide
to cure their morning sickness (Grunenthal, n.d.). The US was spared the brunt of the tragedy
because the drug was still under review by the FDA when evidence of its teratogenicity emerged.
However, the near miss caused Congress to strengthen its oversight of clinical trials by enacting
the Kefauver-Harris Amendment of 1962 (Goodrich, 1963). This amendment now required the
FDA to evaluate all new drugs for safety and efficacy using well-controlled clinical trials whose
data would then be scrutinized more closely (Otte, 2005).
However, concerns remained over the loose way in which clinical trials were being
designed and managed. Many committees and organizations around the globe began to publish
documents that would consolidate, standardize and harmonize clinical trial guidelines around the
world. An important player has been the International Conference for Harmonisation of
Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH, n.d.). In
1995, it issued a particularly important guidance: Topic E6 Guideline for GCP (ICH, 1995). The
recent addendum to the ICH GCP Guidance titled, Integrated Addendum to ICH E6(R1):
Guideline for Good Clinical Practice E6 (R2) (ICH, 2016) and the E6 R2 Good Clinical
Practice: Integrated Addendum to ICH E6 (R1) Guidance for Industry (FDA, 2018b) updated the
27
standards regarding electronic documents, to increase clinical trial quality and efficiency and to
keep up with the growing use of electronic records. E6 has 13 core principles of the ICH-GCP,
whose goals are summarized as follows:
All clinical trials should be conducted in accordance with ethical principles,
sound scientific evidence and a clear detailed protocol with prior
approval/favorable opinion from the institutional review board (IRB)/independent
ethics committee (IEC). The benefits of conducting trials should outweigh the
risks. The rights, safety and well-being of trial participants are of paramount
importance, and these should be preserved by obtaining informed consent and
maintaining confidentiality. The care must be given by appropriately qualified
personnel with adequate experience. Records should be easily accessible and
retrievable for accurate reporting, verification and interpretation. Investigational
products should be manufactured according to Good Manufacturing Practice
(ICH, 2016).
The ICH represents:
…quality standards that improve data quality, minimise unwanted exposure of
humans to investigational products, enhance marketing prospects of new drugs
and makes trials cost-effective for sponsors (Pathak, 2018, par. 4).
Conducting clinical trials in accordance with ICH-GCP guidelines has reduced
the occurrence of frauds and accidents (Pathak, 2018, par. 5).
A brief overview of relevant parts of the guidance, presented below, is important because it is
central to the research proposed in this dissertation.
2.4.1 Regulatory Expectations for Electronic Records and Signatures for Clinical Trials
ICH E6 and its addendum makes clear that the safety and efficacy of tested products and
the rights of the patients can be achieved only if the clinical trial can assure the integrity of its
data. Until recently, handwritten signatures were commonly used to authenticate documents, but
electronic signatures can provide an efficient and secure alternative when they are implemented
in compliance with regulatory requirements. Section 8 of the ICH E6 (R2) provides a list of
28
essential documents that are expected to be maintained throughout the conduct of a clinical trial
and have a high benchmark for data integrity, appropriate use and compliance with regulatory
requirements. The list is extensive. It includes, for example: the investigator’s brochure (IB); the
trial master file (TMF); signed ICFs and CRFs; investigators’ and/or sub-investigators’ curricula
vitae, progress reports from investigators; ethics committee (EC)/institutional review board
(IRB) documentation; regulatory authority documents and correspondence; documentation
relating to safety information; financial disclosure forms; and other trial related documents.
Clinical trial investigators and sponsors must maintain these essential documents throughout the
duration of the trial and for a specified period after the trial has been completed. Proper
documentation and recordkeeping are crucial for ensuring data integrity, transparency and
compliance with regulatory requirements and GCP guidelines.
Section 4.9.0 of the ICH E6 R2 further emphasizes the need for Good Documentation
Practices (GDP) to meet the requirements of sound “data quality” collected for clinical trial
research:
The investigator/institution should maintain adequate and accurate source
documents and trial records that include all pertinent observations on each of the
site's trial subjects. Source data should be attributable, legible, contemporaneous,
original, accurate, and complete. Changes to source data should be traceable,
should not obscure the original entry, and should be explained if necessary (e.g.,
via an audit trail) (ICH, 2016, p. 19).
“’Data integrity’ refers to the soundness of the body of data as a whole. In particular, the
data should be credible, internally consistent, and verifiable” (WHO, 2005, p. 94). Of particular
concern for assuring document integrity is the need to assure that records have not been (and
could not be) corrupted or forged. Thus, audit trails are needed to prove the authenticity of
collected data so that a regulatory agency can recreate the details of the trial. Typically, a
29
consistent approach to data management depends on having robust Standard Operating
Procedures (SOPs) to detail how data must be handled by site personnel and sponsors and how it
will be monitored to confirm its authenticity and integrity.
One area in which compliance with regulations seems to be especially challenging is the
use of devices and software programs that collect, store and/or analyze data from study
participants. If such devices are to be introduced, managed and maintained in a compliant
manner, the staff involved in data collection and management must be trained, the devices must
be validated and calibrated, and quality control methods must be developed to ensure that the
collected data is accurate and complete. The data collected from such devices must also be
backed up so that they could be recovered and archived. Many types of documentation were
subject to that control as stated in section 1.22:
All records, in any form (including, but not limited to, written, electronic,
magnetic, and optical records, and scans, x-rays, and electrocardiograms) that
describe or record the methods, conduct, and/or results of a trial, the factors
affecting a trial, and the actions taken (ICH, 2016, p. 4).
However, this listing obligates companies to a large amount of work. Specific types of
documentation in even a simple trial might include but are not limited to: source data, such as
original records and certified copies of original records; source documents, such as clinical
charts, hospital records, pharmacy dispensing records, subject diaries and laboratory notes; case
report forms; correspondence with oversight individuals or groups; study protocols and
amendments; investigator brochures; signed informed consent documents; and subject screening
logs (ICH, 2016). Such documents had in the past been captured and stored in paper and were
authenticated by ink-based signatures. Today, it is much less certain that paper will persist to be
the medium of preference. However, as stated by Olsen, “Regulatory concerns and technical
30
requirements are viewed as particularly onerous. Undoubtedly, this attributes to the reluctance of
sponsors and CROs to integrate electronic applications in the clinical development process”
(Olsen, 2022, par. 9).
The new electronic landscape, described above, has been revolutionary for the
management of clinical data. The electronic systems hold the promise of greater efficiency and
interoperability. They also open the door for new approaches to clinical trials because they
facilitate the increased decentralization of trials to a point where home-based rather than clinicbased trials become more feasible. Clinical trial data can now be collected or extracted using a
range of novel methods and tools. For example, electronic health records (EHRs), electronic
medical records (EMRs) or mobile applications can now be used to collect and analyze
electronic Clinical Outcome Assessments (eCOA) or evaluate ePROs. Investigators can also
source the data from wearables and sensors that might feed EDC systems (Olsen, 2022).
The evolution had its challenges. Transition has been burdensome, because many types of
paper records still had to be kept and many questions remained that were not covered by the
relatively high-level predicate rules or by the relatively old guidance document from 1999, titled
Guidance for Industry- Computerized Systems Used in Clinical Trials (FDA, 1999). To remedy
this deficiency, FDA released another guidance document, titled Guidance for Industry
Computerized Systems Used in Clinical Investigations, to explain how clinical trials might
comply with expectations related to Part 11 (FDA, 2007). The intent of this 2007 guidance was
to identify how to use computerized systems in clinical investigations without compromising the
reliability, quality and integrity of electronic data and source documentation. It explained how
industry might use electronic records and computerized systems to create, modify, maintain,
archive, retrieve, or transmit source data. It identified requirements for audit trails that began
31
with source data so that the entire clinical trial could be reconstructed from cradle to grave. It
mentioned the use of study protocols, standard operating procedures (SOPs), retention records,
security safeguards and training records to assure the integrity of source data. It also discussed
how clinical data should be submitted in marketing applications. The guidance had a wide reach,
applicable to trial sponsors, contract research organizations (CROs), data management centers,
clinical investigators, and institutional review boards (IRBs) (FDA, 2007). Thus, it set the
expectations about how data from clinical trials should be managed going forward.
2.4.2 Industry Concerns
The concerns that industry has had with respect to Part 11 regulations became magnified
when those regulations were applied to clinical trials. One primary concern was that the
regulations “were often too general when it came to the need of specific clinical trial processes
and structures” (Ohmann, 2011, p. 4). This made it difficult to create a concise standard that
academic clinical trial centers and data centers could use to ensure that their electronic data
management was compliant. However, at least two other specific concerns also made it
complicated to plan an implementation strategy in which the companies could feel confident.
First was the need to develop risk assessments, a stated requirement of the regulations. Second
were the challenges of integrating Part 11 with predicate rules. Consequently, companies were
obligated to justify their decisions regarding the implementation of specific electronic controls
through documented risk assessments while at the same time taking into account the record
requirements outlined in the corresponding predicate rule (Winter, 2003). How to do this was
the question.
32
2.4.2.1 Challenges Related to Risk Assessments
The Q10 Pharmaceutical Quality System (FDA, 2009) and the ICH GCP E6 R2 guidance
(ICH, 2016) require manufacturers and trial sponsors to perform a risk assessment to determine
which requirements of Part 11 should additionally be met after the requirements of predicate
rules are satisfied. As stated by the E6 addendum, “The approach to validation should be based
on a risk assessment that takes into consideration the intended use of the system and the potential
of the system to affect human subject protection and reliability of trial results” (ICH, 2016, p. 8).
Those risk assessments have no standardized format or consistent criteria to prioritize the
risks. The outcomes of an assessment could then be interpreted in different ways by different
functional teams within a single company and could further vary from one company to another.
The GMP industry forums recognized this gap. ISPE published GAMP 5 – A Risk-Based
Approach to Compliant GXP Computerized Systems which proposes a formalized and
documented risk assessment process (ISPE, 2008). Those responsible for clinical trials are
typically unfamiliar with these guidelines and even with the FDA requirement to carry out such
risk assessments. In the most recent draft guidance Electronic Systems, Electronic Records, and
Electronic Signatures in Clinical Investigations Questions and Answers (FDA, 2023b), the
Agency provides no details on “how’ to perform a risk assessment but directs the industry back
to the many available risk assessment methodologies and tools. It advices to see the ICH
guidance for industry Q9(R1) Quality Risk Management (ICH, 2023b). Also, the standard ISO
31010:2019 Risk management – Risk assessment techniques (ISO, 2019). The FDA guidance
document specifically states that:
33
The 2003 part 11 guidance, which states that FDA intends to exercise
enforcement discretion regarding specific part 11 requirements for validation of
computerized systems (§§ 11.10(a) and corresponding requirements in 11.30),
recommends that industry base its approach to such validation on a justified and
documented risk assessment and a determination of the potential of the system to
affect product quality and safety as well as record integrity. Accordingly, we
recommend that sponsors and other regulated entities use a risk-based approach
for validating electronic systems owned or managed by sponsors and other
regulated entities (FDA, 2023b).
Confusingly, Richman recalled that the Agency has at times suggested “that some
systems may not need to be validated” (Richman, 2003). Those statements contradict the
position in its written documentation that all systems should be validated according to their
intended use. Their sometimes-conflicting recommendations seem to lead some companies to
interpret the ‘justified and documented risk assessment’ phrase, stated in the draft guidance, Part
11, Electronic Records; Electronic Signatures — Scope and Application, as permission to avoid
validating systems that in fact are subject to validation requirements called out by the predicate
rules (Richman, 2003).
2.4.2.2 Challenges in Synchronizing with the FDA Predicate Rules
When regulators identified that they would defer to predicate rules to set expectations for
some aspects of electronic records and signatures, it set up a system that was particularly difficult
to manage for clinical trials. Theoretically, the use of predicate rules seemed to be a good
approach, because you could rely on the more detailed guidance documents related to the
predicate rule to make clear the expectations. However, “the devil is in the details”. It soon
became apparent that a single clinical trial would be governed by more than one predicate rule,
and the different rules had different expectations for the management of clinical trial data,
records and test products. On one hand, the trial itself would be covered by the GCP predicate
34
rules, including but not limited to 21 CFR Part 50, Protection of Human Subjects; 21 CFR Part
312, Investigational New Drug Application; and ICH E6 for GCPs. Additional predicate rules of
21 CFR Part 58, GLP for Nonclinical Laboratory Studies and 21 CFR Part 211, which governs
cGMPs for pharmaceutical test articles, also have requirements relevant to the clinical protocol
and quality systems. The Bioresearch Monitoring (BIMO) FDA Compliance Program Guidance
Manual, Program 7348.810 (FDA, 2021a), specifically speaks about these diverse requirements,
found in section S, on Investigations Product (CGMPs) and section W, on Nonclinical Lab
Studies (GLPs). There is a dissonance between the requirements identified in these different
predicate rules (Table 2 and Table 3).
35
Table 2: Some Important Differences Among GLP, GCP, GMP
(Winter, 2003)
The approaches that FDA has taken assume that predicate rules and Part 11 will work
synergistically. Thus, companies must understand the predicate rules relevant to their studies so
36
that they can determine how they mesh with “Part 11” (Jardine, 2022). In the areas of records
and documentation, Table 3 gives examples of some documents of concern and the predicate rule
most relevant to them.
Table 3: Part 11 Enforced Predicate Rule References
(Jardine, 2022, Winter, 2003)
Type of Records GxP
Category
FDA Predicate Rule
Reference
Records and Reports (records retained for atleast 1 yr. after
the expiration of the batch)
Recordkeeping and record retention (records retained for 2
yrs. after a marketing application)
GMP
GCP
21 CFR 211.180
21 CFR 312.57
Changes shall be drafted, reviewed and approved by the
appropriate organizational units and reviewed and approved
by the quality control unit
GMP
GCP
21 CFR 211.100
21 CFR 211.160
No stated requirement
Equipment must be qualified, calibrated, cleaned and
maintained
GMP
GCP
22 CFR 211.63
21 CFR 211.67
21 CFR 211.68
No stated requirement
Equipment cleaning and use logs GMP 21 CFR 211.182
Master production and control records GMP 21 CFR 211.186
Batch production and control records GMP 21 CFR 211.188
Production record reviews GMP 21 CFR 211.192
Laboratory records GMP 21 CFR 211.194
Protocols for a nonclinical laboratory study GLP 21 CFR 58.120
Reporting of nonclinical laboratory results GLP 21 CFR 58.185
Raw data, documentation, protocols, final reports, quality
assurance (QA) inspection records and samples, job
descriptions, training records, and instrument maintenance,
calibration, and inspection records
GLP 21 CFR 58.195
Supporting records for investigational new drug (IND)
applications and records described in ICH (International
Conference on Harmonisation) GCP guidelines
GCP
GCP
21 CFR 312.57
21 CFR 312.62
Records that ensure that the systems are designed to permit
data changes in such a way that they are documented and
deletion of entered data is prevented.
GCP
GMP
ICH GCP 5.5.3
EU GMP Guide Annex 11
List of individuals authorized to make data changes GCP ICH GCP 5.5.3
37
However, in cases where more than one predicate rule appears to be relevant, the
requirements called out by the different predicate rules are not consistent. This can cause
problems when a product or process moves from one functional area to another, for example, an
investigational product (IP) in manufacturing (cGMPs) is provided for testing to preclinical
research labs (GLPs), then is modestly changed and moved back to manufacturing (cGMPs),
only then to be provided for clinical research (GCPs). Specifically, the requirements for
equipment qualifications and calibrations (21 CFR 211.63, 21 CFR 211.67, 21 CFR 211.68),
change controls (21 CFR Part 211.100 a, 21 CFR 211.160), and record retention (21 CFR Part
211.180, 21 CFR Part 312.57) differ markedly under the GMP and GCP requirements as shown
in Table 3.
Just one example of a situation that lacks clarity can be used to illustrate the types of
issues in which confusion can arise. It relates to the rules governing the
management/maintenance of equipment that stores the IP at an investigator site for clinical trials.
No regulations explicitly address the control of site-based equipment, even though the IP itself
must be strictly controlled under GMP rules up until it arrives at the investigational site. In an
interview, an ex-FDA inspector attempted to clarify this problem (personal communication). She
stated that information provided by the sponsor to the site regarding the way to
manage/service/maintain equipment used for the trial (whether found, for example, in a protocol,
pharmacy brochure and/or investigator brochure) should be referenced to 21 CFR 312.60. This
GMP-oriented regulation identifies that "an investigator is responsible for ensuring that an
investigation is conducted according to the investigational plan and for the control of drugs under
investigation”. Then the sufficiency of those instructions would be scrutinized to be sure that the
requirements and instructions to manage IP are all spelled out clearly for the investigators. If the
38
Sponsor does not include or specify any such conditions, 21 CFR 312.60 would still be used but
other evidence would be considered to assess if the site was managing trial equipment properly.
The bar for this assessment would however appear to be lower. In this case, investigators
typically ask for and flip through service records. Most monitors do the same during their site
visits. The GCP world is not even aware of the stringent GMP requirements and so they do not
apply the same standards for validations and qualifications of equipment to store or process IP
that would be required to meet compliance with GMP requirements. However, it is difficult for
companies to ferret out such differences in practice if they are not privy to the personal
discussions reported here with the ex-official. Further, this kind of informal advice is not
documented so is difficult to use as a reason for the types of interventions that are adopted.
2.5 Decentralized Trials (DCT)
The requirements to control the use of electronic records, signatures and computerized
systems were put into place when most clinical trials were “centralized”. The term,
“centralized”, is used to describe a trial in which subjects visit health care facilities to undergo
testing, and whose data are held within that clinical setting using some form of Electronic Data
Capture (EDC). However, increasing numbers of trials in the past decade have modified this
approach to add “decentralized” elements. The use of DCT methods in only 7% of trials just
prior to the start of the COVID-19 pandemic ballooned to an estimated 77% in the period
between 2019-2020 (Gao, 2021). Thus, what started when Pfizer launched the first virtual trial in
2011 has today become a strategic priority for biopharmaceutical organizations. DCT approaches
proved to be a principal way in which to reduce trial disruptions during the COVID-19
pandemic, when about 85% of clinical trials were delayed and just 7% of enrolled patients were
able to complete a trial (Climedo, 2022).
39
Various terms, including decentralized, direct-to-participant, and virtual studies have
been used to describe DTC studies (Dorsey, 2020). DCTs can also build on traditional designs
but add decentralized elements that facilitate subject convenience and engagement, as shown in
Figure 4 (Van Norman, 2021). For example, subjects can be recruited and monitored remotely
through internet interactions and data may be generated and collected electronically (e.g. via
wearable monitors). In many cases, telemedicine and virtual visits using internet technologies
such as FaceTime, Skype and Zoom have been able to assure convenient and complete patient
follow-ups with successful trial outcomes. Data can be collected using electronic patient diaries,
self-assessments or data collection from wearable medical devices outside of the traditional
clinic setting (Van Norman, 2021). Such interactions and data transfers are supported by cyber
security and data privacy protection laws that help to assure subject safety and data integrity
(Khozin, 2019). In situations where in-person visits are mandatory, healthcare provider offices
and laboratory testing sites may be situated close to the subject’s home; study medications can be
shipped directly to that home or to a local healthcare facility (Van Norman, 2021).
40
Figure 4: Illustration of Decentralized Clinical Trials
(Van Norman, 2021)
The decentralized approach has many advantages. For sponsors and sites, it can save
money and time by expediting recruitment, retaining subjects on trial, simplifying logistics and
increasing compliance with study procedures. It can also improve access to hard-to-reach patient
subpopulations whose members might be older, poor, or living in remote locations. In a global
survey conducted in 2017 of 12,427 respondents, the need to travel to the study location was a
‘least liked’ feature of participating in a clinical trial, second only to the possibility of receiving
placebo instead of active investigational product (CISCRP, 2017b). The ability to complete some
aspects of the trial at home is particularly valuable for situations in which the subjects are ill,
41
immunocompromised, have medical comorbidities or live at a distance from the central
investigational site (Dorsey, 2020). For these and other subjects, it can reduce travel costs and
the need for time off from work (Van Norman, 2021).
Surveys have been conducted since 2013 by the 501(c)3 nonprofit organization, CISCRP
(an acronym representing their core values: Collaboration Integrity Service Creativity Respect
Passion), whose core mission is:
to provide accessible, relevant, useful, high-quality educational resources,
programs, and services that increase awareness and understanding of the clinical
research process; recognize and appreciate the unprecedented gift of
participation in clinical trials; enhance and enrich the participation experience
for patients and their families; and promote engagement and partnership between
clinical research professionals, patients, and the public (CISCRP, 2022a, par. 1).
The survey results are supplemented with feedback from pharmaceutical, biotechnology
and medical device companies, CROs, industry service providers, patient advocates/advocacy
groups and investigative sites (CISCRP, 2022b). The 2017 Perceptions and Insights Study
examined how study volunteers make decisions about whether to participate and how those
subjects could be assisted as they go forward in the trial (CISCRP, 2017a). The 2,194 responding
participants identified that the DCT approaches used most frequently were text messaging and
informed consent on a tablet. They also reported that smart phone applications were the most
used technologies during trials (CISCRP, 2017b). Preferences for smart phone apps, wearable
devices, and social media were especially common amongst those in younger age groups
(CISCRP, 2017b).
More recently, in 2021, CISCRP hosted a webinar that further supported the usefulness
of DCTs and their associated DHTs. It highlighted a marked increase in the use of decentralized
activities and in the willingness of study participants to continue such activities after the
42
pandemic (CISCRP, 2021). Results were drawn from a global questionnaire disseminated
between April 2021 and July 2021 to 11,793 participants who were either potential subjects or
subjects already enrolled in clinical research studies.
Respondents identified that disruptions in daily routines related to visits, travel and
diagnostic tests were the most burdensome aspect of a traditional clinical trial and these could be
reduced using DCT methods. Speakers also noted that potential subjects became more aware of
clinical trials and were more commonly engaged in virtual visits and telemedicine during the
pandemic period. It concluded that convenience enhancing initiatives such as smart phone apps,
text messaging and video conferencing were particularly beneficial (CISCRP, 2021). The
findings of the report were amplified by largely supportive comments from study panelists in the
webinar. However, both Kim Harper, Site Director at Benchmark Research and Krystal Doucet,
Associate Site Director at Benchmark Research, pointed out some disadvantages. Patients using
telemedicine did not have face-to-face time with the principal investigator (PI) or healthcare
professionals; this could make it difficult to conduct a physical examination or treat an adverse
event. Nonetheless, the consensus of contributors appeared to support the conclusions of Dr. Gail
Norman, who stated, “with growing acceptance of virtual medicine and technology, there
appears to be little doubt that DCTs have ‘arrived’ and stand to change the face of human clinical
trials in drug and therapeutic development both now and into the future” (Van Norman, 2021, p.
387). Recent research related to the logistics of DCTs have also begun to confirm the predicted
beneficial effects of DCT approaches. For example, significant time savings were reported in a
recent randomized trial evaluating a 5-day course of hydroxychloroquine for asymptomatic
volunteers who had been exposed to others with confirmed infections. The trial opened, enrolled
its participants using social and traditional media, obtained consent using electronic means,
43
collected its data electronically and published the results within a period of 90 days (Boulware,
2020, Schilling, 2020). These potential benefits came with a higher price tag to initially set up
the technology infrastructure to use these capabilities. Examples of cost, at minimum include,
setting up remote monitoring, eConsent and patient engagement platforms, EDC systems, digital
devices, hardware and software, licensing, device connectivity, IT support and training of staff
and subjects. However, a recent phase III study by Novartis suggest that the long-term benefits
can justify the initial startup costs. In that study, remote monitoring reduced travel costs and
eased the strain on internal resources, such as the need for having space and access to hospital
computer terminals when monitoring in the hospital. Such considerations are important because
“the types of monitoring generally performed in pharmaceutical industry trials usually add 25%
to 35% to the overall costs of a typical phase III trial” (Uren, 2013, p. e15).
2.5.1 Digital Health Technologies in Clinical Trials
2.5.1.1 Prevalence of Digital Health Technologies
As decentralized trials gain popularity, more attention has had to be paid to the digital
tools that manage the distant relationships and their associated data exchanges. The terms,
Digital Health Technologies (DHTs) or Digital Technology, refer to electronic methods that
might be employed as part of the assessment and record retention and is defined as:
an electronic method, system, product, or process that generates, stores, displays,
processes, shares, purges, and/or uses digital data (signals in zeros and ones).
Examples of digital technologies include hardware (e.g., wearable sensors, VR
headsets, digitally enabled drug delivery devices), advanced analytics (e.g.,
artificial intelligence, machine learning, sophisticated computation) and cloud
services (e.g., storage, computing, and data processing), and software (e.g.,
mobile medical applications, SaMD). The technology could be a product that
includes new, unfamiliar, or unseen digital health technology never submitted,
cleared, or approved by the FDA (PhRMA, 2022, p. 3).
44
DHTs have already become commonplace in the healthcare field to collect data on participant
health markers such as physical activity, sleep, heart rate, medication adherence and respiration
patterns. DHTs also help to improve the quality of subjectively reported outcome data by
assuring the timely collection of data points, ensuring compliance, reducing administrative
burden and avoiding secondary data entry errors (Coons, 2015). They provide the research
community with new tools that can collect data outside of “brick and mortar” research centers
(Figure 5).
Figure 5: Digital Health Tools in Biopharmaceutical R&D
(PhRMA, 2022)
Key: AI-Artificial Intelligence - Artificial intelligence refers to systems designed by humans that, given a complex
goal, act in the physical or digital world by perceiving their environment, interpreting the collected structured or
unstructured data, reasoning on the knowledge derived from this data and deciding the best action(s) to take
(according to pre-defined parameters) to achieve the given goal.
ML-Machine Learning - One of AI’s sub-disciplines, denoting the ability of a piece of software to learn from its
environment or from a very large set of representative data, enabling systems to adapt their behavior to changing
circumstances or to perform tasks for which they have not been explicitly programmed.
AE-Adverse Event
RWE-Real World Evidence - Clinical evidence regarding the usage, and potential benefits or risks, or a medical
product derived from analysis of RWD.
RWD-Real World Data - Data relating to patient health status and/or the delivery of health care routinely collected
from a variety of sources.
45
In 2000, only eight clinical trials used DHTs to assist the trial; however, by 2017, the
numbers had increased to over 1,100 trials (Marra, 2020). Kaiser Associates and Intel further
estimated that by 2025, 70% of clinical trials will incorporate digital sensors (Jansen, 2020). In
2020, the COVID-19 pandemic also forced adoption of digital health technologies in the
healthcare and pharmaceutical industries to protect participant safety and enable clinical trial
continuity (TransCelerate, 2020). However, despite their usefulness, DHTs also pose challenges.
2.5.1.2 Regulatory Challenges
It can be challenging to develop and use DHTs in regulated activities such as clinical
trials, and even more difficult to manage compliance with regulations related to digital records
and signatures under Part 11 and the predicate rules. Concerns related to dissonance between
predicate rules and the confusions over validation are relevant to the use of DHTs in
decentralized trials. Thus, methods to address Part 11 requirements must be developed early,
when protocols are being planned and vetted through the regulators. Development of proactive
plans will be needed to demonstrate that systems are in place to address technology-related
compliance risks and to provide needed training and assistance to trial participants and staff.
Further, collection of data provided by DHTs pose different technological problems than
traditional data that is collected manually by drug development clinical teams. Some DHTs can
generate massive amounts of data that need to be organized and mined appropriately. DHTs must
be validated for the specific contexts in which they will be used. This will involve developing
methodological standards for data collection, managing the collected data and assuring that the
data are complete (Izmailova, 2023). These activities take time to complete. To implement the
plan, technical support teams and electronic infrastructure must be put into place and methods
for data management must be identified and validated.
46
Companies may also be reluctant to employ DHTs because they are confused about how
to deal with the regulatory requirements, which may lag or may even be contradictory given the
rapid evolution of this field. Confusion may exist about several regulatory concerns including 1)
the definition and management of source data; 2) the management of authorized signatures to
establish valid audit trails; 3) the choice of datasets required to maintain an audit trail; 4) the
appropriate use of devices and systems under the control of site staff or patients; 5) the identity
of the data originator(s); and, 5) what should be reported as the final result. Added to this can be
a regulatory disconnect created because drug and device marketing submissions are separate and
are thus reviewed by different divisions of the FDA.
2.5.1.3 Technological and Logistical Challenges
DHTs are often discussed as though they were all basically the same. In fact, the term
encompasses a variety of approaches and devices that will have their own requirements and
challenges. The functionality of most wearable biometric devices and other tools will need both
the software and/or hardware evaluated and validated before the devices are regarded as
acceptable for regulatory purposes. Validation activities such as risk assessment, design
verification, traceability, and configuration management are needed to be confident that the
system consistently provides accurate data related to the use for which the product is intended
(PhRMA, 2022). Validation and verification activities appropriate for the type of device and its
digitally derived endpoints are complex with potentially lengthy undertakings. For example,
experiments include verification of the data provided by body-worn sensors or analytical
validation of data processing algorithms. Clinical validation is also needed to ascertain if the
‘new measure’ is capturing the right data of interest related to the disease state and may include
such testing as construct validation, test-retest reliability, usability, device safety and sensitivity
47
to change testing. A challenge that multiplies the complexities of validating the DHT is the
disconnect that can exist between device engineers and clinical R&D (Research and
Development) scientists. Device engineers are often not familiar with the drug development
process and the regulatory requirements for drug approvals. Clinical scientists are generally not
familiar with the specific concerns and failure modes of devices (Izmailova, 2018).
Technological advances are key to the evolution of DHTs but also a main hurdle limiting
the adoption of the DHTs into study design. “Operation of the devices themselves depend also on
the availability of technical support and troubleshooting, batteries, transmission methods, and
internet infrastructure, such as cellular towers in remote locations or hard-wired internet
connectivity in homes currently without it” (Van Norman, 2021, p. 386).
Whether the DHT platforms are provided by the company or patient, trial sponsors must
ensure that both the platform and the software are suitable and are fit-for-purpose. A 2021 draft
guidance, titled Digital Health Technologies for Remote Data Acquisition in Clinical
Investigations (FDA, 2021b), identified that DHTs may take at least two forms. The most
familiar type is that in which the software is embedded into specifically designed hardware and
essential for the core operation of the hardware. Smart watches and wearable sensors are
examples of DHTs with embedded software. However, DHTs can also be configured as standalone software, such as that used to support electronic patient diaries or telemedicine visits on the
patient’s computer. This differentiation is important because it may suggest that users of the
different types of technological solutions will have different attitude and experiences to their
implementation.
DHTs can also be differentiated according to the degree of oversight that would be
anticipated for the device or application itself. Some, such as continuous glucose monitors or
48
blood pressure gauges, will meet the definition of a device under the Federal Food, Drug and
Cosmetic Act (FD&C) and therefore be subject to the requirements of the predicate rules or Part
11; the review and clearance or approval of such devices by the FDA may have assured that
compliance with Part 11 will have been examined, at least to some degree. Other types of DHTs,
such as Apple watches for routine measurements of heart rate or mobility, may not be approved
devices and may not have had to comply with the requirements of validations and Part 11 prior
to their proposed use in a clinical trial. In the draft guidance document, the FDA makes clear that
FDA is not responsible to decide if a DHT meets the definition of a device. Rather it is up to the
company to make and justify that determination. The guidance goes further to recommend how
to select DHTs that are suitable for the clinical trial. It also comments on expectations related to
verification and validation, to the ways that data for trial endpoints are collected, and to identify
and manage risks associated with DHTs (FDA, 2021b). However, none of these insights offer a
clear path by which industry can ensure compliance and regulatory alignment for its specific
types of DHTs.
The ability to maximize the potential of such technologies is further complicated by the
fact that many US families are not as technically competent as the engineers who design the
devices; 20% of U.S. households still lack smartphone or broadband access (Dorsey, 2020). The
deployment of DHTs to trial participants who are not comfortable with electronics or cannot
remember to complete or document study related tasks may require that site personnel interact
with the patient, to be sure that he or she is completing daily diaries or outcomes documentation
(Van Norman, 2021). Successful compliance with trial activities can be managed by proactively
training participants on the importance of responding to alerts and system upgrades on their
devices, giving them convenient access to the helpdesk to troubleshoot issues and for the sites to
49
follow up with participants that miss submissions from their DHTs to ensure their technology is
working correctly (Mooney, 2022). Multinational trials are especially difficult to manage
because data monitoring and patient interactions can be plagued by poor internet connectivity
(Olsen, 2022), outdated infrastructure and even language and time zone differences. Delays in
providing alerts due to data loss, buffering or reduced network communication can all result
from poor connectivity. An executive at a digital health startup shared this limitation:
We have some countries that do not have the connectivity that is needed to run
this type of work, and for those locations that can’t power the work, we have to
provision models to them. So, we have to provision devices, or provision internet
connectivity, and doing that really increases the cost and it doesn’t scale well
(Olsen, 2022, par. 12).
Technical characteristics of wearable devices, such as their sizes, convenience to wear and
impact on activities of daily living should also be considered to ensure effective adherence and
compliance by the study patients (Izmailova, 2018).
2.5.1.4 Training Challenges
Personnel and patients who perform trials incorporating DHTs can also have a steep
learning curve. Those trials often have approaches and elements unfamiliar to practitioners who
have gained considerable expertise in site-based trials but are less confident about the use of
electronic tools. Thus, sponsors and site staff must be trained first to use the specific mobile
technology or wearable device themselves and then to educate the patients in that use. Training
must be repeated when new personnel are added or software changes to the DHT are released.
The FDA also recommends that study participants be reassessed and retrained if the mobile tool
is complex or poses a significant risk to the conduct of the study (FDA, 2017). Thus, it can be
time consuming and expensive to keep up with all the training requirements related to rapidly
evolving technologies if those technologies are to satisfy regulatory expectations.
50
Another issue relevant to training of site personnel and users appears to be related to the
provenance of the devices. In the past, most DHTs were provided by the sponsor and given to the
study participants. The option to ‘bring your own device’ (BYOD), however, can allow patients
to download study apps onto their own phones or computers. Such an option can be attractive
because it reduces the need to train the user on a new interface. When participants are familiar
with their own devices, they may be more likely to complete assessments and make data entries
in a timely manner. Thus industry appears to be shifting toward adopting BYOD technologies to
collect eCOAs (IQVIATechnologies, 2022). However, training needs may then have to be
personalized. This could raise questions about implementing BYOD strategies in ways that
would be acceptable to regulators. The FDA has not provided any specific guidance for the use
of BYOD methods but has identified an openness to support BYOD strategies. “Regulators will
want to see evidence that an assessment can be completed across multiple device types, that the
experience is consistent across devices, and that patient behavior or the use of personal
technology won’t compromise data quality and security” (Mooney, 2022, par. 7). Without more
direction, sponsors remain unfamiliar with implementation and regulatory requirements
(Mooney, 2022). It is therefore unclear if sponsors are using or intending to implement BYOD
approaches and whether they have developed specific practices to train on and validate their use.
2.6 Part 11 Assessment: A Changing Landscape
As clinical practice changes, interpreting the regulations governing electronic records and
signatures has become more challenging. Part 11, published in 1997, still must be revised. Two
guidance documents for computerized systems issued by the FDA, General Principles of
Software Validation; Final Guidance for Industry and FDA Staff, published in 2002 (FDA,
2002) and Computerized Systems Used in Clinical Investigations, published in 2007 (FDA,
51
2007), are limited in scope and depth. The challenge with the initial document lies in its focus on
medical device software development that remains unconfigurable or non-customizable by users.
Consequently, it lacks a project phase that typically encompasses software configuration to align
with business processes. Conversely, the subsequent guidance document solely addresses
operational and compliance necessities when utilizing computerized systems in clinical trials.
However, neither of these documents offers comprehensive guidance for validating a
computerized system and upholding validation status within a regulated laboratory environment
(McDowall, 2014).
Developing well-validated and compliant DHTs can be difficult to do well, especially
when the DHTs have not yet been used in the field. Sponsors must identify whether their
electronic health tools include products that are approved medical devices, because that status
will determine the path that a company must take to ensure compliant use. The tools themselves
must be examined for risks related to measurement errors, cybersecurity threats, privacy-related
risks and competing licensing agreements (when participants BYOD). They may also have
unique issues related to data sharing and collection of consents (McGuireWoods, 2022).
Advanced and rapidly evolving DHTs can raise important yet unresolved questions about how to
assure the integrity and security of records and signatures. As new applications and technologies
are developed, sponsors and CROs must identify those that best fit the needs of the trial and can
integrate across different systems where relevant. They must then assure that personnel are
trained to use the tool, develop and retain necessary records and follow the numerous policies
and procedures that address technical, safety, privacy and clinical considerations. At present, it is
not clear how confident the clinical trial teams feel to use a specific DHT from a regulatory and
quality perspective.
52
Given that DHTs have been introduced quite recently, it is not surprising that the rules
governing DHTs face a regulatory lag, leaving some areas of use uncovered or ambiguous. For
example, one guidance identifies that:
Part 11 regulations do not address the performance of wearable biosensors,
mobile apps, or portable devices (i.e., the ability to measure what they are
designed to measure). For example, validation does not apply to the ability of an
activity tracker to accurately and reliably measure the number of steps walked.
Although performance of the mobile technology is critical to the clinical
investigation, recommendations for the performance of specific mobile technology
designed to measure specific biomarkers or physical activity are beyond the scope
of this guidance. For mobile technology that meets the definition of device as
defined in section 201(h) of the Federal Food, Drug, and Cosmetic Act (21 U.S.C.
321(h)), other regulations and policies may apply (FDA, 2017, p. 16).
However, the statement in a recent 2021 guidance document, Digital Health Technologies for
Remote Data Acquisition in Clinical Investigations, states:
To help ensure the quality and integrity of data, adequate protection of
participants, and satisfaction of regulatory requirements applicable to clinical
investigations, sponsors and investigators should consider the following
recommendations with respect to clinical investigations that involve use of a
DHT to remotely acquire data54 (FDA, 2021b, p. 19).
Here, footnote 54, advises to “see generally, e.g. 21 CFR Part 11, part 50, part 312 and part
812” (FDA, 2021b). However, no additional information is provided to the industry on what is
the requirement of the regulation in reference to the evolving DHTs in DCTs. Thus, as noted by
Olsen, sponsors and CROs are embracing electronic tools but remain uncertain about how these
technologies will be regulated. An executive at a Top 10 Pharma, for example, voiced the
concern that “regulators simply cannot keep up with the changing pace of technology in the
clinical development process” (Olsen, 2022, par. 10).
In the face of these concerns, FDA has undertaken several recent activities to increase
collaboration and communication with its stakeholders. On February 10, 2022, for example, the
53
FDA’s Small Business and Industry Assistance unit hosted a webinar titled Digital Health
Technologies for Remote Data Acquisition in Clinical Investigations, DHTs for Remote Data
Acquisition Draft Guidance Webinar to address industry’s concerns. In that webinar, FDA
summarized the draft guidance discussed above, clarified the agency’s position regarding the use
of DHTs during clinical trials and answered questions from stakeholders (FDA, 2022b). Some
topics addressed by the FDA were related directly to how regulatory frameworks were tailored to
digital health technologies for different purposes: the verification and validation of DHTs
regardless of whether they meet the definition of a device; steps to ensure a DHT is fit-forpurpose for remote data collection; and the use of qualified DHTs to support premarket
submissions. Emphasis was placed on interacting with the FDA early and often. In this webinar,
FDA also expressed its enthusiasm to engage with industry, to clarify regulatory expectations
and explore new technological advances in clinical investigations (FDA, 2022b). However, a
webinar is a poor substitute for written guidance if the goal is to engender confidence about
regulatory expectations. Notably, when audience members asked the speakers more specific
questions about the agency’s guidance on managing the Part 11 requirements, no further
guidance could be given (personal observation). Another webinar was held on 25 April 2023
titled Electronic Systems, Electronic Records, and Electronic Signatures in Clinical
Investigations: Questions and Answers, led by Dr. Leonard Sacks and his team to discuss the
draft guidance with the same title. The changing electronic environment and specific topics on
DHTs i.e. validations, risk assessments, audit trails and source documents were discussed along
with an attempt to address many public comments they have received in previous dockets
amongst other questions (FDA, 2023e).
54
In summary, confusion appears to exist related to the application of Part 11 nearly a
quarter century after its publication. We know something about the regulatory expectations of the
FDA from its guidances and public statements. However, we know much less about the
experiences of industry when trying to implement appropriate systems to assure compliance,
apart from anecdotal opinions sometimes expressed tangentially in trade journal articles,
conference presentations or the web pages for vendor services assuring “simple” solutions to Part
11 compliance. To the best of my knowledge, no systematic examination of industry views and
approaches has been performed that evaluates the current state of understanding and
implementation of DHTs in companies as related to compliance with Part 11 regulations.
2.7 Research Approaches and Framework for Part 11
The field of digital health technologies is so broad that many research projects could be
conducted to explore different aspects of its development, use and management. The goal of this
research was to explore one aspect of this environment, specifically the views and experiences of
sponsors and CROs when they attempt to implement the requirements of Part 11 for DHTs used
in clinical trials for data collection. A diversity of devices and applications can be used in ways
that could define them as DHTs. I have chosen two common types of DHTs, consumer
wearables and software platforms that I thought would give a beginning look at the experience
and challenges with DHTs - from a company perspective.
Given this implementation focus, an implementation research framework was considered
important to structure and systematize the research. Although implementation as a science is
relatively new to the research community, it is a field that has grown rapidly. At least 32
implementation frameworks have now been published in the literature in areas of health care,
mental health, business, and education. Many of these alternatives are variations based on the
55
seminal implementation framework developed by Fixsen (Fixsen, 2016), the co-founder of the
National Implementation Research Network (NIRN). The goal of NIRN is to assist researchers to
assess gaps that exist between effective practices, based on theory and science, and current
practices, based on company knowledge and experience (Fixsen, 2005). The approach published
by NIRN in its practical guide, Improving Programs and Outcomes: Implementation
Frameworks 2013, (Bertram, 2013) was deemed to be well matched to the work being
undertaken in this research. It builds on the original work of Fixsen and his colleagues that
characterized implementation as having 6 successive phases- exploration and adoption, program
installation, initial implementation, full operation, innovation and sustainability (Fixsen, 2005).
NIRN has since revised and condensed that model into only four stages of implementation:
exploration, installation, initial implementation and full implementation (Figure 6). The
requirements to assure innovation and sustainability in the initial model were addressed by
removing them as self-standing stages but rather weaving them throughout the other stages of the
implementation model.
Key to the research framework is the tenet that implementation is not an event but “a
specified set of activities designed to put into practice an activity or program of known
dimensions” (Fixsen, 2005, p. 5). The types of activities needed along this progression are
described by NIRN in Figure 6 and as follows (Bertram, 2013, NIRN, 2022):
Exploration: In this phase, the organization must begin to understand its needs and
readiness to begin implementation and to put into place the high-level decisions needed to go
forward. It must identify the resources that will be needed and educate itself about the
requirements of the regulations and guidance documents. At this stage it begins to determine the
potential drivers and barriers to implementation.
56
Installation: The organization must then prepare for implementation by securing the
needed resources, including funding and staffing, by assessing organizational readiness at a more
granular level, and by assigning key leaders and developing needed partnerships.
Initial Implementation: The organization begins to pilot its approaches. New processes
and systems are deployed and weaknesses that can drive more efficient and sustainable
implementation are identified and changed.
Full Implementation: The organization continues to monitor and manage and has put into
place a capable system of policies and procedures that need more modest, incremental tweaks to
improve its practices and assure sustainable operations.
Figure 6: NIRN - Fixsen’s Implementation Model
(NIRN, 2022)
Although the model appears to have boundaries between different stages, those who
created the model were quick to emphasize that implementation stages are not performed in a
lock-step sequence (NIRN, 2022). Phases may overlap and even circle back as the
implementation teams respond to multiple decisions, actions and corrections that affect the final
57
implementation. Further some implementation programs are never fully “finished”. As science
and regulations change, implementations must adapt.
In this study, I used the implementation framework illustrated in Figure 6 to survey the
views of industry professionals in mid to senior level positions in the biopharmaceutical and
medical device industry who have experience with the use of the specific types of DHTs of
interest for their clinical trials. I explored how those experts view the Part 11 regulation and its
associated regulatory instruments. Further, I investigated the experiences of individuals who
have implemented the requirements and their level of confidence in their company’s ability to
stay compliant with the relevant regulations.
58
Chapter 3. Methodology
3.1 Introduction
The purpose of this research study was to (1) explore current practices in the industry
when implementing compliance with the requirements of Part 11 regulations governing
electronic records and signatures for DHTs (specifically consumer wearables and software
platforms for data collection), (2) evaluate the views of industry regarding the adequacy of
regulations governing that use (3) and explore some of the main barriers to the implementation
of DHTs.
3.2 Survey Development
A self-administered survey was used for this exploratory study. The findings from the
literature review along with discussions with industry professionals and my supervisory
committee provided a starting point for a novel survey instrument. The instrument was structured
with reference to the NIRN framework. It began with a short section to collect demographic
information and then was divided into five blocks, including exploration, installation, initial
implementation, and full implementation, in accordance with the framework. It had two primary
blocks of 25 questions related to the implementation of DHTs, one directed at those with
experience with software platforms and a second for those experienced with consumer
wearables. Participants had the option of completing one of the two blocks or both, according to
their experience and inclination. Six additional questions were directed at all participants at the
end of the survey. The survey questions were posed in a range of formats such as true/false,
yes/no, scaled, choose one and rank-order options. Additional free-text comment boxes allowed
the participants to expand on their views and experiences. The survey was designed to take no
59
longer than 15- 25 minutes, depending on whether the participant answered one block of
questions or two.
The draft survey was critiqued by members of a focus group that included 8 professionals
from diverse backgrounds in academia, professional organizations and industry. All were
provided with a brief introduction to the research study and the preliminary survey questions
about 8 days before the scheduled date of the 90-minute meeting via Zoom videoconference. The
meeting was recorded with the permission of the attendees. After a brief introduction, the focus
group members were encouraged to comment on the relevance and clarity of each question in the
survey. I used this feedback to help finalize the survey questions.
The final survey questions were recorded on a web-based electronic survey platform at
Qualtrics.com, that provides an interactive tool to structure and distribute the survey. As a
quality control measure, a beta test of the survey was carried out by a sample group of reviewers
to ensure that the survey was delivered properly, that the branches between question blocks were
arranged appropriately, and that the questions were formatted correctly. Problems that were
identified were fixed before the survey was launched. The finalized survey was then sent to
survey participants.
3.3 Survey Deployment
A target population of approximately 300 survey participants with appropriate
professional backgrounds was identified based on personal connections, searches of membership
and conference lists and referrals. Potential participants included personnel holding mid to senior
level professional positions in industry or consultancies of different sizes. This included but was
not limited to those working at sponsor companies who have job functions in clinical operations,
data management, quality assurance, regulatory affairs or information technology, currently
60
working with or having experience with DHTs in clinical trials. Also included were consultants
and contract organizations (CROs) with relevant experience with DHTs used in clinical trials.
Part 11 regulatory authorities, study participants in clinical trials and study site personnel were
excluded from this study. Participants were encouraged to nominate other qualified professionals
in their networks who were subject matter experts on the topic of Part 11. To assure the
confidentiality of all the survey participants, participants who received the initial survey link
were encouraged to send the nominated names to the researcher; an anonymous link was then
provided to the nominees. The Qualtrics Mailer function provided each potential participant with
a unique link to the survey via email. The survey was kept open for approximately 6 weeks, until
the survey had received at least 40 responses. I promised the participants that their participation
would be held in confidence and that they could “opt-out” of the study at any point if they so
choose. Reminders to complete the survey were generated automatically by Qualtrics every two
weeks until the survey window was closed. All participants received a ‘Thank you” email for
their participation, but no compensation was provided to complete the survey. Grouped results
were stored in Qualtrics electronically and are included in Appendix A. Respondents were
provided with the results if requested.
3.4 Survey Result Analysis
Survey results were analyzed once the survey was closed. Various simple statistical
methods generated by Qualtrics were used to describe the raw data in questions amenable to
quantitative analysis. Questions that were scale-based according to degrees of knowledge,
preference, importance or agreement, were also analyzed using a weighted approach. In this
approach, the most positive of the offered choices was assigned the maximum score. For
example, in a 3-scale question, choices of ‘agree’, ‘somewhat agree’, and ‘disagree’ would be
61
assigned a score of 3, 2, and 1 respectively. The score for each choice was multiplied by the
number of individuals who selected that choice. The resulting scores were summed (total
weighted rating) and then divided by the total number of respondents who participated in the
question, giving us the weighted average (WA). Thus, a high calculated score indicated that
more respondents agreed than disagreed whereas a lower score indicated a stronger tendency to
disagree. This index assisted in comparing the ‘average’ knowledge, preference, agreement, or
importance, (See example below).
Table 4: Data Conversion to Weighted Average (WA)
0-No knowledge, 1-Some knowledge, 2-A lot of knowledge, 3-Expert knowledge
Rank ordered questions in which respondents sorted options by priority were assigned a
rank score. The rank score was calculated by applying a weight to each response. For example, if
there were 5 options, the highest priority was given ‘rank 1’ and the lowest priority was given
‘rank 5’.
Title Total Total WA
Rating 0 1 2 3
GCP 1 11 32 16 60 0 11 64 48 123 2.05
21CFR Part 11 Regulation 2 21 22 15 60 0 21 44 45 110 1.83
2003 - Part 11, Electronic records,
Electronic signatures - Scope &
application 23 26 11 60 0 23 52 33 108 1.80
2017 - Use of Electronic records &
Electronic signatures in Clinical
Investigations 1 27 21 11 60 0 27 42 33 102 1.70
2007 - Computerized Systems used in
Clinical Investigations 7 23 22 8 60 0 23 44 24 91 1.52
2022 - Digital Health Technologies for
Remote Data Acquisition in Clinical
Investigations 12 23 16 9 60 0 23 32 27 82 1.37
Responses Weighted Rating
62
Cross-tabulation analyses were conducted to explore whether differences in responses
were related to certain other characteristics, such as differences in industry segment with which
the respondent was associated, years of experience or company size (Appendices B). However,
when cross-tabulations are used for small groups, they have limited power so can be used to
detect only those differences that are strongly significant and were not seen to be a key statistical
tool for this type of exploratory study. Free text responses were reviewed and analyzed to
determine thematic areas related to the adequacy of the Part 11 regulations and their associated
implementation challenges.
63
Chapter 4. Results
4.1 Survey Participation
The survey was activated on 16 March 2023 and disseminated between 16 March 2023
and 12 April 2023 to 285 subject matter experts. Nine emails bounced back, but 5 of those could
be updated, bringing the total number of surveys sent out to 281. Sixty-two participants
completed at least one question for a response rate of 22% (62/281); forty-eight participants
completed the entire survey for a completion rate of 77% (48/62). The 14 participants (23%) who
did not complete the survey left the survey after completing between 7% to 74% of the
questions. Of these, 11/14 (79%) stopped answering after the question that asked them to pick
one DHT, either consumer wearables or software platforms. 2/14 (14%) stopped after answering
the question, “Did the preparations … cause delays in the start of the clinical trial?”
4.2 Demographic Profiles of Participants
Two-thirds of survey participants (67%, 40/60) reported that they worked or had worked
with a ‘pharmaceutical/biotechnology company’ and more than a third (38%, 23/60) with a
‘medical device/IVD company’ in the last five years (Figure 7). A smaller number worked with a
‘consulting company’ (20%, 12/60) or ‘CRO (Contract Research Organization)’ (13%, 8/60).
About 10% (6/60) chose ‘Other’, and reported they worked at life science research technology &
services, vendors to clinical trials, hospital, startup digital health technology firm and a
healthcare conglomerate. A few respondents provided company names that have not been
included in the results. The number of total responses exceeded the number of respondents
because some individuals were engaged with more than one type of product or service.
Cross tabulations were performed to identify differences between the responses to
selected questions based on the organizations in which the individual worked over the last five
64
years. No obvious differences were seen in the distribution of answers, given the small ‘n’
(Appendix B;Table 27, Table 28, Table 29, and Table 30).
Figure 7: Organizations Where Participants Worked
Q - Please tell us about the organizations with which you have worked in the last five years.
(Select all that apply). n=60
Pharma/Biotech-Pharmaceutical/Biotechnology Company; Med Device- Medical Device/IVD
Company; CRO-Contract Research Organization.
Many participants reported having more than one functional role during their careers
(Figure 8). More than half selected roles in ‘Regulatory Affairs’ (60%, 36/60). Many were
experienced with ‘Quality Assurance’ (45%, 27/60) or worked in ‘Clinical’ teams (e.g., Data
Management, Clinical Operations, Biostatics, and Medical Writing) (40%, 24/60). About onequarter had experience in ‘Information Systems/Digital Technology’ (25%, 15/60). Affiliations
reported as ‘Other’ (10%, 6/60) included FDA Compliance Consulting, UX/UI Design, Business
development, Law, R&D, QC and Clinical Innovation (Ops).
65
Figure 8: Functional Group Affiliations of Participant
Q - With which functional groups have you been affiliated in the past and present? (Select all
that apply). n=60
RA-Regulatory Affairs; QA-Quality Assurance; IS/DT-Information Systems/Digital Technology.
Participants varied in their current levels of seniority (Figure 9). Reported most
commonly was the level of ‘Director/Senior Director/Executive Director’ (45%, 27/60). Other
respondents were at levels of ‘Manager/Senior Manager’ (17%, 10/60); ‘Vice
President/President’ (13%, 8/60); ‘Consultant’ (12%, 7/60); ‘Specialist/Associate’ (7%, 4/60)
and ‘Other’ (7%, 4/60). Only one clarified ‘other’ as Chief Commercial Officer.
Cross tabulations were performed to identify differences between the responses to
selected questions based on the role of the individual. No obvious differences were seen in the
distribution of answers, given the small ‘n’ (Appendix B; Table 31, Table 32, Table 33, and
Table 34).
66
Figure 9: Current Role of Participant
Q - What is your current role? n=60
Dir-Director; Sr.-Senior; Mgr-Manager; VP-Vice President.
Participants were employed by companies of all sizes (Figure 10), described in four
categories: ‘less than 200 employees’ (30%, 18/60); ‘201-2000 employees’ (25%, 15/60); ‘2001-
20,000 employees’ (10%, 6/60); and ‘more than 20,000 employees’ (35%, 21/60).
Cross tabulations were completed to confirm that results from the answers were not
dissimilar between the small and large companies (Appendix B; Table 36, Table 37, Table 38,
and Table 39).
Figure 10: Size of Participant Organization
Q - What is the size of your company? n=60
The numbers of trials in the respondents’ organizations ranged widely (Figure 11): ‘5 or
less’ (25%, 15/60); ‘6-20’ (20%, 12/60); ‘21-100’ (10%, 6/60); ‘more than 100’ (25%, 15/60).
67
Twelve percent were carrying out none (7/60) and 8% were not sure about the number of trials
(5/60).
Figure 11: Clinical Trials Conducted in a Year
Q - How many clinical trials does your company conduct in a year? n=60
The participants were also asked about the numbers of trials in which they had
participated over the last 5 years (Figure 12). About a third had been involved in ‘5-15’ (37%,
22/60), 22% in ‘5 or less’ (13/60) and 23% in ‘more than 15’ (14/60). Seventeen percent had no
experience in this timeframe (10/60) and one was ‘not sure’ (2%, 1/60).
Figure 12: Participant Involvement in Clinical Trials
Q - With how many clinical trials have you personally worked over the past 5 years? n=60
Most survey respondents had experience with one or more trial phases: ‘phase 1’ (78%,
47/60); ‘phase 2’ (83% 50/60); ‘phase 3’ (83%, 50/60); ‘phase 4’ (62%, 37/60). Five percent
(3/60) had no experience and 5% (3/60) were not sure (Figure 13).
68
Figure 13: Participant Experience with Phases of Clinical Trials
Q - With what phase(s) of clinical trials do you have experience? (Select all that apply). n=60
Participants varied in their levels of understanding when asked about six documents that
directly affected implementation of Part 11 (Figure 14). The document with which they were
most familiar was that relating to Good Clinical Practices GCP. More than half stated that they
had ‘a lot of knowledge’ (53%, 32/60), 27% had ‘expert knowledge’ (16/60) and 18% had ‘some
knowledge’ (11/60). Only one participant (2%, 1/60) reported that he/she had no knowledge of
GCPs.
For 21CFR Part 11 Regulations, a quarter (25%, 15/60) stated they had ‘expert
knowledge’ and more than a third had ‘a lot of knowledge’ (37%, 22/60). A remaining third had
‘some knowledge’ (35%, 21/60). Only 2 had ‘no knowledge’ (3%, 2/60).
For the 2003 – Guidance for Industry Part 11, Electronic Records; Electronic Signatures
— Scope and Application, 18% (11/60) stated they had ‘expert knowledge’, and more than a
third stated that they had ‘a lot of knowledge’ (43%, 26/60) or ‘some knowledge’ (38%, 23/60).
For the 2007 - Guidance for Industry Computerized Systems Used in Clinical
Investigations, relatively few stated that they had ‘expert knowledge’ (13%, 8/60). More than a
third stated that they had ‘a lot of knowledge’ (37%, 22/60) or ‘some knowledge’ (38%, 23/60),
and a small number had ‘no knowledge’ (12%, 7/60).
69
For the 2017 - Use of Electronic Records and Electronic Signatures in Clinical
Investigations Under 21 CFR Part 11 – Questions and Answers, 18% (11/60) stated they had
‘expert knowledge’, about a third \ had ‘a lot of knowledge’ (35%, 21/60) and nearly half had
‘some knowledge’ (45%, 27/60); only one participant had ‘no knowledge’ (2%, 1/60).
For the 2022 - Digital Health Technologies for Remote Data Acquisition in Clinical
Investigations, 15% stated that they had ‘expert knowledge’ (9/60), 27% (16/60) had ‘a lot of
knowledge’, 38% had ‘some knowledge’ (23/60) and 20% had ‘no knowledge’ (12/60).
Figure 14: Familiarity with Regulations, Standards and Guidances
Q - How familiar are you with the following regulations, standards and guidances? n=60
GCP-Good Clinical Practice; CFR-Code of Federal Regulations; Q&A-Questions and Answers.
Unnumbered bars = 1 response. The X-axis is a cumulative total of the responses.
A quarter of the respondents (47%, 28/60) stated that they had ‘expert’ experience (25%,
15/60) with Part 11 compliance for clinical trials. Almost half had an ‘intermediate’ level of
experience and another quarter had ‘beginner’ experience (23%, 14/60). Five percent had ‘no
experience’ (3/60) (Figure 15).
70
Figure 15: Participant Experience with Part 11 Compliance
Q - How would you characterize your level of experience with Part 11 compliance for clinical
trials? n=60
Participants varied in the challenges they faced when they tried to understand the role of
GCPs when implementing compliance with Part 11 for clinical trials (Figure 16). Incorporating
IT Standards such as NIST, ISO, CDISC was ‘most challenging’ for 2% (1/57) of the
respondents, ‘very challenging’ for 28% (16/57), ‘marginally challenging’ for 37% (21/57) and
‘not challenging’ for 12% (7/57). Twenty-one percent (12/57) selected ‘not sure’.
Having sufficient regulatory requirements/guidances was viewed by 21% of the
respondents as ‘very challenging’ (12/57), 45% (26/57) as ‘marginally challenging’ and 25%
(14/57) as ‘not challenging’. Nine percent selected ‘not sure’.
Understanding GCPs to implement 21 CFR Part 11 compliance was ‘most challenging’
for 2% (1/59) of the respondents, ‘very challenging’ for 12% (7/57), ‘marginally challenging’ for
58% (34/59) and ‘not challenging’ for 20% (12/59). Eight percent (5/59) selected ‘not sure’.
Twelve participants selected ‘other’ and five gave explanations (Table 5).
71
Figure 16: Challenges in Understanding the Role of GCPs
Q - What challenges did you face when trying to understand the role of Good Clinical Practices
(GCPs) in implementing compliance with Part 11 for clinical trials?
Response numbers varied between 57 or 59 for the three choices; other n=12
Table 5: Other Challenges, Please Specify:
n=5
Comments on Challenges with GCPs When Implementing Compliance with Part 11
I didnt really have to implement them
Stupidly aggressive QA departments
Not my role, but followed specific requirements from compliance dept
Ensuring the compliance of the underlying electronic systems
Interpretation, practical implementation
Participants were asked to share whether they or their companies had explored the use of
DHTs for their clinical trials by offering two choices specified as consumer wearables (CW) and
software platforms (SP) (Figure 17). More than half stated they had experience with SPs (65%,
39/60) and/or CWs (60%, 36/60). A third stated that they had no experience with either DHT
(20%, 12/60).
72
Figure 17: Explorations of Software Platforms Versus Consumer Wearables
Q - Have you or your company explored using any of the following digital health technologies
for your clinical trials: consumer wearables (CW) (e.g. smart watches, step trackers, vital sign
monitors) or software platforms (SP) to capture patient reported outcomes or experiences (e.g. to
monitor compliance with therapy, to record activities of daily living, to respond to surveys, to
report on clinical performance outcome measures)? (Select all that apply). n=60
Respondents with relevant experience were given the opportunity to answer questions on
each presented area. Over half preferred to start by answering questions about CWs (55%, 27/49)
and 37% about SPs (18/49) (Figure 18). Eight percent picked neither (4/49). For those
experienced with both, the option was given to respond to questions on the second type of DHT
after they had answered questions on the first.
Figure 18: Experience with Specific Digital Health Technologies
Q - This survey has 2 blocks of questions, one to explore your experience with consumer
wearables and the second to explore your experience with software platforms collecting patient
reported outcomes. Which digital health technology questions would you like to answer now?
n=49
Participants who responded that they would not like to provide answers for either of the
DHTs with which they had experience were asked three additional questions (Figure 19) before
they were advanced out of the survey.
73
First, those respondents were asked if their company had plans to use DHTs for their
CTs. One responded ‘probably yes’ (8%, 1/13), for drug diaries and PROs. Nearly half
responded that they were ‘not sure’ (46%, 6/13), a third that they ‘might or might not’ (31%,
4/13) and only a few that they would ‘probably not’ (15%, 2/13). No responses were picked for
the ‘definitely not’ or ‘definitely yes’ options.
Figure 19: Participant Plans to use Digital Health Technologies
Q - Does your company have plans to use any digital health technologies for their clinical trials?
n=13
Second, they were asked if they thought that their company understands the Part 11
requirements for clinical trials (Figure 20). About a third answered that they were ‘very
confident’ (31%, 4/13), 38% said they were ‘confident’ (5/13), 15% were ‘moderately confident’
(2/13) and 8% were ‘not very confident’ (1/13) or ‘not sure’ (1/13).
74
Figure 20: Understanding of Part 11 Requirements
Q - How well do you think that your company understands Part 11 requirements for clinical
trials? n=13
VC-Very Confident; Confident; MC-Moderately Confident; NVC-Not Very Confident.
Finally, they were asked to characterize the FDA guidances that support the Part 11
regulation in just one word (Figure 21). More than half responded with ‘reasonable’ (54%, 7/13),
and only a few with ‘insufficient’ (15%, 2/13). One viewed them as ‘confusing’ (8%, 1/13) and
one as ‘excessive’ (8%, 1/13). Two individuals chose ‘other’ (2/13) and provided a written
response of “NA” (Not applicable) and “Not sure”. These participants who declined to provide
answers for either of the DHTs with which they had experience were then advanced out of the
survey.
Figure 21: Characterization of FDA Guidances
Q - In one word, how would you characterize FDA guidances to support the Part 11 regulation?
n=13
75
4.3 Consumer Wearables (CW) – Survey Responses
Participants who agreed to provide their input on CW were asked if it was provisioned by
the trial sponsor or if the trial participants were allowed to ‘bring your own device’ (BYOD)
(Figure 22). Nearly two thirds responded that they were provisioned (63%, 15/24), a third that
both types were used (33%, 8/24) and one that it was BYOD (4%, 1/24).
Figure 22: Consumer Wearables - Provisioned or Bring Your Own Device
Q - Was the consumer wearable provided by the sponsor of the trial (i.e. provisioned) or did the
clinical trial participant bring their own device (i.e. BYOD)? n=24
The participants were then asked about the type of study for which the CW data would be
used (Figure 23). Most used or would use it in ‘Feasibility’ studies (75%, 18/24) and/or ‘Early
phase exploratory’ studies (71%, 17/24), fewer for ‘Late phase confirmatory’ (38%, 9/24) or
‘Phase IV’ studies (25%, 6/24) and one was ‘not sure’ (4%, 1/24).
Figure 23: Use of Data to Support Studies
Q - How was/will the data from the consumer wearable be used to support the study? (Select all
that apply). n=24
P/F-Pilot/Feasibility, EPE-Early Phase Exploratory, LPC-Late Phase Confirmatory.
76
Participants were asked about their role(s) in assuring Part 11 compliance (Figure 24).
About half were involved in ‘quality assurance’ (52%, 12/23), ‘communication to regulators’
(52%, 12/23) and/or ‘validation of the device’ (11/13); 43% were involved in ‘implementation of
the device’ (10/23). ‘Selection of the device’ and ‘training the trainer or other users’ were chosen
by 35% (8/23 for each choice) and ‘monitoring field performance’ by 22% (5/23). ‘Other’,
chosen by 9% (2/23), had responses of “regulatory” and “analysis of device data”.
Figure 24: Role Played in Assuring Part 11 Compliance
Q - What roles have you played in assuring Part 11 compliance for consumer wearables? (Select
all that apply). n=23
Participants were then questioned about several sources of information that might be used
as educational resources related to regulatory requirements for CWs (Figure 25).
For the choice of regulations and guidance documents, more than half responded as
‘used, very helpful’ (54%, 13/24), somewhat fewer, 42%, with ‘used, moderately helpful’
(10/24) and only one with ‘used, but not helpful’ (4%, 1/24).
For the choice of discussion/meeting with regulatory authorities, more than half stated
that it was ‘used, very helpful’ (54%, 13/24), a fourth responded with ‘used, moderately helpful’
(25%, 6/24) and five with ‘used, but not helpful’ (21%, 5/24).
77
For the choice of input from colleagues, nearly half responded either with ‘used, very
helpful’ (42% 10/24) or ‘used, moderately helpful’ (46%, 11/24) and 13% responded with ‘used,
but not helpful’ (13%, 3/24).
For the choice of manufacturer’s documentation, a quarter responded with ‘used, very
helpful’ (25%, 6/24), more than half with ‘used, moderately helpful’ (58%, 14/24) and 13% with
‘used, but not helpful’ (13%, 3/24). One chose ‘have not used’ (4%, 1/24).
For the choice of input from consultants, about a quarter found it ‘used, very helpful’
(25%, 6/24) and about a fifth, ‘used, moderately helpful’ (21%, 5/24). The rest, more than a half,
identified this as ‘used, but not helpful’ (29%, 7/24) or ‘have not used’ (25%, 6/24).
For the choice of landscape analysis, a few responded with ‘used, very helpful’ (17%,
4/23) and 43% with ‘used, moderately helpful’ (10/23); a few chose ‘used, but not helpful’ (17%,
4/23) and about a fifth, ‘have not used’ (22%, 5/23).
For the choice of trade journals, news articles and blogs, only one responded with ‘used,
very helpful’ (4%, 1/24), more than half chose ‘used, moderately helpful’ (71%, 17/24), and a
few chose ‘used, but not helpful’ (13%, 3/24) or ‘have not used’ (13%, 3/24).
For seminars/conference’, one responded with ‘used, very helpful’ (4%, 1/24), about half
responded with ‘used, moderately helpful’ (46%, 11/24) and a fourth with ‘used, but not helpful’
(25%, 6/24); five chose ‘have not used’ (21%, 5/24).
For the choice of other, please specify, three stated ‘have not used’ (75%, 3/4) and one
participant stated ‘used, very helpful’ (25%, 1/4) with a text response, “summary basis of
approval”.
78
Figure 25: Sources Used to Educate About Consumer Wearables
Q - When you explored using consumer wearables in a clinical trial, what sources of information
did you use to educate yourself about regulatory requirements?
Response numbers varied between 23 and 24; other n=4
RA-Regulatory Authorities.
Participants responded to a request to rate six factors (Figure 26, Table 6) that contributed
to the adoption of the CW as ‘very important’ (VI), ‘somewhat important’ (SI), ‘not important’
(NI) or ‘not sure’ (NS). ‘Ease of use’ was rated highest according to its weighted average (WA)
of 2.78 (ratings VI: 18/24; SI: 5/24; NI: 0/24, NS: 1/24). ‘Corporate strategy’ had a slightly
lower WA of 2.65 (VI: 16/24; SI: 6/24; NI: 1/24; NS: 1/24). Also somewhat lower were ranked
‘feedback from FDA’ and ‘time to validate’ with WAs of 2.59 (VI: 14/24; SI: 7/24, NI: 1/24;
NS: 2/24) and 2.57 (VI: 14/24, SI: 8/24, NI: 1/24, NS: 1/24) respectively; ‘flexibility in data
collection’ had a WA of 2.52 (VI: 13/24, SI: 9/24, NI: 1/24, NS: 1/24) and ‘capabilities to satisfy
part 11 and Good Clinical Practices (GCP)’ had a WA of 2.46 (VI: 13/24, SI: 9/24, NI: 2/24, NS:
0/24). ‘Other’, chosen by one participant (100%, 1/1), had a response of NS.
79
Figure 26: Importance of Factors When Adopting Consumer Wearables
Q - How important were the following factors when deciding to adopt consumer wearables in
your clinical trials? n= 24; other n=1
Table 6: Importance of Factors When Implementing Consumer Wearables
WA: Weighted Average, 3-Very important (VI), 2-Somewhat important (SI), 1-Not important
(NI), 0-Not sure (NS).
NS answers were removed from WA calculation.
Factor
Very
important
Somewhat
important
Not
important Not sure WA
Ease of use 75%, 18 21%, 5 0%, 0 4%, 1 2.78
Corporate strategy 67%, 16 25%, 6 4%, 1 4%, 1 2.65
Feedback from FDA 58%, 14 29%, 7 4%, 1 8%, 2 2.59
Time to validate 58%, 14 33%, 8 4%, 1 4%, 1 2.57
Flexibility in data collection 54%, 13 38%, 9 4%, 1 4%, 1 2.52
Capabilities to satisfy Part 11 and
Good Clinical Practices (GCP) 54%, 13 38%, 9 8%, 2 0%, 0 2.46
Other, please specify (n=1) 0%, 0 0%, 0 0%, 0 100%, 1 N/A
Participants were asked if they had assessed whether the CW was “fit-for-purpose”
before the trial commenced (Figure 27). Most (92%) said ‘yes’ (22/24) and only two said ‘no’
(8%, 2/22).
80
Figure 27: Fit-for-Purpose Assessment
Q - Before beginning your trial, did your company ensure that the consumer wearable was “fitfor-purpose” i.e. the level of validation was sufficient to support its use in the clinical
investigation? n=24
Participants were asked to rate the potential concerns of six factors that contributed to the
implementation of the CW (Figure 28, Table 7), on a scale of 1-6, in order of importance with
‘1’ being the most important and ‘6’ being the least important. ‘Unclear requirements for
verifications’ was rated highest according to its WA of 3.17 but it ranked second in the number
of respondents placing it in the ‘1’position (ratings 1: 5/24; 2: 8/24; 3: 3/24, 4: 4/24, 5: 2/24 6:
2/24). In comparison, the choice of ‘high cost of testing’ was ranking most frequently in the ‘1’
position (1: 7/24; 2: 1/24; 3: 1/24, 4: 4/24, 5: 5/24 6: 6/24) but nearly the lowest WA of 2.29.
‘Incorrect use by participants’ had a WA of 2.54 (1: 4/24; 2: 7/24; 3: 1/24, 4: 3/24, 5: 4/24 6:
5/24) and ‘cumbersome validations’ had a WA of 2.50 (1: 3/24; 2: 4/24; 3: 5/24, 4: 4/24, 5: 6/24
6: 2/24). ‘Insufficient compliance experience’ had a WA of 2.29 (1: 5/24; 2: 0/24; 3: 7/24, 4:
3/24, 5: 3/24 6: 6/24) and ‘cumbersome risk assessments’ had a WA of 2.21 (1: 0/24; 2: 4/24; 3:
7/24, 4: 6/24, 5: 4/24 6: 3/24).
81
Figure 28: Potential Concerns in Implementation
Q - In your opinion, rate these potential concerns in order of importance when implementing the
consumer wearables? (Drag and Drop your response). n=24
Table 7: Potential Concerns in Implementation
WA: Weighted Average, 6-Most important … 1-Least important
Two additional comments were received (Table 8).
Potential Concerns 1 2 3 4 5 6 WA
Unclear requirements for verifications 21%, 5 33%, 8 13%, 3 17%, 4 8%, 2 8%, 2 3.17
Incorrect use by participants 17%, 4 29%, 7 4%, 1 13%, 3 17%, 4 21%, 5 2.54
Cumbersome validations 13%, 3 17%, 4 21%, 5 17%, 4 25%, 6 8%, 2 2.50
High cost of testing 29%, 7 4%, 1 4%, 1 17%, 4 21%, 5 25%, 6 2.29
Insufficient compliance experience 21%, 5 0%, 0 29%, 7 13%, 3 13%, 3 25%, 6 2.29
Cumbersome risk assessments 0%, 0 17%, 4 29%, 7 25%, 6 17%, 4 13%, 3 2.21
82
Table 8: Additional Comments on Implementing Consumer Wearables
Additional Comments on Implementing Consumer Wearables
Ease of use for the wearable and patient compliance. Access to raw data from manufacturers.
Stability of the platform (ie will there be hardware or software updates happening mid-study
that may impact the data usability). One note, many of our wearables are not consumer devices
but operate in a similar manner (eg actigraphy). Just a small difference in that we can purchase
for use in our clinical trials but the tools aren't available off the shelf for consumers.
Separate compliance from acceptance. You may be perfectly compliant with a device, but that
means nothing in terms of anyone accepting the evidence. These are two very different things.
Part 11 reqs are poorly understood and applied to this kind of devices, with definitions needed
for data processing, what is a clinical trial system, what is source data etc... most people are
not even thinking about these. sic
Participants were asked if the CW was ultimately used for data collection (Figure 29).
More than half responded ‘yes’ (58%, 14/24), about a third that it was ‘still under consideration’
(29%, 7/24), and two that they were ‘not sure’ (8%, 2/24). One participant said ‘no’ and
clarified the response by saying:
FDA did not accept the data generated for a similar consumer wearable, and
we were concerned that the data we would generate would not accurately
represent the endpoint that we were trying to measure. We were concerned
about validation and the risk of introducing a consumer wearable for the first
time in a potential pivotal clinical trial.
Figure 29: Final Incorporation of Consumer Wearables
Q - Did you ultimately incorporate consumer wearables for data collection? n=24
Participants were asked to rate five challenges (Figure 30, Table 9) that they may have
faced when implementing the use of CW according to whether they were seen to have ‘a lot of
influence’ (ALIf), ‘some influence’ (SIf), ‘no influence’(NIf) or ‘cannot answer’(CA). The
83
response that was rated as having ‘a lot of influence’ most often was an ‘insufficient
understanding of validation methods’ (ALIf: 9/23; SIf: 7/23; NIf: 7/23, CA: 0/23). However, it
had a somewhat lower WA of 2.09 than ‘insufficient budget allocation’, which had the highest
WA of 2.18 but fewer responses in the number 1 position (ALIf: 7/23; SIf: 12/23; NIf: 3/23; CA:
1/23). It was also somewhat lower than ‘insufficient clarity on regulatory requirements’, which
had a WA of 2.13 (ALIf: 7/23; SIf: 12/23; NIf: 4/23; CA: 0/23). The lowest weighted averages
were associated with ‘insufficient technical expertise to perform validations’, with a WA of 2.05
(ALIf: 7/23; SIf: 7/23; NIf: 6/23; CA: /23) and ‘lack of leadership support’, with a WA of 1.76
(ALIf: 4/23; SIf: 8/23; NIf: 9/23; CA: 2/23). Two individuals chose ‘other’ and provided a
response of ‘NIf’ (1/2) and ‘CA’ (1/2), and one expanded on this choice by saying, “there are
almost no pre-qualified digital endpoints…”.
Figure 30: Challenges Affecting Implementation
Q - What challenges affected your use of consumer wearables for your clinical trial?
n=23, other n=2
84
Table 9: Challenges Affecting Implementation
WA-Weighted Average, 3-A lot of influence (ALIf), 2-Some influence (SIf), 1-No influence (NIf),
0-Cannot answer(CA).
Participants were asked if the preparations to use CW had delayed the start date of the
clinical trial (Figure 31). More than half responded ‘no’ (68%, 15/22), two said ‘yes’ (9%, 2/22)
and five were ‘not sure’ (23%, 5/22). The two participants who chose ‘yes’ estimated the delays
to be “1” and “18 months”.
Figure 31: Delays in Start of Clinical Trials
Q - Did the preparations to use the consumer wearables delay the start date of the clinical trial?
n=22
Participants were asked to rate five factors that may have contributed to delays in the use
of CW (Figure 32, Table 10) as ‘most impactful’ (MoIp), ‘impactful’ (Ip), ‘marginally impactful’
(MaIp), ‘not impactful’ (NIp) or ‘not sure’ (NS). ‘Time required to complete validations’ had the
highest WA of 2.57 (ratings MoIp: 0/7, Ip: 4/7 MaIp: 3/7, NIp: 0/7, NS: 0/7); WAs of 2.43 were
Challenges
A lot of
influence
Some
influence No influence
Cannot
answer WA
Insufficient budget allocation 30%, 7 52%, 12 13%, 3 4%, 1 2.18
Insufficient clarity on
regulatory requirements 30%, 7 52%, 12 17%, 4 0%, 0 2.13
Insufficient understanding of
validation methods 39%, 9 30%, 7 30%, 7 0%, 0 2.09
Insufficient technical expertise
to perform validations 30%, 7 30%, 7 26%, 6 13%, 3 2.05
Lack of leadership support 17%, 4 35%, 8 39%, 9 9%, 2 1.76
Other, please specify (n=2) 0%, 0 0%, 0 50%, 1 50%, 1 1.00
85
associated with both ‘time to train personnel’ (MoIp: 1/7, Ip: 1/7, MaIp: 5/7, NIp: 0/7’, NS: 0/7)
and ‘time required to complete risk assessments’ (MoIp: 0/7, Ip: 4/7, MaIp: 2/7, NIp: 1/7, NS:
0/7). WAs of 2.0 were associated with ‘time required for communication discussions with the
FDA’ (MoIp: 0/7, Ip: 2/7, MaIp: 3/7, NIp: 2/7, NS: 0/7) and ‘time to train subjects’ (MoIp: 0/7,
Ip: 1/7, MaIp: 4/7, NIp: 1/7, NS: 1/7).
Figure 32: Factors Resulting in Delays
Q - How impactful were the following factors in delaying the use of consumer wearables in your
company’s clinical trial? n=7
Table 10: Factors Resulting in Delays
WA-Weighted Average, 4-Most impactful (MoIp), 3-Impactful (Ip), 2-Marginally impactful
(MaIp,) 1-Not impactful (NIp), 0-Not sure (NS).
Delays
Most
impactful Impactful
Marginally
impactful
Not
impactful Not sure WA
Time required to complete
validations 0%, 0 57%, 4 43%, 3 0%, 0 0%, 0 2.57
Time to train personnel 14%, 1 14%, 1 71%, 5 0%, 0 0%, 0 2.43
Time required to complete
risk assessments 0%, 0 57%, 4 29%, 2 14%, 1 0%, 0 2.43
Time required for
communication discussions
with the FDA 0%, 0 29%, 2 43%, 3 29%, 2 0%, 0 2.00
Time to train subjects 0%, 0 14%, 1 57%, 4 14%, 1 14%, 1 2.00
Other, please specify 0%, 0 0%, 0 0%, 0 0%, 0 0%, 0 N/A
86
Participants were asked to rate the six challenges (Figure 33, Table 11) that they may
have faced when completing validations for CW as ‘most challenging’ (MoC), ‘very
challenging’ (VC), ‘moderately challenging’ (MdC), ‘not challenging’ (NC) and ‘not sure’ (NS).
‘Meeting timelines’ had the highest WA of 2.44 (ratings MoC: 0/21, VC: 10/21, MdC: 6/21, NC:
2/21, NS: 3/21). ‘Ensuring knowledge to perform validations’ had WA of 2.26 (MoC: 0/21, VC:
8/21, MdC: 8/21, NC: 3/21, NS: 2/21); ‘securing financial resources’ had WA of 2.20 (MoC:
1/21, VC: 4/21, MdC: 7/21, NC: 3/21, NS: 6/21); ‘documenting plans and procedures’ had WA
of 2.16 (MoC: 0/21, VC: 6/21, MdC: 10/21, NC: 3/21, NS: 2/21); ‘assuring manpower to
perform validations’ had WA of 2.00 (MoC: 0/21, VC: 5/21, MdC: 9/21, NC: 5/21, NS: 2/21)
and ‘understanding good clinical practices (GCPs) related to validations had WA of 1.74 (MoC:
0/21, VC: 2/21, MdC: 10/21, NC: 7/21, NS: 2/21).
Figure 33: Challenges with Completing Validations
Q- How challenging were the following in completing validations for consumer wearables
(CW)? n=21
87
Table 11: Challenges with Completing Validations
WA-Weighted Average, 4-Most challenging (MoC), 3-Very challenging (VC), 2-Moderately
challenging (MdC), 1-Not challenging (NC), 0-Not sure (NS).
Participants were asked if they had set up meetings with the FDA to discuss compliance
requirements (Figure 34). Most commonly, responses were ‘yes, for early-stage exploratory data’
(62%, 13/21) and/or ‘yes, for end point data collection’ (48%, 10/21). Less than a fourth of the
participants chose ‘no’ (19%, 4/21) or ‘not sure’ (10%, 2/21).
Figure 34: FDA Compliance Discussions for Consumer Wearables
Q - As you prepared to use consumer wearables in your study protocol, did you set up meetings
with the FDA to discuss the necessary compliance requirements? (Select all that apply). n=21
Participants who said yes were asked three additional questions. First, they were asked
how long it took to set up and complete the discussions with the FDA (Figure 35). One
respondent stated that the turnaround time for these discussions was ‘almost immediate’ (7%,
Validations
Most
challenging
Very
challenging
Moderately
challenging
Not
challenging
Not
sure WA
Meeting timelines 0%, 0 48%, 10 29%, 6 10%, 2 14%, 3 2.44
Ensuring knowledge to
perform validations 0%, 0 38%, 8 38%, 8 14%, 3 10%, 2 2.26
Securing financial resources 5%, 1 19%, 4 33%, 7 14%, 3 29%, 6 2.20
Documenting plans and
procedures 0%, 0 29%, 6 48%, 10 14%, 3 10%, 2 2.16
Assuring manpower to
perform validations 0%, 0 24%, 5 43%, 9 24%, 5 10%, 2 2.00
Understanding Good Clinical
Practices (GCPs) related to
validations 0%, 0 10%, 2 48%, 10 33%, 7 10%, 2 1.74
88
1/15), more than half as ‘up to 6 months’ (60%, 9/15), two as ‘6-12 months’ (13%, 2/15) and one
as ‘greater than 12 months’ (7%, 1/15). Two were ‘not sure’ (13%, 2/15).
Figure 35: Time Needed for FDA Compliance Discussions
Q - How long did it take to set up and complete the discussions with the FDA before you
implemented the consumer wearable? n=15
Participants were then asked if they had sufficient time to address the FDA’s feedback
before the trial started (Figure 36). More than half said ‘yes’ (67%, 10/15), one said ‘no’ (7%,
1/15), two were ‘not sure’ (13%, 2/15) and two chose ‘not applicable’ (13%, 2/15) along with
written responses of “Did not seek feedback on Part 11 compliance as part of HA discussions”
and “We didn't discuss these questions”.
Figure 36: Time to address FDA’s Feedback
Q - Did you have enough time to address the FDA’s feedback regarding the requirements for
Part 11 compliance before your trial started? n=15
The participants were asked to rate the feedback received from the FDA by responding to
four statements (Figure 37, Table 12) with options of ‘disagree’ (D), ‘neither disagree nor agree’
(NDA) ‘agree’ (A) and ‘not sure’ (NS). Almost all agreed that the feedback ‘yielded a clear
89
understanding of potential impact on data collection’ (ratings A: 87%, 13/15; D: 7%, 1/15; NDA:
7%, 1/15; NS: 0%, 0/15) and ‘provided opportunity for follow up/discussion’ had ratings (A:
80%, 12/15; D: 7%, 1/15; NDA: 13%, 2/15; NS: 0%, 0/15). Responses were mixed for the
choices of ‘provided sufficient information for decision-making’ (A: 67%, 10/15; D: 0%; 0/15;
NDA: 33%; 5/15; NS: 0%, 0/15) and ‘provided sufficient advice about data needed for a
marketing application’ (A: 33%, 5/15; D: 7%, 1/15; NDA: 53%, 8/15; NS: 7%, 1/15).
Figure 37: Feedback from FDA discussions
Q - When thinking about your discussions with FDA, do you agree or disagree with the
following statements regarding those discussions? n=15
Table 12: Usefulness of Feedback from FDA Discussions
Agree (A), Neither disagree nor agree (NDA), Disagree (D), Not sure (NS).
FDA Discussions Disagree
Neither disagree
nor agree Agree Not sure
Yielded a clear understanding of
potential impact on data collection 7%, 1 7%, 1 87%, 13 0%, 0
Provided opportunity for follow
up/discussion 7%, 1 13%, 2 80%, 12 0%, 0
Provided sufficient information for
decision-making 0%, 0 33%, 5 67%, 10 0%, 0
Provided sufficient advice about
data needed for a marketing
application 7%, 1 53%, 8 33%, 5 7%, 1
90
Participants were asked if they performed risk assessments (Figure 38). More than half
said ‘yes’ (62%, 13/21) and two said ‘no’ (10%, 2/21) and two said they ‘did not need risk
assessments as the device/sensor was validated previously for this purpose’ (10%, 2/21).
Nineteen percent were ‘not sure’ (4/21).
Figure 38: Performed Risk Assessments for Consumer Wearables
Q - Have you carried out risk assessments related to the consumer wearables? n=21
RA-Risk Assessment.
Participants answering ‘yes’ to the previous question were asked to explain any
additional challenges they faced (Table 13).
Table 13: Risk Assessment Challenges
Q - What challenges did you face when performing risk assessments? n=4
Risk Assessment Challenges
Which risks were being mitigated and how to accept the ones that were not for exploratory
purposes.
What do we need to really consider for a go/no go decision.
Lack of expertise.
Differentiating intended use for the trial in question vs the ultimate context of use for the
wearable. Risk assessment changes based on immediate use and ultimate use and can change
the potential regulatory oversight.
Understanding the User Groups - Usability Risks
91
Participants were asked to rate eight challenges (Figure 39, Table 14) they may have
faced when first using CW as ‘very challenging’ (VC), ‘moderately challenging’ (MdC), ‘not
challenging’ (NC) and ‘not sure’ (NS). The challenge of ‘trial participants made errors’ has the
highest WA of 2.07 and the most common weighting as ‘very challenging’ (VC: 3/18, MdC:
10/18, NC: 2/18, NS: 3/18). ‘Participants were not compliant’ had WA of 2.00 (VC: 3/18, MdC:
8/18, NC: 3/18, NS: 4/18); ‘device did not work as anticipated’ had WA of 1.93 (VC: 2/18,
MdC: 10/18, NC: 3/18, NS: 3/18); ‘operating system upgrades caused software incompatibility’
had WA of 1.85 (VC: 2/18, MdC: 7/18, NC: 4/18, NS: 5/18); ‘devices failed’ had WA of 1.80
(VC: 3/18, MdC: 6/18, NC: 6/18, NS: 3/18); ‘device updates changed data or algorithms’ had
similar WA of 1.80 (VC: 3/18, MdC: 6/18, NC: 6/18, NS: 3/18); ‘data transmission was
unreliable’ had WA of 1.71 (VC: 2/17, MdC: 6/17, NC: 6/17, NS: 3/17) and ‘technical support
was insufficient’ had WA of 1.43 (VC: 0/18, MdC: 6/18, NC: 8/18, NS: 4/18).
Figure 39: Challenges for Consumer Wearables
Q - When you first used the consumer wearables for the clinical trial, what were the challenges?
Response numbers varied between 17 and 18.
92
Table 14: Challenges for Consumer Wearables
WA-Weighted Average, 3-Very challenging (VC), 2-Moderately challenging (MdC), 1-Not
challenging (NC), 0-Not sure (NS).
Participants were asked to rate five factors (Figure 40, Table 15) related to the ongoing
compliance of the CW as ‘extremely important’ (EI), ‘very important’ (VI), ‘moderately
important’ (MdI), ‘not important’ (NI) and ‘not sure’ (NS). ‘Organizational readiness (i.e.,
update procedures, processes, trainings, etc.)’ had the highest WA of 3.00 (ratings EI: 5/18, VI:
8/18, SI: 3/18, NI: 1/18, NS: 1/18). ‘Adequate time’ (EI: 4/18, VI: 5/18, SI: 7/18, NI: 1/18, NS:
1/18) and ‘adequate finances’ (EI: 3/18, VI: 7/18, SI: 8/18, NI: 1/18, NS: 1/18) had similar WAs
of 2.71. ‘Reassessments when regulatory requirements/guidances changed’ had a WA of 2.65
(EI: 4/18, VI: 6/18, SI: 4/18, NI: 3/18, NS: 1/18) and ‘adequate manpower’ had WA of 2.47 (EI:
1/18, VI: 7/18, SI: 8/18, NI: 1/18, NS: 1/18).
Challenges
Very
challenging
Moderately
challenging
Not
challenging Not sure WA
Trial participants made errors 17%, 3 56%, 10 11%, 2 17%, 3 2.07
Participants were not
compliant 17%, 3 44%, 8 17%, 3 22%, 4 2.00
Device did not work as
anticipated 11%, 2 56%, 10 17%, 3 17%, 3 1.93
Operating system upgrades
caused software
incompatibility 11%, 2 39%, 7 22%, 4 28%, 5 1.85
Devices failed 17%, 3 33%, 6 33%, 6 17%, 3 1.80
Device updates changed data
or algorithms 17%, 3 33%, 6 33%, 6 17%, 3 1.80
Data transmission was
unreliable (n=17) 12%, 2 35%, 6 35%, 6 18%, 3 1.71
Technical support was
insufficient 0%, 0 33%, 6 44%, 8 22%, 4 1.43
93
Figure 40: Ongoing Compliance for Consumer Wearables
Q - How important were the following factors in assuring ongoing Part 11 compliance for
consumer wearables, as the clinical trial progressed? n=18
Table 15: Ongoing Compliance for Consumer Wearables
WA- Weighted Average, 4-Extremely important (EI), 3-Very important (VI), 2-Somewhat
important (SI), 1-Not important (NI), 0-Not sure (NS).
Participants were asked to rate eight factors (Figure 41, Table 16) they may have faced
when trying to maintain compliance for CW as ‘very confident’ (VC), ‘confident’ (C), ‘not very
confident’ (NVC) and ‘not sure’ (NS). ‘Managing identified risks’ had the highest WA of 2.35
(ratings VC: 7/19, C: 9/19, NVC: 1/19, NS: 2/19). ‘Conducting risk assessments’ and ‘updating
trial protocols and procedures’ had identical WAs of 2.29 and identical rating distributions (VC:
7/19, C: 8/19, NVC: 1/19, NS: 2/19). ‘Performing reverifications and revalidations’ had WA of
Ongoing Compliance
Extremely
important
Very
important
Somewhat
important
Not
important
Not
sure WA
Organizational readiness (i.e.,
update procedures, processes,
trainings, etc.) 28%, 5 44%, 8 17%, 3 6%, 1 6%, 1 3.00
Adequate time 22%, 4 28%, 5 39%, 7 6%, 1 6%, 1 2.71
Adequate finances 17%, 3 39%, 7 33%, 6 6%, 1 6%, 1 2.71
Reassessments when regulatory
requirements/guidances changed 22%, 4 33%, 6 22%, 4 17%, 3 6%, 1 2.65
Adequate manpower 6%, 1 39%, 7 44%, 8 6%, 1 6%, 1 2.47
94
2.20 (VC: 5/19, C: 8/19, NVC: 2/19, NS: 4/19), ‘training new staff’ had WA of 2.00 (VC: 4/19,
C: 9/19, NVC: 4/19, NS: 2/19); ‘managing software updates’ had WA of 1.88 (VC: 3/19; C:
8/19, NVC: 5/19; NS: 3/19); ‘allocating sufficient time’ had WA of 1.75 (VC: 3/19; C: 6/19;
NVC: 7/19; NS: 3/19) and ‘adding manpower’ had WA of 1.73 (VC: 2/19, C: 7/19, NVC: 6/19,
NS: 4/19).
Figure 41: Maintenance of Compliance for Consumer Wearables
Q - How confident were you that the following activities were successful in maintaining
compliance with Part 11? n=19
95
Table 16: Maintenance of Compliance for Consumer Wearables
WA: Weighted Average, 3-Very confident, (VC), 2-Confident (C), 1-Not very confident (NVC),
0-Not sure (NS).
4.4 Software Platforms (SP) – Survey Responses
Participants who agreed to provide their input on software platforms (SP) were asked if
the SP would be used to support patient reported outcomes (PROs) (Figure 42). Respondents
stated that the data would be used ‘to report on clinical or performance outcome measures’ (72%,
21/29), ‘to record activities of daily living’ (69%, 20/29), ‘to respond to surveys’ (62%, 18/29),
or ‘to monitor compliance with therapy’ (55%, 16/29). One chose ‘other’ and wrote, “my
companies have supplied them to research companies”. The percentages totaled over 100%
because participants were asked to select all answers that applied.
Figure 42: Use of Data to Support Patient Reported Outcomes
Q - For what purpose(s) has your company considered software platforms for patient reported
outcomes? (Select all that apply). n=29
Maintenance of Compliance
Very
confident Confident
Not very
confident Not sure WA
Managing identified risks 37%, 7 47%, 9 5%, 1 11%, 2 2.35
Conducting risk assessments 37%, 7 42%, 8 11%, 2 11%, 2 2.29
Updating trial protocols and procedures 37%, 7 42%, 8 11%, 2 11%, 2 2.29
Performing reverifications and
revalidations 26%, 5 42%, 8 11%, 2 21%, 4 2.20
Training new staff 21%, 4 47%, 9 21%, 4 11%, 2 2.00
Managing software updates 16%, 3 42%, 8 26%, 5 16%, 3 1.88
Allocating sufficient time 16%, 3 32%, 6 37%, 7 16%, 3 1.75
Adding manpower 11%, 2 37%, 7 32%, 6 21%, 4 1.73
96
Participants were asked about their role(s) in assuring Part 11 compliance (Figure 43).
Nearly half identified that they worked in ‘quality assurance’ (46%, 13/28), ‘communication to
regulators’ (43%, 12/28), ‘implementation of the device’ (43%, 12/28), ‘development of device’
(39%, 11/28) and ‘validation of the device’ (39%, 11/28). Identified less frequently were
‘selection of the device’ (32%, 9/28), ‘monitoring field performance’ (29%, 8/28) and ‘training
the trainer or other users’ (25%, 7/28). ‘Other’, chosen by 11% (3/28), had responses of “none –
I’m in sales”, “none directly, only inspection readiness in terms of required documents. Other
responsibilities are outsourced to CRO” and “privacy and security”.
Figure 43: Role Played in Assuring Part 11 Compliance
Q - What roles have you played in assuring Part 11 compliance for software platforms? (Select
all that apply). n=28
Participants were then offered several sources of information that might be used as
educational resources related to regulatory requirements for SPs (Figure 44).
For the choice of regulations and guidance documents, more than half responded with
‘used, very helpful’ (61%, 17/28), 10 with ‘used, moderately helpful’ (36%, 10/28) and one with
‘used, but not helpful’ (4%, 1/28).
97
For the choice of input from colleagues, nearly half responded with ‘used, very helpful’
(44%, 12/27) or ‘used, moderately helpful’ (52%, 14/27); one chose ‘used but not helpful’ (4%,
1/27).
For the choice of manufacturer’s documentation, less than half responded with ‘used,
very helpful’ (37%, 10/27) or ‘used, moderately helpful’ (41%, 11/27). Two responded with
‘used, but not helpful’ (7%, 2/27) and four with ‘have not used’ (15%, 4/27).
For the choice of input from consultants, less than half responded with ‘used, very
helpful’ (33%, 9/27) or ‘used, moderately helpful’ (30%, 8/27). Four participants chose ‘used,
but not helpful’ (15%, 4/27) and six chose ‘have not used’ (22%, 6/27).
For the choice of discussion/meeting with regulatory authorities, a third responded with
‘used, very helpful’ (33%, 9/27) and a fifth with ‘used, moderately helpful’ (22%, 6/27). One
chose ‘used, but not helpful’ (4%, 1/27) and eleven chose ‘have not used’ (41%, 11/27).
For the choice of trade journals, news articles and blogs, two responded with ‘used, very
helpful’ (7%, 2/27) and more than half with ‘used, moderately helpful’ (63%, 17/27). However,
about a fifth chose ‘used but not helpful’ (19%, 5/27) and three had ‘not used’ these sources
(11%, 3/27).
For the choice of seminars/conferences, only two responded with ‘used, very helpful’
(7%, 2/27), about half with ‘used, moderately helpful’ (52%, 14/27), 15% with ‘used, but not
helpful’ (15%, 4/27) and a quarter with ‘have not used’ (26%, 7/27).
For the choice of landscape analysis, about half responded with ‘used, moderately
helpful’ (48%, 13/27). A few responded with ‘used, but not helpful’ (11%, 3/27) and several
‘have not used’ this tool (41%, 11/27).
98
For the choice of other, please specify, one participant chose ‘used, very helpful’ (25%,
1/4) with a text response, “regulator’s input”; three chose ‘have not used’ (75%, 3/4).
Figure 44: Sources Used to Educate About Software Platforms
Q - When you explored using software platforms in a clinical trial, what sources of information
did you use to educate yourself about regulatory requirements?
Response numbers varied between 27 or 28, other n=4
RA-Regulatory Authorities.
Participants were asked to rate six factors (Figure 45, Table 17) that contributed to the
adoption of the SP as ‘very important’ (VI) ‘somewhat important’ (SI), ‘not important’ (NI) or
‘not sure’ (NS). ‘Capabilities to satisfy Part 11 and Good Clinical Practices (GCP)’ was rated
highest according to its WA of 2.92 (ratings VI: 23/28; SI: 2/28; NI: 0/28, NS: 3/28). ‘Ease of
use’ had a slightly lower WA of 2.85 and the same number who rated it as their first choice (VI:
23/28; SI: 4/28; NI: 0/28; NS: 1/28). ‘Flexibility in data collection’ had a WA of 2.56 (VI: 16/28,
SI: 10/28, NI: 1/28, NS: 1/28) and ‘feedback from FDA’ had a similar WA of 2.52 (VI: 12/27;
SI: 8/27, NI: 1/27; NS: 6/27). Both ‘time to validate’ (VI: 13/28, SI: 14/28, NI: 0/28, NS: 1/28)
and ‘corporate strategy’ (VI: 13/28, SI: 11/28, NI: 1/28, NS: 3/28) had the same WAs of 2.48.
99
‘Other’, chosen by five participants (VI: 1/5, SI: 1/5, NS: 3/5) with “cost” and “ensures privacy
and security of data” as additional responses.
Figure 45: Importance of Factors When Implementing Software Platforms
Q - How important were the following factors when deciding to adopt software platforms in your
clinical trials? Response numbers varied between 27 and 28; other n=5
Table 17: Importance of Factors When Implementing Software Platforms
WA: Weighted Average, 3-Very important (VI), 2-Somewhat important (SI), 1-Not important
(NI), 0-Not sure (NS).
NS values were removed from WA calculation.
Factor
Very
important
Somewhat
important
Not
important Not sure WA
Capabilities to satisfy Part 11 and
Good Clinical Practices (GCP) 82%, 23 7%, 2 0%, 0 11%, 3 2.92
Ease of use 82%, 23 14%, 4 0%, 0 4%, 1 2.85
Flexibility in data collection 57%, 16 36%, 10 4%, 1 4%, 1 2.56
Feedback from FDA (n=27) 44%, 12 30%, 8 4%, 1 22%, 6 2.52
Time to validate 46%, 13 50%, 14 0%, 0 4%, 1 2.48
Corporate strategy 46%, 13 39%, 11 4%, 1 11%, 3 2.48
Other, please specify (n=5) 20%, 1 20%, 1 0%, 0 60%, 3 2.50
Participants were asked if they had assessed whether the SP was “fit-for-purpose” before
the trial commenced (Figure 46). Most (82%) said ‘yes’ (23/28) and five said ‘not sure’ (18%,
5/28).
100
Figure 46: Fit-for-purpose Assessment
Q - Before beginning your trial, did your company ensure that the software platform was “fit-forpurpose” i.e. the level of validation was sufficient to support its use in the clinical investigation?
(n=28)
Participants were asked to rate the degree of concern associated with six possible
challenges that might hamper the implementation of the SP (Figure 47, Table 18), on a scale of
1-6, with ‘1’ being the most important and ‘6’ being the least important. ‘Incorrect use by
participants’ was rated highest according to its weighted average (WA) 3.19 and was most
commonly selected as most important as well (1: 10/27; 2: 4/27; 3: 2/27, 4: 5/27, 5: 4/27 6:
2/27). ‘Unclear requirements for verifications’ had WA of 2.74 (1: 2/27; 2: 10/27; 3: 3/27, 4:
5/27, 5: 5/27 6: 2/27); ‘insufficient compliance experience’ had WA of 2.52 (1: 7/27, 2: 4/27, 3:
4/27, 4: 1/27, 5: 3/27 6: 8/27); ‘cumbersome risk assessments’ had WA of 2.44 (1: 1/27, 2: 4/27,
3: 8/27, 4: 8/27, 5: 5/27, 6: 1/27); ‘cumbersome validations’ had WA of 2.33 (1: 3/27, 2: 3/27, 3:
6/27, 4: 6/27, 5: 6/27, 6: 3/27) and ‘high cost of testing’ had WA of 1.78 (1: 4/27, 2: 2/27, 3:
4/27, 4: 2/27, 5: 4/27, 6: 11/27).
101
Figure 47: Potential Concerns in Implementation
Q - In your opinion, rate these potential concerns in order of importance when implementing the
software platforms? (Drag and Drop your response). n=27
Table 18: Potential Concerns in Implementation
WA: Weighted Average, 1-Most important … 6-Least important.
One additional comment was provided specifying a concern with cost-benefit assessments: “The
benefit of using software systems, including wearables, into clinical trials needs to be significant
in order to justify the increased cost for implementation”.
Participants were asked if the SP was ultimately used for data collection (Figure 48).
More than half responded ‘yes’ (68%, 19/28), a fourth identified that it was ‘still under
consideration’ (25, 7/28), and two were ‘not sure’ (7%, 2/28).
Potential Concerns 1 2 3 4 5 6 WA
Incorrect use by participants 37%, 10 15%, 4 7%, 2 19%, 5 15%, 4 7%, 2 3.19
Unclear requirements for verifications 7%, 2 37%, 10 11%, 3 19%, 5 19%, 5 7%, 2 2.74
Insufficient compliance experience 26%, 7 15%, 4 15%, 4 4%, 1 11%, 3 30%, 8 2.52
Cumbersome risk assessments 4%, 1 15%, 4 30%, 8 30%, 8 19%, 5 4%, 1 2.44
Cumbersome validations 11%, 3 11%, 3 22%, 6 22%, 6 22%, 6 11%, 3 2.33
High cost of testing 15%, 4 7%, 2 15%, 4 7%, 2 15%, 4 41%, 11 1.78
102
Figure 48: Final Incorporation of Software Platforms
Q - Did you ultimately incorporate software platforms for patient reported outcomes in your
clinical trials? n=28
Participants were asked to rate five challenges (Figure 49, Table 19) that they may have
faced when implementing the use of SP as having ‘a lot of influence’ (ALIf), ‘some influence’
(SIf), ‘no influence’(NIf) or ‘cannot answer’ (CA). The response that was rated as having ‘a lot
of influence’ most often was ‘insufficient technical expertise to perform validations’ with the
highest weighted average WA of 2.11 (ratings ALIf: 10/29, SIf: 10/29, NIf: 7/29, CA: 2/29).
‘Insufficient budget allocation’ had a WA of 2.04 (ALI: 8/29, SIf: 9/29, NIf: 7/29, CA: 5/29);
‘insufficient clarity on regulatory requirements’ had WA of 2.00 (ALIf: 8/29, SIf: 10/29, NIf:
8/29, CA: 3/29); ‘insufficient understanding of validation methods’ had WA of 1.92 (ALIf: 5/28,
SIf: 14/28, NIf: 7/28, CA: 2/28) and ‘lack of leadership support’ had WA 1.89 (ALIf: 5/29, SIf:
13/29, NIf: 8/29, CA: 3/29). Two individuals chose ‘other’ (CA: 2/2).
103
Figure 49: Challenges Affecting Implementation
Q - What challenges affected your use of software platforms for patient reported outcomes in
your clinical trial? Response numbers varied between 28 or 29; other n=2
Table 19: Challenges Affecting Implementation
WA-Weighted Average, 3-A lot of influence (ALIf), 2-Some influence (SIf), 1-No influence (NIf),
0-Cannot answer(CA).
Participants were asked if the preparations to use SP had delayed the start date of the
clinical trial (Figure 50). More than half said ‘no’, (55%, 16/29), eight said ‘not sure’ (28%,
8/29) and five said ‘yes’ (17%, 5/29). The four participants who chose ‘yes’ responded with ‘4
months’, 3-6’, ‘inadequate expertise at all phases – 4 months’, ‘2’ and ‘went with a split go-live
so the delay is approx. 3 months to get the translations’.
Challenges
A lot of
influence
Some
influence No influence
Cannot
answer WA
Insufficient technical expertise
to perform validations 34%, 10 34%, 10 24%, 7 7%, 2 2.11
Insufficient budget allocation 28%, 8 31%, 9 24%, 7 17%, 5 2.04
Insufficient clarity on
regulatory requirements 28%, 8 34%, 10 28%, 8 10%, 3 2.00
Insufficient understanding of
validation methods (n=28) 18%, 5 50%, 14 25%, 7 7%, 2 1.92
Lack of leadership support 17%, 5 45%, 13 28%, 8 10%, 3 1.88
Other, please specify (n=2) 0%, 0 0%, 0 0%, 0 100%, 2 N/A
104
Figure 50: Delays in Start of Clinical Trials
Q - Did the preparations to use the software platforms for patient reported outcomes cause delays
in the start date of the clinical trial? n=29
Participants were asked to rate five factors that may have contributed to delays in the use
of SP (Figure 51, Table 20) as ‘most impactful’ (MoIp), ‘impactful’ (Ip), ‘marginally impactful’
(MaIp) ‘not impactful’ (NIp) and if applicable ‘not sure’ (NS). ‘Time to train subjects’ (MoIp:
3/12, Ip: 4/12, MaIp: 2/12, NIp: 0/12, NS: 3/12) and ‘time to train personnel’ (MoIp: 2/12, Ip:
6/12, MaIp: 1/12, NIp: 0/12, NS: 3/12) had the same WAs of 3.11. ‘Time required to complete
validations’ had WA of 3.00 (MoIp: 1/12, Ip: 7/12, MaIp: 1/12, NIp: 0/12, NS: 3/12); ‘time
required to complete risk assessments’ had WA of 2.67 (MoIp: 1/12, Ip: 5/12, MaIp: 2/12, NIp:
1/12, NS: 3/12) and ‘time required for communication discussions with the FDA’ had WA of
2.56 (MoIp: 3/12, Ip: 1/12, MaIp: 3/12, NIp: 2/12, NS: 3/12). Two individuals chose ‘other’ and
provided a response of ‘NS’ (2/2).
105
Figure 51: Factors Resulting in Delays
Q - How impactful were the following factors in delaying the use of software platforms in your
company’s clinical trial? n=12, other n=2
Table 20: Factors Resulting in Delays
WA-Weighted Average, 4-Most impactful (MoIp), 3-Impactful (Ip), 2-Marginally impactful
(MaIp,) 1-Not impactful (NIp), 0-Not sure (NS).
Participants were asked to rate the severity of six challenges (Figure 52, Table 21) that
they may have faced when completing validations for SP as ‘most challenging’ (MoC) ‘very
challenging’ (VC) ‘moderately challenging’ (MdC) ‘not challenging’ (NC) and ‘not sure’ (NS).
‘Ensuring knowledge to perform validations’ had the highest WA of 2.48 (MoC: 04/28, VC:
6/28, MdC: 13/28, NC: 2/28, NS: 3/28). ‘Meeting timelines’ (MoC: 4/28, VC: 6/28, MdC: 11/28,
Delays
Most
impactful Impactful
Marginally
impactful
Not
impactful Not sure WA
Time to train subjects 25%, 3 33%,, 4 17%, 2 0%, 0 25%, 3 3.11
Time to train personnel 17%, 2 50%, 6 8%, 1 0%, 0 25%, 3 3.11
Time required to complete
validations 8%, 1 58%, 7 8%, 1 0%, 0 25%, 3 3.00
Time required to complete
risk assessments 8%, 1 42%, 5 17%, 2 8%, 1 25%, 3 2.67
Time required for
communication discussions
with the FDA 25%, 3 8%, 1 25%, 3 17%, 2 25%, 3 2.56
Other, please specify : 0%, 0 0%, 0 0%, 0 0%, 0 100%, 2/2 N/A
106
NC: 5/28, NS: 2/28) and ‘assuring manpower to perform validations’ (MoC: 2/28, VC: 6/28,
MdC: 17/28, NC: 1/28, NS: 2/28) had similar WAs of 2.35. ‘Documenting plans and procedures’
had WA of 2.19 (MoC: 3/28, VC: 3/28, MdC: 16/28, NC: 4/28, NS: 2/28) ‘securing financial
resources’ had WA of 2.08 (MoC: 2/28, VC: 6/28, MdC: 9/28, NC: 8/28, NS: 3/28) and
‘understanding good clinical practices (GCPs) related to validations had WA of 1.92 (MoC:
1/28, VC: 5/28, MdC: 10/28, NC: 9/28, NS: 3/28).
Figure 52: Challenges with Completing Validations
Q- How challenging were the following in completing validations for software platforms? n=28
Table 21: Challenges with Completing Validations
WA-Weighted Average, 4-Most challenging (MoC), 3-Very challenging (VC), 2-Moderately
challenging (MdC), 1-Not challenging (NC), 0-Not sure (NS).
Validations
Most
challenging
Very
challenging
Moderately
challenging
Not
challenging
Not
sure WA
Ensuring knowledge to
perform validations 14%, 4 21%, 6 46%, 13 7%, 2 11%, 3 2.48
Meeting timelines 14%, 4 21%, 6 39%, 11 18%, 5 7%, 2 2.35
Assuring manpower to
perform validations 7%, 2 21%, 6 61%, 17 4%, 1 7%, 2 2.35
Documenting plans and
procedures 11%, 3 11%, 3 57%, 16 14%, 4 7%, 2 2.19
Securing financial resources 7%, 2 21%, 6 32%, 9 29%, 8 11%, 3 2.08
Understanding Good Clinical
Practices (GCPs) related to
validations 4%, 1 18%, 5 36%, 10 32%, 9 11%, 3 1.92
107
Participants were asked if they had set up meetings with the FDA to discuss compliance
requirements (Figure 53). More identified that they used the meetings for end point data
collection’ (43%, 7/28) than for ‘early stage exploratory data’ (11%, 3/28). Less than half of the
participants responded with a ‘no’ (43%, 12/28) or ‘not sure’ (21%, 6/28).
Figure 53: FDA Compliance Discussions for Software Platforms
Q - As you prepared to use software platforms for patient reported data in your study protocol,
did you set up meetings with the FDA to discuss the necessary compliance requirements? (Select
all that apply). n=28
Participants who said yes were asked three additional questions. First, they were asked
how long it took to set up and complete the discussions with the FDA (Figure 54). Most selected
‘up to 6 months’ (40%, 4/10) or ‘6-12 months’ (20%, 2/10). Three participants were ‘not sure’.
None identified that the time was ‘almost immediate’ or that it was ‘greater than 12 months’.
Figure 54: Time Needed for FDA Compliance Discussions
Q - How long did it take to set up and complete the discussions with the FDA before you
implemented the software platform? n=10
108
Participants were asked if they had sufficient time to address the FDA’s feedback before
the trial started (Figure 55). More than half said ‘yes’ (80%, 8/10) and 2 were ‘not sure’ (20%,
2/10). No one selected ‘no’ or ‘not applicable’.
Figure 55: Time to address FDA’s Feedback
Q - Did you have enough time to address the FDA’s feedback regarding the requirements for
Part 11 compliance before your trial started? n=10
The participants were asked to rate the feedback received from the FDA by responding to
four statements (Figure 56, Table 22) with options of ‘disagree’ (D), ‘neither disagree nor agree’
(NDA), ‘agree’ (A) and ‘not sure’ (NS). Most respondents agreed that the feedback ‘yielded a
clear understanding of potential impact on data collection’ (A: 78%, 7/9; D: 11%, 1/9, NDA:
11%, 1/9) and that it ‘provided opportunity for follow up/discussion’ (A: 78%, 7/9, D: 11%, 1/9,
NDA: 0%, 0/9, NS: 11%, 1/9). Fewer agreed that it ‘provided sufficient information for decisionmaking’ (A: 56%, 5/9, D: 0%, 0/9, NDA: 44%, 4/9) or that it ‘provided sufficient advice about
data needed for a marketing application’ (A: 56%, 5/9, D: 0%, 0/9, NDA: 33%, 3/9, NS: 11%,
1/9).
109
Figure 56: Feedback from FDA Discussions
Q - When thinking about your discussions with FDA, do you agree or disagree with the
following statements regarding those discussions? n=9
Table 22: Usefulness of Feedback from FDA Discussions
Agree (A), Neither disagree nor agree (NDA), Disagree (D), Not sure (NS).
Participants were asked if they performed risk assessments (Figure 57). About half said
‘yes’ (48%, 13/27), four said ‘no’ (15%, 4/27) and four said ‘not sure’ (15%, 4/27). Six ‘did not
need risk assessments as the device/sensor was validated previously for this purpose’ (22%,
6/27).
FDA Discussions Disagree
Neither disagree
nor agree Agree Not sure
Yielded a clear understanding of
potential impact on data collection 11%, 1 11%, 1 78%, 7 0%, 0
Provided opportunity for follow
up/discussion 11%, 1 0%, 0 78%, 7 11%, 1
Provided sufficient information for
decision-making 0%, 0 44%, 4 56%, 5 0%, 0
Provided sufficient advice about
data needed for a marketing
application 0%, 0 33%, 3 56%, 5 11%, 1
110
Figure 57: Performed Risk Assessments for Software Platforms
Q - Have you carried out risk assessments related to the software platforms? n=27
RA-Risk Assessment.
Participants answering ‘yes’ to the previous question were asked to explain any
additional challenges they faced. Two comments were “No - standard practice at our company.
Dedicated teams with extensive working knowledge” and “user groups - user risk analysis”.
Participants were asked to rate eight challenges (Figure 58, Table 23) they may have
faced when first using SP as ‘very challenging’ (VC), ‘moderately challenging’ (MdC), ‘not
challenging’ (NC) and ‘not sure’ (NS). ‘Trial participants made errors’ had a WA of 2.05 (VC:
4/28, MdC: 15/28, NC: 3/28, NS: 6/28). The choice of ‘participants were not compliant’ had a
marginally lower WA of 2.00 and a marginally more frequent identification as ‘very challenging’
(VC: 5/28, MdC: 12/28, NC: 5/28, NS: 6/28). ‘Device did not work as anticipated’ had WA of
1.76 (VC: 2/27, MdC: 12/27, NC: 7/27, NS: 6/27); ‘devices failed’ had WA of 1.63 (VC: 0/28,
MdC: 12/28, NC: 7/28, NS: 9/28); ‘technical support was insufficient’ had WA of 1.55 (VC:
1/28, MdC: 9/28, NC: 10/28, NS: 8/28); ‘data transmission was unreliable’ had WA of 1.52 (VC:
1/28, MdC: 9/28, NC: 11/28, NS: 7/28); ‘operating system upgrades caused software
incompatibility’ had WA of 1.50 (VC: 0/28, MdC: 8/28, NC: 8/28, NS: 12/28) and ‘device
updates changed data or algorithms’ had WA of 1.44 (VC: 0/28, MdC: 8/28, NC: 8/28, NS:
12/28).
111
Figure 58: Challenges for Software Platforms
Q - When you first used the software platforms for patient reported outcomes for the clinical
trial, what were the challenges? Response numbers varied between 27 and 28.
Table 23: Challenges for Software Platforms
WA-Weighted Average, 3-Very challenging (VC), 2-Moderately challenging (MdC), 1-Not
challenging (NC), 0-Not sure (NS).
Challenges
Very
challenging
Moderately
challenging
Not
challenging Not sure WA
Trial participants made errors 14%, 4 54%, 15 11%, 3 21%, 6 2.05
Participants were not
compliant 18%, 5 43%, 12 18%, 5 21%, 6 2.00
Device did not work as
anticipated (n=27) 7%, 2 44%, 12 26%, 7 22%, 6 1.76
Devices failed 0%, 0 43%, 12 25%, 7 32%, 9 1.63
Technical support was
insufficient 4%, 1 32%, 9 38%, 10 29%, 8 1.55
Data transmission was
unreliable 4%, 1 32%, 9 39%, 11 25%, 7 1.52
Operating system upgrades
caused software
incompatibility 0%, 0 29%, 8 29%, 8 43%, 12 1.50
Device updates changed data
or algorithms 0%, 0 29%, 8 38%, 10 38%, 10 1.44
112
Participants were asked if they had any additional comments or concerns. Three
responses of ‘we did not implement yet’, ‘we did not face any issues that were mentioned’ and
‘N/A’ were received.
Participants were asked to rate five factors (Figure 59, Table 24) related to the ongoing
compliance of the SP as ‘extremely important’ (EI), ‘very important’ (VI), ‘moderately
important’ (MdI), ‘not important’ (NI) and ‘not sure’ (NS). ‘Adequate time’ had the highest WA
of 2.96 (ratings EI: 5/27, VI: 13/27, SI: 4/27, NI: 1/27, NS: 4/27). ‘Adequate finances’ had WA
of 2.82 (EI: 4/27, VI: 12/27, SI: 4/27, NI: 2/27, NS: 5/27); ‘organizational readiness (i.e., update
procedures, processes, trainings, etc.)’ had WA of 2.78 (EI: 3/27, VI: 13/27, SI: 6/27, NI: 1/27,
NS: 4/27); ‘reassessments when regulatory requirements/guidances changed’ had WA of 2.77
(EI: 6/27, VI: 8/27, SI: 5/27, NI: 3/27, NS: 5/27) and ‘adequate manpower’ had a similar WA of
2.77 (EI: 4/27, VI: 10/27, SI: 7/27, NI: 1/27, NS: 5/27).
Figure 59: Ongoing Compliance for Software Platforms
Q - How important were the following factors in assuring ongoing Part 11 compliance for
software platforms, as the clinical trial progressed? n=27
113
Table 24: Ongoing Compliance for Software Platforms
WA- Weighted Average, 4-Extremely important (EI), 3-Very important (VI), 2-Somewhat
important (SI), 1-Not important (NI), 0-Not sure (NS).
Participants were asked to rate eight factors (Figure 60, Table 25) they may have faced
when trying to maintain compliance for SP as ‘very confident’ (VC), ‘confident’ (C), ‘not very
confident’ (NVC) and ‘not sure’ (NS). ‘Updating trial protocols and procedures’ had the highest
WA of 2.38 (ratings VC: 9/27, C: 15/27, NVC: 0/27, NS: 3/27). ‘Performing reverifications and
revalidations’ had WA of 2.23 (VC: 6/27, C: 15/27, NVC: 1/27, NS: 5/27); ‘training new staff’
(VC: 8/27, C: 13/27, NVC: 3/27, NS: 3/27) and ‘managing identified risks’(VC: 7/27, C: 15/27,
NVC: 2/27, NS: 3/27) had similar WAs of 2.21; ‘conducting risk assessments’ had WA of 2.19
(VC: 6/27, C: 13/27, NVC: 2/27, NS: 6/27); ‘managing software updates’ had WA of 2.05 (VC:
3/27, C: 17/27, NVC: 2/27, NS: 5/27); ‘allocating sufficient time’ had WA of 1.86 (VC: 3/26, C:
13/26, NVC: 6/26, NS: 4/26) and ‘adding manpower’ had WA of 1.86 (VC: 2/27, C: 15/27,
NVC: 5/27, NS: 5/27).
Ongoing Compliance
Extremely
important
Very
important
Somewhat
important
Not
important
Not
sure WA
Adequate time 19%, 5 48%, 13 15%, 4 4%, 1 15%, 4 2.96
Adequate finances 15%, 4 44%, 12 15%, 4 7%, 2 19%, 5 2.82
Organizational readiness (i.e.,
update procedures, processes,
trainings, etc.) 11%, 3 48%, 13 22%, 6 4%, 1 15%, 4 2.78
Reassessments when regulatory
requirements/guidances changed 22%, 6 30%, 8 19%, 5 11%, 3 19%, 5 2.77
Adequate manpower 15%, 4 37%, 10 26%, 7 4%, 1 19%, 5 2.77
114
Figure 60: Maintenance of Compliance for Software Platforms
Q - How confident were you that the following activities were successful in maintaining
compliance with Part 11? Response numbers varied between 26 and 27.
Table 25: Maintenance of Compliance for Software Platforms
WA: Weighted Average, 3-Very confident, (VC), 2-Confident (C), 1-Not very confident (NVC),
0-Not sure (NS).
Participants who completed one set of questions regarding either CW or SP were asked if
they were willing to answer questions on the alternate section. For those first answering
questions on CW, 61% answered ‘yes’ (14/23), 26% answered ‘no’ (6/23) and 13% stated that
Maintenance of Compliance
Very
confident Confident
Not very
confident Not sure WA
Updating trial protocols and procedures 33%, 9 56%, 15 0%, 0/27 11%, 3 2.38
Performing reverifications and
revalidations 22%, 6 56%, 15 4%, 1 19%, 5 2.23
Training new staff 30%, 8 48%, 13 11%, 3 11%, 3 2.21
Managing identified risks 26%, 7 56%, 15 7%, 2 11%, 3 2.21
Conducting risk assessments 22%, 6 48%, 13 7%, 2 22%, 6 2.19
Managing software updates 11%, 3 63%, 17 7%, 2 19%, 5 2.05
Allocating sufficient time (n=26) 12%, 3 50%, 13 23%, 6 15%, 4 1.86
Adding manpower 7%, 2 56%, 15 19%, 5 19%, 5 1.86
115
they ‘have already completed that section’ [on SP] (3/23). For those first answering questions on
SP, 46% answered ‘no’ (13/28), 18% answered ‘yes’ (5/28) and 36% stated that they ‘have
already completed that section’ (10/28).
4.5 Concluding Questions
When participants finished the survey questions that they were willing to address, they
were asked six final questions. First, they were asked to characterize the FDA guidances that
support the Part 11 regulation in just one word from four choices with an option to provide other
descriptors (Figure 61). More than half responded with ‘reasonable’ (63%, 20/32), 19% with
‘insufficient’ (6/32) and 13% with ‘confusing’ (4/32). No one picked ‘excessive’. Two
participants chose to provide an ‘other’ response (6%, 2/32), which were ‘adequate’ and ‘nonspecific’ respectively.
Cross tabulations were performed to see if the number of clinical trials on which the
respondent worked over the past 5 years was related to their characterization of the FDA
guidances, but no obvious conclusions could be drawn given the small ‘n’ (Appendix B;
Table 39).
Figure 61: Characterization of FDA Guidances
Q - In one word, how would you characterize FDA guidances to support the Part 11 regulation?
n=32
116
Participants were asked if cybersecurity risks were considered for CW and SP
(Figure 62). Most answered ‘yes’ (84%, 27/32), 9% answered ‘no’ (3/32) and 6% identified that
they were ‘not sure’ (2/32).
Figure 62: Assessment of Cybersecurity Risks
Q - Before deploying the discussed digital health technologies (consumer wearables, software
platforms for patient reported data and FDA cleared biosensors) for your clinical trial did you
consider cybersecurity risks? n=32
Participants were asked if their company had procedures in place to handle any data
breaches (Figure 63). Most said ‘yes’ (88%, 28/32), two said ‘no’ (6%, 2/32) and two said that
they were ‘not sure’ (6%, 2/32).
Figure 63: Handling Data Breaches
Q - Does your company have procedures to handle a data breach? n=32
Participants were asked if their company assures that employees understand and stay
current with the requirements of ongoing compliance for DHTs (Figure 64). More than half said
‘yes’ (59%, 19/32) and a third said ‘no’ (34%, 11/32). A few responded with ‘not sure’ (6%,
2/32).
117
Figure 64: Ongoing Compliance
Q - Do you feel that your company pays sufficient attention to assure that employees understand
and stay current with the requirements of Part 11 compliance for digital health technologies?
n=32
Participants were encouraged to write about any other experiences regarding compliance
with Part 11 when implementing DHTs and four additional comments were received (Table 26).
Table 26: Additional Compliance Information
Additional Comments When Implementing DHTs
1) It takes far too much time and effort to pull together these aspects of studies.
2) Enforcement and expectations vary far too widely
3) One must become their own expert to get this done with remote possibility of being
effective
4) Lack of expertise in a pharma company is a key challenge.
All the participants were thanked and asked to provide their emails if they would like a
copy of the survey result summary. Nine participants shared their email addresses and two
provided a text response of, “be sure email is clearly from research and not junk if possible” and
“feel free to share a list or compendium of resources, that's where it all starts”.
118
Chapter 5. Discussion
5.1 Introduction
The use of DHTs to collect data in clinical trials represents a transformative shift in how
clinical research is conducted. However, the shift from paper to digital data collection does pose
challenges, including challenges related to compliance with 21 CFR Part 11, as present results
show. This study explored how companies implemented, managed and maintained compliance
with Part 11 for DHTs [specifically consumer wearables (CW) and software platforms (SP)]
used in their clinical trials. Using an implementation research framework developed by Fixsen as
a guide, it gives new insights into whether and how companies have progressed along the path of
implementation. It also examines the nature of challenges along that implementation pathway in
more detail. However, the conclusions that can be drawn will be affected by some of the
limitations and delimitations inherent in this study, which are best discussed to give context to
the results.
5.2 Methodological Considerations
5.2.1 Delimitations
The field of decentralized trials presents many opportunities for research, all of which
cannot be addressed in a single study. The research reported here was therefore confined to one
aspect of decentralized trials, specifically, the implementation of DHTs in compliance with 21
CFR Part 11 for electronic records and signatures. This delimitation helped to sharpen the focus
of the study. It allowed the research questions in the survey to be more specific and relevant, so
that I could collect quite detailed information within the defined scope. Nonetheless those
delimitations can also run the risk of affecting the external validity of the results, so care must be
taken to consider the degree to which the results presented here can be generalized from the
119
research study’s sample to the larger target population (Price, 2004, Ross, 2019). In this context,
generalization is the ability to make general statements about the effect of some treatment. The
level of confidence an experimenter can have in generalizing findings to the target population
remains uncertain, as true random sampling from the exact target population is unattainable
(Bracht, 1968).
One factor that could pose a challenge relates to the fact that the DHTs used in clinical
trials vary widely in their purposes and configurations. DHTs such as heart or glucose monitors
obviously differ from electronic patient diaries, for example. Thus, it would not be surprising if
they also differed in the implementation challenges that they pose when trying to comply with
Part 11. The choice of two specified categories of DHTs with different characteristics appeared
to be important to this research study because it provided insights into implementation
challenges that are generalizable and others that are not common to DHTs of different types, as
discussed below. If the respondents had instead been told to give their experiences on an
unspecified “DHT”, they might report experiences with different types of DHTs; results
therefore might be blurred by combining “apples and oranges”. Further, care must be taken to
recognize that other types of DHTs not considered here, such as implantable medical devices,
might have more complex regulatory and logistical paths and hurdles that would not be captured
by this study. However, tradeoffs had to be made in this exploratory survey to constrain the
length of the survey. Exploration of even two types of DHT required an unusually long survey,
as discussed in the “limitations” section below.
Ensuring the survey’s validity also hinges on selecting an appropriate respondent sample
to represent the target population (Devroe, 2019). This purposive approach differs from random
sampling methods, which are often viewed as a central way to bolster external validity
120
(Druckman, 2011). However, only a small and highly experienced set of individuals have the
type of expertise needed to implement DHTs. Thus, narrowing the inclusion to a specific set of
stakeholders who could be identified to have this expertise allowed us to gain a better
understanding of industry trends and challenges from professionals at the forefront of this type of
implementation. Within this delimitation were individuals who performed a range of functions,
including regulatory, clinical and IT specializations. These groups might be expected to have
different views related to the unique challenges, interactions and contributions associated with
their differing job functions. However, stratification and cross tabulations yielded relatively
small subsamples of individuals with no identifiable differences in perspectives. This might be
explained by the fact that individuals in different roles are part of a collaborative undertaking in
which extensive interaction is needed to accomplish a common goal. If differences do exist, they
might only be recognized in a more directed study based on larger numbers of individuals.
Alternatively, I might see differences if I expanded the surveyed population to include more
peripheral functions such as legal, marketing or financial areas. However, the views of these
individuals might be limited by the fact that they are unlikely to have deep domain knowledge of
Part 11 requirements needed to provide meaningful insights into implementation. I also looked
for differences between company sizes. Companies with fewer resources might be anticipated to
have more challenges with resource intensive activities such a training and validations. Further
these small companies are often neglected in studies carried out by professional organizations
which often draw their responses from large pharmaceutical companies. However, from the
admittedly limited sample of respondents here, most of the activities causing most difficulty
appeared to be common to companies of all sizes.
121
The scope of this study was also delimited to the relevant regulations, guidances and
standards that govern clinical trials in the United States only. It therefore might be difficult to
generalize these findings to the ‘rest of world’, where regulations might differ. However, it is
unlikely that the dissonance would be great. The US pharmaceutical industry is larger than that
of any other country in terms of the numbers of clinical trials conducted globally (Hawkins,
2022). Further, many of these trials are multinational, and the approaches of sites outside of the
US would have to meet US requirements if marketing authorization was sought in the US.
Similarly, US companies also must satisfy the regulations of other countries in which they wish
to market. However, this work cannot identify whether these global considerations add
challenges for implementation, which may be an interesting subject for further research.
Survey research such as this captures a snapshot in time. Because the use of DHTs is both
novel and rapidly evolving, it seems reasonable to expect that clinical trial practitioners and
regulatory policy experts will become more comfortable with the use of DHTs as time goes
forward. As stated by Beaucamp in reference to the use of DHTs in one specific type of drug
product, “At present, there are no specific recommendations for the use of wearables in
oncology, and little research has examined the purpose of using wearables in oncology”
(Beauchamp, 2020). In future, better guidance from regulatory and standards-setting agencies
related to certain troublesome areas such as validations, highlighted here and elsewhere
(Izmailova, 2023) may decrease the degree to which industry is challenged from a Part 11
perspective.
5.2.2 Limitations
The research described here was based on survey methods. A well-designed survey is
useful because it reduces the cost and yield of responses from a geographically dispersed set of
122
respondents within a defined period of time (Marshall, 2014). An electronic survey can further
shorten turnaround times and improve the accuracy of data collection compared to paper-based
surveys (Sue, 2013, Saleh, 2017). Importantly for this study, it allowed me to engage with a
diverse group of participants who could then respond anonymously, so that their proprietary
company knowledge could be protected. The confidentiality of results can increase response
rates and decrease the likelihood of bias (Marshall, 2014). Anecdotally, several respondents told
me that they would participate in the survey only if their responses were protected.
At the same time, the use of an electronic survey had limitations. Although email
communications can be expedient, a recipient may not respond for long periods. If a business
email address is used, email communications that are not clearly related to company business
may be deleted or placed in spam or junk folders (Evans, 2018). The use of email may also have
problems if the email addresses of some potential participants are outdated. Nonetheless, 62
participants entered the survey, yielding a response rate of 22% (62/281).
Response rate is important to understand because it can affect the development of a
representative sample. Very low rates of participation can call into question the validity and
generalizability of a survey (Parker, 2014). However, in a study such as this, a response rate of
20-30% was seen to be adequate, given results of Nair that surveys achieving a response rate as
low as 10% have often been considered acceptable (Nair, 2008). The challenges of interpreting
response rate have been discussed by Hesse-Bieber and Leavy who emphasized that the essence
of qualitative research lies in the pursuit of profound understanding, often involving the
examination of small sample sizes. Unlike quantitative research, which seeks to draw
generalizations about the extent of a problem or a set of patterns, qualitative research is focused
123
on understanding more deeply the meanings individuals assign to a given situation (Hesse-Biber,
2010).
In this study, I was satisfied that the response rate was sufficiently high to provide
meaningful insights. I believe that the deep domain knowledge of the survey respondents
strengthened the validity of the research results. As suggested by Orcher, it is more important to
obtain a representative sample rather than collecting a larger sample size (Orcher, 2007).
What is limited by response rate is the confidence in results that emerge by stratifying the
populations. Stratifying the sample into three or more groups such as job function or company
size necessarily gave subgroups with small numbers. When working with small numbers, a
researcher must use judgment to determine the point at which the collected data is considered to
be saturated (Sue, 2013). Thus it was not surprising to find that cross-tabulation methods yielded
few convincing differences.
Given the need to engage qualified respondents from a relatively small overall population
of experts, it was important to assure that the participants who entered the survey felt engaged as
they progressed through the survey. The social science literature cautions against constructing
long surveys that may induce survey fatigue and reduce the likelihood that participants would
fail to complete the survey (Herzog, 1981, Porter, 2004). This potential limitation was of
particular concern because the survey in this study included questions for two DHTs, wearables
and software platforms, so some participants with experience with both types of DHTs would
have to invest much time to complete the survey. However, the relatively high response rate of
77% (48/62) significantly exceeds the 50% response rate that has been suggested to be necessary
to reduce bias and nonresponse errors (Draugalis, 2008). It is not clear why 14 participants left
the survey prematurely. Some may have found that the survey would take too much time. Others
124
may believe that they are insufficiently knowledgeable to be a qualified respondent. This is not
necessarily a bad thing, if it reduced the likelihood that poorly knowledgeable participants would
respond with guesses or other strategies for answering questions that might reduce the quality of
the data (Herzog, 1981).
Because survey length had to be limited, I had to assure that the questions were able to
address the research topic sufficiently. Two strategies were used to strengthen the survey's
internal validity and its capacity to capture the desired information. First, I used an
implementation framework derived from the National Implementation Research Network
(NIRN) to guide the formulation of questions across well-characterized stages of
implementation. Such an approach can foster the systematic exploration of barriers at different
stages and help investigators to select research questions, measures and results relevant to
implementation outcomes (Moullin, 2019). Second, a focus group was convened to critique the
survey questions. This approach can benefit survey construction by clarifying ambiguous or
inappropriate questions. The focus group also helps to reduce the ‘Experimenter Effect’, in
which expectations of the researcher may bias the survey and decrease the internal validity of the
study (Bracht, 1968, Starr, 2012). Nevertheless, there remains a possibility that the survey might
not have identified crucial areas that could have enhanced our understanding of issues linked to
Part 11 compliance for DHTs in clinical trials.
5.3 Consideration of Results or Research Insights
Companies improve when they find new ways to streamline their processes and reduce
their costs. DHTs provide one such opportunity. However, inserting a new method into an
already complex trial protocol can have challenges from the time at which the change was first
considered to the later installation and implementation of that change (Fixsen, 2005). The
125
decision to add a DHT to a clinical trial program is complicated by the fact that DHTs exist in
various forms, each with its own specific needs for validation and implementation. However, a
side-by-side comparison of certain key results in this study suggested more similarities than
differences in the challenges faced along the implementation path. Certain key takeaways from
the survey results are discussed below with reference to different stages of implementation that
must be carried out before a DHT can be integrated successfully in clinical trial operations.
5.3.1 Exploration
Implementation begins with a decision to determine operational readiness and feasibility
of the various DHTs that might be used for data collection in a clinical trial. Sponsors must first
become aware that DHTs even exist for this purpose and then gather information so that
available options can be compared. A large volume of literature has examined “diffusion” of
information and “adoption decisions” by individuals and organizations (Rogers, 2003). These
identify that time and thoughtfulness are important to evaluate whether a new option aligns with
needs, evidence-based practices, program requirements and available resources. Exploration
typically ends with a decision on whether to proceed (Fixsen, 2005). In this study, I wanted to
understand how the industry had performed these determinations.
5.3.1.1 Regulatory Documents and Guidance
It seemed clear that most respondents believed, even at this early stage, that their
organizations understood the FDA regulatory requirements well enough to comply with Part 11
for DHTs. To this end, they would have had to educate themselves about several regulations
affecting DHT implementation. Not surprisingly, that level of proficiency was ranked highest for
more general requirements such as ‘Good Clinical Practice (GCP)’ and ‘21 CFR Part 11
Regulation’. The 1997 Part 11 regulation lays the foundation by which compliance is assessed
126
and GCP regulations were called out as key to managing Part 11 compliance in, 2003-Guidance
for Industry Part 11, Electronic Records; Electronic Signatures – Scope and Application (FDA,
2003). Both GCP and Part 11 regulations have been in place for a relatively long time, so have
been important for many other aspects of clinical trials even before the options of using DHTs
were available.
At the same time, however, the level of satisfaction with these documents appears mixed.
These results seem consistent with literature that has previously identified issues such as
‘variability in guidance’ and ‘uncertainty on requirements” (Sharpe, 2021). As discussed in
chapter 2, some critics have expressed concern that the scope of regulatory requirements are
minimally defined for DHTs in drug development and clarity is lacking about the logistical
expectations when adopting DHTs in clinical trials. For example, Pan identifies uncertainties
about whether certain DHTs are defined as medical devices and when medical device regulations
would apply to them (Pan, n.d.). Bradley Merrill Thompson, attorney at Epstein Becker Green
and general counsel for the Clinical Decision Support (CDS) Coalition, was reported as saying
that the FDA tends to require that some DHTs adhere to device quality systems when it is not
warranted (Al-Faruque, 2023). Additionally, the community has requested clarification on
“whether a medical device clearance is preferred or irrelevant to the agency”, and “who is able to
make this distinction, and through what mechanism this distinction should be pursued”
(Goldsack, 2022, p. 4).
The challenges experienced by some respondents might suggest that they need more
specific help with policy-related materials. Further, respondents ranked ‘having sufficient
regulatory requirements/guidances’ second only to ‘understanding GCPs to implement Part 11
compliance’ as their most troublesome regulatory hurdle. However, the idea that the challenges
127
posed by the regulations would vanish if more documents were written must be tempered by
recognizing that busy regulatory professionals may be slow to read and respond to those
documents. This concern appears to be recognized by the FDA, which has announced plans to
convene five public meetings at the end of summer, FY 2023, to discuss more directly the use of
DHTs in clinical trials (Al-Faruque, 2023). These meetings might help to educate industry about
current guidances, thus increasing the “diffusion of knowledge” which has been recognized as
important (Balas, 2018). It could also help to clarify FDA’s expectations and illuminate areas in
which guidance is currently deficient.
Notably, the document with which expertise was judged to be lowest in this study was the
2022 - Digital Health Technologies for Remote Data Acquisition in Clinical Investigations. It is
not clear whether some of the 67% who felt the need for more regulatory guidance had yet to
become familiar with this document, which should have been able to address some aspects of
implementation more specifically. Alternatively, some may be familiar with the document but
found it wanting. This interpretation may be supported by the fact that individuals from various
companies and organizations have already asked the FDA to clarify various parts of this
guidance document and suggested ways in which it could be leveraged to make clinical studies
more successful (Al-Faruque, 2022). Their concerns resonate with comments in this study. For
example, one respondent stated that “interpretation, practical implementation”, was a challenge
with GCPs when implementing compliance with Part 11. However, from a high-level view,
many respondents did seem to feel that the regulations were ‘reasonable’, so issues appear to be
associated with specific areas in which more information was thought to be helpful.
128
A contributing factor to confusion comes from the fact that DHTs can have regulations
spanning between two FDA Centers that govern drugs and devices respectively. As noted by AlFaroque:
The Center for Devices and Radiological Health (CDRH) often develops
guidance on digital topics without the Center for Drugs Evaluation and
Research (CDER), and then when those guidance documents are relied upon
in connection with a digital health technology used with drugs, CDER wants to
take a fresh look at the policy and may or may not follow CDRH's lead. That
creates great uncertainty for industry (Al-Faruque, 2023, par. 5).
The FDA acknowledges this gap and states that the DHT Steering Committee (set up to oversee
the implementation of the PDUFAVII commitments related to evaluating DHT based
measurements in human drug development) will put forth consistent ways to review and evaluate
(FDA, 2023d).
5.3.1.2 Initial Discussions/Meetings with Regulatory Authorities
Historically, one way that industry had clarified expectations when it introduces a new
technology has been to consult the FDA directly in some form of meeting. As identified by
CBER:
Meetings with industry and sponsor-investigators are a forum for the Agency
to provide guidance to representatives of the regulated industry (including
sponsors/applicants of user fee related products) and/or individual sponsorinvestigators during product development and facility design, and to facilitate
their compliance with the regulations governing development and postapproval marketing of products (FDA, 2023c).
Respondents here also identify that such meetings are ‘most important’ as a source of early
education. Further, the feedback that they received from FDA was most commonly ranked as
‘very important’ for their decision-making. Their responses are consistent with the findings of
Sharpe that evidentiary requirements for regulatory purposes can lack clarity, and that a key
challenge is presented by the “lack of ways to engage with regulators and make decisions on
129
digital tools in a timely manner that keeps pace with innovation” (Sharpe, 2021, p. 63). Early
discussions and meetings with the regulatory authorities are therefore key to explore a variety of
topics (Izmailova, 2023). To this end, companies can use normally scheduled meetings such as
pre-IND and pre-submissions meetings, and have access to meetings with the DHT steering
committee, if needed. However, FDA limits the number of meetings and requires quite extensive
documentation and lead time prior to the meetings, so companies may be reluctant to “waste” the
opportunities by pursuing them before a more mature development plan has been put into place.
Earlier input that might be helpful could include topics such as the usefulness and acceptability
of specific types of DHTs for clinical investigations, the regulatory status of DHTs under
consideration and the expectations for verification/validation of DHTs (FDA, 2023d).
5.3.1.3 Input from Colleagues
Survey responses also pointed to several other educational resources that were being used
to explore the potential suitability of a proposed device. Feedback from industry colleagues was
an important source of education for respondents working with both CW and SP. As identified
elsewhere, frequent information exchange amongst industry stakeholders would appear to be
valuable so that practitioners could share their learning experiences which would facilitate the
development and adoption of best practices (Izmailova, 2018). As DHTs become more common
for data collection, it will be important to develop venues for such exchange and to drive the
development of appropriate standards by which compliance can be assessed. Optimizing the role
of DHTs in drug development through such interactions could improve interoperability among
DHTs and formulate effective strategies for their efficient validation (Khandekar, 2020). One
successful example of an initiative to create such a collaborative space has been the “Digital
Drug Development Tools” (3DT) workstream of the Critical Path for Parkinson’s Consortium
130
(CPP). This public-private partnership led by the Critical Path Institute engages industry
members, academic experts, clinicians, patient advocacy organizations, regulatory agencies, all
of whom work precompetitively to share data, knowledge, and costs to advance the regulatory
maturity of DHTs (Izmailova, 2023).
5.3.1.4 Additional Key Inputs to Decision-Making
Respondents stated that ‘ease of use’ was also an important factor for both CW and SP
when initially exploring the use of DHTs for their clinical trials. As identified by Raval during an
IQVIA webinar titled, Clinical Trials Moving From Site to Home—Lessons Learned from
Digital Health Technologies:
Always keep ourselves in the foot of the end user, i.e. if they are cancer patient or
a pediatric or geriatric patient, what is the feasibility of using the digital endpoint
at the end stage. How beneficial is the data collection to the sponsor or the study.
We must base our decisions on compassion and deriving decisions on who the end
user will be and how difficult it will be for them to utilize the technology. This is
the most significant and important aspect (IQVIA, 2023).
Validations appear also to be recognized by some as a factor, but this factor is ranked
lower as a decision-making criterion at the exploration phase. Exploratory activities instead tend
to focus on business objectives and impediments to trial success. It is only as the implementation
proceeds that this factor begins to take precedence over others, as described below.
Given the many variables underlying a decision to implement a new DHT, it may be
understandable that not all companies found the case for DHT use compelling enough to
implement its use. As stated by one respondent, for example:
FDA did not accept the data generated for a similar consumer wearable, and
we were concerned that the data we would generate would not accurately
represent the endpoint that we were trying to measure. We were concerned
about validation and the risk of introducing a consumer wearable for the first
time in a potential pivotal clinical trial.
131
5.3.2 Installation
During installation, a company assigns resources such as staff and infrastructure to ensure
that the chosen DHT meets the requirements for Part 11 compliance (Fixsen, 2005, Khandekar,
2020). Others who have implemented digital technologies in clinical trials have noted that many
challenges associated with this implementation can be successfully circumvented by providing
the appropriate training to staff and patients and ensuring the availability of appropriate technical
support (Mitsi, 2022). Insufficient budget allocation was also recognized by respondents as a
primary hurdle. It was interesting, however, that respondents did not seem so concerned about a
lack of leadership support. This seems counterintuitive. It possibly might be explained if senior
management was simply inexperienced with DHTs so underestimated the financial requirements
associated with preparatory activities such as the validations and user testing needed to qualify
DHTs. Clinical trials are already complex and expensive; the annual biopharmaceutical R&D
expenditures have been estimated to account for 51% and 94% of out of pocket cost per
approved molecule (DiMasi, 2016). The costs of installing a DHT will add to this cost.
The added costs for DHT implementation are often justified by the opportunity to save
costs and time in the longer term. Much has been said about the potential of digital interventions
to increase safety, efficacy and quality of care, but it is not clear if the implementation costs
eventually result in cost savings overall (Gentili, 2022). For example, when AstraZeneca
introduced digital solutions later in one such study, the change in the overall study design
increased the number of virtual visits and added workload for sites and patients - changes that
contributed to higher rather than lower study costs (Duran, 2023). This observation emphasizes
the importance of selecting the DHT at beginning stages of the trial and ensuring that the study
design takes into consideration well-estimated costs of DHT implementation.
132
5.3.2.1 Regulatory Requirements
Other challenges with regulatory implications are also seen when introducing new DHTs
in a clinical trial. One of the most difficult of these challenges appeared to be how to assure the
quality and integrity of the data, accomplished in part through extensive validation to ensure that
the device can capture accurately the requisite data related to clinical endpoints. This is further
complicated because validations are taking place in an environment of changing regulatory
policies. Thus, Misti recommends that companies planning to use a new DHT should anticipate
these challenges and establish risk management protocols before study launch (Mitsi, 2022).
Digital practices and concerns related to data validity/integrity and security were also mentioned
as stumbling blocks at the CIRS 2020 and 2021 workshops on Digital Technologies (Sharpe,
2020, Sharpe, 2021). CTTI (Clinical Trials Transformation Initiative) recommends that a
sponsor’s decision should be driven by the appropriateness of the use, which should be justified
through the verification and validation process (CTTI, 2021, Khandekar, 2020). However, this is
easier said than done. At the installation phase, some of the problems and risks associated with
the use of the device may not be obvious. Thus, ‘unclear requirements for verification’,
‘cumbersome validations’ and ‘cumbersome risk assessments’ were all identified as potential
concerns in this survey. One respondent also expanded his/her perception of this hurdle as an
‘insufficient understanding of validation methods’. These comments resonate with remarks of
Goldsack during a workshop titled The Role of Digital Health Technologies in Drug
Development (Khandekar, 2020), who discussed the need for well-established standards with a
single “source of truth” and common frameworks to perform verifications and validations. In
further research, it would be interesting to probe how companies are performing their
validations, and in particular to explore whether they are using the V3 (Verification, Analytical
Validation, clinical Validation) framework recently developed by the Digital Medicine Society to
133
determine if a device is fit-for-purpose. As stated by Goldsack, “V3 are foundational to
determine whether a digital medicine tool is fit-for-purpose” (Goldsack, 2020). CTTI further
recommends conducting small-scale feasibility studies before the protocol design is finalized to
ensure the correct analytical approach of the data outputs from the DHTs. Further, it suggests
establishing standards to guide the collection and reporting of data captured by DHTs which
should include:
Transparency of information related to digital technology specifications,
calibration, and verification bench-tests, and Transparency requirements for the
development of algorithms used to convert the data into physiologically and
medically useful endpoints (CTTI, 2021, p. 13).
All of these activities are time consuming and can further cause delays in study timelines if not
planned at study conception.
Not surprisingly, FDA also recognizes the importance of thoughtful validation and
verification. As stated in a recent draft guidance:
When measurements made by DHTs (e.g., glucometers) are used to modify the
administration of the investigational product or the treatment of the participant, it
is critical to evaluate the risk of erroneous measurements resulting in excessive,
deficient, or inappropriate treatment (FDA, 2021b, p. 16).
The Agency ties validations to a risk-based approach. Like MHRA (Medicines and Healthcare
products Regulatory Agency), the FDA recommends that risk assessment be conducted as early
as possible and should include cross functional members such as legal, RA, QA, statistics and
research departments (MHRA, 2022). In this research, respondents also found the risk
management process to be challenging. They identified diverse concerns in early stages of
implementation including “Which risks were being mitigated and how to accept the ones that
were not for exploratory purposes. What do we need to really consider for a go/no go decision”;
“Lack of expertise”; and “Understanding the User Groups - Usability Risks”. As stated by one
134
respondent, “Risk assessment changes based on immediate use and ultimate use and can change
the potential regulatory oversight”.
The complications of validation and risk evaluation are compounded by the fact that most
DHTs require the cooperation of patients or caregivers. Survey respondents in both CW and SP
groups identified that ‘incorrect use by participants’ was an early concern as validations were
planned and risk assessments were developed. Not only must the in-house clinical staff
understand the DHT but clinical support personnel at the sites must be knowledgeable about the
devices so that they can recognize when the devices are working badly. They also must interact
with and train the users, who typically vary in their cognitive capability, technical expertise and
motivation. Their concerns reflect comments made by Tracy Smith, in a recent webinar titled,
Clinical Trials Moving From Site to Home—Lessons Learned from Digital Health Technologies,
who spoke about the importance of training and support to reduce the burden for patients and
improve data generation and collection (IQVIA, 2023). In addition to those potential concerns
mentioned above, other responses from industry when deciding to use CW and SP included
‘insufficient compliance experience’, ‘insufficient technical expertise to perform validations’ and
‘insufficient clarity on regulatory requirements’. The draft guidance (FDA, 2021b) warns that
sponsors and stakeholders (e.g. clinical investigators and other regulated entities) have the
ultimate responsibility to ascertain that their technology fits the intended purpose of the study, so
a clear understanding is needed to be sure that the DHT will be accepted as a valid tool to collect
clinical outcomes (Sharpe, 2021).
5.3.3 Initial Implementation
During the initial implementation phase, teams learn to exercise new skills and practices
before fully integrating the DHT into accepted procedures (Fixsen, 2009). In this dynamic phase,
135
different approaches are used as practitioners try to achieve target benchmarks. It was clear from
survey responses that some of the challenges identified in the installation phase continued into
the implementation phase. Resource limitations continued to exist related to time, staffing and
cost. However, these limitations were associated with somewhat different activities, including
the time needed to train personnel and subjects, the time and expertise needed to complete risk
assessments and validations (including efforts to secure the required financial resources) and
delays associated with FDA interactions.
5.3.3.1 Technical Expertise and Training
One set of risks underestimated as implementation begins appears to be the time
commitment required for training personnel and patients. These issues were found to be most
impactful in causing study delays and complications, illustrated by comments such as ‘trial
participants made errors’; ‘participants were not compliant’; and the ‘device did not work as
anticipated’. Such challenges probably also accounted for the fact that ‘time to train subjects’
was rated somewhat higher for SP than CW. Software platforms can be difficult to design,
especially if they are to be used by patients who can have limited technical or cognitive
capabilities (Baig, 2017). Challenges are not restricted to the training of patients. Study
coordinators who often lack technical or computing expertise may have to become adept with
novel operations such as using a specialized watch, making iPod pairings or assuring software
and portal functionality. In the face of these challenges, it is understandable why ‘meeting
timelines’ was identified as a major challenge. Further, as noted by Richeson, training must be
repeated when new personnel are added or changes to a mobile platform are released (Richeson,
2017). The FDA also recommends that study participants be reassessed and retrained if the
136
mobile tool is complex or poses a significant risk to the conduct of the study even if it has not
been significantly changed over time.
5.3.3.2 Interactions with FDA
Considering that many respondents acknowledged problems when trying to understand
FDA expectations, it was not surprising that more than half of respondents working with CWs
and a third with SPs had participated in one or more meetings with FDA staff. Those interactions
appear to be well appreciated; most considered them to be very helpful to provide guidance for
decision making on the use of the DHT. However, the time needed to prepare for and have those
discussions was an issue for some respondents. It was ranked as a key factor that could delay the
trial almost as much as did the challenges of training. Notably, the FDA recognizes the need to
improve its ability to provide more detailed and timely assistance, perhaps outside of the meeting
format. It has identified plans to expand its own technical expertise and to develop training
programs within the Agency to increase its own knowledge and sensitivity to issues related to the
use of DHTs.
Validations are a particular source of concern for FDA. Foci include assessments of
verification and validation requirements and the possibility of including usability and
interoperability studies as part of the validation; the use of a participant’s own DHT or a general
purpose computing platform to ensure that measurements are standardized across different
protocol-specified DHTs; and the impact of upgrades and updates to understand what constitutes
“a meaningful difference in results observed before and after the updates and how the differences
impact interpretability of those results in their context of use” (FDA, 2023d, p. 10). Ongoing
communication will be important because validation plans may change based on field experience
137
after the DHT is deployed for use in the trial. Further, the numerous types of DHTs stand in the
way of a ‘one size fits all’ plan. This concern was also voiced by survey respondents:
Ease of use for the wearable and patient compliance. Access to raw data from
manufacturers. Stability of the platform (ie will there be hardware or software
updates happening mid-study that may impact the data usability). One note, many
of our wearables are not consumer devices but operate in a similar manner (eg
actigraphy). Just a small difference in that we can purchase for use in our clinical
trials but the tools aren't available off the shelf for consumers.
and
Separate compliance from acceptance. You may be perfectly compliant with a
device, but that means nothing in terms of anyone accepting the evidence. These
are two very different things. Part 11 reqs are poorly understood and applied to
this kind of devices, with definitions needed for data processing, what is a clinical
trial system, what is source data etc... most people are not even thinking about
these.
Another area that must be revisited as companies obtain their first real world experience
are the preliminary risk assessments begun in installation. Those assessments can be tested
against user experiences as implementation proceeds. They can drive revisions of the protocol
and of the benefit-risk relationship that may have to be communicated to the FDA. Another area
of concern that may be recognized only after the DHT is deployed relates to the functionality of
the software in the devices. Concerns that ‘operating system upgrades caused software
incompatibility’ shine a light on early implementation as a time when unanticipated risks are
recognized and must be addressed (MHRA, 2022).
5.3.4 Full Implementation
During full implementation, the organization is at a point where the chosen DHT is an
established part of the clinical protocol. Continuing are challenges that had already been
identified during earlier stages, such as ‘adequate time’, ‘training’ and ‘conducting risk
assessments’. The primary tasks at this later stage, however, seem to focus on maintaining the
138
operational capabilities of the devices and assuring their ongoing compliance with regulations.
These concerns are reflected in comments about revisions or maintenance activities:
‘reassessments when regulatory requirements/guidances changed’, ‘managing identified risks’
and ‘performing reverifications and revalidations’.
The ongoing management of risks is important for most aspects of product development,
and this is no different for the management of DHTs. It is exaggerated when regulatory
requirements change mid-trial, but even without such changes, risk reassessment and control will
need ongoing attention. Risk management can only be carried out effectively if experience in the
field is used effectively. The feedback can help to assure that risks have been identified
appropriately and that control measures are working well (ICH, 2023b). In some cases, that
feedback may require a minor change that does not impact the benefit-risk relationship, if it does
not change the following i.e.:
the validity of the data or information resulting from the completion of the
approved protocol, or the relationship of likely patient risk to benefit relied upon
to approve the protocol; the scientific soundness of the investigational plan; or
the rights, safety, or welfare of the human subjects involved in the investigation
(FDA, 2001, p. 2-3).
In this case, the change could be simply described in the annual report. If, however, the change
adversely impacts the risk profile, a supplement must be submitted to the Agency. As stated by
FDA, “Any change to the basic principles of operation of a device is considered to be a
significant change and, thus, requires prior FDA approval” (FDA, 2001, p. 16). Sponsors of the
trial may find that these exchanges require additional time and resources to satisfy and so can
delay timelines.
Survey results showed that validation continues to be a concern as implementation
progresses but the challenges center around concerns about revalidation. Expectations for
revalidation have not received much attention in the literature to date. Because the use of DHTs
139
is quite new, many companies may only now be reaching a stage of full implementation when
revalidation becomes an issue. Further industry may not recognize it as a separate activity
specifically designated as ‘revalidation’, but instead consider it as part of its “validation of
changes” which “should be based on risk and consider both previously collected and new data”
(ICH, 2023a). The nomenclature around revalidation is currently fuzzy. For example, the draft
version of the ICH E6 (R3) talks about achieving quality datasets by “implementing timely and
reliable processes for data capture, verification, validation, review and rectification of errors and
omissions that have a meaningful impact on the safety of trial participants and/or the reliability
of the trial results” (ICH, 2023a). This mention of ‘review and rectification’ appears to be
synonymous with the terms ‘reassessments’, ‘revalidation’ and ‘reverifications’.
Validation and revalidation appear to be activities in which industry as well as
governmental agencies have a vested interest to improve and systematize. Respondents
underlined this need by stating that “enforcement and expectations vary far too widely” and that
“one must become their own expert to get this done with remote possibility of being effective”.
How do these stakeholders then go forward? The public-private partnership, CTTI, is one
organization where some of the challenges identified in the later stages of implementation and
validation might be examined. CTTI has a stated mission to identify legal, regulatory, and
practical barriers to DCTs and inform policies that affect their implementation (Medable, 2021).
Through its collaborations, it is well placed to provide a forum for such discussions, but to date,
most of its work has been directed at the earlier stages of selecting and managing the right
devices (Izmailova, 2018). They have not yet been as active in addressing the requirements for
reassessments in the full implementation stage.
140
The whole validation process including needs for revalidations and reverifications could
profit also from additional guidance from the recognized expert sources. It might be a topic wellsuited for study by standards-setting or guidance developing organizations such as ISO
(International Organization for Standardization) or ICH (International Council for
Harmonization of Technical Requirements for Pharmaceutical for Human Use). Early planning
and coordination in partnership with regulatory agencies and industry partners would allow
linkages across datasets and help to ensure that validation processes are completed in a more
standardized way (Izmailova, 2018, Baig, 2017).
5.3.5 Conclusions and Future Directions
Decentralized trials are becoming more common because they can increase patient
recruitment and reduce trial costs. Key to decentralization, however, is the ability to use digital
health technologies instead of human staff, who previously had to record trial data and monitor
the health of patients as they visited brick-and mortar clinical sites. Yet DHTs are not trivial to
manage. Confusions related to regulatory requirements and resource needs are commonly
identified in this study, especially in areas such as training of users and ongoing maintenance and
validation of the DHTs. The results suggest that a concerted effort amongst stakeholders,
including industry and regulatory agencies, will be needed to realize the full potential of this
technology. This is particularly important because DHTs have a broad range of applications
across drug and device development, so no “one-size-fits-all” evidence framework will likely be
sufficient to meet current areas of challenge. Different approaches may be needed to manage
different types of devices in the same way that guidance documents are often available for
individual classes of medical devices with specific performance requirements.
141
One purpose of this research has been to assist companies to identify where it will face
the greatest hurdles when trying to implement DHTs. The research suggests that the installation
and use of DHTs can be more expensive than it might seem at first glance, so companies should
plan early to assure that sufficient resources and experienced staff are put into place, especially
in areas of training and validation. It also suggests that ongoing interactions may be needed with
FDA until more standardized approaches and regulatory guidance have been provided in some
areas of concern. Participation of industry in collaborations with other companies and
professional organization may help to develop best practices. Efforts to standardize approaches
to training and validation might help to relieve the burden on FDA which now carries the burden
of clarifying expectations as a primary educator. It would also help industry to design validations
with more confidence, and would improve the quality of data that is ultimately collected with the
assurance that the DHTs and their associated data outputs will be accepted by FDA, many
companies will be skeptical about introducing a new technology into an already expensive and
complicated trial protocol.
The current dissertation focused on the compliance of DHTs with Part 11 requirements to
be sure that the data collected by those devices will be accepted by regulatory agencies and
scientific experts. However, these challenges should not be viewed as the only areas in which the
use of DHTs may need more study. For example, software management and cybersecurity were
not investigated here. Further research will be needed to explore the current views and
experiences when assuring privacy and security from data breaches or software incursions,
regulated by the US Health Insurance Portability and Accountability Act as well as a variety of
trial-related regulations.
142
In conclusion, DHTs are a novel and powerful approach to data collection and analysis in
clinical trials. As they coevolve with the use of machine learning, artificial intelligence,
distributed hash tables and block chain technology for healthcare, they have great promise to
decrease the time required to complete clinical trials, decrease costs improve patient access and
the quality of research data all while ensuring data security and privacy. However, the results of
this dissertation may provide insight into areas of challenge when implementing DHTs in an
environment where compliance with regulations could limit entry into mainstream clinical trials.
143
References
Al-Faruque, F. (2022). Stakeholders want more clarity about devices used for remote clinical
studies. Regulatory Focus, A RAPS Publication. Available at:
https://www.raps.org/news-and-articles/news-articles/2022/3/stakeholders-want-moreclarity-about-devices-used (Accessed: 13 September 2023).
Al-Faruque, F. (2023). FDA outlines plan for digital health technologies for clinical trials.
Regulatory Focus, A RAPS Publication. Available at: https://www.raps.org/News-andArticles/News-Articles/2023/3/FDA-outlines-plan-for-digital-health-technologies
(Accessed: 09 September 2023).
Baig, M. M., et al. (2017). A systematic review of wearable patient monitoring systems – Current
challenges and opportunities for clinical adoption. Journal of Medical Systems, 41(7): 1-
9.
Balas, E. A., Chapman, W. W. (2018). Road map for diffusion of innovation in health care.
Health Affairs, 37(2): 194-204.
Beauchamp, U. L., Pappot, H., Hollander-Mieritz, C. (2020). The use of wearables in clinical
trials during cancer treatment: Systematic review. JMIR MHealth UHealth, 8(11): 1-15.
Bertram, R. M., Blase, K. A., Fixsen, D. L. (2013). Improving programs and outcomes:
Implementation frameworks 2013. Bridging the research & practice gap symposium.
Houston, TX: National Implementation Research Network.
Boulware, D. R., et al. (2020). A randomized trial of hydroxychloroquine as postexposure
prophylaxis for COVID-19. New England Journal of Medicine, 383(6): 517-525.
Bracht, G. H., Glass, G. V. (1968). The external validity of experiments. Americal Educational
Research Journal, 5(4): 437-734.
CISCRP. (2017a). 2017 Perceptions & insights study - Report on the participation decisionmaking process. Available at: https://www.ciscrp.org/wp-content/uploads/2019/06/2017-
CISCRP-Perceptions-and-Insights-Study-Decision-Making-Process.pdf (Accessed: 05
March 2022).
CISCRP. (2017b). 2017 Perceptions & insights study - Report on the participation experience.
Available at: https://www.ciscrp.org/wp-content/uploads/2019/06/2017-CISCRPPerceptions-and-Insights-Study-Participation-Experience.pdf (Accessed: 03 March
2022).
CISCRP. (2021). Findings from the CISCRP 2021 perceptions & insights study. Available at:
https://www.ciscrp.org/findings-from-the-ciscrp-2021-perceptions-insights-study/
(Accessed: 05 March 2022).
144
CISCRP. (2022a). CISCRP about us. Available at: https://www.ciscrp.org/about-us/ (Accessed:
07 July 2022).
CISCRP. (2022b). CISCRP research services. Available at:
https://www.ciscrp.org/services/research-services/ (Accessed: 21 August 2022).
Climedo. (2022). The decentralized, patient - Centric platform for clinical trials. Available at:
https://climedo.de/en/solutions/decentralized-clinical-trials/ (Accessed: 17 April 2022.
Coons, S. J., et al. (2015). Erratum to: Capturing patient-reported outcome (PRO) data
electronically: The past, present, and promise of ePRO measurement in clinical trials.
The Patient, 8(4): 301-309.
CTTI. (2018). CTTI recommendations: Decentralized clinical trials. CTTI-clinicaltrials.org.
Available at: https://ctti-clinicaltrials.org/wpcontent/uploads/2021/06/CTTI_DCT_Recs.pdf (Accessed: 31 May 2022).
CTTI. (2021). CTTI considerations for advancing the use of digital technologies for data capture
& improved clinical trials. Available at: https://ctti-clinicaltrials.org/wpcontent/uploads/2021/06/CTTI_Digital_Health_Technologies_Recs.pdf (Accessed).
Devroe, R., Wauters, B. (2019). How to enhance the external validity of survey experiments? A
discussion on the basis of an experimental study on political gender stereotypes in
Flanders (Belgium), London, Sage Publications, Inc.
DHEW (1979). The Belmont report: ethical principles and guidelines for the protection of human
subjects of research. The National Commission for the Protection of Human Subjects of
Biomedical and Behavioral Research. Washington, D.C.: HHS.
DiMasi, J. A., Grabowski, H. G., Hansen, R. W. (2016). Innovation in the pharmaceutical
industry: New estimates of R&D costs. Journal of Health Economics, 47(n.i.): 20-33.
DocuSign. (2022). Send, sign, success. DocuSign eSignature. Available at:
https://www.docusign.com/products/electronic-signature (Accessed: 10 January 2022).
Dorsey, E. R., et al. (2020). The new normal in clinical trials: Decentralized studies. Annals of
Neurology, 88(5): 863-866.
Draugalis, J., Coons, S., Plaza, C. (2008). Best practices for survey research reports: A synopsis
for authors and reviewers. American Journal of Pharmaceutical Education, 72(1): 1-6.
Druckman, J. N., et al. (2011). Cambridge handbook of experimental political science, New
York, NY, Cambridge University Press.
Dubinsky, P. M., Henry, K. A. (2022). The fundamentals of clinical research, NJ, USA, John
Wiley & Sons, Inc.
145
Duran, C. O., et al. (2023). Implementation of digital health technology in clinical trials: The 6R
framework. Nature Medicine, 29(n.i.): 2693-2697.
Evans, J. R., Mathur, A. (2018). The value of online surveys: A look back and a look ahead.
Internet Research, 28(4): 854-887.
FDA. (1997). Electronic submissions; Establishment of public docket. Government Publishing
Office, DHHS.
FDA. (1999). Guidance for industry - Computerized systems used in clinical trials. Government
Publishing Office, DHHS.
FDA. (2001). Changes or modifications during the conduct of a clinical investigation; Final
guidance for industry and CDRH staff. Government Publishing Office, DHHS.
FDA. (2002). General principles of software validation; Final guidance for industry and FDA
staff. Government Publishing Office, DHHS.
FDA. (2003). Guidance for industry part 11, electronic records; Electronic signatures — Scope
and application. Government Publishing Office, DHHS.
FDA. (2007). Guidance for industry computerized systems used in clinical investigations.
Goverment Publishing Office, DHHS.
FDA. (2009). Guidance for industry Q10 pharmaceutical quality system. Government Publishing
Office, DHHS.
FDA. (2016 [1997]). 21CFR§11 part 11 - Electronic records; Electronic signatures. Government
Publishing Office, HHS.
FDA. (2017). Use of electronic records and electronic signatures in clinical investigations under
21 CFR part 11 – Questions and answers. Government Publishing Office, DHHS.
FDA. (2018a). A Brief History of the Center for Drug Evaluation and Research. Government
Publishing Office, CDER.
FDA. (2018b). E6(R2) Good Clinical Practice: Integrated Addendum to ICH E6(R1) Guidance
for Industry. Government Publishing Office, DHHS.
FDA. (2021a). Bioresearch monitoring (BIMO) FDA compliance program 7348.810, for
sponsors and contract research organizations. Government Publishing Office, FDA.
FDA. (2021b). Digital health technologies for remote data acquisition in clinical investigations
guidance for industry, investigators, and other stakeholders. Government Publishing
Office, DHHS.
146
FDA. (2022a). Digital health center of excellence updates from CDRH. Available at:
https://content.govdelivery.com/accounts/USFDA/bulletins/31dea9f (Accessed: 07
August 2022).
FDA. (2022b). Digital Health Technologies for Remote Data Acquisition in Clinical
Investigations - DHTs for Remote Data Acquisition Draft Guidance Webinar. Available
at: https://sbiaevents.com/files2022/DHTs-Webinar-2022-02-10.pdf (Accessed: 10 March
2022).
FDA. (2023a). Digital health center of excellence. Available at: https://www.fda.gov/medicaldevices/digital-health-center-excellence (Accessed: 10 August 2023).
FDA. (2023b). Electronic systems, electronic records, and electronic signatures in clinical
investigations questions and answers. Government Publishing Office, DHHS.
FDA. (2023c). Formal meetings for CBER - Regulated products. Washington, DC. Available at:
https://www.fda.gov/vaccines-blood-biologics/development-approval-processcber/formal-meetings-cber-regulated-products (Accessed: 05 July 2023).
FDA. (2023d). Framework for the use of digital health technologies in drug and biological
product development. Washington, DC. Available at:
https://www.fda.gov/media/166396/download (Accessed).
FDA. (2023e). Webinar - Electronic systems, electronic records, and electronic signatures in
clinical investigations: Questions and answers. Available at:
https://www.fda.gov/media/169688/download (Accessed: 26 August 2023).
FDA (n.d.). Nuremberg code : Directives for human experimentation. In: ORI, D. (ed.).
Washington, DC: Government Publishing Office.
Felsenthal, J. (2011). Give me your John Hancock. When did we start signing our names to
authenticate documents? Available at: https://slate.com/news-and-politics/2011/03/whendid-we-start-signing-our-names-to-authenticatedocuments.html#:~:text=Between%20the%20sixth%20century%2C%20when,to%20sign
ing%20to%20formalize%20contracts. (Accessed: 04 November 2021).
Fixsen, D. L., et al. (2005). Implementation research: A synthesis of the literature, Tampa, FL,
University of South Florida, , Louis de la Parte Florida Mental Health Institute, National
Implementation Research Network.
Fixsen, D. L., et al. (2009). Core implementation components. Research on Social Work
Practice. North Carolina, Chapel Hill.
Fixsen, D. L., Fixsen, A. A. M. (2016). An integration and synthesis of current implementation
frameworks. Available at:
https://www.ojp.gov/sites/g/files/xyckuh241/files/media/document/fixsenintegrateframe.
pdf (Accessed: 11 June 2022).
147
Gao, F. e. a. (2021). Why decentralized clinical trials are the way of the future. Available at:
https://www.appliedclinicaltrialsonline.com/view/why-decentralized-clinical-trials-arethe-way-of-the-future (Accessed: 18 November 2021).
Gentili, A., et al. (2022). The cost-effectiveness of digital health interventions: A systematic
review of the literature. Digitalization for Precision Healthcare, 10(n.i.): 787135.
Goldsack, J. C. (2022). Comment from Digital Medicine Society, FDA DHT use in clinical trials
draft guidance. Available at: FDA-2021-D-1128-0029.
Goldsack, J. C., et al. (2020). Verification, analytical validation, and clinical validation (V3):
The foundation of determining fit-for-purpose for biometric monitoring technologies
(BioMeTs). NPJ Digital Medicine, 3(55): 1-15.
Goodrich, W. W. (1963). FDA's regulations under the Kefauver-Harris drug amendments of
1962. Food, Drug, Cosmetic Law Journal, 18(10): 561-569.
Gough, J., Nettleton, D. (2010). Managing the document maze. Answers to questions you didn't
even know to ask, New Jersey, John Wiley & Sons, Inc.
Grunenthal. (n.d.). The history of the thalidomide tragedy. Available at:
https://www.thalidomide-tragedy.com/en/the-history-of-the-thalidomide-tragedy
(Accessed: 22 August 2023).
Hanson, C. (2023). FDA revamps guidance concerning use of electronic systems, records, and
signatures in clinical investigations. Available at:
https://www.nelsonmullins.com/idea_exchange/blogs/healthcare_essentials/digital_health
/fda-revamps-guidance-concerning-use-of-electronic-systems-records-and-signatures-inclinical-investigations (Accessed: 17 August 2023).
Hawkins, L. (2022). Top five countries running the most clinical trials. Available at:
https://www.pharma-iq.com/pre-clinical-discovery-and-development/articles/top-fivecountries-running-the-most-clinical-trials (Accessed: 05 September 2023).
Heller, K. J. (2011). The Nuremberg military tribunals and the origins of international criminal
law, Oxford, New York, Oxford University Press.
Herzog, A. R., Bachman, J. G. (1981). Effects of questionnaire length on response quality. The
Public Opinion Quarterly, 45(4): 549-559.
Hesse-Biber, S. N., Leavy, P. (2010). The practice of qualitative research, Thousand Oaks, CA.
ICH. (1995). ICH topic E6 (R1) guideline for good clinical practice. International Council For
Harmonisation Of Technical Requirements For Pharmaceuticals For Human Use.
Available at: https://www.ema.europa.eu/en/documents/scientific-guideline/ich-e6-r1-
guideline-good-clinical-practice_en.pdf (Accessed on EMA website) (Accessed: 25
August 2023).
148
ICH. (2016). Integrated addendum to ICH E6 (R1): Guideline for good clinical practice E6 (R2).
International Council For Harmonisation Of Technical Requirements For
Pharmaceuticals For Human Use. Available at:
https://database.ich.org/sites/default/files/E6_R2_Addendum.pdf (Accessed: 20 October
2023).
ICH. (2023a). ICH harmonised guideline good clinical practice (GCP) E6 (R3) (draft version).
International Council For Harmonisation Of Technical Requirements For
Pharmaceuticals For Human Use. Available at:
https://database.ich.org/sites/default/files/ICH_E6%28R3%29_DraftGuideline_2023_051
9.pdf (Accessed: 21 October 2023).
ICH. (2023b). Quality risk management Q9 (R1). Available at:
https://database.ich.org/sites/default/files/ICH_Q9%28R1%29_Guideline_Step4_2023_0
126_0.pdf (Accessed: 29 August 2023).
ICH. (n.d.). ICH official website. Available at: https://www.ich.org/ (Accessed: 15 December
2021).
IQVIA. (2023). Clinical trials moving from site to home - Lessons learned from digital health
technologies. Available at: https://www.appliedclinicaltrialsonline.com/act_w/digitalhealth
(Accessed: 13 September 2023).
IQVIATechnologies. (2022). BYOD is here to stay : Considerations for implementing a BYOD
eCOA strategy. Clinical Leader. Available at: https://www.clinicalleader.com/doc/byodis-here-to-stay-0001 (Accessed: 06 June 2022).
IronMountain.com. (2022). Document scanning and digital storage services. Available at:
https://www.ironmountain.com/services/document-scanning-and-digital-storage#benefits
(Accessed: 05 January 2022).
ISO. (2019). IEC 31010:2019 risk management - Risk assessment techniques. Available at:
https://www.iso.org/standard/72140.html (Accessed: 12 October 2023).
ISPE (2008). A risk-based approach to compliant GXP computerized systems, Tampa, FL.
Izmailova, E. S., Wagner, J. A., Perakslis, E. D. (2018). Wearable devices in clinical trials: Hype
and hypothesis. Clinical Pharmacology & Therapeutics, 104(1): 42-52.
Izmailova, E. S. e. a. (2023). Digital technologies: Innovations that transform the face of drug
development. Clinical Translational Science, 16(8): 1-8.
Jansen, Y., Thornton, G. (2020). Wearables & big data in clinical trials - Where do we stand?
Available at: https://www.clinicalleader.com/doc/wearables-big-data-in-clinical-trialswhere-do-we-stand-0001 (Accessed: 20 August 2023).
149
Jardine, J. (2020). Understanding GxP predicaterules is key to compliance with FDA 21 CFR
part 11. MasterControl. Available at: https://www.mastercontrol.com/gxplifeline/understanding-gxp-predicate-rules-is-key-to-compliance-with-fda-21-cfr-part-11/
(Accessed: 04 March 2022 2022).
Jardine, J. (2022). The role of FDA predicate rules in 21 CFR part 11 compliance.
MasterControl. Available at: https://www.mastercontrol.com/gxplifeline/fda_21_cfr_part_11/ (Accessed: 06 July 2023).
Johner, C. (2018). 21 CFR part 11: You should know these requirements. Available at:
https://www.johner-institute.com/articles/regulatory-affairs/and-more/21-cfr-part-11/
(Accessed).
Khandekar, E., et al. (2020). The role of digital health technologies in drug development:
Proceedings of a workshop. In: NIH (ed.) National Library of Medicine, National Center
for Biotechnology Information. Washington, DC: The National Acadamies Press.
Khozin, S., Coravos, A. (2019). Decentralized trials in the age of real-world evidence and
inclusivity in clinical investigations. Clinical Pharmacology & Therapeutics, 106(1): 25-
27.
Kronk, H. (2018). What is title 21 CFR part 11- And why is it so complicated? Available at:
https://news.elearninginside.com/what-is-title-21-cfr-part-11-and-why-is-it-socomplicated/ (Accessed: 27 November 2021).
Linder, D. O. (n.d.). The Nuremberg trials: The doctors trial. Available at: https://famoustrials.com/nuremberg/1903-doctortrial (Accessed: 23 August 2022).
Marra, C., et al. (2020). Quantifying the use of connected digital products in clinical research.
NPJ Digital Medicine, 3(50): 1-5.
Marshall, C., Rossman, G. B. (2014). Designing quality research, Thousand Oaks, CA, SAGE
Publications.
Marshall, M. M. (2006). 21 CFR part 11: Rules for complying with the rules. Office of the VicePresident for Research. Available at:
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKE
wi5yrRlb34AhWuD0QIHRkVCwMQFnoECA4QAQ&url=https%3A%2F%2Fcals.arizona.ed
u%2Fclasses%2Fpls595d%2FPresentations-06%2FPLS595d-21CFR06.ppt&usg=AOvVaw2nU43jH3Oiq2kRvw_r0hw9 (Accessed: 04 March 2022).
McDowall, R. D. (2014). Computer validation: Do all roads lead to annex 11? Spectroscopy,
29(12): 10-13.
150
McGuireWoods. (2022). FDA issues draft guidance on digital health technologies in clinical
trials. Available at: https://www.mcguirewoods.com/client-resources/Alerts/2022/1/fdaissues-draft-guidance-on-digital-health-technologies-in-clinical-trials (Accessed: 17 April
2022).
Medable. (2021). The centricity of decentricity - Breaking down the basics of decentralized
clinical trials. Available at: https://www.centerwatch.com/whitepapers/medable/thebasics-of-decentralized-clinical-trials (Accessed: 15 November 2023).
MHRA. (2022). Guidance risk-adapted approach to clinical trials and risk assessments. UK.
Available at: https://www.gov.uk/government/publications/risk-adapted-approach-toclinical-trials-and-risk-assessments/risk-adapted-approach-to-clinical-trials-and-riskassessments#:~:text=The%20risk%20assessment%20must%20be,been%20considered%2
0in%20previous%20trials. (Accessed).
Mitsi, G., et al. (2022). Implementing digital technologies in clinical trials: Lessons learned.
Innovations in Clinical Neuroscience, 19(4-6): 65-69.
Mooney, M. (2022). Ensuring your BYOD strategy meets the rigor of regulators. Clinical
Leader. Available at: https://www.clinicalleader.com/doc/ensuring-your-byod-strategymeets-the-rigor-of-regulators-0001?vm_tId=2421267&vm_nId=73272&user=511a6af6-
b160-4091-bd2cc7b5c4a19450&gdpr=0&vm_alias=Ensuring%20Your%20BYOD%20Strategy%20Meet
s%20The%20Rigor%20Of%20Regulators&utm_source=mkt_CLNCL&utm_medium=e
mail&utm_campaign=CLNCL_06-08-2022-IQVIA-DPS&utm_term=511a6af6-b160-
4091-bd2cc7b5c4a19450&utm_content=Ensuring%20Your%20BYOD%20Strategy%20Meets%20
The%20Rigor%20Of%20Regulators&mkt_tok=MDc1LU5WQy0wODYAAAGE4l_XU
6APtCo4Yaii8ANsoMwLET9TQnQq4C_Ewg7On6b1_1UL6LAjJc88GjfyQRCT529aN8
5GQCN-Q96Twq1-rkuncxQTmDIxut8VpQVVI9k5IA (Accessed: 11 June 2022).
Moullin, J. C., et al. (2019). Systematic review of the exploration, preparation, implementation,
sustainment (EPIS) framework. Implementation Science, 14(1): 1-16.
Nair, C. S., Adams, P., Mertova, P. (2008). Student engagement: The key to improving survey
response rates. Quality in Higher Education, 14(3): 225-232.
Nettleton, D. (2021). What is FDA 21 CFR part 11? Available at:
http://computersystemvalidation.com/index.php?Itemid=9&catid=10:articles&id=48:wha
t-is-fda-21-cfr-part-11&option=com_content&view=article (Accessed: 27 November
2021).
Nettleton, D., Gough, J. (2004). Electronic record keeping achieving and maintaining
compliance with 21 CFR part 11 and 45 CFR parts 160, 162 and 164, Boca Raton
Interpharm/CRC.
NIRN. (2022). Framework 2: Implementation stages. Chapel Hill, NC. Available at:
https://nirn.fpg.unc.edu/module-1/implementation-stages (Accessed: 16 May 2022).
151
Ohmann, C., et al. (2011). Standard requirements for GCP - Compliant data management in
multinational clinical trials. Trials, 12(85): 1-9.
Olsen, P. (2022). Industry trends in cinical development technology. Raleigh, NC: ISR Reports.
Available at: https://www.clinicalleader.com/doc/industry-trends-in-clinicaldevelopment-technology-0001 (Accessed: 04 March 2022).
Orcher, L. T. (2007). Conducting a survey: Techniques for a term project, New York, NY,
Pyrczak Publishing.
Otte, A., Maier-Lenz, H., Dierckx, R. A. (2005). Good clinical practice: Historical background
and key aspects. Nuclear Medicine Communications, 26(7): 563-574.
Pan, C., Latimer, J., Gaebler, J. A. (n.d.). Transforming the regulatory landscape for digital
health technologies in drug development. Available at:
https://www.biogen.com/content/dam/corporate/international/global/enUS/docs/pdfs/Biogen-Regulatory-Policy-White-Paper_Transforming-the-RegulatoryLandscape-for-Digital-Health-Technologies-in-Drug-Developmen.pdf (Accessed).
Parker, R. A., Rea, L. M. (2014). Designing and conducting survey research: A comprehensive
guide, 4th edition, Hoboken, NJ, Jossey-Bass, A Wiley Brand.
Pathak, N. (2018). Role of ICH - GCP in clinical trials. James Lind Institute. Available at:
https://www.jliedu.com/blog/ich-gcp-clinical-trials/ (Accessed: 14 August 2022).
PhRMA. (2022). Biopharmaceutical digital health lexicon. Washington, DC: PhRMA.org.
Available at: https://www.phrma.org/-/media/Project/PhRMA/PhRMA-Org/PhRMAOrg/PDF/P-R/PhRMA-Digital-Health-Lexicon.pdf (Accessed).
Porter, S. R., Whitcomb, M.E., Weitzer, W. H. (2004). Multiple surveys of students and survey
fatigue. New Directions for Institutional Research, 2004(121): 63-73.
Price, J. H. (2004). Research limitations and the necessity of reporting them. American Journal
of Health Education, 35(2): 66-67.
Richeson, M. (2017). FDA Guidance on 21 CFR Part 11 and Mobile Tech In Clinical Trials.
Perficient. Available at: https://blogs.perficient.com/2017/07/31/fda-guidance-on-21-cfrpart-11-and-mobile-tech-in-clinical-trials/ (Accessed: 17 April 2022.
Richman, G. B. (2003). The end of the 21 CFR part 11 controversy and confusion?
Pharmaceutical Technology Europe, 15(9): 51+.
Rogers, E. M. (2003). Diffusion of innovations, New York, Free Press.
Ross, P. T., Zaidi, N. L. B. (2019). Limited by our limitations. Perspect Med Educ, 8(4): 261-
264.
152
Saleh, A., Bista, K. (2017). Examining factors impacting online survey response rates in
educational research: Perceptions of graduate students. Journal of MultiDisciplinary
Evaluation, 13(29): 63-74.
Schilling, W. H. K., et al. (2020). Chloroquine/ hydroxychloroquine prevention of coronavirus
disease (COVID-19) in the healthcare setting; protocol for a randomised, placebocontrolled prophylaxis study (COPCOV) 5(241): n.p.
Sharpe, J. (2020). Re-imagining medicines regulatory models: Implementing fit-for-purpose
sustainable activities for patient access. Available at:
https://www.cirsci.org/publications/2020-workshop-report-reimagining-regulatorymodels/ (Accessed).
Sharpe, J. (2021). Digital technologies: Enabling evidence generation in clinical development
for regulatory and reimbursement decisions – How are the regulatory and HTA
landscapes adapting? Available at: https://www.cirsci.org/publications/2021-workshopreport-digital-technologies-for-clinical-evidence-generation/ (Accessed).
Speer, J. (2016). An introduction to FDA 21 CFR part 11. Available at:
https://www.greenlight.guru/blog/fda-21-cfr-part-11 (Accessed).
Starr, S. (2012). Survey research: We can do better. Journal Medical Library Association,
100(1): 1-2.
Sue, V. M., Ritter, L. A. (2013). Conducting online surveys, Thousand Oaks, CA, Sage.
TransCelerate. (2020). Beyond COVID-19 modernizing clinical trial conduct. Available at:
http://transceleratebiopharmainc.com/wpcontent/uploads/2020/07/TransCelerate_Beyond-COVID19_Modernizing-Clinical-TrialConduct_July-2020.pdf (Accessed).
UN. (n.d.). Universal declaration of human rights. Available at: https://www.un.org/en/aboutus/universal-declaration-of-human-rights (Accessed: 22 August 2023).
Uren, S., et al. (2013). Reducing clinical trial monitoring resource allocation and costs through
remote access to electronic medical records. Journal of Oncology Practice, 9(1): e13-e16.
Van Norman, G. A. (2021). Decentralized clinical trials the future of medical product
development? JACC: Basic to Translational Science, 6(4): 384-387.
WHO. (2005). Handbook for Good Clinical Research Practice (GCP) Guidance for
Implementation. Office of Publications, WHO.
Winter, W., Huber, L. (2003). Part 11 Is not going away the new electronic records draft
guidance. Biopharm International, 16(5): 28-34.
153
WMA. (2023). WMA declaration of helsinki – Ethical principles for medical research involving
human subjects. Available at: https://www.wma.net/policies-post/wma-declaration-ofhelsinki-ethical-principles-for-medical-research-involving-human-subjects/ (Accessed:
15 October 2022).
Yadin, Y. (1971). Bar-kokhba: The rediscovery of the legendary hero of the second jewish revolt
against Rome, New York, NY, Random House.
Young, G. (2016). Constitution of Medina Encyclopedia Britannica. Available at:
https://www.britannica.com/topic/Constitution-of-Medina.
154
Appendices
155
Industry Survey
Industry Survey - 21 CFR Part 11
Compliance of Digital Health
Technologies in Clinical Trials
Start of Block: Block 1 - Demographics
Q1 Thank you for sharing your experiences about how you comply with 21 CFR Part 11
(identified below as “Part 11”) when you work with digital health technologies in your clinical
trials. Your responses will be kept anonymous. We will ask you about your familiarity with two
types of devices, consumer wearables (e.g. smart watches) and software platforms to collect
patient reported outcomes (e.g. eDiaries). We will give you the choice of answering questions
about one or both of these device types. If you find that you cannot answer a question, please
feel free to skip it or to choose the “not sure/I don’t know” options.
Q2 Please tell us about the organizations with which you have worked in the last five years.
(Select all that apply).
▢Contract Research Organization (1)
▢Pharmaceutical/Biotechnology Company (2)
▢Medical Device/IVD Company (3)
▢Consulting Company (4)
▢Other, please specify: (5)
__________________________________________________
156
Q3 With which functional groups have you been affiliated in the past and presently? (Select all
that apply)
▢Clinical Trials Team (Data Management, Clinical Operations, Biostatistics, Medical
Writing etc.) (1)
▢Regulatory Affairs Team (5)
▢Quality Assurance Team (2)
▢Information Systems/Digital Technology Team (3)
▢Other, please specify: (4)
__________________________________________________
Q4 What is your current role?
o Vice President/President (1)
o Director/Senior Director/Executive Director (2)
o Manager/Senior Manager (3)
o Specialist/Associate (4)
o Consultant (6)
o Other, please specify: (5)
__________________________________________________
157
Q5 What is the size of your company?
oLess than 200 employees (1)
o201-2000 employees (2)
o2001-20,000 employees (3)
o More than 20,000 employees (6)
o Not sure (4)
Q6 How many clinical trials does your company conduct in a year?
o None (1)
o5 or less (2)
o6-20 (5)
o21-100 (6)
o More than 100 (3)
o Not sure (4)
Q7 With how many clinical trials have you personally worked over the past 5 years?
o None (1)
o5 or less (2)
o5-15 (5)
o More than 15 (3)
o Not sure (4)
158
Q8 With what phase(s) of clinical trials do you have experience? (Select all that apply).
▢Phase 1 (1)
▢Phase 2 (2)
▢Phase 3 (3)
▢Phase 4 (4)
▢None (5)
▢Not sure (6)
159
Q9 How familiar are you with the following regulations, standards and guidances?
No knowledge
(1)
Some
knowledge (2)
A lot of
knowledge (3)
Expert
Knowledge (4)
21CFR Part 11
Regulation (1) o o o o
Good Clinical
Practices (GCP)
(2) o o o o
2003 - Guidance
for Industry Part
11, Electronic
Records;
Electronic
Signatures —
Scope and
Application (3)
o o o o
2007 - Guidance
for Industry
Computerized
Systems Used in
Clinical
Investigations
(4)
o o o o
2017 - Use of
Electronic
Records and
Electronic
Signatures in
Clinical
Investigations
Under 21 CFR
Part 11 –
Questions and
Answers (5)
o o o o
2022 - Digital
Health
Technologies for
Remote Data
Acquisition in
Clinical
Investigations
(6)
o o o o
160
Q10 How would you characterize your level of experience with Part 11 compliance for clinical
trials?
o Expert (1)
oIntermediate (2)
o Beginner (3)
o None (4)
Q11 What challenges did you face when trying to understand the role of Good Clinical Practices
(GCPs) in implementing compliance with Part 11 for clinical trials?
Most
Challenging
(1)
Very
Challenging
(2)
Marginally
Challenging
(3)
Not
Challenging
(4)
Not
Sure
(5)
Understanding GCPs to
implement 21 CFR Part
11 compliance (2) o o o o o
Having sufficient
regulatory
requirements/guidances
(3)
o o o o o
Incorporating IT
Standards such as
NIST, ISO, CDISC (5) o o o o o
Other, please specify:
(4) o o o o o
Q12 Have you or your company explored using any of the following digital health technologies
for your clinical trials: consumer wearables (e.g. smart watches, step trackers, vital sign
monitors) or software platforms to capture patient reported outcomes or experiences (e.g. to
161
monitor compliance with therapy, to record activities of daily living, to respond to surveys, to
report on clinical performance outcome measures)? (Select all that apply)
▢Yes, Consumer Wearables (1)
▢Yes, Software Platforms (3)
▢No (2)
Skip To: End of Block If Have you or your company explored using any of the following digital health
technologies for your... = No
Q13 This survey has 2 blocks of questions, one to explore your experience with consumer
wearables and the second to explore your experience with software platforms collecting patient
reported outcomes. Which digital health technology questions would you like to answer now?
o Consumer Wearables (1)
o Software Platforms (2)
o Neither (4)
End of Block: Block 1 - Demographics
Start of Block: Block 6 - Generic Concluding Questions
Q1 Does your company have plans to use any digital health technologies for their clinical trials?
o Definitely not (1)
o Probably not (2)
o Might or might not (3)
o Not sure (4)
o Probably yes, provide details on which ones: (5)
__________________________________________________
o Definitely yes, provide details on which ones: (6)
__________________________________________________
162
Q2 How well do you think that your company understands Part 11 requirements for clinical
trials?
o Very confident (1)
o Confident (2)
o Moderately confident (3)
o Not very confident (4)
o Not sure (5)
Q3 In one word, how would you characterize FDA guidances to support the Part 11 regulation?
o Confusing (1)
o Excessive (2)
oInsufficient (3)
o Reasonable (4)
o Other, please specify: (6)
__________________________________________________
Q4 Thank you for supporting my doctoral work. I appreciate your time and value your input on
this topic. I will be providing a brief summary of the results to those who have participated in the
survey once the analysis of the survey is complete. If you wish to receive the summary, please
provide your email address below.
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
163
________________________________________________________________
End of Block: Block 6 - Generic Concluding Questions
Start of Block: Block 2 - Consumer Wearables
Q1 If you have had experience with more than one consumer wearable please answer this
section of the survey with your most recent experiences.
Thank you for completing this survey.
Q2 Was the consumer wearable provided by the sponsor of the trial (i.e. provisioned) or did the
clinical trial participant bring their own device (i.e. BYOD)?
o Provisioned device (1)
o BYOD (2)
o Both (3)
Q3 How was/will the data from the consumer wearable be used to support the study?(Select all
that apply)
▢Pilot, feasibility studies (1)
▢Early phase exploratory studies (2)
▢Late phase confirmatory studies (3)
▢Phase IV studies (4)
▢Not sure (5)
164
Q4 What roles have you played in assuring Part 11 compliance for consumer wearables?
(Select all that apply)
▢Selection of the device (1)
▢Implementation of the device (2)
▢Validation of the device (3)
▢Quality Assurance of the device (4)
▢Communication to regulators (5)
▢Monitoring field performance (6)
▢Training the trainer or other users (7)
▢Other, please specify: (8)
__________________________________________________
165
Q5 When you explored using consumer wearables in a clinical trial, what sources of information
did you use to educate yourself about regulatory requirements?
Used, very
helpful (1)
Used,
moderately
helpful (2)
Used, but not
helpful (3)
Have not used
(4)
Regulations and
guidance documents
(1) o o o o
Discussion/meeting
with regulatory
authorities (3) o o o o
Manufacturer’s
documentation (4) o o o o
Trade journals, news
articles and blogs (5) o o o o
Seminars/conferences
(6) o o o o
Input from consultants
(7) o o o o
Input from colleagues
(9) o o o o
Landscape analysis,
Clinicaltrials.gov (2) o o o o
Other, please specify:
(8) o o o o
166
Q6 How important were the following factors when deciding to adopt consumer wearables in
your clinical trials?
Very Important
(1)
Somewhat
Important (2)
Not Important
(3) Not Sure (4)
Corporate
strategy (1) o o o o
Capabilities to
satisfy Part 11
and Good
Clinical
Practices (GCP)
(2)
o o o o
Ease of use (4) o o o o
Time to validate
(5) o o o o
Flexibility in data
collection (7) o o o o
Feedback from
FDA (9) o o o o
Other, please
specify: (10) o o o o
Q7 Before beginning your trial, did your company ensure that the consumer wearable was “fitfor-purpose” i.e. the level of validation was sufficient to support its use in the clinical
investigation?
o Yes (1)
o No (2)
o Not sure (3)
167
Q8 In your opinion, rate these potential concerns in order of importance when implementing the
consumer wearables? (Drag and Drop your response)
1. ______ Cumbersome risk assessments (1)
2. ______ Unclear requirements for verifications (2)
3. ______ Cumbersome validations (3)
4. ______ High cost of testing (4)
5. ______ Insufficient compliance experience (6)
6. ______ Incorrect use by participants (5)
Q9 Are there additional comments or concerns you would like to share other than those
mentioned above?
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
Q10 Did you ultimately incorporate consumer wearables for data collection?
o Yes (1)
o Still under consideration (3)
o Not sure (4)
o No, comment why not? (2)
__________________________________________________
Skip To: Q25 If Did you ultimately incorporate consumer wearables for data collection? = No, comment
why not?
168
Q11 What challenges affected your use of consumer wearables for your clinical trial?
No influence (1) Some influence
(2)
A lot of influence
(3)
Cannot answer
(4)
Lack of
leadership
support (1) o o o o
Insufficient
clarity on
regulatory
requirements (2)
o o o o
Insufficient
budget allocation
(3) o o o o
Insufficient
understanding of
validation
methods (4)
o o o o
Insufficient
technical
expertise to
perform
validations (5)
o o o o
Other, please
specify: (6) o o o o
Q12 Did the preparations to use the consumer wearables delay the start date of the clinical
trial?
o No (1)
o Not sure (2)
o Yes, with the following delay (approximate number of months): (3)
__________________________________________________
Skip To: Q14 If Did the preparations to use the consumer wearables delay the start date of the clinical
trial? = No
169
Q13 How impactful were the following factors in delaying the use of consumer wearables in your
company’s clinical trial?
Most
impactful (1) Impactful (2) Marginally
Impactful (3)
Not impactful
(4) Not sure (5)
Time required
to complete
risk
assessments
(1)
o o o o o
Time required
to complete
validations (2) o o o o o
Time to train
personnel (3) o o o o o
Time to train
subjects (4) o o o o o
Time required
for
communication
discussions
with the FDA
(5)
o o o o o
Other, please
specify: (6) o o o o o
170
Q14 How challenging were the following in completing validations for consumer wearables?
Most
challenging
(1)
Very
challenging
(2)
Moderately
challenging
(3)
Not
challenging
(4)
Not Sure (5)
Understanding
Good Clinical
Practices
(GCPs)
related to
validations (2)
o o o o o
Documenting
plans and
procedures (3) o o o o o
Assuring
manpower to
perform
validations (4)
o o o o o
Ensuring
knowledge to
perform
validations (5)
o o o o o
Meeting
timelines (6) o o o o o
Securing
financial
resources (7) o o o o o
Q15 As you prepared to use consumer wearables in your study protocol, did you set up
meetings with the FDA to discuss the necessary compliance requirements? (Select all that
apply)
▢Yes, for early stage exploratory data (1)
▢Yes, for end point data collection (5)
▢No (2)
▢Not sure (3)
171
Skip To: Q19 If As you prepared to use consumer wearables in your study protocol, did you set up
meetings with th... = No
Skip To: Q19 If As you prepared to use consumer wearables in your study protocol, did you set up
meetings with th... = Not sure
Q16 How long did it take to set up and complete the discussions with the FDA before you
implemented the consumer wearable?
o Almost immediate, e.g., phone call discussion (1)
o Up to 6 months (2)
o6-12 months (3)
o Greater than 12 months (4)
o Not sure (5)
o Not Applicable (6)
Q17 Did you have enough time to address the FDA’s feedback regarding the requirements for
Part 11 compliance before your trial started?
o No (1)
o Yes (2)
o Not sure (3)
o Not Applicable, please comment: (4)
__________________________________________________
172
Q18 When thinking about your discussions with FDA, do you agree or disagree with the
following statements regarding those discussions?
Disagree (1) Neither disagree
nor agree (2) Agree (3) Not sure (4)
Provided
sufficient
information for
decision-making
(2)
o o o o
Provided
opportunity for
follow
up/discussion (3)
o o o o
Yielded a clear
understanding of
potential impact
on data
collection (4)
o o o o
Provided
sufficient advice
about data
needed for a
marketing
application (5)
o o o o
Q19 Have you carried out risk assessments related to the consumer wearables?
o Yes (1)
o No (2)
o Did not need risk assessments as the device/sensor was validated previously for this
purpose (3)
o Not sure (4)
Skip To: Q21 If Have you carried out risk assessments related to the consumer wearables? = No
Skip To: Q21 If Have you carried out risk assessments related to the consumer wearables? = Not sure
Skip To: Q21 If Have you carried out risk assessments related to the consumer wearables? = Did not
need risk assessments as the device/sensor was validated previously for this purpose
173
Q20 What challenges did you face when performing risk assessments?
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
174
Q21 When you first used the consumer wearables for the clinical trial, what were the
challenges?
Very challenging
(2)
Moderately
challenging (3)
Not challenging
(4) Not sure (5)
Trial participants
made errors (2) o o o o
Participants
were not
compliant (3) o o o o
Device did not
work as
anticipated (4) o o o o
Devices failed
(5) o o o o
Device updates
changed data or
algorithms (6) o o o o
Operating
system
upgrades
caused software
incompatibility
(7)
o o o o
Data
transmission
was unreliable
(8)
o o o o
Technical
support was
insufficient (9) o o o o
Q22 Are there additional comments or concerns you would like to share other than those
mentioned above?
________________________________________________________________
________________________________________________________________
________________________________________________________________
175
________________________________________________________________
________________________________________________________________
Q23 How important were the following factors in assuring ongoing Part 11 compliance for
consumer wearables, as the clinical trial progressed?
Extremely
Important
(1)
Very
Important
(2)
Moderately
important
(3)
Not
important
(4)
Not sure
(5)
Reassessments when
regulatory
requirements/guidances
changed (1)
o o o o o
Organizational
readiness (i.e., update
procedures, processes,
trainings, etc.) (3)
o o o o o
Adequate manpower
(4) o o o o o
Adequate finances (5) o o o o o
Adequate time (6) o o o o o
176
Q24 How confident were you that the following activities were successful in maintaining
compliance with Part 11?
Very confident
(1) Confident (2) Not very
confident (3) Not sure (4)
Conducting risk
assessments (1) o o o o
Managing
identified risks
(2) o o o o
Performing
reverifications
and revalidations
(3)
o o o o
Managing
software
updates (5) o o o o
Updating trial
protocols and
procedures (6) o o o o
Training new
staff (7) o o o o
Adding
manpower (8) o o o o
Allocating
sufficient time
(9) o o o o
Q25 Thank you for your valuable input and time to complete this block of questions. Would you
be willing to answer questions about software platforms used to collect patient reported
outcomes? We value your insights as an industry expert .
o Yes (1)
o No (2)
o Have already completed that section (4)
End of Block: Block 2 - Consumer Wearables
177
Start of Block: Block 5 - Concluding Questions
Q1 In one word, how would you characterize FDA guidances to support the Part 11 regulation?
o Confusing (1)
o Excessive (2)
oInsufficient (3)
o Reasonable (4)
o Other, please specify: (6)
__________________________________________________
Q2 Before deploying the discussed digital health technologies (consumer wearables, software
platforms for patient reported data and FDA cleared biosensors) for your clinical trial did you
consider cybersecurity risks?
o Yes (1)
o No (2)
o Not sure (3)
Q3 Does your company have procedures to handle a data breach?
o Yes (1)
o No (2)
o Not sure (3)
178
Q4 Do you feel that your company pays sufficient attention to assure that employees
understand and stay current with the requirements of Part 11 compliance for digital health
technologies?
o Yes (1)
o No (2)
o Not sure (3)
Q5 Is there anything else you would like to tell us about your experiences when implementing
Part 11 compliance for discussed digital health technologies (consumer wearables, software
platforms for patient reported data and/or FDA cleared biosensors) in clinical trials that I may not
have captured?
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
Q6 Thank you for supporting my doctoral work. I appreciate your time and value your input on
this topic. I will be providing a brief summary of the results to those who have participated in the
survey once the analysis of the survey is complete. If you wish to receive the summary, please
provide your email address below.
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
End of Block: Block 5 - Concluding Questions
179
Start of Block: Block 3 - Software Platforms
Q1 If you have had experience with more than one software platform to collect patient report
outcomes please answer this section of the survey with your most recent experiences.
Thank you for completing this survey.
Q2 For what purpose(s) has your company considered software platforms for patient reported
outcomes? (Select all that apply).
▢To monitor compliance with therapy (1)
▢To record activities of daily living (2)
▢To respond to surveys (3)
▢To report on clinical or performance outcome measures (4)
▢Other, please specify: (5)
__________________________________________________
Q3 How was/will the data from the consumer wearable be used to support the study? (Select all
that apply)
▢Pilot, feasibility studies (1)
▢Early phase exploratory studies (2)
▢Late phase confirmatory studies (3)
▢Phase IV studies (4)
▢Not sure (5)
180
Q4 What roles have you played in assuring Part 11 compliance, for software platforms? (Select
all that apply)
▢Selection of the device (1)
▢Development/modification of the device (2)
▢Implementation of the device (3)
▢Validation ot the device (4)
▢Quality Assurance of the device (5)
▢Communication to regulators (6)
▢Monitoring field performance (7)
▢Training the trainer or other users (8)
▢Other, please specify: (9)
__________________________________________________
181
Q5 When you explored using software platforms in a clinical trial, what sources of information
did you use to educate yourself about regulatory requirements?
Used, very
helpful (1)
Used,
moderately
helpful (2)
Used, but not
helpful (3)
Have not used
(4)
Regulations and
guidance documents
(1) o o o o
Discussion/meeting
with regulatory
authorities (3) o o o o
Manufacturer’s
documentation (4) o o o o
Trade journals, news
articles and blogs (5) o o o o
Seminars/conferences
(6) o o o o
Input from consultants
(7) o o o o
Input from colleagues
(9) o o o o
Landscape analysis,
Clinicaltrials.gov (2) o o o o
Other, please specify:
(8) o o o o
182
Q6 How important were the following factors when deciding to adopt software platforms in your
clinical trials?
Very Important
(1)
Somewhat
Important (2)
Not Important
(3) Not Sure (4)
Corporate
strategy (1) o o o o
Capabilities to
satisfy Part 11
and Good
Clinical
Practices (GCP)
(2)
o o o o
Ease of use (4) o o o o
Time to validate
(5) o o o o
Flexibility in data
collection (7) o o o o
Feedback from
FDA (9) o o o o
Other, please
specify: (10) o o o o
Q7 Before beginning your trial, did your company ensure that the software platform was “fit-forpurpose” i.e. the level of validation was sufficient to support its use in the clinical investigation?
o Yes (1)
o No (2)
o Not sure (3)
183
Q8 In your opinion, rate these potential concerns in order of importance when implementing the
software platforms? (Drag and Drop your response)
7. ______ Cumbersome risk assessments (1)
8. ______ Unclear requirements for verifications (2)
9. ______ Cumbersome validations (3)
10. ______ High cost of testing (4)
11. ______ Insufficient compliance experience (6)
12. ______ Incorrect use by participants (5)
Q9 Are there additional comments or concerns you would like to share other than those
mentioned above?
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
Q10 Did you ultimately incorporate software platforms for patient reported outcomes in your
clinical trials?
o Yes (1)
o Still under consideration (3)
o Not sure (4)
o No, comment why not? (2)
__________________________________________________
Skip To: Q25 If Did you ultimately incorporate software platforms for patient reported outcomes in your
clinical... = No, comment why not?
184
Q11 What challenges affected your use of software platforms for patient reported outcomes in
your clinical trial?
No influence (1) Some influence
(2)
A lot of influence
(3)
Cannot answer
(4)
Lack of
leadership
support (1) o o o o
Insufficient
clarity on
regulatory
requirements (2)
o o o o
Insufficient
budget allocation
(3) o o o o
Insufficient
understanding of
validation
methods (4)
o o o o
Insufficient
technical
expertise to
perform
validations (5)
o o o o
Other, please
specify: (6) o o o o
Q12 Did the preparations to use the software platforms for patient reported outcomes cause
delays in the start date of the clinical trial?
o No (1)
o Not sure (2)
o Yes, with the following delay (approximate number of months): (3)
__________________________________________________
Skip To: Q14 If Did the preparations to use the software platforms for patient reported outcomes cause
delays in... = No
185
Q13 How impactful were the following factors in delaying the use of software platforms in your
company’s clinical trial?
Most
impactful (1) Impactful (2) Marginally
Impactful (3)
Not impactful
(4) Not sure (5)
Time required
to complete
risk
assessments
(1)
o o o o o
Time required
to complete
validations (2) o o o o o
Time to train
personnel (3) o o o o o
Time to train
subjects (4) o o o o o
Time required
for
communication
discussions
with the FDA
(5)
o o o o o
Other, please
specify: (6) o o o o o
186
Q14 How challenging were the following in completing validations for software platforms
recording patient reported outcomes?
Most
challenging
(1)
Very
challenging
(2)
Moderately
challenging
(3)
Not
challenging
(4)
Not Sure (5)
Understanding
Good Clinical
Practices
(GCPs)
related to
validations (2)
o o o o o
Documenting
plans and
procedures (3) o o o o o
Assuring
manpower to
perform
validations (4)
o o o o o
Ensuring
knowledge to
perform
validations (5)
o o o o o
Meeting
timelines (6) o o o o o
Securing
financial
resources (7) o o o o o
Q15 As you prepared to use software platforms for patient reported data in your study protocol,
did you set up meetings with the FDA to discuss the necessary compliance requirements?
o Yes, for early stage exploratory data (1)
o Yes, for end point data collection (4)
o No (2)
o Not sure (3)
187
Skip To: Q19 If As you prepared to use software platforms for patient reported data in your study protocol,
did y... = No
Skip To: Q19 If As you prepared to use software platforms for patient reported data in your study protocol,
did y... = Not sure
Q16 How long did it take to set up and complete the discussions with the FDA before you
implemented the software platform?
o Almost immediate, phone call discussion (1)
o Up to 6 months (2)
o6-12 months (3)
o Greater than 12 months (4)
o Not sure (5)
o Not Applicable (6)
Q17 Did you have enough time to address the FDA’s feedback regarding the requirements for
Part 11 compliance before your trial started?
o No (1)
o Yes (2)
o Not sure (3)
o Not Applicable, please comment: (4)
__________________________________________________
188
Q18 When thinking about your discussions with FDA, do you agree or disagree with the
following statements regarding those discussions?
Disagree (1) Neither disagree
nor agree (2) Agree (3) Not sure (4)
Provided
sufficient
information for
decision-making
(2)
o o o o
Provided
opportunity for
follow
up/discussion (3)
o o o o
Yielded a clear
understanding of
potential impact
on data
collection (4)
o o o o
Provided
sufficient advice
about data
needed for a
marketing
application (5)
o o o o
Q19 Have you carried out risk assessments related to the software platforms for patient
reported outcomes?
o Yes (1)
o No (2)
o Did not need risk assessments because the platform was validated previously for this
purpose (3)
o Not sure (4)
Skip To: Q21 If Have you carried out risk assessments related to the software platforms for patient
reported outc... = No
Skip To: Q21 If Have you carried out risk assessments related to the software platforms for patient
reported outc... = Not sure
189
Skip To: Q21 If Have you carried out risk assessments related to the software platforms for patient
reported outc... = Did not need risk assessments because the platform was validated previously for this
purpose
Q20 Did you have any challenges with perfoming the risk assessments?
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
190
Q21 When you first used the software platforms for patient reported outcomes for the clinical
trial, what were the challenges?
Very challenging
(2)
Moderately
challenging (3)
Not challenging
(4) Not sure (5)
Trial participants
made errors (2) o o o o
Participants
were not
compliant (3) o o o o
Device did not
work as
anticipated (4) o o o o
Devices failed
(5) o o o o
Device updates
changed data or
algorithms (6) o o o o
Operating
system
upgrades
caused software
incompatibility
(7)
o o o o
Data
transmission
was unreliable
(8)
o o o o
Technical
support was
insufficient (9) o o o o
Q22 Are there additional comments or concerns you would like to share other than those
mentioned above?
________________________________________________________________
________________________________________________________________
________________________________________________________________
191
________________________________________________________________
________________________________________________________________
Q23 How important were the following factors in assuring ongoing Part 11 compliance for
software platforms as the clinical trial progressed?
Extremely
Important
(1)
Very
Important
(2)
Moderately
important
(3)
Not
important
(4)
Not sure
(5)
Reassessments when
regulatory
requirements/guidances
changed (1)
o o o o o
Organizational
readiness (i.e., update
procedures, processes,
trainings, etc.) (3)
o o o o o
Adequate manpower
(4) o o o o o
Adequate finances (5) o o o o o
Adequate time (6) o o o o o
192
Q24 How confident were you that the following activities were successful in maintaining
compliance with Part 11?
Very confident
(1) Confident (2) Not very
confident (3) Not sure (4)
Conducting risk
assessments (1) o o o o
Managing
identified risks
(2) o o o o
Performing
reverifications
and revalidations
(3)
o o o o
Managing
software
updates (5) o o o o
Updating trial
protocols and
procedures (6) o o o o
Training of new
staff (7) o o o o
Adding
manpower (8) o o o o
Allocating
sufficient time
(9) o o o o
Q25 Thank you for your valuable input and time to complete this block of questions. Would you
be willing to answer the same type of questions for consumer wearables (for e.g. smart
watches). We appreciate your insights as an industry expert .
o Yes (1)
o No (2)
o Have already completed that section (4)
End of Block: Block 3 - Software Platforms
193
Cross Tabulations
Table 27: Q2: How well do you think that your company understands Part 11
requirements for clinical trials?
Total
Contract
Research
Organization
Pharma/
Biotech
Company
Medical
Device/IVD
Company
Consulting
Company
Other,
please
specify:
6.0 1.0 2.0 1.0 1.0 1.0
6.7% 12.5% 5.0% 4.3% 8.3% 16.7%
8.0 1.0 4.0 0.0 2.0 1.0
9.0% 12.5% 10.0% 0.0% 16.7% 16.7%
3.0 0.0 2.0 0.0 1.0 0.0
3.4% 0.0% 5.0% 0.0% 8.3% 0.0%
1.0 0.0 0.0 1.0 0.0 0.0
1.1% 0.0% 0.0% 4.3% 0.0% 0.0%
2.0 0.0 1.0 1.0 0.0 0.0
2.2% 0.0% 2.5% 4.3% 0.0% 0.0%
Not sure
Q2: Please tell us about the organizations with which you have worked in the
last five years. (Select all that apply)
Very confident
Confident
Moderately confident
Not very confident
194
Table 28: Q9: How familiar are you with the following regulations, standards and
guidances?
Total
Contract
Research
Organization
Pharma/
Biotech
Company
Medical
Device/IVD
Company
Consulting
Company
Other,
please
specify:
89.0 8.0 40.0 23.0 12.0 6.0
No knowledge 2.0 0.0 0.0 1.0 0.0 1.0
2.2% 0.0% 0.0% 4.3% 0.0% 16.7%
Some knowledge 30.0 4.0 15.0 7.0 2.0 2.0
33.7% 50.0% 37.5% 30.4% 16.7% 33.3%
A lot of knowledge 37.0 3.0 16.0 9.0 6.0 3.0
41.6% 37.5% 40.0% 39.1% 50.0% 50.0%
Expert Knowledge 20.0 1.0 9.0 6.0 4.0 0.0
22.5% 12.5% 22.5% 26.1% 33.3% 0.0%
No knowledge 2.0 0.0 1.0 1.0 0.0 0.0
2.2% 0.0% 2.5% 4.3% 0.0% 0.0%
Some knowledge 15.0 2.0 4.0 6.0 1.0 2.0
16.9% 25.0% 10.0% 26.1% 8.3% 33.3%
A lot of knowledge 45.0 2.0 24.0 10.0 6.0 3.0
50.6% 25.0% 60.0% 43.5% 50.0% 50.0%
Expert Knowledge 27.0 4.0 11.0 6.0 5.0 1.0
30.3% 50.0% 27.5% 26.1% 41.7% 16.7%
No knowledge 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Some knowledge 37.0 5.0 16.0 9.0 3.0 4.0
41.6% 62.5% 40.0% 39.1% 25.0% 66.7%
A lot of knowledge 38.0 2.0 19.0 10.0 5.0 2.0
42.7% 25.0% 47.5% 43.5% 41.7% 33.3%
Expert Knowledge 14.0 1.0 5.0 4.0 4.0 0.0
15.7% 12.5% 12.5% 17.4% 33.3% 0.0%
No knowledge 11.0 0.0 5.0 3.0 2.0 1.0
12.4% 0.0% 12.5% 13.0% 16.7% 16.7%
Some knowledge 37.0 6.0 17.0 8.0 3.0 3.0
41.6% 75.0% 42.5% 34.8% 25.0% 50.0%
A lot of knowledge 32.0 2.0 14.0 10.0 4.0 2.0
36.0% 25.0% 35.0% 43.5% 33.3% 33.3%
Expert Knowledge 9.0 0.0 4.0 2.0 3.0 0.0
10.1% 0.0% 10.0% 8.7% 25.0% 0.0%
No knowledge 1.0 0.0 0.0 1.0 0.0 0.0
1.1% 0.0% 0.0% 4.3% 0.0% 0.0%
Some knowledge 44.0 5.0 20.0 11.0 4.0 4.0
49.4% 62.5% 50.0% 47.8% 33.3% 66.7%
A lot of knowledge 31.0 2.0 15.0 7.0 5.0 2.0
34.8% 25.0% 37.5% 30.4% 41.7% 33.3%
Expert Knowledge 13.0 1.0 5.0 4.0 3.0 0.0
14.6% 12.5% 12.5% 17.4% 25.0% 0.0%
No knowledge 19.0 2.0 10.0 4.0 2.0 1.0
21.3% 25.0% 25.0% 17.4% 16.7% 16.7%
Some knowledge 39.0 4.0 15.0 11.0 6.0 3.0
43.8% 50.0% 37.5% 47.8% 50.0% 50.0%
A lot of knowledge 20.0 2.0 9.0 4.0 3.0 2.0
22.5% 25.0% 22.5% 17.4% 25.0% 33.3%
Expert Knowledge 11.0 0.0 6.0 4.0 1.0 0.0
12.4% 0.0% 15.0% 17.4% 8.3% 0.0%
Total Count (All)
21CFR Part 11
Regulation
Good Clinical Practices
(GCP)
2003 - Guidance for
Industry Part 11,
Electronic Records;
Electronic Signatures —
Scope and Application
2007 - Guidance for
Industry Computerized
Systems Used in Clinical
Investigations
2017 - Use of Electronic
Records and Electronic
Signatures in Clinical
Investigations Under 21
CFR Part 11 – Questions
and Answers
2022 - Digital Health
Technologies for
Remote Data Acquisition
in Clinical Investigations
Q2: Please tell us about the organizations with which you have worked in the
last five years. (Select all that apply)
195
Table 29: Q11: What challenges did you face when trying to understand the role of
Good Clinical Practices (GCPs) in implementing compliance with Part 11 for
clinical trials?
Total
Contract
Research
Organization
Pharma/
Biotech
Company
Medical
Device/IVD
Company
Consulting
Company
Other,
please
specify:
89.0 8.0 40.0 23.0 12.0 6.0
Most Challenging 1.0 0.0 1.0 0.0 0.0 0.0
1.1% 0.0% 2.5% 0.0% 0.0% 0.0%
Very Challenging 9.0 0.0 6.0 2.0 0.0 1.0
10.1% 0.0% 15.0% 8.7% 0.0% 16.7%
Marginally Challenging 53.0 6.0 25.0 11.0 8.0 3.0
59.6% 75.0% 62.5% 47.8% 66.7% 50.0%
Not Challenging 15.0 1.0 5.0 6.0 3.0 0.0
16.9% 12.5% 12.5% 26.1% 25.0% 0.0%
Not Sure 9.0 1.0 2.0 3.0 1.0 2.0
10.1% 12.5% 5.0% 13.0% 8.3% 33.3%
Most Challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very Challenging 15.0 1.0 8.0 4.0 2.0 0.0
16.9% 12.5% 20.0% 17.4% 16.7% 0.0%
Marginally Challenging 41.0 4.0 21.0 9.0 5.0 2.0
46.1% 50.0% 52.5% 39.1% 41.7% 33.3%
Not Challenging 21.0 2.0 7.0 7.0 4.0 1.0
23.6% 25.0% 17.5% 30.4% 33.3% 16.7%
Not Sure 7.0 1.0 2.0 2.0 0.0 2.0
7.9% 12.5% 5.0% 8.7% 0.0% 33.3%
Most Challenging 1.0 0.0 1.0 0.0 0.0 0.0
1.1% 0.0% 2.5% 0.0% 0.0% 0.0%
Very Challenging 26.0 3.0 12.0 6.0 3.0 2.0
29.2% 37.5% 30.0% 26.1% 25.0% 33.3%
Marginally Challenging 30.0 2.0 15.0 7.0 6.0 0.0
33.7% 25.0% 37.5% 30.4% 50.0% 0.0%
Not Challenging 9.0 1.0 3.0 3.0 1.0 1.0
10.1% 12.5% 7.5% 13.0% 8.3% 16.7%
Not Sure 19.0 2.0 8.0 5.0 2.0 2.0
21.3% 25.0% 20.0% 21.7% 16.7% 33.3%
Most Challenging 4.0 1.0 1.0 1.0 1.0 0.0
4.5% 12.5% 2.5% 4.3% 8.3% 0.0%
Very Challenging 2.0 0.0 1.0 1.0 0.0 0.0
2.2% 0.0% 2.5% 4.3% 0.0% 0.0%
Marginally Challenging 1.0 0.0 0.0 1.0 0.0 0.0
1.1% 0.0% 0.0% 4.3% 0.0% 0.0%
Not Challenging 3.0 0.0 1.0 1.0 0.0 1.0
3.4% 0.0% 2.5% 4.3% 0.0% 16.7%
Not Sure 9.0 2.0 5.0 1.0 1.0 0.0
10.1% 25.0% 12.5% 4.3% 8.3% 0.0%
Q2: Please tell us about the organizations with which you have worked in the
last five years. (Select all that apply)
Total Count (All)
Understanding GCPs to
implement 21 CFR Part
11 compliance
Having sufficient
regulatory
requirements/guidances
Incorporating IT
Standards such as NIST,
ISO, CDISC
Other, please specify:
196
Table 30: Q14: How challenging were the following in completing validations for
consumer wearables?
Total
Contract
Research
Organization
Pharma/
Biotech
Company
Medical
Device/IVD
Company
Consulting
Company
Other,
please
specify:
89.0 8.0 40.0 23.0 12.0 6.0
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 3.0 0.0 1.0 2.0 0.0 0.0
3.4% 0.0% 2.5% 8.7% 0.0% 0.0%
Moderately challenging 15.0 0.0 9.0 3.0 2.0 1.0
16.9% 0.0% 22.5% 13.0% 16.7% 16.7%
Not challenging 11.0 0.0 3.0 6.0 1.0 1.0
12.4% 0.0% 7.5% 26.1% 8.3% 16.7%
Not Sure 3.0 0.0 2.0 1.0 0.0 0.0
3.4% 0.0% 5.0% 4.3% 0.0% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 9.0 0.0 5.0 4.0 0.0 0.0
10.1% 0.0% 12.5% 17.4% 0.0% 0.0%
Moderately challenging 17.0 0.0 7.0 5.0 3.0 2.0
19.1% 0.0% 17.5% 21.7% 25.0% 33.3%
Not challenging 3.0 0.0 1.0 2.0 0.0 0.0
3.4% 0.0% 2.5% 8.7% 0.0% 0.0%
Not Sure 3.0 0.0 2.0 1.0 0.0 0.0
3.4% 0.0% 5.0% 4.3% 0.0% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 7.0 0.0 3.0 3.0 1.0 0.0
7.9% 0.0% 7.5% 13.0% 8.3% 0.0%
Moderately challenging 16.0 0.0 7.0 5.0 2.0 2.0
18.0% 0.0% 17.5% 21.7% 16.7% 33.3%
Not challenging 6.0 0.0 3.0 3.0 0.0 0.0
6.7% 0.0% 7.5% 13.0% 0.0% 0.0%
Not Sure 3.0 0.0 2.0 1.0 0.0 0.0
3.4% 0.0% 5.0% 4.3% 0.0% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 11.0 0.0 6.0 4.0 1.0 0.0
12.4% 0.0% 15.0% 17.4% 8.3% 0.0%
Moderately challenging 12.0 0.0 5.0 4.0 1.0 2.0
13.5% 0.0% 12.5% 17.4% 8.3% 33.3%
Not challenging 6.0 0.0 2.0 3.0 1.0 0.0
6.7% 0.0% 5.0% 13.0% 8.3% 0.0%
Not Sure 3.0 0.0 2.0 1.0 0.0 0.0
3.4% 0.0% 5.0% 4.3% 0.0% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 15.0 0.0 6.0 5.0 2.0 2.0
16.9% 0.0% 15.0% 21.7% 16.7% 33.3%
Moderately challenging 10.0 0.0 5.0 4.0 1.0 0.0
11.2% 0.0% 12.5% 17.4% 8.3% 0.0%
Not challenging 2.0 0.0 1.0 1.0 0.0 0.0
2.2% 0.0% 2.5% 4.3% 0.0% 0.0%
Not Sure 5.0 0.0 3.0 2.0 0.0 0.0
5.6% 0.0% 7.5% 8.7% 0.0% 0.0%
Most challenging 1.0 0.0 0.0 1.0 0.0 0.0
1.1% 0.0% 0.0% 4.3% 0.0% 0.0%
Very challenging 5.0 0.0 3.0 2.0 0.0 0.0
5.6% 0.0% 7.5% 8.7% 0.0% 0.0%
Moderately challenging 9.0 0.0 5.0 2.0 2.0 0.0
10.1% 0.0% 12.5% 8.7% 16.7% 0.0%
Not challenging 6.0 0.0 1.0 3.0 1.0 1.0
6.7% 0.0% 2.5% 13.0% 8.3% 16.7%
Not Sure 11.0 0.0 6.0 4.0 0.0 1.0
12.4% 0.0% 15.0% 17.4% 0.0% 16.7%
Q2: Please tell us about the organizations with which you have worked in the
last five years. (Select all that apply)
Total Count (All)
Understanding Good
Clinical Practices (GCPs)
related to validations
Documenting plans and
procedures
Assuring manpower to
perform validations
Ensuring knowledge to
perform validations
Meeting timelines
Securing financial
resources
197
Table 31: Q2: How well do you think that your company understands Part 11
requirements for clinical trials?
VP/
President Dir/ Sr Dir
Mgr/ Sr
Mgr
Specialist/
Associate Consultant
Other,
please
specify:
60.0 8.0 27.0 10.0 4.0 7.0
4.0 0.0 2.0 0.0 1.0 1.0
6.7% 0.0% 7.4% 0.0% 25.0% 14.3%
5.0 0.0 2.0 1.0 0.0 2.0
8.3% 0.0% 7.4% 10.0% 0.0% 28.6%
2.0 1.0 1.0 0.0 0.0 0.0
3.3% 12.5% 3.7% 0.0% 0.0% 0.0%
1.0 0.0 0.0 0.0 1.0 0.0
1.7% 0.0% 0.0% 0.0% 25.0% 0.0%
1.0 0.0 0.0 1.0 0.0 0.0
1.7% 0.0% 0.0% 10.0% 0.0% 0.0%
Moderately confident
Not very confident
Not sure
Total Count (All)
Very confident
Confident
Q4: What is your current role? - Selected Choice
198
Table 32: Q9: How familiar are you with the following regulations, standards and
guidances?
Total
VP/
President Dir/ Sr Dir
Mgr/ Sr
Mgr
Specialist/
Associate Consultant
Other,
please
specify:
60.0 8.0 27.0 10.0 4.0 7.0 4.0
No knowledge 2.0 0.0 0.0 0.0 1.0 0.0 1.0
3.3% 0.0% 0.0% 0.0% 25.0% 0.0% 25.0%
Some knowledge 21.0 3.0 12.0 2.0 2.0 0.0 2.0
35.0% 37.5% 44.4% 20.0% 50.0% 0.0% 50.0%
A lot of knowledge 22.0 4.0 6.0 5.0 1.0 5.0 1.0
36.7% 50.0% 22.2% 50.0% 25.0% 71.4% 25.0%
Expert Knowledge 15.0 1.0 9.0 3.0 0.0 2.0 0.0
25.0% 12.5% 33.3% 30.0% 0.0% 28.6% 0.0%
No knowledge 1.0 0.0 0.0 1.0 0.0 0.0 0.0
1.7% 0.0% 0.0% 10.0% 0.0% 0.0% 0.0%
Some knowledge 11.0 0.0 5.0 0.0 3.0 2.0 1.0
18.3% 0.0% 18.5% 0.0% 75.0% 28.6% 25.0%
A lot of knowledge 32.0 5.0 16.0 6.0 1.0 3.0 1.0
53.3% 62.5% 59.3% 60.0% 25.0% 42.9% 25.0%
Expert Knowledge 16.0 3.0 6.0 3.0 0.0 2.0 2.0
26.7% 37.5% 22.2% 30.0% 0.0% 28.6% 50.0%
No knowledge 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Some knowledge 23.0 4.0 9.0 4.0 3.0 0.0 3.0
38.3% 50.0% 33.3% 40.0% 75.0% 0.0% 75.0%
A lot of knowledge 26.0 3.0 12.0 4.0 1.0 5.0 1.0
43.3% 37.5% 44.4% 40.0% 25.0% 71.4% 25.0%
Expert Knowledge 11.0 1.0 6.0 2.0 0.0 2.0 0.0
18.3% 12.5% 22.2% 20.0% 0.0% 28.6% 0.0%
No knowledge 7.0 1.0 2.0 1.0 1.0 0.0 2.0
11.7% 12.5% 7.4% 10.0% 25.0% 0.0% 50.0%
Some knowledge 23.0 4.0 9.0 5.0 2.0 1.0 2.0
38.3% 50.0% 33.3% 50.0% 50.0% 14.3% 50.0%
A lot of knowledge 22.0 2.0 11.0 3.0 1.0 5.0 0.0
36.7% 25.0% 40.7% 30.0% 25.0% 71.4% 0.0%
Expert Knowledge 8.0 1.0 5.0 1.0 0.0 1.0 0.0
13.3% 12.5% 18.5% 10.0% 0.0% 14.3% 0.0%
No knowledge 1.0 0.0 0.0 0.0 1.0 0.0 0.0
1.7% 0.0% 0.0% 0.0% 25.0% 0.0% 0.0%
Some knowledge 27.0 5.0 11.0 4.0 2.0 1.0 4.0
45.0% 62.5% 40.7% 40.0% 50.0% 14.3% 100.0%
A lot of knowledge 21.0 2.0 11.0 3.0 1.0 4.0 0.0
35.0% 25.0% 40.7% 30.0% 25.0% 57.1% 0.0%
Expert Knowledge 11.0 1.0 5.0 3.0 0.0 2.0 0.0
18.3% 12.5% 18.5% 30.0% 0.0% 28.6% 0.0%
No knowledge 12.0 1.0 3.0 4.0 1.0 2.0 1.0
20.0% 12.5% 11.1% 40.0% 25.0% 28.6% 25.0%
Some knowledge 23.0 5.0 8.0 3.0 3.0 2.0 2.0
38.3% 62.5% 29.6% 30.0% 75.0% 28.6% 50.0%
A lot of knowledge 16.0 2.0 9.0 1.0 0.0 3.0 1.0
26.7% 25.0% 33.3% 10.0% 0.0% 42.9% 25.0%
Expert Knowledge 9.0 0.0 7.0 2.0 0.0 0.0 0.0
15.0% 0.0% 25.9% 20.0% 0.0% 0.0% 0.0%
Total Count (All)
21CFR Part 11 Regulation
Good Clinical Practices
(GCP)
2003 - Guidance for
Industry Part 11,
Electronic Records;
Electronic Signatures —
Scope and Application
2007 - Guidance for
Industry Computerized
Systems Used in Clinical
Investigations
2017 - Use of Electronic
Records and Electronic
Signatures in Clinical
Investigations Under 21
CFR Part 11 – Questions
and Answers
2022 - Digital Health
Technologies for Remote
Data Acquisition in
Clinical Investigations
Q4: What is your current role? - Selected Choice
199
Table 33: Q11: What challenges did you face when trying to understand the role of
Good Clinical Practices (GCPs) in implementing compliance with Part 11 for
clinical trials?
Total
VP/
President Dir/ Sr Dir
Mgr/ Sr
Mgr
Specialist/
Associate Consultant
Other,
please
specify:
60.0 8.0 27.0 10.0 4.0 7.0 4.0
Most Challenging 1.0 0.0 1.0 0.0 0.0 0.0 0.0
1.7% 0.0% 3.7% 0.0% 0.0% 0.0% 0.0%
Very Challenging 7.0 1.0 3.0 2.0 0.0 1.0 0.0
11.7% 12.5% 11.1% 20.0% 0.0% 14.3% 0.0%
Marginally Challeng 34.0 3.0 17.0 6.0 1.0 5.0 2.0
56.7% 37.5% 63.0% 60.0% 25.0% 71.4% 50.0%
Not Challenging 12.0 4.0 5.0 1.0 1.0 1.0 0.0
20.0% 50.0% 18.5% 10.0% 25.0% 14.3% 0.0%
Not Sure 5.0 0.0 0.0 1.0 2.0 0.0 2.0
8.3% 0.0% 0.0% 10.0% 50.0% 0.0% 50.0%
Most Challenging 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very Challenging 12.0 1.0 6.0 3.0 0.0 2.0 0.0
20.0% 12.5% 22.2% 30.0% 0.0% 28.6% 0.0%
Marginally Challeng 26.0 2.0 17.0 3.0 1.0 3.0 0.0
43.3% 25.0% 63.0% 30.0% 25.0% 42.9% 0.0%
Not Challenging 14.0 4.0 3.0 3.0 1.0 1.0 2.0
23.3% 50.0% 11.1% 30.0% 25.0% 14.3% 50.0%
Not Sure 5.0 1.0 0.0 1.0 2.0 0.0 1.0
8.3% 12.5% 0.0% 10.0% 50.0% 0.0% 25.0%
Most Challenging 1.0 0.0 1.0 0.0 0.0 0.0 0.0
1.7% 0.0% 3.7% 0.0% 0.0% 0.0% 0.0%
Very Challenging 16.0 1.0 8.0 4.0 0.0 3.0 0.0
26.7% 12.5% 29.6% 40.0% 0.0% 42.9% 0.0%
Marginally Challeng 21.0 4.0 12.0 1.0 1.0 2.0 1.0
35.0% 50.0% 44.4% 10.0% 25.0% 28.6% 25.0%
Not Challenging 7.0 0.0 5.0 1.0 0.0 0.0 1.0
11.7% 0.0% 18.5% 10.0% 0.0% 0.0% 25.0%
Not Sure 12.0 2.0 0.0 4.0 3.0 2.0 1.0
20.0% 25.0% 0.0% 40.0% 75.0% 28.6% 25.0%
Most Challenging 1.0 1.0 0.0 0.0 0.0 0.0 0.0
1.7% 12.5% 0.0% 0.0% 0.0% 0.0% 0.0%
Very Challenging 1.0 0.0 0.0 1.0 0.0 0.0 0.0
1.7% 0.0% 0.0% 10.0% 0.0% 0.0% 0.0%
Marginally Challeng 1.0 1.0 0.0 0.0 0.0 0.0 0.0
1.7% 12.5% 0.0% 0.0% 0.0% 0.0% 0.0%
Not Challenging 3.0 0.0 2.0 0.0 1.0 0.0 0.0
5.0% 0.0% 7.4% 0.0% 25.0% 0.0% 0.0%
Not Sure 6.0 1.0 4.0 1.0 0.0 0.0 0.0
10.0% 12.5% 14.8% 10.0% 0.0% 0.0% 0.0%
Q4: What is your current role? - Selected Choice
Total Count (All)
Understanding GCPs to
implement 21 CFR Part
11 compliance
Having sufficient
regulatory
requirements/guidances
Incorporating IT
Standards such as NIST,
ISO, CDISC
Other, please specify:
200
Table 34: Q14: How challenging were the following in completing validations for
consumer wearables?
Total
VP/
President Dir/ Sr Dir
Mgr/ Sr
Mgr
Specialist/
Associate Consultant
Other,
please
specify:
60.0 8.0 27.0 10.0 4.0 7.0 4.0
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 2.0 1.0 0.0 0.0 0.0 1.0 0.0
3.3% 12.5% 0.0% 0.0% 0.0% 14.3% 0.0%
Moderately challen 10.0 0.0 6.0 2.0 1.0 1.0 0.0
16.7% 0.0% 22.2% 20.0% 25.0% 14.3% 0.0%
Not challenging 7.0 0.0 3.0 2.0 1.0 1.0 0.0
11.7% 0.0% 11.1% 20.0% 25.0% 14.3% 0.0%
Not Sure 2.0 0.0 1.0 0.0 0.0 0.0 1.0
3.3% 0.0% 3.7% 0.0% 0.0% 0.0% 25.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 6.0 1.0 3.0 1.0 0.0 1.0 0.0
10.0% 12.5% 11.1% 10.0% 0.0% 14.3% 0.0%
Moderately challen 10.0 0.0 5.0 2.0 1.0 2.0 0.0
16.7% 0.0% 18.5% 20.0% 25.0% 28.6% 0.0%
Not challenging 3.0 0.0 1.0 1.0 1.0 0.0 0.0
5.0% 0.0% 3.7% 10.0% 25.0% 0.0% 0.0%
Not Sure 2.0 0.0 1.0 0.0 0.0 0.0 1.0
3.3% 0.0% 3.7% 0.0% 0.0% 0.0% 25.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 5.0 1.0 2.0 1.0 0.0 1.0 0.0
8.3% 12.5% 7.4% 10.0% 0.0% 14.3% 0.0%
Moderately challen 9.0 0.0 4.0 2.0 1.0 2.0 0.0
15.0% 0.0% 14.8% 20.0% 25.0% 28.6% 0.0%
Not challenging 5.0 0.0 3.0 1.0 1.0 0.0 0.0
8.3% 0.0% 11.1% 10.0% 25.0% 0.0% 0.0%
Not Sure 2.0 0.0 1.0 0.0 0.0 0.0 1.0
3.3% 0.0% 3.7% 0.0% 0.0% 0.0% 25.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 8.0 1.0 4.0 1.0 0.0 2.0 0.0
13.3% 12.5% 14.8% 10.0% 0.0% 28.6% 0.0%
Moderately challen 8.0 0.0 4.0 2.0 1.0 1.0 0.0
13.3% 0.0% 14.8% 20.0% 25.0% 14.3% 0.0%
Not challenging 3.0 0.0 1.0 1.0 1.0 0.0 0.0
5.0% 0.0% 3.7% 10.0% 25.0% 0.0% 0.0%
Not Sure 2.0 0.0 1.0 0.0 0.0 0.0 1.0
3.3% 0.0% 3.7% 0.0% 0.0% 0.0% 25.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 10.0 1.0 5.0 1.0 0.0 3.0 0.0
16.7% 12.5% 18.5% 10.0% 0.0% 42.9% 0.0%
Moderately challen 6.0 0.0 3.0 1.0 2.0 0.0 0.0
10.0% 0.0% 11.1% 10.0% 50.0% 0.0% 0.0%
Not challenging 2.0 0.0 1.0 1.0 0.0 0.0 0.0
3.3% 0.0% 3.7% 10.0% 0.0% 0.0% 0.0%
Not Sure 3.0 0.0 1.0 1.0 0.0 0.0 1.0
5.0% 0.0% 3.7% 10.0% 0.0% 0.0% 25.0%
Most challenging 1.0 0.0 1.0 0.0 0.0 0.0 0.0
1.7% 0.0% 3.7% 0.0% 0.0% 0.0% 0.0%
Very challenging 4.0 1.0 3.0 0.0 0.0 0.0 0.0
6.7% 12.5% 11.1% 0.0% 0.0% 0.0% 0.0%
Moderately challen 7.0 0.0 4.0 0.0 2.0 1.0 0.0
11.7% 0.0% 14.8% 0.0% 50.0% 14.3% 0.0%
Not challenging 3.0 0.0 1.0 1.0 0.0 1.0 0.0
5.0% 0.0% 3.7% 10.0% 0.0% 14.3% 0.0%
Not Sure 6.0 0.0 1.0 3.0 0.0 1.0 1.0
10.0% 0.0% 3.7% 30.0% 0.0% 14.3% 25.0%
Q4: What is your current role? - Selected Choice
Total Count (All)
Understanding Good
Clinical Practices (GCPs)
related to validations
Documenting plans and
procedures
Assuring manpower to
perform validations
Ensuring knowledge to
perform validations
Meeting timelines
Securing financial
resources
201
Table 35: Q2: How well do you think that your company understands Part 11
requirements for clinical trials?
Total
Less than
200 201-2000
2001-
20,000
More
than
20,000 Not sure
60.0 18.0 15.0 6.0 21.0 0.0
4.0 0.0 1.0 2.0 1.0 0.0
6.7% 0.0% 6.7% 33.3% 4.8% 0.0%
5.0 4.0 1.0 0.0 0.0 0.0
8.3% 22.2% 6.7% 0.0% 0.0% 0.0%
2.0 2.0 0.0 0.0 0.0 0.0
3.3% 11.1% 0.0% 0.0% 0.0% 0.0%
1.0 0.0 0.0 1.0 0.0 0.0
1.7% 0.0% 0.0% 16.7% 0.0% 0.0%
1.0 0.0 1.0 0.0 0.0 0.0
1.7% 0.0% 6.7% 0.0% 0.0% 0.0%
Q5: What is the size of your company? (Number of Employees)
Total Count (All)
Very confident
Confident
Moderately confident
Not very confident
Not sure
202
Table 36: Q9: How familiar are you with the following regulations, standards and
guidances?
Total
Less than
200 201-2000
2001-
20,000
More
than
20,000
60.0 18.0 15.0 6.0 21.0 0.0
No knowledge 2.0 0.0 0.0 1.0 1.0 0.0
3.3% 0.0% 0.0% 16.7% 4.8% 0.0%
Some knowledge 21.0 5.0 5.0 2.0 9.0 0.0
35.0% 27.8% 33.3% 33.3% 42.9% 0.0%
A lot of knowledge 22.0 10.0 7.0 0.0 5.0 0.0
36.7% 55.6% 46.7% 0.0% 23.8% 0.0%
Expert Knowledge 15.0 3.0 3.0 3.0 6.0 0.0
25.0% 16.7% 20.0% 50.0% 28.6% 0.0%
No knowledge 1.0 0.0 1.0 0.0 0.0 0.0
1.7% 0.0% 6.7% 0.0% 0.0% 0.0%
Some knowledge 11.0 2.0 3.0 3.0 3.0 0.0
18.3% 11.1% 20.0% 50.0% 14.3% 0.0%
A lot of knowledge 32.0 7.0 8.0 3.0 14.0 0.0
53.3% 38.9% 53.3% 50.0% 66.7% 0.0%
Expert Knowledge 16.0 9.0 3.0 0.0 4.0 0.0
26.7% 50.0% 20.0% 0.0% 19.0% 0.0%
No knowledge 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Some knowledge 23.0 5.0 6.0 3.0 9.0 0.0
38.3% 27.8% 40.0% 50.0% 42.9% 0.0%
A lot of knowledge 26.0 10.0 8.0 0.0 8.0 0.0
43.3% 55.6% 53.3% 0.0% 38.1% 0.0%
Expert Knowledge 11.0 3.0 1.0 3.0 4.0 0.0
18.3% 16.7% 6.7% 50.0% 19.0% 0.0%
No knowledge 7.0 3.0 3.0 0.0 1.0 0.0
11.7% 16.7% 20.0% 0.0% 4.8% 0.0%
Some knowledge 23.0 5.0 6.0 3.0 9.0 0.0
38.3% 27.8% 40.0% 50.0% 42.9% 0.0%
A lot of knowledge 22.0 8.0 5.0 1.0 8.0 0.0
36.7% 44.4% 33.3% 16.7% 38.1% 0.0%
Expert Knowledge 8.0 2.0 1.0 2.0 3.0 0.0
13.3% 11.1% 6.7% 33.3% 14.3% 0.0%
No knowledge 1.0 1.0 0.0 0.0 0.0 0.0
1.7% 5.6% 0.0% 0.0% 0.0% 0.0%
Some knowledge 27.0 6.0 7.0 3.0 11.0 0.0
45.0% 33.3% 46.7% 50.0% 52.4% 0.0%
A lot of knowledge 21.0 7.0 6.0 1.0 7.0 0.0
35.0% 38.9% 40.0% 16.7% 33.3% 0.0%
Expert Knowledge 11.0 4.0 2.0 2.0 3.0 0.0
18.3% 22.2% 13.3% 33.3% 14.3% 0.0%
No knowledge 12.0 6.0 5.0 0.0 1.0 0.0
20.0% 33.3% 33.3% 0.0% 4.8% 0.0%
Some knowledge 23.0 5.0 6.0 4.0 8.0 0.0
38.3% 27.8% 40.0% 66.7% 38.1% 0.0%
A lot of knowledge 16.0 6.0 4.0 1.0 5.0 0.0
26.7% 33.3% 26.7% 16.7% 23.8% 0.0%
Expert Knowledge 9.0 1.0 0.0 1.0 7.0 0.0
15.0% 5.6% 0.0% 16.7% 33.3% 0.0%
Total Count (All)
21CFR Part 11
Regulation
Good Clinical Practices
(GCP)
2003 - Guidance for
Industry Part 11,
Electronic Records;
Electronic Signatures —
Scope and Application
2007 - Guidance for
Industry Computerized
Systems Used in Clinical
Investigations
2017 - Use of Electronic
Records and Electronic
Signatures in Clinical
Investigations Under 21
CFR Part 11 – Questions
and Answers
2022 - Digital Health
Technologies for
Remote Data Acquisition
in Clinical Investigations
Q5: What is the size of your company? (Number of Employees)
Not sure
203
Table 37: Q11: What challenges did you face when trying to understand the role of
Good Clinical Practices (GCPs) in implementing compliance with Part 11 for
clinical trials?
Total
Less than
200 201-2000
2001-
20,000
More
than
20,000 Not sure
Total Count (All) 60.0 18.0 15.0 6.0 21.0 0.0
Most Challenging 1.0 0.0 1.0 0.0 0.0 0.0
1.7% 0.0% 6.7% 0.0% 0.0% 0.0%
Very Challenging 7.0 3.0 3.0 0.0 1.0 0.0
11.7% 16.7% 20.0% 0.0% 4.8% 0.0%
Marginally Challenging 34.0 9.0 8.0 3.0 14.0 0.0
56.7% 50.0% 53.3% 50.0% 66.7% 0.0%
Not Challenging 12.0 6.0 2.0 1.0 3.0 0.0
20.0% 33.3% 13.3% 16.7% 14.3% 0.0%
Not Sure 5.0 0.0 1.0 2.0 2.0 0.0
8.3% 0.0% 6.7% 33.3% 9.5% 0.0%
Most Challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very Challenging 12.0 5.0 4.0 1.0 2.0 0.0
20.0% 27.8% 26.7% 16.7% 9.5% 0.0%
Marginally Challenging 26.0 7.0 7.0 1.0 11.0 0.0
43.3% 38.9% 46.7% 16.7% 52.4% 0.0%
Not Challenging 14.0 5.0 2.0 2.0 5.0 0.0
23.3% 27.8% 13.3% 33.3% 23.8% 0.0%
Not Sure 5.0 0.0 1.0 2.0 2.0 0.0
8.3% 0.0% 6.7% 33.3% 9.5% 0.0%
Most Challenging 1.0 0.0 1.0 0.0 0.0 0.0
1.7% 0.0% 6.7% 0.0% 0.0% 0.0%
Very Challenging 16.0 7.0 7.0 0.0 2.0 0.0
26.7% 38.9% 46.7% 0.0% 9.5% 0.0%
Marginally Challenging 21.0 3.0 3.0 3.0 12.0 0.0
35.0% 16.7% 20.0% 50.0% 57.1% 0.0%
Not Challenging 7.0 3.0 1.0 0.0 3.0 0.0
11.7% 16.7% 6.7% 0.0% 14.3% 0.0%
Not Sure 12.0 5.0 2.0 2.0 3.0 0.0
20.0% 27.8% 13.3% 33.3% 14.3% 0.0%
Most Challenging 1.0 0.0 0.0 0.0 1.0 0.0
1.7% 0.0% 0.0% 0.0% 4.8% 0.0%
Very Challenging 1.0 0.0 0.0 0.0 1.0 0.0
1.7% 0.0% 0.0% 0.0% 4.8% 0.0%
Marginally Challenging 1.0 0.0 0.0 1.0 0.0 0.0
1.7% 0.0% 0.0% 16.7% 0.0% 0.0%
Not Challenging 3.0 2.0 1.0 0.0 0.0 0.0
5.0% 11.1% 6.7% 0.0% 0.0% 0.0%
Not Sure 6.0 1.0 2.0 0.0 3.0 0.0
10.0% 5.6% 13.3% 0.0% 14.3% 0.0%
Q5: What is the size of your company? (Number of Employees)
Understanding GCPs to
implement 21 CFR Part
11 compliance
Having sufficient
regulatory
requirements/guidances
Incorporating IT
Standards such as NIST,
ISO, CDISC
Other, please specify:
204
Table 38: Q14: How challenging were the following in completing validations for
consumer wearables?
Total
Less than
200 201-2000
2001-
20,000
More
than
20,000 Not sure
60.0 18.0 15.0 6.0 21.0 0.0
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 2.0 2.0 0.0 0.0 0.0 0.0
3.3% 11.1% 0.0% 0.0% 0.0% 0.0%
Moderately challenging 10.0 1.0 1.0 0.0 8.0 0.0
16.7% 5.6% 6.7% 0.0% 38.1% 0.0%
Not challenging 7.0 3.0 1.0 0.0 3.0 0.0
11.7% 16.7% 6.7% 0.0% 14.3% 0.0%
Not Sure 2.0 0.0 0.0 0.0 2.0 0.0
3.3% 0.0% 0.0% 0.0% 9.5% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 6.0 2.0 1.0 0.0 3.0 0.0
10.0% 11.1% 6.7% 0.0% 14.3% 0.0%
Moderately challenging 10.0 3.0 1.0 0.0 6.0 0.0
16.7% 16.7% 6.7% 0.0% 28.6% 0.0%
Not challenging 3.0 1.0 0.0 0.0 2.0 0.0
5.0% 5.6% 0.0% 0.0% 9.5% 0.0%
Not Sure 2.0 0.0 0.0 0.0 2.0 0.0
3.3% 0.0% 0.0% 0.0% 9.5% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 5.0 2.0 1.0 0.0 2.0 0.0
8.3% 11.1% 6.7% 0.0% 9.5% 0.0%
Moderately challenging 9.0 3.0 1.0 0.0 5.0 0.0
15.0% 16.7% 6.7% 0.0% 23.8% 0.0%
Not challenging 5.0 1.0 0.0 0.0 4.0 0.0
8.3% 5.6% 0.0% 0.0% 19.0% 0.0%
Not Sure 2.0 0.0 0.0 0.0 2.0 0.0
3.3% 0.0% 0.0% 0.0% 9.5% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 8.0 3.0 1.0 0.0 4.0 0.0
13.3% 16.7% 6.7% 0.0% 19.0% 0.0%
Moderately challenging 8.0 3.0 1.0 0.0 4.0 0.0
13.3% 16.7% 6.7% 0.0% 19.0% 0.0%
Not challenging 3.0 0.0 0.0 0.0 3.0 0.0
5.0% 0.0% 0.0% 0.0% 14.3% 0.0%
Not Sure 2.0 0.0 0.0 0.0 2.0 0.0
3.3% 0.0% 0.0% 0.0% 9.5% 0.0%
Most challenging 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Very challenging 10.0 5.0 1.0 0.0 4.0 0.0
16.7% 27.8% 6.7% 0.0% 19.0% 0.0%
Moderately challenging 6.0 1.0 1.0 0.0 4.0 0.0
10.0% 5.6% 6.7% 0.0% 19.0% 0.0%
Not challenging 2.0 0.0 0.0 0.0 2.0 0.0
3.3% 0.0% 0.0% 0.0% 9.5% 0.0%
Not Sure 3.0 0.0 0.0 0.0 3.0 0.0
5.0% 0.0% 0.0% 0.0% 14.3% 0.0%
Most challenging 1.0 1.0 0.0 0.0 0.0 0.0
1.7% 5.6% 0.0% 0.0% 0.0% 0.0%
Very challenging 4.0 1.0 0.0 0.0 3.0 0.0
6.7% 5.6% 0.0% 0.0% 14.3% 0.0%
Moderately challenging 7.0 2.0 0.0 0.0 5.0 0.0
11.7% 11.1% 0.0% 0.0% 23.8% 0.0%
Not challenging 3.0 1.0 1.0 0.0 1.0 0.0
5.0% 5.6% 6.7% 0.0% 4.8% 0.0%
Not Sure 6.0 1.0 1.0 0.0 4.0 0.0
10.0% 5.6% 6.7% 0.0% 19.0% 0.0%
Q5: What is the size of your company? (Number of Employees)
Total Count (All)
Understanding Good
Clinical Practices (GCPs)
related to validations
Documenting plans and
procedures
Assuring manpower to
perform validations
Ensuring knowledge to
perform validations
Meeting timelines
Securing financial
resources
205
Table 39: Q1: In one word, how would you characterize FDA guidances to support the
Part 11 regulation?
Total
Total Count (All) 60.0 10.0 13.0 22.0 14.0 1.0
Confusing 4.0 1.0 0.0 3.0 0.0 0.0
6.7% 10.0% 0.0% 13.6% 0.0% 0.0%
Excessive 0.0 0.0 0.0 0.0 0.0 0.0
0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
Insufficient 6.0 1.0 1.0 2.0 2.0 0.0
10.0% 10.0% 7.7% 9.1% 14.3% 0.0%
Reasonable 20.0 0.0 6.0 7.0 7.0 0.0
33.3% 0.0% 46.2% 31.8% 50.0% 0.0%
Other, please specify: 2.0 0.0 0.0 2.0 0.0 0.0
3.3% 0.0% 0.0% 9.1% 0.0% 0.0%
Q7: With how many clinical trials have you personally worked over the past 5 years?
None 5 or less 5-15 More than 15 Not sure
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
An industry survey of implementation strategies for clinical supply chain management of cell and gene therapies
PDF
Institutional review board implementation of the single IRB model for multicenter clinical trials
PDF
A survey of US industry views on implementation of decentralized clinical trials
PDF
Institutional review board capabilities to oversee new technology: social media as a case study
PDF
Sharing the results of clinical trials: industry views on disclosure of data from industry-sponsored clinical research
PDF
Use of electronic health record data for generating clinical evidence: a summary of medical device industry views
PDF
Clinical trials driven by investigator-sponsors: GCP compliance with or without previous industry sponsorship
PDF
Contract research organizations: a survey of industry views and outsourcing practices
PDF
Computerized simulation in clinical trials: a survey analysis of industry progress
PDF
Regulatory agreements for drug development collaborations: practices in the medical products industry
PDF
Experience with breakthrough therapy designation: an industry survey
PDF
Exploration of the reasons for the overrepresentation of Black patients in schizophrenia clinical trials
PDF
Regulation of pediatric cancer drug development: an industry perspective
PDF
Examining the cord blood industry views on the biologic license application regulatory framework
PDF
Regulatory harmonization in a resource-limited setting: the World Health Organization Collaborative Procedure for Accelerated Registration
PDF
Implementation of tobacco regulatory science competencies in the tobacco centers of regulatory science (TCORS): stakeholder views
PDF
Benefits-risk frameworks: implementation by industry
PDF
Implementation of unique device identification in the medical device industry: a survey of the change management experience
PDF
Implementation of good manufacturing practice regulations for positron emission tomography radiopharmaceuticals: challenges and opportunities perceived by imaging thought leaders
PDF
Evaluation of FDA-sponsor formal meetings on the development of cell and gene therapies: a survey of industry views
Asset Metadata
Creator
Malhotra, Anjali
(author)
Core Title
21 CFR Part 11 compliance for digital health technologies in clinical trials
School
School of Pharmacy
Degree
Doctor of Regulatory Science
Degree Program
Regulatory Science
Degree Conferral Date
2023-12
Publication Date
12/21/2023
Defense Date
12/15/2023
Publisher
Los Angeles, California
(original),
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
21 CFR Part 11,clinical trials,compliance,CT,CTS,DHT,DHTs,digital health technologies,digital health technology,OAI-PMH Harvest,part 11
Format
theses
(aat)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Richmond, Frances (
committee chair
), Church, Terry (
committee member
), Myles, Lequina (
committee member
), Pire-Smerkanich, Nancy (
committee member
)
Creator Email
anjali.malhotra@immunitybio.com,anjalima@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC113797580
Unique identifier
UC113797580
Identifier
etd-MalhotraAn-12580.pdf (filename)
Legacy Identifier
etd-MalhotraAn-12580
Document Type
Dissertation
Format
theses (aat)
Rights
Malhotra, Anjali
Internet Media Type
application/pdf
Type
texts
Source
20231221-usctheses-batch-1117
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
21 CFR Part 11
clinical trials
compliance
CT
CTS
DHT
DHTs
digital health technologies
digital health technology
part 11